We are currently seeking an experienced data architect / senior data engineer to join a global data & analytics initiative, working onsite in monterrey, mexico. This is a contractor position, and all interviews and day-to-day work will be conducted in english, so advanced/fluent english is required.in this role, you will be responsible for designing, building, and optimizing scalable data pipelines and cloud data architecture, enabling analytics, machine learning, and business intelligence across the organization. Additional information location: monterrey, mexico (onsite at client's office) contract type: contractor duration: 12 months (extendable) ️ language: english (mandatory for interviews and daily work) responsibilities: design, build, and maintain etl/elt pipelines to ingest, transform, and store data from multiple sources develop and manage data lakes, data warehouses, and lakehouse architectures, primarily using azure databricks implement and optimize batch and real-time data processing using apache spark, databricks, and kafka optimize data workflows for performance, scalability, reliability, and cost efficiency apply best practices for data modeling, partitioning, indexing, and schema design work with structured and unstructured data across sql and nosql environments ensure data quality, governance, security, and compliance (e.g., gdpr, ccpa) maintain data documentation, metadata management, and data lineage collaborate closely with data analysts, engineers, and global stakeholders to support analytics initiatives required qualifications: 8+ years of experience in data engineering, data architecture, or related roles strong hands-on experience with azure databricks and azure data factory (adf) proficiency in scala, python, and sql solid experience with apache spark and large-scale data processing strong understanding of medallion architecture, lakehouse concepts, and cloud data platforms experience designing scalable, secure, and high-performance data solutions ability to work onsite in monterrey and collaborate with global teams in english nice to have skills experience with streaming data platforms (kafka, event hubs, kinesis) familiarity with ci/cd pipelines and infrastructure as code (terraform, cloudformation) cloud certifications (azure, aws, or gcp)