Job title: senior data pipeline engineer (python/airflow) location: guadalajara, mexico design, build, and maintain airflow dags using taskflow, dynamic dags, deferrable operators, providers, and the secrets backend; develop python etl/elt code to ingest from apis, object storage, message buses, and databases; operate airflow on managed or self‑hosted platforms (e.g., azure, kubernetes deployments); implement data quality and testing with unit tests for operators/hooks, and dag validation in ci. Model and manage data stores across sql and blob storage; security & governance: apply least‑privilege iam, secrets management, pii handling, and data contracts; enforce rbac in airflow and warehouses. Ci/cd & iac: build pipelines to lint/test/deploy dags and python packages; provision infra with terraform/helm; containerize with docker. Cost & performance: tune task parallelism, autoscaling, storage formats, and compute footprints to optimize cost/perf. Collaboration: work closely with android/backend teams to define interfaces and data contracts; 8+ years in data engineering or backend engineering with strong python expertise. ~2+ years airflow 2.proven experience designing reliable etl/elt at scale (batch and streaming) with ~ robust testing and monitoring. ~ strong sql and data modeling skills; hands‑on with one or more data warehouses ~(bigquery, redshift, snowflake) and relational systems (postgresql/mysql). ~ integrations), api gateways, and secrets management (vault/aws secrets manager/gcp secret manager). ~ comfortable operating in production: monitoring, troubleshooting, and performance tuning. ~