Core responsibilitiesbuild and manage airflow dags (taskflow, dynamic dags, deferrable operators).develop python-based etl/elt pipelines for apis, storage, kafka, and databases.operate airflow in azure/kubernetes with blue/green or canary releases.implement data quality checks, unit tests, and ci/cd dag validation.build near–real-time event pipelines with proper schema management.model sql/blob datasets with optimal partitioning and retention.drive observability (metrics, logs, alerts, slas) and lead incident reviews.enforce security best practices: iam, secrets, rbac, pii handling, data contracts.build ci/cd pipelines, containerize with docker, and use terraform/helm for infra.optimize performance, cost, and parallelism.collaborate with android/backend teams on data interfaces and documentation.qualifications8+ years in data/backend engineering with strong python skills.2+ years with airflow 2.x and advanced operators/hooks/scheduler tuning.experience with reliable large-scale etl/elt (batch + streaming).strong sql, data modeling, and hands-on with bigquery/redshift/snowflake.production-ready debugging, tuning, and monitoring experience.knowledge of rbac, secrets management, oauth/oidc, and api gateways.strong communication and documentation skills.