Requirements
* experience as a dataops engineer working with aws
* strong knowledge of aws services, amazon s3, redshift, glue, iam
* experience designing and maintaining data pipelines
* knowledge of data lake and data warehouse architectures
* experience with workflow orchestration tools, airflow
* experience deploying data infrastructure using terraform
* knowledge of monitoring, alerting, and cost management in aws
* experience managing aws permissions and access policies
* experience integrating external data sources into cloud platforms
* ability to analyze data pipeline issues and perform root cause analysis
* strong collaboration and communication skills
responsibilities
* design, build, and maintain data pipelines in aws
* build and manage data lake and data warehouse pipelines using amazon s3 and redshift
* deploy data infrastructure and pipelines using terraform
* create and manage pipelines across development, uat, and production environments
* manage aws permissions and iam policies for data platform access
* monitor data pipelines and aws services, including alerting and cost monitoring
* perform root cause analysis for data and infrastructure issues
* coordinate and engage aws support for troubleshooting and issue resolution
* ingest and process shopify, segment, and ga4 data into the data lake and warehouse
* support data transformations using aws glue and airflow
* maintain operational documentation and improve dataops processes
required languages
advanced english 80-95%
location
mexico/bogotá and surroundings (hybrid), rest of colombia (remote)
#j-18808-ljbffr