Role: data engineer – gcp / bigquery
role summary
we are seeking a highly skilled data engineer with strong experience in google cloud platform (gcp), particularly bigquery, to design, build, and maintain scalable data pipelines. The ideal candidate will have hands-on experience with python, sql, airflow/composer, and ci/cd practices, and will play a key role in expanding and migrating adobe-based data pipelines while integrating and operationalizing orion datasets across hybrid cloud environments.
required skills and qualifications
experience: 7 to 9 years of applicable engineering experience
* strong hands-on experience with google cloud platform (gcp), particularly bigquery.
* proficiency in python and sql for data engineering use cases.
* experience with apache airflow and cloud composer.
* hands-on experience with github and ci/cd pipelines.
* solid understanding of data warehousing concepts, including:
* star schemas
* dimensional modeling
* oltp vs. Olap architectures
* experience designing and supporting etl/elt pipelines in large-scale data environments.
* familiarity with adobe data platforms and event-based data pipelines.
* experience working in multi-cloud and hybrid environments (aws, gcp, snowflake).
* strong problem-solving skills and attention to data quality and performance.
key responsibilities
* design, build, and maintain etl/elt data pipelines using python, sql, airflow, and gcp composer.
* support the migration and expansion of adobe-based data pipelines.
* integrate orion datasets into existing enterprise datasets and data platforms.
* develop new data pipelines for orion event data collection.
* implement and manage data ingestion, transformation, and orchestration workflows in bigquery.
* optimize pipeline performance, including runtime, compute usage, storage tiering, and query costs, with a strong focus on bigquery cost optimization.
* establish data quality checks, validation logic, and reconciliation processes to ensure data accuracy and reliability.
* work across hybrid data environments, supporting data movement and transformation across aws, gcp, and snowflake.
* implement and maintain ci/cd pipelines using github and enterprise devops practices.
* create and maintain clear, comprehensive documentation for all data pipelines, integrations, and operational processes.
* collaborate with cross-functional teams to support analytics, reporting, and downstream data consumers.
languages
* fluent english (mandatory)
our benefits:
* 100% payroll
* major medical insurance
* visual and dental insurance
* life insurance
* 3% food coupons
* 12 vacation days
* 15 days christmas bonus (aguinaldo)
* 25% vacation bonus (prima vacacional)
* imss/ afore/infonavit
* home office bonus
ltm is an equal opportunity employer committed to diversity in the workplace. Employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability, genetic information, union affiliation, affectional or sexual orientation, or any other characteristic protected by applicable law.