About the role
we are seeking an experienced gcp data engineer who will lead end-to-end development of complex data engineering use cases and drive the evolution of our cloud-based data platform. The ideal candidate combines deep technical expertise in cloud-native data technologies with proven leadership skills and a passion for building robust, scalable data platforms that drive strategic business insights.
* design and implement enterprise-scale data pipelines and platform architecture for end-to-end data products.
* develop fault-tolerant, petabyte-scale data processing systems using advanced gcp services.
* evaluate and recommend new technologies, tools, and architectural approaches.
key requirements
* gcp services: 5+ years hands-on experience with bigquery (advanced sql, scripting, ml integration), cloud dataflow, cloud composer, cloud storage, pub/sub, and vertex ai.
* programming: expert-level python/java programming, proficiency in python/scala for spark development.
* advanced technologies: deep experience with apache beam, airflow, kubernetes, docker, and distributed computing.
drive innovation
in this role, you will have the opportunity to design and implement cutting-edge data solutions that drive business growth and improve customer experiences. If you are passionate about staying up-to-date with the latest trends and technologies, we want to hear from you.