Join us in the procurement execution center (pec) as a data engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the e2e management of our data, including: etl/elt, dw/dl, data staging, data governance, and manage the different layers of data required to ensure a successful bi & reporting for the pec. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including fleet, mro & energy, travel, professional services, among others.
how will you do it?
* serve as the main technical resource for any data-related requirement
* demonstrate an ability to communicate technical knowledge through project management and contributions to product strategy
* deploy data ingestion processes through azure data factory to load data models as required into azure synapse.
* build and design robust, modular and scalable etl/elt pipelines with azure data factory, python and/or dbt.
* assemble large, complex, robust and modular data sets that meet functional / nonfunctional business requirements.
* build the infrastructure required for optimal etl/elt of data from a wide variety of data sources using data lakehouse technologies and adf.
* develop data models that enable dataviz, reporting and advanced data analytics, striving for optimal performance across all data models.
* maintain conceptual, logical, and physical data models along with corresponding metadata.
* manages the devops pipeline deployment model, including automated testing
procedures
* deploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules.
* ensure compliance with system architecture, methods, standards, practices and participate in their creation
* clearly articulate and effectively influence both business and technical teams
* performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual procurement bidding activities.
* support the deployment of a global data standard.
* create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
* support rate repository management as required (including rate card uploads to our dw).
* other procurement duties as assigned.
what are we looking for?
* bachelor's degree in related field (engineering, computer science, data science or similar)
* 4+ years of relevant professional experience in bi engineering, data modeling, data engineering, software engineering or other relevant roles. Strong sql knowledge and experience working with relational databases.
* knowledge in dw/dl concepts, data marts, data modeling, etl/elt, data
quality/stewardship, distributed systems and metadata management.
* experience building and optimizing data pipelines, architectures, and data sets.
* azure data engineering certification preferred (dp-203)
* etl/elt development experience (4+ years), adf, dbt and snowflake are preferred.
* ability to resolve etl/elt problems by proposing and implementing
tactical/strategic solutions.
* strong project management and organizational skills.
* experience with object-oriented function scripting languages: python, scala, r, etc.
* experience with nosql databases is a plus to support the transition from on-prem to cloud.
* excellent problem solving, critical thinking, and communication skills
* relevant experience with azure devops (ci/cd, git/repo management) is a plus
* due to the global nature of the role, proficiency in english language is a must.