Job duties and responsibilities
* design and implement scalable data models and transformations using dbt within the microsoft fabric ecosystem.
* collaborate with business stakeholders to understand data needs and deliver high-quality datasets.
* ensure data quality, consistency, and governance through testing, documentation, and version control.
* optimize performance of data transformations and queries across large datasets.
* monitor and troubleshoot data workflows and proactively resolve issues.
* contribute to the development of best practices and standards for data modeling, transformation, and source control.
education and qualifications
* 2-5 years of experience in data engineering, analytics engineering, or a related field.
* proficiency with sql, ideally in dbt (core or cloud) for data transformation and modeling.
* hands‑on experience with microsoft fabric, including onelake, lakehouse, warehouse, and power bi artifacts.
* strong sql skills and experience working with large‑scale data warehouses or lakehouses.
* familiarity with git‑based version control and ci/cd practices.
* excellent problem‑solving skills and attention to detail.
* strong communication and collaboration skills.
* experience with power bi semantic models and dax.
skills and competencies
* experience with data models for sap ecc, oracle ebs, and/or qad strongly preferred.
* experience with data models for salesforce, getpaid, inexus, concur, sap s4/hana, sap successfactors, hyperion financial management, hyperion planning (pbcs), siemens teamcenter.
* knowledge of data governance, lineage, and observability tools.
* familiarity with python or other scripting languages.
* experience working in agile or devops environments.
#j-18808-ljbffr