Id: 14686
*what you will do*
- design and implement data pipelines using dbt for transformation and modeling;
- manage and optimize data warehouse solutions on snowflake;
- develop and maintain etl processes using fivetran for data ingestion;
- utilize terraform for infrastructure as code (iac) to provision and manage resources in aws, snowflake, kubernetes, and fivetran;
- collaborate with cross-functional teams to understand data requirements and deliver scalable solutions;
- implement workflow automation using argoworkflows to streamline data processing tasks;
- ensure data quality and integrity throughout the data lifecycle.
*must haves*
- bachelor’s degree in computer science, engineering, or related field;
- 5+ years of experience working with python;
- proven experience as a data engineer with a focus on dbt, snowflake, argoworkflows, and fivetran;
- strong sql skills for data manipulation and querying;
- experience with cloud platforms like aws or azure;
- experience with kubernetes;
- familiarity with data modeling concepts and best practices;
- excellent problem-solving skills and attention to detail;
- ability to work independently and collaborate effectively in a team environment;
- upper-intermediate english level.
*nice to haves*
- +2 years experience with golang.
*the benefits of joining us*
- *professional growth*
accelerate your professional journey with mentorship, techtalks, and personalized growth roadmaps
- *competitive compensation*
we match your ever-growing skills, talent, and contributions with competitive usd-based compensation and budgets for education, fitness, and team activities
- *a selection of exciting projects*
join projects with modern solutions development and top-tier clients that include fortune 500 enterprises and leading product brands
- *flextime*
tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office - whatever makes you the happiest and most productive.
Work location: remote