About the role
join a high-impact u.s.-based startup transforming the music industry through automation. As a senior data engineer, you'll architect intelligent pipelines that help music creators get paid fairly by reconciling millions of rows of royalty data across 15+ streaming platforms. You'll build the data layer from scratch using modern tools and take full ownership of performance, quality, and auditability.
about you
● you have 5+ years of experience as a data engineer working on production-grade etl pipelines.
● you're passionate about clean, reliable data and solving complex transformation problems.
● you communicate effectively with both technical and non-technical stakeholders.
● you thrive in fast-paced environments and love building systems from the ground up.
what you'll be doing
● design and implement an end-to-end etl pipeline to process millions of rows of royalty data per day.
● develop platform-specific csv parsers and build a robust dbt project with clear transformation layers.
● integrate with streaming apis (spotify, chartmetric) for data enrichment and validation.
● ensure high data quality using frameworks like great expectations.
● build monitoring dashboards and set up alerts to maintain pipeline health.
● normalize financial data across currencies and territories and handle song metadata matching.
what we're looking for
● proactivity and a solution-oriented mindset.
● attention to detail in financial data handling.
● ability to prioritize tasks and manage time effectively.
● comfort working independently in a startup environment.
technical requirements
must-haves:
● 5+ years of experience in data engineering roles.
● 3+ years of experience with dbt in production.
● advanced sql and python skills.
● experience with orchestration tools like airflow or dagster.
● strong knowledge of postgresql, including schema design and optimization.
● familiarity with git and ci/cd pipelines.
nice-to-haves:
● knowledge of the music industry or financial reconciliation processes.
● experience with great expectations or similar tools for data quality.
● integration experience with 5+ third-party apis.
● handling of multi-currency and multi-territory data.
● experience with dbt cloud, terraform, looker, or metabase.