Important information years of experience: 7 years in product management, data engineering, or technical program management job mode: full-time work mode: remote job summary the product manager – enterprise data lake will lead the delivery, execution, and rollout of a new enterprise data platform built on snowflake or databricks. This role is responsible for translating business and data requirements into technical execution plans, driving the development of ingestion pipelines, data models, and cloud infrastructure to enable scalable analytics and data science operations. The ideal candidate combines strong technical understanding with exceptional project management and stakeholder coordination skills. Responsibilities and duties own the delivery and execution of an enterprise data lake on snowflake or databricks, from initial setup through production rollout. Translate business and data requirements into clear, actionable technical stories and sprint plans for engineering teams. Define and manage data ingestion and integration pipelines from core saas platforms and enterprise systems. Partner with data engineers and architects to design efficient data models, storage layers, and optimization strategies. Oversee the full development lifecycle, ensuring milestones, dependencies, and deliverables are met on time. Prioritize and manage the product backlog, balancing business impact, technical complexity, and cross-functional dependencies. Build and maintain a phased delivery roadmap, including mvp and subsequent enhancement releases. Collaborate with infrastructure and devops teams to establish reliable, scalable, and cost-effective cloud environments. Define and monitor operational kpis such as data pipeline success rates, processing latency, and system uptime. Ensure readiness for production release, including documentation, monitoring, and smooth handoff to analytics and data science teams. Qualifications and skills proven experience leading data platform or data lake initiatives in cloud environments. Deep understanding of agile methodologies, sprint planning, and backlog management. Strong technical literacy in data architecture, pipelines, and storage optimization. Excellent communication and stakeholder management skills across technical and business domains. Ability to balance short-term delivery with long-term platform scalability and maintainability. Role-specific requirements hands-on experience with data lake solutions on snowflake or databricks. Familiarity with data integration tools and etl/elt pipelines (e.g., airflow, dbt, fivetran, azure data factory). Knowledge of cloud infrastructure (aws, azure, or gcp) and devops processes for data environments. Understanding of enterprise data governance, security, and compliance frameworks. Ability to define kpis for operational efficiency and data reliability. Technologies snowflake / databricks python, sql, dbt, airflow aws / azure / gcp etl/elt tools (fivetran, data factory, glue, etc.) Agile tools (jira, confluence) skillset competencies data platform product management agile execution and roadmapping cross-functional collaboration cloud data architecture understanding backlog prioritization kpi tracking and reporting about encora encora is the preferred digital engineering and modernization partner of some of the world’s leading enterprises and digital native companies. With over 9,000 experts in 47 offices and innovation labs worldwide, encora’s technology practices include product engineering & development, cloud services, quality engineering, devsecops, data & analytics, digital experience, cybersecurity, and ai & llm engineering. At encora, we hire professionals based solely on their skills and qualifications, and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality. J-18808-ljbffr