*snowflake data engineer*
the ability to design, implement, and optimize large-scale data and analytics solutions on snowflake cloud data warehouse is essential.
expertise with amazon s3 is a must.
*responsibilities*:
a data engineer at snowflake is responsible for:
- overall responsibility of managing and maintaining the snowflake environment both from administration and development stand point.
- implementing elt pipelines within and outside of a data warehouse and snowflakes snow sql
- querying snowflake using sql, expert in created complex views and udf's.
- development elt jobs in talend for extracting, loading, and transforming data.
- assist with production issues in data warehouses like reloading data, transformations, and translations.
quick in finding issues and debugging experience.
- develop a database design and reporting design by creating complex views, based on business intelligence and reporting requirements
- a solid understanding of data science concepts will be an additional advantage.
- support bi solutions that report on data extracted from crm and erp solutions (e.g.
salesforce, cpq, netsuite, oracle apps, ax dynamics, siebel crm, oracle e-business suite).
*qualifications*:
snowflake data engineers are required to have the following qualifications:
- minimum of 1 year of experience designing and implementing a full-scale data warehouse solution based on snowflake.
- a minimum of three years of experience in developing production-ready data ingestion and processing pipelines using elt tools (talend).
- knowledge of amazon s3 is a must.
- a solid understanding of data science concepts will be an additional advantage.
- data analysis expertise.
- working knowledge of elt tools, like talend, informatica or any other elt tools.
- knowledge of bi tools, like tableau, power bi and qlik sense.
- experience with complex data warehouse solutions on teradata, oracle, or db2 platforms with 2 years of hands-on experience
- expertise and excellent proficiency with snowflake internals and integration of snowflake with other technologies for data processing and reporting.
- a highly effective communicator, both orally and in writing
- problem-solving and architecting skills in cases of unclear requirements.
- a minimum of one year of experience architecting large-scale data solutions, performing architectural assessments, examining architectural alternatives, and choosing the best solution in collaboration with both it and business stakeholders.
- extensive experience with talend, informatica, and building data ingestion pipelines.
- expertise with amazon web services, microsoft azure and google cloud.
- good knowledge of mssql from dba prospective will be an additional advantage.
- a solid understanding of data science concepts will be an additional advantage.
*about imperva*:
*rewards*:
*legal notice*:
imperva is an equal opportunity employer.
all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, ancestry, pregnancy, age, sexual orientation, gender identity, marital status, protected veteran status, medical condition or disability, or any other characteristic protected by law.
li-remote