Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 - 20.0 years
35 - 45 Lacs
pune, ahmedabad
Hybrid
Role & responsibilities We are seeking a skilled LEAD Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture. Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or masters degree in computer science, Data Science, or a related field Must have 13 + years of experience as a Snowflake Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic. Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |