4 - 7 years
5 - 14 Lacs
Posted:3 weeks ago|
Platform:
Work from Office
Full Time
We are looking for an experienced Data Engineer to design, develop, and maintain our data pipelines, primarily focused on ingesting data into our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake and practical experience with AWS services, particularly using S3 as a landing zone and an entry point to the Snowflake environment. You will be responsible for building efficient, reliable, and scalable data pipelines that are critical for our data-driven decision-making processes. Role & responsibilities 1. Design, develop, implement, and maintain scalable and robust data pipelines to ingest data from various sources into the Snowflake data platform. 2. Utilize AWS S3 as a primary landing zone for data, ensuring efficient data transfer and integration with Snowflake. 3. Develop and manage ETL/ELT processes, focusing on data transformation, cleansing, and loading within the Snowflake and AWS ecosystem. 4.Write complex SQL queries and stored procedures in Snowflake for data manipulation, transformation, and performance optimization. 5. Monitor, troubleshoot, and optimize data pipelines for performance, reliability, and scalability. 6. Collaborate with data architects, data analysts, data scientists, and business stakeholders to understand data requirements and deliver effective solutions. 7. Ensure data quality, integrity, and governance across all data pipelines and within the Snowflake platform. 8. Implement data security best practices in AWS and Snowflake. 9. Develop and maintain comprehensive documentation for data pipelines, processes, and architectures. 10. Stay up-to-date with emerging technologies and best practices in data engineering, particularly related to Snowflake and AWS. 11. Participate in Agile/Scrum development processes, including sprint planning, daily stand-ups, and retrospectives. Preferred candidate profile 1. Strong, hands-on proficiency with Snowflake: In-depth knowledge of Snowflake architecture, features (e.g., Snowpipe, Tasks, Streams, Time Travel, Zero-Copy Cloning). Experience in designing and implementing Snowflake data models (schemas, tables, views). Expertise in writing and optimizing complex SQL queries in Snowflake. Experience with data loading and unloading techniques in Snowflake. 2. Solid experience with AWS Cloud services: Proficiency in using AWS S3 for data storage, staging, and as a landing zone for Snowflake. Experience with other relevant AWS services (e.g., IAM for security, Lambda for serverless processing, Glue for ETL - if applicable). 3. Strong experience in designing and building ETL/ELT data pipelines. Proficiency in at least one programming language commonly used in data engineering (e.g., Python, Scala, Java). Python is highly preferred.
Safran Engineering Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Safran Engineering Services
Aerospace & Defense
Approximately 600 Employees
57 Jobs
Key People
Mumbai, Navi Mumbai, Mumbai (All Areas)
7.0 - 17.0 Lacs P.A.
Lucknow
0.5 - 0.6 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Chennai, Tamil Nadu, India
Salary: Not disclosed
Bhubaneswar, Odisha, India
Salary: Not disclosed
Chennai, Bengaluru
15.0 - 25.0 Lacs P.A.
Hyderabad, Telangana, India
Salary: Not disclosed
Kochi, Kerala, India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
Gurugram, Chennai, Bengaluru
9.5 - 17.0 Lacs P.A.