8 - 13 years

2 Lacs

Posted:Just now| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

SUMMARY

Data Engineer Position Overview

Role Summary

We are searching for a talented and motivated Data Engineer to join our team. The ideal candidate will have expertise in data modeling, analytical thinking, and developing ETL processes using Python. In this role, you will be pivotal in transforming raw data from landing tables into reliable, curated master tables, ensuring accuracy, accessibility, and integrity within our Snowflake data platform.

Experience

Overall 6+ yrs of experience with relevant experience of 4+ years

Main Responsibilities

Design, Develop, and Maintain ETL Processes:

o Build and maintain scalable ETL pipelines in Python to extract, transform, and load data into Snowflake master tables. Automate data mastering, manage incremental updates, and ensure consistency between landing and master tables.

Data Modeling:

o Create and optimize logical and physical data models in Snowflake for efficient querying and reporting. Translate business needs into well-structured data models, defining tables, keys, relationships, and constraints.

Analytical Thinking and Problem Solving:

o Analyze complex datasets, identify trends, and work with analysts and stakeholders to resolve data challenges. Investigate data quality issues and design robust solutions aligned with business goals.

Data Quality and Governance:

o Implement routines for data validation, cleansing, and error handling to ensure accuracy and reliability in Snowflake. Support the creation and application of data governance standards.

Automation and Optimization:

o Seek automation opportunities for data engineering tasks, enhance ETL processes for performance, and scale systems as data volumes grow within Snowflake.

Documentation and Communication:

o Maintain thorough documentation of data flows, models, transformation logic, and pipeline configurations. Clearly communicate technical concepts to all stakeholders.



Collaboration:

o Work closely with data scientists, analysts, and engineers to deliver integrated data solutions, contributing to cross-functional projects with your data engineering expertise.

Required Qualifications

Bachelor’s or Master’s degree in Computer Science, IT, Engineering, Mathematics, or related field

At least 2 years of experience as a Data Engineer or similar role

Strong Python skills, including experience developing ETL pipelines and automation scripts

Solid understanding of relational and dimensional data modeling

Experience with Snowflake for SQL, schema design, and managing pipelines

Proficient in SQL for querying and data analysis in Snowflake

Strong analytical and problem-solving skills

Familiarity with data warehousing and best practices

Knowledge of data quality, cleansing, and validation techniques

Experience with version control systems like Git and collaborative workflows

Excellent communication, both verbal and written

Preferred Qualifications

In-depth knowledge of Snowflake features like Snowpipe, Streams, Tasks, and Time Travel

Experience with cloud platforms such as AWS, Azure, or Google Cloud

Familiarity with workflow orchestration tools like Apache Airflow or Luigi

Understanding of big data tools like Spark, Hadoop, or distributed databases

Experience with CI/CD pipelines in data engineering

Background in streaming data and real-time processing

Experience deploying data pipelines in production

Sample Responsibilities in Practice

Develop automated ETL pipelines in Python to ingest daily CSVs into a Snowflake landing table, validate data, and merge clean records into a master table, handling duplicates and change tracking.

Design scalable data models in Snowflake to support business intelligence reporting, ensuring both integrity and query performance.

Collaborate with business analysts to adapt data models and pipelines to evolving needs.

Monitor pipeline performance and troubleshoot inconsistencies, documenting causes and solutions.

Key Skills and Competencies

Technical Skills: Python (including pandas, SQLAlchemy); Snowflake SQL and management; schema design; ETL process development

Analytical Thinking: Ability to translate business requirements into technical solutions; strong troubleshooting skills

Collaboration and Communication: Effective team player; clear technical documentation

Adaptability: Willingness to adopt new technologies and proactively improve processes

Our Data Environment

Our organization manages diverse data sources, including transactional systems, third-party APIs, and unstructured data. We are dedicated to building a top-tier Snowflake data infrastructure for analytics, reporting, and machine learning. In this role, you will influence our data architecture, implement modern data engineering practices, and contribute to a culture driven by data.


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Algoleap Technologies logo
Algoleap Technologies

Information Technology

San Francisco

RecommendedJobs for You