Posted:4 days ago|
Platform:
On-site
Full Time
Job Title : Data Engineer
Experience : 4-9 Years
Location : Noida, Chennai & Pune
Skills : Python, Pyspark, Snowflake & Redshift
Key Responsibilities
• Migration & Modernization
• Lead the migration of data pipelines, models, and workloads from Redshift to Snowflake/Yellowbrick.
• Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.
• Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.
• Design and build robust ETL/ELT pipelines using Python, PySpark, SQL, and orchestration tools (e.g., Airflow, dbt).
• Support both batch and streaming pipelines, with real-time processing via Kafka, Snowpipe, or Spark Structured Streaming.
• Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.
• Define and implement data modeling strategies (star, snowflake, normalization/denormalization) for analytics and BI layers.
• Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.
• Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).
• Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.
• Contribute to data governance initiatives including metadata tracking, data lineage, and access control.
Required Skills & Experience
• 10+ years in data engineering roles with increasing responsibility.
• Proven experience leading data migration or re-platforming projects.
• Strong command of Python, SQL, and PySpark for data pipeline development.
• Hands-on experience with modern data platforms like Snowflake, Redshift, Yellowbrick, or BigQuery.
• Proficient in building streaming pipelines with tools like Kafka, Flink, or Snowpipe.
• Deep understanding of data modeling, partitioning, indexing, and query optimization.
• Expertise with ETL orchestration tools (e.g., Apache Airflow, Prefect, Dagster, or dbt).
• Comfortable working with large datasets and solving performance bottlenecks.
• Experience in designing data validation frameworks and implementing DQ rules.
TerraGiG
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now0.5 - 0.5 Lacs P.A.
Bengaluru
6.0 - 7.0 Lacs P.A.
Hyderabad
4.0 - 7.0 Lacs P.A.
Hyderabad
5.0 - 10.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
Pune
22.5 - 37.5 Lacs P.A.
Bengaluru
10.0 - 20.0 Lacs P.A.
Pune, Chennai, Bengaluru
25.0 - 37.5 Lacs P.A.
Chennai
12.0 - 15.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.