Senior Associate - Data Engineer

4 - 7 years

3 - 7 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • We are looking for a skilled Data Engineer with expertise in Snowflake and ETL processes to build, maintain, and troubleshoot data pipelines that support research data services.
  • This role ensures that data infrastructure is robust, scalable, and delivers accurate and timely data to end users.

Essential skills

  • 5+ years of experience in data engineering or ETL development.
  • Strong experience in developing and debugging ETL pipelines using Python.
  • Hands-on expertise in Databricks for data processing and workflow orchestration.
  • Proficiency in SQL for complex queries, transformations, and performance tuning.
  • Deep knowledge of Snowflake for data loading, modeling, and optimization.
  • Ability to test, validate, and optimize pipelines for accuracy, reliability, and performance.
  • Proficiency in documenting pipeline architecture and operational processes for maintainability.
  • Knowledge of data integrity, quality assurance, and best practices for robust data workflows.
  • Familiarity with CI/CD practices and version control for data engineering.

Good to have:

  • Experience with cloud platforms (AWS, Azure, GCP).
  • Familiarity with ETL orchestration tools (Airflow, dbt).
  • Familiarity with research data sources (e.g., Capital IQ).
  • Knowledge of Power BI and Tableau.
  • Awareness of data security and compliance in financial data.

Key Responsibilities

  • Write and debug ETL pipelines in Python on Databricks leveraging Snowflake
  • Develop and implement data transformations and loading processes in Snowflake
  • Test and validate new pipelines for accuracy, performance, and reliability
  • Document pipelines, architecture, and operational procedures; conduct knowledge transfer
  • Maintain data integrity and quality across ingestion and transformation layers
  • Collaborate with Data Quality Analysts to embed validation checks in pipelines
  • Optimize Snowflake usage (compute, warehouse sizing, caching, micro-partitions)
  • Ensure observability: logging, metrics, alerts, and SLAs for pipelines
  • Manage jobs/workflows and handle incident response root-cause analysis
  • Uphold security, compliance, and governance standards

Key Metrics

  • Proven Snowflake expertise.
  • ETL development and troubleshooting experience.
  • Strong SQL and Python proficiency.
  • Strong documentation and knowledge transfer capabilities.
  • Demonstrated problem-solving skills in debugging and troubleshooting pipelines.
  • Evidence of collaboration with cross-functional teams (Data Quality, Analytics).

Behavioral Competencies

  • Good communication (verbal and written)
  • Experience in managing client stakeholders

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Aarti Industries logo
Aarti Industries

Chemicals and Pharmaceuticals

Mumbai

RecommendedJobs for You