Sr. DevOps Engineer

2 - 4 years

7 - 11 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Position Overview


We are seeking a skilled Data Engineer with 2-4 years of experience to design, build, and maintain scalable data pipelines and infrastructure. You will work with modern data technologies to enable data-driven decision making across the organisation.


Key Responsibilities


Design and implement ETL/ELT pipelines using Apache Spark and orchestration tools (Airflow/Dagster). Build and optimize data models on Snowflake and cloud platforms. Collaborate with analytics teams to deliver reliable data for reporting and ML initiatives. Monitor pipeline performance, troubleshoot data quality issues, and implement testing frameworks. Contribute to data architecture decisions and work with cross-functional teams to deliver quality data solutions.


Required Skills & Experience


  • 2-4 years of experience in data engineering or related field
  • Strong proficiency with Snowflake including data modeling, performance optimisation, and cost management
  • Hands-on experience building data pipelines with Apache Spark (PySpark)
  • Experience with workflow orchestration tools (Airflow, Dagster, or similar)
  • Proficiency with dbt for data transformation, modeling, and testing
  • Proficiency in Python and SQL for data processing and analysis
  • Experience with cloud platforms (AWS, Azure, or GCP) and their data services
  • Understanding of data warehouse concepts, dimensional modeling, and data lake architectures


Preferred Qualifications


  • Experience with infrastructure as code tools (Terraform, CloudFormation)
  • Knowledge of streaming technologies (Kafka, Kinesis, Pub/Sub)
  • Familiarity with containerisation (Docker, Kubernetes)
  • Experience with data quality frameworks and monitoring tools
  • Understanding of CI/CD practices for data pipelines
  • Knowledge of data catalog and governance tools
  • Advanced dbt features including macros, packages, and documentation
  • Experience with table format technologies (Apache Iceberg, Apache Hudi)

Technical Environment


  • Data Warehouse:

    Snowflake
  • Processing:

    Apache Spark, Python, SQL
  • Orchestration:

    Airflow/Dagster
  • Transformation:

    dbt
  • Cloud:

    AWS/Azure/GCP
  • Version Control:

    Git
  • Monitoring:

    DataDog, Grafana, or similar

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Auriga It logo
Auriga It

Information Technology

San Francisco

RecommendedJobs for You

Turbhe Khurd, Navi Mumbai, Maharashtra