Data Engineer - 3

5 - 9 years

5 - 9 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • This role combines core data engineering excellence with optional exposure tofinancial correctness & reconciliation pipelines
  • You will design, build, and optimize data systems that handle high-volume, high-variety datasets, power analytics, reporting, financial systems, and operational products
  • The work will be across ingestion, transformation, data modeling, data marts, and workflow orchestration, ensuringreliability, scalability, and data correctness across the entire platform
  • Design and build scalable, high-performance data platforms leveraging Databricks, Lakehouse architectures, Spark ETL, and Kafka-based streaming pipelines

What you'll Do

  • Build & Optimize Large-Scale Data Pipelines Own end-to-end ETL/ELT pipelines using Databricks (PySpark, Spark SQL). Build streaming and batch workflows processing billions of events per day. Implement efficient ingestion frameworks using Kafka/MSK, Auto Loader, CDC. Optimize jobs for performance, cost efficiency, cluster utilisation, and scalability. Ensure pipelines meet defined SLAs for latency, throughput, and freshness.
  • Own Lakehouse Design & Data Modeling Design bronze silver gold layers using Delta Lake best practices. Build curated data marts for analytics, BI, finance, risk, and product teams. Implement dimensional models, fact/event stores, and conformed dimensions. Drive schema governance, catalog organization, versioning, and lineage.
  • Build High-Quality Data Integrations Integrate with internal microservices, third-party APIs, streaming sources, and databases. Develop connectors/pipelines for structured, semi-structured, and unstructured data. Implement robust idempotency, incremental ingestion, and change data capture. Build scalable workflows for multi-system integration (custody, payments, trading, CRM, etc)
  • Workflow Orchestration & Reliability Build highly reliable workflows using Airflow/Databricks Workflows/Lambda/Step Functions. Implement retry logic, DLQs, backfill strategies, and pipeline auto-recovery. Maintain >.5% pipeline uptime for critical systems.
  • Data Quality, Observability & Governance Implement DQ checks: schema, completeness, freshness, referential integrity. Build monitoring and alerting for ingestion lag, pipeline failures, and cost anomalies. Own documentation, runbooks, dashboards (Datadog/CloudWatch/Lakehouse monitoring). Ensure compliance with internal governance (S3 structure, catalog rules, PII handling).
  • Cross-Functional Collaboration
  • Work closely with Analytics, Product, Finance, Ops, Risk, and Engineering teams. Translate requirements into scalable data pipelines and optimized datasets.
  • Participate in design reviews, code reviews, architecture discussions. Mentor SDE-1/SDE-2 engineers and uplift engineering excellence.

you'll Excel in This Role If You

  • Have strong hands-on experience in:Data Engineering Core Databricks (PySpark, Spark SQL, Delta Lake) - mandatory Kafka/MSK / Streaming pipelines PySpark, Python expert level Spark SQL optimisation & query tuning ETL engineering for large-scale workflows
  • Designing data marts, fact tables, dimensions, business layers Strong experience in data warehouse concepts
  • Cloud & Infrastructure AWS (S3, Glue, Lambda, EMR/Databricks, DMS, IAM, CloudWatch)
  • Cluster management, autoscaling, job tuning CI/CD for data pipelines
  • Data Integration REST APIs, event ingestion, CDC, RDBMS ingestion (Postgres/MySQL)
  • Handling schema evolution & incremental loads

you'll Know you're Winning

  • When 4-6 key pipelines redesigned or optimized for performance/cost High-reliability ingestion for multiple systems in production
  • Data marts operational with measurable improvement in query performance
  • Zero severe data-quality issues attributed to owned systems
  • Documented and automated workflows replacing manual processes

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Coindcx logo
Coindcx

Cryptocurrency

Mumbai

RecommendedJobs for You

pune, bengaluru, mumbai (all areas)

noida, mumbai, pune, chennai, bengaluru

hubli, mangaluru, mysuru, bengaluru, belgaum