Senior Data Pipeline Engineer

6 - 10 years

0 - 1 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Experience:

Work Location:

Shift -

Job Summary:

We are looking for a highly skilledAWS Data Engineer with strong experience in data architecture, DevOps practices, and code quality tools like SonarQube. You will be responsible for designing scalable data storage and access patterns using AWS services such asDynamo DB, Redis, Glue, Lambda, and Kafka. This role focuses on ensuring efficient, secure, and maintainable data solutions to support both short-term and long-term consumption across our platforms.

Key Responsibilities:

  • Design and implement robust data architectures that support scalability, high availability, and performance within theAWS ecosystem.
  • Develop optimized data access patterns for DynamoDB, Redis, and PostgreSQL based on usage and business needs.
  • Architect short-term and long-term data storage strategies aligned with consumption patterns.
  • Integrate and utilize AWS services, including API Gateway, Lambda, Glue, Kafka, and CloudWatch, to enable real-time and batch data processing.
  • Collaborate with engineering and pipeline teams to ensure best practices in data modeling,partitioning, and storage optimization.
  • Monitor and correct improper storage practices, particularly in DynamoDB and streaming systems.
  • Build reusable Lambda functions and manage their deployment using DevOps pipelines.
  • Set up and maintain CI/CD workflows using GitHub Actions and integrate with quality gates viaSonarQube.
  • Apply DevOps principles to infrastructure provisioning, automated testing, monitoring, and continuous delivery.
  • Ensure code quality and maintainability by incorporating SonarQube analysis into the development lifecycle.
  • Use Kafka for building real-time data ingestion pipelines and managing event-driven architecture.
  • Maintain and integrate PostgreSQL databases with AWS-native and hybrid environments.
  • Analyze stored data and expose insights securely via API Gateway endpoints.

Required Skills and Experience:

  • 3+ years of hands-on experience as a Data Engineer or Data Architect in an AWS environment.
  • Deep knowledge of AWS services: DynamoDB, Redis, Lambda, Glue, API Gateway, and Kafka.
  • Strong experience with data modeling, access pattern design, and schema optimization.
  • Experience with CI/CD pipelines, DevOps tools, and infrastructure automation.
  • Hands-on experience with GitHub, GitHub Actions, and integrating withSonarQube for code quality assurance.
  • Solid understanding of PostgreSQL and its integration into scalable architectures.
  • Strong problem-solving skills and the ability to work independently or in a cross-functional team.
  • Clear understanding of security, data governance, and monitoring best practices in cloud environments.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You