Senior Data Engineer, ATG, Data Technology - Delivery

4 - 6 years

0 Lacs

Posted:2 weeks ago| Platform: Indeed logo

Apply

Work Mode

On-site

Job Description

    ITPune Corporate Office - Mantri
    Posted On
    27 Jul 2025
    End Date
    27 Jul 2026
    Required Experience
    4 - 6 Years

BASIC SECTION

Job Level

GB04

Job Title

Senior Data Engineer, ATG, Data Technology - Delivery

Job Location

Country

India

State

MAHARASHTRA

Region

West

City

Pune

Location Name

Pune Corporate Office - Mantri

Tier

Tier 1

Skills

SKILL

SKILLS AS PER JD

Minimum Qualification

OTHERS

JOB DESCRIPTION

Job Purpose

The Senior Data Engineer will be responsible for designing, building, and maintaining scalable and efficient data pipelines and architectures for the Enterprise Data Platform. This role will focus on enabling high-quality, reliable, and timely data access for analytics, reporting, and business decision-making. Working closely with business analysts, data scientists, and architects, the Senior Data Engineer will ensure data solutions meet business needs and adhere to best practices and governance standards

Duties and Responsibilities

  • Design and implement robust, scalable, and high-performance data pipelines and ETL/ELT processes.

  • Develop, optimize, and maintain data architectures including databases, data lakes, and data warehouses.

  • Ensure the quality, integrity, and security of data through robust data validation and data quality frameworks.

  • Collaborate with business analysts and stakeholders to understand business data requirements and translate them into technical designs.

  • Work closely with data architects to align with enterprise architecture standards and strategies.

  • Implement data integration solutions with various internal and external data sources.

  • Monitor, troubleshoot, and optimize system performance and data workflows.

  • Support the migration of on-premise data solutions to cloud-based environments (e.g., AWS, Azure, GCP).

  • Stay up to date with the latest industry trends and technologies in data engineering and recommend innovative solutions.

  • Create and maintain comprehensive documentation for all developed data pipelines and systems.

  • Mentor junior data engineers and contribute to the development of best practices.

  • Key Decisions / Dimensions

  • Selecting appropriate technologies, tools, and frameworks for data pipeline development.

  • Designing data models and database schemas that optimize for both performance and scalability.

  • Establishing standards for code quality, data validation, and monitoring processes.

  • Identifying performance bottlenecks and recommending architectural improvements.

  • Major Challenges

  • Managing and processing large volumes of structured and unstructured data with efficiency.

  • Designing systems that can handle scaling needs as business requirements and data volumes grow.

  • Balancing the need for quick delivery with the necessity for scalable and maintainable code.

  • Ensuring data quality and compliance with data governance and security policies.

  • Integrating disparate data sources with differing formats and standards into unified models.

  • Required Qualifications and Experience

    a) Qualifications
  • Bachelors Degree in Computer Engineering, Computer Science, Information Technology, or a related field.

  • Professional certifications such as Google Professional Data Engineer, AWS Certified Data Analytics Specialty, or Microsoft Certified: Azure Data Engineer Associate are a plus.

  • b) Work Experience
  • Minimum of 4+ years of experience in data engineering or a related role.

  • Strong expertise in building and optimizing ETL/ELT pipelines and data workflows.

  • Proficient in programming languages such as Python, Java, or Scala.

  • Hands-on experience with SQL and relational database systems (e.g., PostgreSQL, SQL Server, MySQL).

  • Experience with big data technologies (e.g., Hadoop, Spark, Kafka).

  • Familiarity with cloud platforms (AWS, Azure, GCP) and cloud-native data services (e.g., Redshift, BigQuery, Snowflake, Databricks).

  • Solid understanding of data modeling, data warehousing concepts, and best practices.

  • Knowledge of CI/CD pipelines and infrastructure-as-code (IaC) is a plus.

  • Strong problem-solving skills and the ability to work independently or in a team.

  • c) Skills Keywords
  • Data Architecture

  • Delivery Management

  • Project Management

  • Cloud Data Platforms (e.g., Azure, AWS, GCP)

  • Data Modeling

  • Data Governance

  • Stakeholder Management

  • Quality Assurance

  • Agile Methodology

  • Team Leadership

  • Budget Management

  • Risk Management

  • Data Integration

  • Scalable Data Solutions

    Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now

    RecommendedJobs for You