Data Engineer

5 - 7 years

5 - 8 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


Job Summary:
Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale.
Please NoteWhile the role is categorized as remote, it will follow a hybrid work model based out of our Pune office .
Key Responsibilities:
  • Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools.
  • Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems.
  • Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods.
  • Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders.
  • Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions.
  • Participate in data modeling and storage architecture using star and snowflake schema designs.
  • Contribute to the implementation of data governance , metadata management , and access control mechanisms .
  • Maintain documentation for solutions and participate in testing and validation activities.
  • Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture .
  • Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes.

  • Why Join Cummins?
  • Opportunity to work with a global leader in power solutions and digital transformation.
  • Be part of a collaborative and inclusive team culture.
  • Access to cutting-edge data platforms and tools.
  • Exposure to enterprise-scale data challenges and finance domain expertise .
  • Drive impact through data innovation and process improvement .

  • Competencies
  • Data Extraction & Transformation - Ability to perform ETL activities from varied sources with high data accuracy.
  • Programming - Capable of writing and testing efficient code using industry standards and version control systems.
  • Data Quality Management - Detect and correct data issues for better decision-making.
  • Solution Documentation - Clearly document processes, models, and code for reuse and collaboration.
  • Solution Validation - Test and validate changes or solutions based on customer requirements.
  • Problem Solving - Address technical challenges systematically to ensure effective resolution and prevention.
  • Customer Focus - Understand business requirements and deliver user-centric data solutions.
  • Communication & Collaboration - Work effectively across teams to meet shared goals.
  • Values Differences - Promote inclusion by valuing diverse perspectives and backgrounds.
  • Education, Licenses, Certifications
  • Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline.
  • Certifications in data engineering or relevant tools (Snowflake, Power BI, etc.) are a plus.

  • Experience
    Must have skills
  • 5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment.
  • Proficient in ETL tools , SQL , and data warehouse development .
  • Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies.
  • Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases.
  • Working knowledge of Oracle databases and Oracle EBS structures.

  • Preferred Skills:
  • Experience with Qlik Replicate , data replication , or data migration tools.
  • Familiarity with data governance , data quality frameworks , and metadata management .
  • Exposure to cloud-based architectures, Big Data platforms (e.g., Spark, Hive, Kafka), and distributed storage systems (e.g., HBase, MongoDB).
  • Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start DevOps Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Skills

    Practice coding challenges to boost your skills

    Start Practicing Now
    Cummins logo
    Cummins

    Engineering, Manufacturing

    Columbus

    RecommendedJobs for You