Data Engineer

5 years

8 - 9 Lacs

Posted:4 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

GE Vernova is accelerating the path to more reliable, affordable, and sustainable energy, while helping our customers power economies and deliver the electricity that is vital to health, safety, security, and improved quality of life. Are you excited at the opportunity to electrify and decarbonize the world?
We are seeking a passionate, creative, and results-driven Data Engineer with substantial experience in designing, building, and maintaining the infrastructure and data pipelines that enable effective data collection, processing, quality control, storage, and analysis for innovative AI/ML applications in the energy domain. This role requires at least 5 years working within the energy sector or related domains such as smart infrastructure or industrial automation. As part of the Grid Automation (GA) organization, you will work closely with the CTO AI/ML Team, product lines, R&D teams, product management, and other GA functions. You will also support cross-functional initiatives by developing scalable and unified data frameworks to address critical customer problems and enable rapid prototyping and deployment—both at the edge and in the cloud.

Essential Responsibilities:

  • Design and maintain database architectures, schemas, and data models tailored to grid innovation and energy system applications.
  • Utilize efficient data storage technologies (e.g., Relational Databases, Data Lakes, NoSQL) to ensure scalable and secure data access.
  • Build, optimize, and maintain reliable data pipelines for data ingestion, cleaning, transformation, and feature extraction from structured and unstructured sources.
  • Develop and manage integrations with internal and external data sources and APIs to enable seamless data flow.
  • Identify new and relevant datasets to improve product capabilities and decision-making across the business.
  • Automate data integration and transformation workflows for diverse data formats and operational needs.
  • Monitor performance and scalability of data systems and implement enhancements to increase efficiency and reliability.
  • Apply data governance policies and implement data quality checks to ensure data integrity across systems.
  • Collaborate with Application Architects, Data Scientists, ML Engineers, and other technical stakeholders to deliver relevant, ready-to-use datasets.
  • Work closely with product management and R&D teams to gather requirements and develop innovative data solutions that support product development.

Must-Have Requirements

  • PhD, Master’s, or Bachelor’s degree in Data Science, Computer Science, Electrical Engineering, or a related field with a focus on data engineering.
  • Proven experience in the energy, smart infrastructure, or industrial automation sectors, with hands-on project experience in building and managing data pipelines, typically acquired through a minimum of 5 years of service.
  • Proficiency in Python, SQL, and at least one other programming language commonly used in data engineering (e.g., Scala, Java).
  • Experience with relational databases (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) and data warehousing technologies like Snowflake or Redshift.
  • Familiarity with cloud platforms like AWS, Azure, or GCP for deploying and managing data systems.
  • Extensive experience with ETL processes (Extract, Transform, Load) and automating data pipeline workflows.
  • Experience with data visualization tools such as Tableau, Power BI, or similar platforms for building reports and dashboards.
  • Familiarity with big data tools and technologies, such as Hadoop, Kafka, and Spark.
  • Ability to collaborate effectively in a team environment, contributing ideas and taking initiative to solve problems.
  • Adaptability to work in a dynamic, multi-tasking environment, with the ability to address evolving challenges.
  • Effective communication skills, with the ability to collaborate smoothly with cross-functional teams and resolve conflicts proactively.

Nice-to-Have Requirements:

  • Familiarity with data governance frameworks and validation standards in the energy sector.
  • Knowledge of distributed computing environments and model deployment at scale.

Additional Information

Relocation Assistance Provided: Yes

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You