JSW One Platforms - Data Engineer - ETL/BigQuery

2 - 5 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Role Overview

We are looking for a mid-level Data Engineer to help build, maintain and evolve our data infrastructure, ensuring that business analysts and stakeholders across different lines of business have reliable, unified access to data - and enabling self-service reporting and dashboards. You will work closely with analytics, operations, and business stakeholders to transform raw data from multiple sources into actionable, high-quality datasets and Responsibilities :
  • Build, maintain, and optimize data pipelines (batch and streaming, as needed) to - extract, transform, and load (ETL/ELT)- data from multiple, disparate source systems into a centralized data warehouse.
  • Implement and manage data warehousing solutions using cloud-native technologies (e.g. on Google Cloud Platform - data warehouse like BigQuery, data ingestion/orchestration tools, etc.).
  • Integrate data from diverse operational platforms (for different lines of business/departments) to create a unified, analytics-ready dataset.
  • Perform data modeling and schema design to support reporting and analytics needs (e.g. star/snowflake schemas, dimensional models) and optimize storage and query performance.
  • Ensure data quality, consistency, and reliability: implement validation/cleaning/transformation logic, monitor pipelines, handle issues, and maintain data governance and compliance processes.
  • Work with business analysts, stakeholders and reporting/BI teams to understand business reporting requirements, translate them into data solutions, and deliver datasets for dashboards.
  • Collaborate with BI/analytics/reporting tools and teams - support creation of dashboards and reports for various departments (e.g. sales performance, operations, executive dashboards, order management, customer service).
  • Document data pipelines, data models, data flows, business logic and data definitions to help maintainability and team knowledge sharing.
  • Help optimize data processing workflows for performance and cost-efficiency, including efficient querying and storage.

Required Qualifications & Skills

  • Bachelor's degree in Computer Science, Engineering, or related technical discipline (or comparable experience).
  • 2- 5 years of practical experience in data engineering, ETL/ELT pipelines, or related backend/data roles.
  • Strong proficiency in - SQL- and a programming/scripting language such as - Python- (or similar).
  • Experience with data warehousing platforms and cloud-based data infrastructure (preferably cloud-native warehouses or lakehouses).
  • Solid understanding of data modeling, schema design (dimensional models), data normalization/denormalization, and data transformation concepts.
  • Experience developing and maintaining ETL/ELT pipelines, working with both batch and (optionally) streaming datasets.
  • Strong problem-solving skills, attention to detail, and commitment to data quality, consistency and reliability.
  • Good communication and collaboration skills - able to liaise with non-technical stakeholders (business analysts, operations, leadership), understand requirements, and translate them into technical solutions.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You