GAIN - Central IT - Data Warehouse Developer

0 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

Data Engineer / Data Warehouse Developer

Location:

Mumbai

Reports to:

Head of Engineering

Direct Reports:

N/A

Primary Purpose

To design, build, and maintain a scalable internal data repository that consolidates data from key business systems (operational platforms, finance, sales & marketing, and timesheets) into a secure, well-modelled cloud data warehouse to power analytics and self‑service reporting in Power BI

Main Responsibilities:

Technology

  • Design, develop, and maintain robust data pipelines to ingest data from multiple internal systems via APIs, integration platforms, and batch processes into a central Snowflake data warehouse.
  • Model and implement scalable data warehouse structures (staging, core, marts) to support analytics and reporting use cases across finance, sales & marketing, operations, and resourcing/timesheets
  • Collaborate with business stakeholders (Finance, Sales, Marketing, Operations, HR/Resourcing) to understand data requirements, map source systems, and define a consistent, joined‑up data model.
  • Implement and maintain ELT/ETL processes using appropriate tools (e.g. integration platforms, orchestration tools, SQL, scripting) ensuring performance, reliability, and maintainability
  • Work closely with BI developers and analysts to expose clean, well‑documented datasets and semantic models optimised for Power BI
  • Ensure data quality and integrity through validation rules, reconciliation checks, and monitoring across the full data pipeline from source to reporting.
  • Optimise compute, storage, and queries for cost‑efficiency and performance, including clustering, caching, and workload management
  • Maintain clear technical documentation for data models, data flows, lineage, and transformations to support collaboration and ongoing maintenance.
  • Stay current with modern data engineering practices, tooling, and cloud data platform capabilities, proposing improvements to the data architecture and pipelines

Process, Quality and Information Security

  • Manage your own workload, delivering committed work within the iteration and to agreed definitions of done.
  • Ensure all data pipelines and transformations are appropriately tested (unit, integration, regression) and integrated into the release process, supporting higher levels of deployment automation
  • Contribute to and adopt CI/CD practices for data and analytics assets (e.g. version control for SQL/scripts, automated deployment of data models and pipelines).
  • Adhere to Information Security policies and implement "security by design" across data pipelines and platforms, including access control, encryption, and secure handling of PII/financial data.
  • Collaborate with IT Operations, Information Security, and Software Engineering teams to align on infrastructure requirements, environments, and operational support models
  • Support change and release processes (including CAB where required), ensuring changes to data pipelines and warehouse structures are well‑planned and low risk

Professional skills/ experience:

  • Strong experience as a Data Engineer / Data Warehouse Developer or similar role, ideally in a cloud‑native environment
  • Expert SQL skills with experience building and optimising complex queries and transformations
  • Hands‑on experience designing and implementing data warehouses or data lakehouses (star/snowflake schemas, dimensional modelling, slowly changing dimensions, etc.)
  • Practical experience with cloud data warehouses, databases, roles, virtual warehouses, performance tuning, cost optimisation
  • Experience ingesting data from APIs and/or integration platforms (e.g. iPaaS tools, ETL/ELT orchestrators) and working with a variety of data formats (JSON, CSV, parquet, etc.)
  • Familiarity with BI / visualisation tools, preferably Power BI, and how to structure data for efficient reporting and self‑service analytics
  • Understanding of data quality management, metadata, data lineage and data governance practices
  • Experience with cloud platforms (e.g. Azure, AWS, or GCP) and modern DevOps practices (version control, CI/CD, environment management)
  • Scripting or programming experience (e.g. Python) for data processing and automation is highly desirable
  • Knowledge of information security standards and regulatory requirements relating to data (e.g. ISO27001, GDPR) is a plus

Personal Qualities

  • Strong problem solver who enjoys working with complex, messy data and turning it into reliable, usable assets
  • Able to build trust and rapport across Finance, Sales, Marketing, Operations, IT and Engineering teams
  • Clear and confident communicator, capable of explaining technical concepts to non‑technical stakeholders
  • Team player with a collaborative, flexible approach and a "can do" mindset
  • Able to prioritise effectively, manage multiple data initiatives, and resolve issues quickly
  • High attention to detail with a focus on data accuracy, consistency, and reliability

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You