Data Engineering Analyst

8 - 13 years

45 - 80 Lacs

mumbai navi mumbai mumbai (all areas)

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Purpose of the role

As the Analytics Manager, you will play a vital role in MDS Data & Analytics team supporting the initiatives within the Procurement function of our company. You will collaborate closely with cross-functional teams, including Procurement, marketing, supply chain, and finance, to drive the data strategy and empower Procurement teams with actionable insights that accelerate product development cycles, enhance decision-making, and ultimately contribute to Mondelez's mission of delighting consumers around the world. This position offers an exciting opportunity to work in a dynamic environment working in close partnership with the technical team and business team and manage the projects and analytics roadmap in your area of responsibility.

Job Specific Requirements

The Manager, Analytics, is a member of the Mondelz Digital Services (MDS) Data & Analytics working in close partnership with the Procurement team.

- 8+ years of overall industry experience and minimum of 6+ years of experience building and deploying large scale data processing pipelines in a production environment

- Hands on experience of coding on GCP Big Query, Databricks and SQL

- Focus on excellence: Has practical experience of Data-Driven Approaches, is familiar with the application of Data Security strategy, is familiar with well know data engineering tools and platforms

- Technical depth and breadth: Able to build and operate Data Pipelines, Build and operate Data Storage, has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices that are made on their projects.

- Implementation and automation of Internal data extraction from SAP BW / HANA

- Implementation and automation of External data extraction from openly available internet data sources via APIs

- Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR

- Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases

- Exposing data via Alteryx, SQL Database for consumption in Tableau

- Data documentation maintenance/update

- Collaboration and workflow using a version control system (e.g., Git Hub)

- Learning ability: Is self-reflective, has a hunger to improve, and has a keen interest in driving their own learning. Applies theoretical knowledge to practice

- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Data Engineering Concepts: Experience in working with data lake, data warehouse, data mart and Implemented ETL/ELT and SCD concepts.

- ETL or Data integration tool: Experience in Talend is highly desirable.

- Analytics: Fluent with SQL, PL/SQL and have used analytics tools like Big Query for data analytics

- Cloud experience: Experienced in GCP services like cloud function, cloud run, data flow, data bricks and big query.

- Data sources: Experience working with structured data sources like SAP, BW, Flat Files, RDBMS, etc. and semi structured data sources like PDF, JSON, XML, etc.

- Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week.

- Data Processing: Experience working with any of the Data Processing Platforms like Dataflow and Databricks.

- Orchestration: Experience in orchestrating/scheduling data pipelines using any of the tools like Airflow and Alteryx

- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

- Work with data and analytics experts to strive for greater functionality in our data systems.

Skills and Experience

- Deep knowledge in manipulating, processing, and extracting value from datasets.

- 5+ years of experience in data engineering, business intelligence, data science, or related field.

- Proficiency with Programming Languages: SQL, Python, R and Spark, PySpark, SQL for data processing.

- Strong project management skills and ability to plan and prioritize work in a fast-paced environment.

- Experience with: Google Cloud Platform, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau.

- Ability to think creatively, highly driven, and self-motivated.

- Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs).

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Mondelez logo
Mondelez

Food and Beverage Manufacturing

Greater Chicago Area IL

RecommendedJobs for You

mumbai, navi mumbai, mumbai (all areas)

chennai, tamil nadu, india

chennai, tamil nadu, india

mumbai, navi mumbai, mumbai (all areas)

noida, uttar pradesh, india