o9 D&A lead – E2E Lead

8 years

0 Lacs

Posted:15 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Are You Ready to Make It Happen at Mondelēz International?

Join our Mission to Lead the Future of Snacking. Make It Uniquely Yours.

Support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred.The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week.The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.A key aspect of the MDLZ Google cloud BigQuery platform is handling the complexity of inbound data, which often does not follow a global design (e.g., variations in channel inventory, customer PoS, hierarchies, distribution, and promo plans). You will assist in ensuring the robust operation of pipelines that translate this varied inbound data into the standardized o9 global design. This also includes managing pipelines for different data drivers (> 6 months vs. 0-6 months), ensuring consistent input to o9.

How You Will Contribute

  • 8+ years of overall industry experience and minimum of 8-10 years of experience building and deploying large scale data processing pipelines in a production environment
  • Focus on excellence: Has practical experience of Data-Driven Approaches, is familiar with the application of Data Security strategy, is familiar with well know data engineering tools and platforms
  • Technical depth and breadth: Able to build and operate Data Pipelines, Build and operate Data Storage, has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects.
  • Implementation and automation of Internal data extraction from SAP BW / HANA
  • Implementation and automation of External data extraction from openly available internet data sources via APIs
  • Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR
  • Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases
  • Exposing data via Alteryx, SQL Database for consumption in Tableau
  • Data documentation maintenance/update
  • Collaboration and workflow using a version control system (e.g., Git Hub)
  • Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice
  • Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Data engineering Concepts: Experience in working with data lake, data warehouse, data mart and Implemented ETL/ELT and SCD concepts.
  • ETL or Data integration tool: Experience in Talend is highly desirable.
  • Analytics: Fluent with SQL, PL/SQL and have used analytics tools like Big Query for data analytics
  • Cloud experience: Experienced in GCP services like cloud function, cloud run, data flow, data proc and big query.
  • Data sources: Experience of working with structure data sources like SAP, BW, Flat Files, RDBMS etc. and semi structured data sources like PDF, JSON, XML etc.
  • Programming: Understanding of OOPs concepts and hands-on experience with Python/Java for programming and scripting.
  • Data Processing: Experience in working with any of the Data Processing Platforms like Dataflow, Databricks.
  • Orchestration: Experience in orchestrating/scheduling data pipelines using any of the tools like Airflow and Alteryx
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Skills and Experience
  • Deep knowledge in manipulating, processing, and extracting value from datasets.
  • + 5 years of experience in data engineering, business intelligence, data science, or related field.
  • Proficiency with Programming Languages: SQL, Python, R
  • Spark, PySpark, Sparkr, SQL for data processing.
  • Strong project management skills and ability to plan and prioritize work in a fast-paced environment.
  • Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau.
  • Ability to think creatively, highly driven and self-motivated.
  • Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, Open Hubs)

What You Will Bring

A desire to drive your future and accelerate your career. You will bring experience and knowledge in:
  • 10-12 Years of experience in Data Engg / Ingestion
  • Project management including knowledge of Agile
  • Technical expertise and a passion for innovation, understanding a variety of disruptive technology products and services
  • Technical and business acumen to evaluate opportunities with internal and external partners
  • Machine learning/artificial intelligence programming languages
  • The start-up landscape
  • Familiarity with the fast-moving consumer goods or related sector
  • Leading internal and external teams through complex challenges and developing creative solutions/options
  • Driving for results and Analytical Skills

More about this role:

  • 8+ years of overall industry experience and minimum of 8-10 years of experience building and deploying large scale data processing pipelines in a production environment
  • Focus on excellence: Has practical experience of Data-Driven Approaches, is familiar with the application of Data Security strategy, is familiar with well know data engineering tools and platforms
  • Technical depth and breadth: Able to build and operate Data Pipelines, Build and operate Data Storage, has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects.
  • Implementation and automation of Internal data extraction from SAP BW / HANA
  • Implementation and automation of External data extraction from openly available internet data sources via APIs
  • Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR
  • Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases
  • Exposing data via Alteryx, SQL Database for consumption in Tableau
  • Data documentation maintenance/update
  • Collaboration and workflow using a version control system (e.g., Git Hub)
  • Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice
  • Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Data engineering Concepts: Experience in working with data lake, data warehouse, data mart and Implemented ETL/ELT and SCD concepts.
  • ETL or Data integration tool: Experience in Talend is highly desirable.
  • Analytics: Fluent with SQL, PL/SQL and have used analytics tools like Big Query for data analytics
  • Cloud experience: Experienced in GCP services like cloud function, cloud run, data flow, data proc and big query.
  • Data sources: Experience of working with structure data sources like SAP, BW, Flat Files, RDBMS etc. and semi structured data sources like PDF, JSON, XML etc.
  • Programming: Understanding of OOPs concepts and hands-on experience with Python/Java for programming and scripting.
  • Data Processing: Experience in working with any of the Data Processing Platforms like Dataflow, Databricks.
  • Orchestration: Experience in orchestrating/scheduling data pipelines using any of the tools like Airflow and Alteryx
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Skills and Experience
  • Deep knowledge in manipulating, processing, and extracting value from datasets.

Education / Certifications:

Job specific requirements:

  • + 5 years of experience in data engineering, business intelligence, data science, or related field.
  • Proficiency with Programming Languages: SQL, Python, R
  • Spark, PySpark, Sparkr, SQL for data processing.
  • Strong project management skills and ability to plan and prioritize work in a fast-paced environment.
  • Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau.
  • Ability to think creatively, highly driven and self-motivated.
  • Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, Open Hubs)

Travel requirements:

Work schedule: Hybrid

No Relocation support available

Business Unit Summary

Mondelez India Foods Private Limited (formerly Cadbury India Ltd.) has been in India for over 70 years, making sure our mouth-watering and well-loved local and global brands such as Cadbury chocolates, Bournvita and Tang powdered beverages, Oreo and Cadbury Bournvita biscuits, and Halls and Cadbury Choclairs Gold candies get safely into our customers hands—and mouths. Headquartered in Mumbai, the company has more than 3,300 employees proudly working across sales offices in New Delhi, Mumbai, Kolkata and Chennai and in manufacturing facilities at Maharashtra, Madhya Pradesh, Himachal Pradesh and Andhra Pradesh, at our global Research & Development Technical Centre and Global Business Hub in Maharashtra and in a vast distribution network across the country. We are also proud to be recognised by Avatar as the Best Companies for Women in India in 2019 – the fourth time we’ve received this award.Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law.

Job Type

RegularDigital Strategy & InnovationTechnology & Digital

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You