Home
Jobs

Data Engineer

2 - 9 years

15 - 17 Lacs

Posted:11 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Description
Are You Ready to Make It Happen at Mondel z International?
Join our Mission to Lead the Future of Snacking. Make It With Pride.
You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs
How you will contribute
You will:
  • Operationalize and automate activities for efficiency and timely production of data visuals
  • Assist in providing accessibility, retrievability, security and protection of data in an ethical manner
  • Search for ways to get new data sources and assess their accuracy
  • Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases
  • Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation
  • Validate information from multiple sources.
  • Assess issues that might prevent the organization from making maximum use of its information assets
What you will bring
A desire to drive your future and accelerate your career and the following experience and knowledge:
  • Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems
  • Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data
  • Ability to simplify complex problems and communicate to a broad audience
In This Role
As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.
Role & Responsibilities:
  • Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions.
  • Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.
  • Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity.
  • Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance.
  • Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices.
Technical Requirements:
  • Programming: Python, PySpark, Go/Java
  • Database: SQL, PL/SQL
  • ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran.
  • Data Warehousing: SCD, Schema Types, Data Mart.
  • Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker.
  • GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex.
  • AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis.
  • Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics.
  • Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow.
Soft Skills:
  • Problem-Solving: The ability to identify and solve complex data-related challenges.
  • Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.
  • Analytical Thinking: The capacity to analyze data and draw meaningful insights.
  • Attention to Detail: Meticulousness in data preparation and pipeline development.
  • Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field.
Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy
Business Unit Summary
Job Type
Regular
Data Science
Analytics & Data Science

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Mondelez
Mondelez

Food and Beverage Manufacturing

Greater Chicago Area IL

RecommendedJobs for You

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru