Data Engineer

10 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title: Data Engineer
Location: Hyderabad

Opella

, the Consumer Healthcare company is the third-largest player globally in the Over The Counter (OTC) & Vitamins, Minerals & Supplements (VMS) market. We believe in the power of self-care and the role it can play in creating a healthier society and a healthier planet. That’s why we want to make self-care as simple as it should be by being consumer-led always. Through our over 100 loved brands such as Allegra, Dulcolax and Buscopan, we deliver our mission: helping more than half a billion consumers worldwide take their health in their hands. This mission is brought to life by an 11,000-strong team, 13 best-in-class manufacturing sites, and 4 specialized science and innovation development centers. We are proudly B Corp certified in multiple markets.We aim to be a positive force by embedding sustainability throughout our operations and culture. To succeed, we seek talented individuals who can transform our business and support our mission to become the world's best fast-moving consumer healthcare (FMCH) company in and for the world.

About The Role

  • We are seeking a highly skilled Data Engineer with a strong background in IBM Cognos TM1), deep understanding of the Finance P&L domain, and hands-on experience with modern data stack technologies including dbt, Snowflake, Informatica IICS, Databricks, Airflow, Unity Catalog, Iceberg, Python, and AWS or Azure. This role will be focused on the data strategy, design of data models, data consolidation (from IBM Cognos TM1, ERP, etc.), optimization of data structures and data processing, and development / maintenance of the respective documentation.
  • Data Engineer is expected to play a key role in designing and building secure, scalable, and governed data solutions, working closely with technical and business stakeholders to drive enterprise-level financial reporting and analytics.
  • The ideal candidate will apply modern cloud data technologies, work both independently and as part of a team, and partner with diverse stakeholders including data analysts, data scientists, data engineers, data architects, and product lead/manager to deliver business-aligned outcomes.

Key Responsibilities

  • Strategic governance and continuous development of the Finance Data Foundation in Snowflake, ensuring scalability, efficiency, and long-term stability.
  • Design and maintain scalable data pipelines and ETL processes to support the Finance Data Foundation, reporting and analytics, using Informatica Snowflake, IICS, Apache Airflow, Iceberg, dbt, Databricks, and Snowflake technologies
  • Collaborate with Finance stakeholders to understand P&L structures, reporting requirements, and business logic
  • Integrate TM1 with other data sources (ERP, CRM, data lakes, etc.) and cloud platforms (AWS/Azure) to ensure data consistency and accuracy
  • Develop and maintain documentation for data models, processes, and workflows.
  • Create and refine Python scripts for efficient ETL/ELT processes
  • Oversee Snowflake cloud database operations with focus on security, performance, and availability
  • Implement structured data transformations through dbt for enhanced modeling and reporting
  • Utilize Elementary for comprehensive data quality monitoring and reliability assurance
  • Partner with diverse teams to capture requirements, design data models, and drive data initiatives
  • Ensure optimal workflow performance through continuous monitoring and optimization to meet business standards
  • Apply governance and security best practices to maintain data integrity and compliance
  • Support analytics teams by preparing high-quality datasets for analysis and machine learning projects

Qualifications

Required Experience & Skills:

  • Technical Expertise: Minimum 10+ years of hands-on data engineering experience with proficiency in AWS/Azure, Snowflake (Mandatory), dbt, Airflow, Python, and Databricks or Iceberg. Good to have Informatica (IICS) hands-on.
  • Proficiency in identifying trends, inconsistencies, and data quality issues in the financial reporting environments.
  • Solid understanding of Finance P&L concepts and Financial KPIs.
  • Knowledge of ERP systems (e.g., SAP, Oracle) and their integration with TM1.
  • Python Development: Strong capabilities in Python programming for data engineering, automation, and scripting.
  • Data Orchestration: Deep understanding of Apache Airflow for pipeline orchestration and workflow management.
  • Exposure to Agile methodologies and DevOps practices.
  • Cloud Database: Extensive experience with Snowflake architecture, including Snowpipe implementation, warehouse optimization, Back-up and Recovery planning, and query performance tuning.
  • Data Source: TM1, ERP, SAP eco-system.
  • Data Transformation: Expertise in using dbt for building scalable data models and transformation workflows.
  • Data Quality: Practical experience with Elementary for pipeline observability and data quality assurance.
  • Data Architecture: Proven experience in data modeling, schema design, and performance optimization.
  • SQL Proficiency: Advanced SQL skills for complex data querying and transformation.
  • Analytics: Exposure to Power BI or similar BI tools and ability to prepare Snowflake datasets accordingly.
  • Governance & Security: Practical knowledge of implementing Unity Catalog in a modern data platform. Solid understanding of data governance frameworks, security best practices, and privacy regulations.
  • Collaboration: Excellent problem-solving abilities with strong attention to detail, capable of working both independently and in team environments.
  • Functional Domain: Finance and FMCH (Fast Moving Consumer Health).

Preferred Additional Skills

  • AI enthusiast and Automation expertise
  • Experience with Github actions and workflows.
  • Knowledge of CI/CD methodologies and Git version control
  • Understanding of modern data architectures including data lakes and real-time processing
  • Familiarity with BI tools such as Power BI, Tableau, Looker

Education & Languages

  • Bachelor’s degree in computer science, Information Technology, or similar quantitative field of study.
  • Fluent in English, and French is a plus.
Function effectively within teams of varied cultural backgrounds and expertise
Why us?
At Opella, you will enjoy doing challenging, purposeful work, empowered to develop consumer brands with passion and creativity. This is your chance to grow new skills and be part of a bold, collaborative, and inclusive culture where people can thrive and be at their best every day.

We Are Challengers.

We Are Dedicated To Making Self-care As Simple As It Should Be. That Starts With Our Culture. We Are Challengers By Nature, And This Is How We Do Things

All In Together:

We keep each other honest and have each other's backs.

Courageous:

We break boundaries and take thoughtful risks with creativity.

Outcome-Obsessed:

We are personally accountable, driving sustainable impact and results with integrity.

Radically Simple:

We strive to make things simple for us and simple for consumers, as it should be.Join us on our mission. Health. In your hands.www.opella.com/en/careers

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Hyderabad, Telangana, India

Noida, Uttar Pradesh, India

Noida, Uttar Pradesh, India