Jobs
Interviews

6 Ataccama Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 16.0 years

0 Lacs

maharashtra

On-site

Are you ready to make it happen at Mondelz International Join our mission to lead the future of snacking and make it with pride. Together with analytics team leaders, you will support our business by providing excellent data models to uncover trends that can drive long-term business results. Your role will involve: - Working closely with the business leadership team to execute the analytics agenda - Identifying and incubating best-in-class external partners for strategic projects - Developing custom models/algorithms to uncover signals, patterns, and trends for long-term business performance - Executing the business analytics program agenda using a methodical approach that communicates the deliverables to stakeholders effectively To excel in this position, you should possess: - Experience in using data analysis to make recommendations to senior leaders - Technical expertise in best-in-class analytics practices - Experience in deploying new analytical approaches in a complex organization - Proficiency in utilizing analytics techniques to create business impacts The Data COE Software Engineering Capability Tech Lead will be part of the Data Engineering and Ingestion team, responsible for defining and implementing software engineering best practices, frameworks, and tools to support scalable data ingestion and engineering processes. Key responsibilities include: - Leading the development of reusable software components, libraries, and frameworks for data ingestion, transformation, and orchestration - Designing and implementing intuitive user interfaces using React.js and modern frontend technologies - Developing backend APIs and services to support data engineering tools and platforms - Defining and enforcing software engineering standards and practices for developing and maintaining data products - Collaborating with data engineers, platform engineers, and other COE leads to build fit-for-purpose engineering tools - Integrating observability and monitoring features into data pipeline tooling - Mentoring and supporting engineering teams in using the frameworks and tools developed Qualifications required: - Bachelor's or master's degree in computer science, engineering, or related discipline - 12+ years of full-stack software engineering experience, with at least 3 years in data engineering, platform, or infrastructure roles - Strong expertise in front-end development with React.js and component-based architecture - Backend development experience in Python with exposure to microservices architecture, FAST APIs, and RESTful APIs - Experience working with data engineering tools such as Apache Airflow, Kafka, Spark, Delta Lake, and DBT - Familiarity with GCP cloud platforms, containerization (Docker, Kubernetes), and DevOps practices - Strong understanding of CI/CD pipelines, testing frameworks, and software observability - Ability to work cross-functionally and influence without direct authority Preferred skills include: - Experience with building internal developer platforms or self-service portals - Familiarity with data catalogue, metadata, and lineage tools (e.g., Collibra) - Understanding of data governance and data mesh concepts - Agile delivery mindset with a focus on automation and reusability In this role, you will play a strategic part in developing the engineering backbone for a next-generation enterprise Data COE. You will work with cutting-edge data and software technologies in a highly collaborative and innovative environment, driving meaningful change and enabling data-driven decision-making across the business. Join us at Mondelz International to be part of our purpose to empower people to snack right, offering a broad range of delicious, high-quality snacks made with sustainable ingredients and packaging. With a rich portfolio of globally recognized brands, we are proud to lead in biscuits, chocolate, and candy globally, and we have a diverse community of makers and bakers across the world who are energized for growth and committed to living our purpose and values. This is a regular job opportunity in the field of Analytics & Modelling.,

Posted 2 weeks ago

Apply

7.0 - 9.0 years

17 - 32 Lacs

hyderabad, telangana, india

On-site

Technical Lead Data Engineer Experience : - 7+ to 10 Years Location :- Hyderabad / Chennai Notice :- Looking for Immediate to serving notice periods upto 15th Sept / 30 Days Official Notice Period. Technical Lead Data Ingestion / ETL Solution Design & Architecture: Lead the design and implementation of scalable data ingestion and ETL pipelines using tools such as PySpark, Python, SQL, Snowflake. Deployment: Jenkins CI/CD, Scheduling tool: Autosys. Cloud : AWS Good to have: Ataccama (DQ profiling tool) Technical Leadership: Mentor and guide development teams, enforce coding standards, and ensure best practices in data engineering. Performance Optimization: Monitor, troubleshoot, and optimize ETL workflows for speed, scalability, and reliability. Collaboration & Delivery: Work closely with data architects, Domain Data analysts, and business stakeholders to translate requirements into technical solutions and ensure timely delivery. Show more Show less

Posted 4 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Location : Hyderabad What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career paths, and steady growth prospects with great scope to innovate. We aim to create an ecosystem of easily configurable data applications focused on storytelling for public and private use. Data Architect We are seeking an experienced Data Architect to design and govern scalable, secure, and efficient data platforms in a data mesh environment. You will lead data architecture initiatives across multiple domains, enabling self-serve data products built on Databricks and AWS, and support both operational and analytical use cases. Key Responsibilities Design and implement enterprise-grade data architectures leveraging the medallion architecture (Bronze, Silver, Gold). Develop and enforce data modelling standards, including flattened data models optimized for analytics. Define and implement MDM strategies (Reltio), data governance frameworks (Collibra), and data classification policies. Lead the development of data landscapes, capturing sources, flows, transformations, and consumption layers. Collaborate with domain teams to ensure consistency across decentralized data products in a data mesh architecture. Guide best practices for ingesting and transforming data using Fivetran, PySpark, SQL, and Delta Live Tables (DLT). Define metadata and data quality standards across domains. Provide architectural oversight for data platform development on Databricks (Lakehouse) and AWS ecosystem. Key Skills & Qualifications Must-Have Technical Skills: (Reltio, Colibra, Ataccama, Immuta) Experience in the Pharma domain. Data Modeling (dimensional, flattened, common data model, canonical, and domain-specific, entity-level data understanding from a business process point of view). Master Data Management (MDM) principles and tools (Reltio) (1). Data Governance and Data Classification frameworks (1). Strong experience with Fivetran**, PySpark, SQL, Python. Deep understanding of Databricks (Delta Lake, Unity Catalog, Workflows, DLT) . Experience with AWS services related to data (e.g., S3, Glue, Redshift, IAM, ). Experience on Snowflake. Architecture & Design Proven expertise in Data Mesh or Domain-Oriented Data Architecture. Experience with medallion/lakehouse architecture. Ability to create data blueprints and landscape maps across complex enterprise systems. Soft Skills Strong stakeholder management across business and technology teams. Ability to translate business requirements into scalable data designs. Excellent communication and documentation skills. Preferred Qualifications Familiarity with regulatory and compliance frameworks (e.g., GxP, HIPAA, GDPR). Background in data product building. About Us We consult and deliver solutions to organizations where data is the core of decision-making. We undertake strategic data consulting for organizations, laying out the roadmap for data-driven decision-making. This helps organizations convert data into a strategic differentiator. Through a host of our products, solutions, and Service Offerings, we analyze and visualize large amounts of data. To know more about us visit Gramener Website and Gramener Blog. Apply for this role Apply for this Role Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

The position is responsible for supporting the Data Governance Office (DGO) in delivering best-in-class methodologies for designing and implementing the Data Quality Program. You will be supporting the design, development, and delivery by providing solutions for end-to-end implementation of the Data Quality platform in partnership with key Business and IT stakeholders. Your main responsibilities will include collecting and managing metadata, supporting the identification and management of Critical Data Elements, developing business rules and hypotheses for testing consistency of usage, assisting in Data Quality dashboard and to-do list requirements for monitoring and remediation, coordinating with Data Stewards for creating data quality controls and monitoring tools, and implementing Data Quality tools to deliver end-to-end Data Quality solutions. To excel in this role, you should have a BA/BS in Business Administration, Communications, Accounting, Computer Science, Finance, or related field with 2-4 years of experience in Data Quality implementation such as IDQ, AXON, or EDC. You should be well-versed in each stage of the data quality development cycle, proficient in creating different profiles and understanding profile results, skilled in writing complex SQL queries and Data Quality rules on Data Quality Platform, experienced in creating glossary, Catalogue, DQ Monitoring Projects, and integrating with tools like Collibra, Power BI, etc. Additionally, familiarity with UNIX or any scripting language and working with Data Governance tools like Ataccama will be beneficial. MetLife, recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune Worlds 25 Best Workplaces for 2024, is one of the world's leading financial services companies. Operating in over 40 markets, MetLife provides insurance, annuities, employee benefits, and asset management services to individual and institutional customers globally. The company's purpose is to help colleagues, customers, communities, and the world create a more confident future. United by purpose and guided by empathy, MetLife aims to transform the next century in financial services. Join MetLife and be part of creating a more confident future together.,

Posted 1 month ago

Apply

2.0 - 7.0 years

10 - 19 Lacs

Bengaluru

Remote

Ataccama Support Engineer Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Resource should have 2+ years experience in Ataccama Support software application Experience in data quality, governance, and metadata management. Extensive knowledge of Ataccama, ADF, SQL Open for 24*7 support swift rotation. Experience in business processing mapping of data and analytics solutions Monitor and support Ataccama Data Quality rules execution and profiling jobs. Troubleshoot data validation, anomaly detection, and scorecard generation issues. Perform patching, software upgrades, and ensure compliance with latest platform updates. Work with business teams to resolve data integrity and governance-related incidents. Maintain SLA commitments for resolving incidents and ensuring data accuracy. Experience with Ataccama ONE platform and knowledge of SQL for data validation.

Posted 2 months ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. As a Package Consultant at IBM, get ready to tackle numerous mission-critical company directives. Our team takes on the challenge of designing, developing and re-engineering highly complex application components and integrating software packages using various tools. You will use a mix of consultative skills, business knowledge, and technical expertise to effectively integrate packaged technology into our clients business environment and achieve business results In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Collaborate with stakeholders to gather requirements and design data quality, data governance, and master data management solutions using Ataccama. Responsible for design and implement data matching and deduplication strategies using Ataccama Data Matching Responsible for developing and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Center Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 3-5 plus years of experience Experience in the optimization of Ataccama data management solutions. Develop and maintain data quality rules, data profiling, and data cleansing processes within Ataccama Data Quality Center. Design and implement data matching and deduplication strategies using Ataccama Data Matching Preferred technical and professional experience Utilize Ataccama Data Catalog for metadata management, data lineage tracking, and data discovery.. Provide expertise in integrating Ataccama with other data management tools and platforms within the organizations ecosystem. Collaborate with stakeholders to gather requirements and design data quality, data governance, and master data management solutions using Ataccama

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies