Home
Jobs

3523 Informatica Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

2 Lacs

Hyderabad

Work from Office

Naukri logo

Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse Implement best practices for data loading, ensuring optimal performance and data quality Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements Work on data modeling and schema design to optimize database structures for ETL processes Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading Troubleshoot and resolve issues related to data integration and performance bottlenecks Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions Provide guidance and mentorship to junior members of the data engineering team Create and maintain comprehensive documentation for ETL processes, data models, and data flows Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows Use Jira for task tracking and project management Implement data quality checks and validation processes to ensure data integrity and reliability Maintain detailed documentation of data engineering processes and solutions Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e g , Informatica Cloud, Talend, Apache NiFi) Expertise in IDMC principles, including data governance, data quality, and metadata management Solid understanding of data warehousing concepts and practices Strong SQL skills and experience working with relational databases Excellent problem-solving and analytical skills Qualified candidates should APPLY NOW for immediate consideration! Please hit APPLY to provide the required information, and we will be back in touch as soon as possible Thank you! ABOUT INNOVA SOLUTIONS: Founded in 1998 and headquartered in Atlanta, Georgia, Innova Solutions employs approximately 50,000 professionals worldwide and reports an annual revenue approaching $3 Billion Through our global delivery centers across North America, Asia, and Europe, we deliver strategic technology and business transformation solutions to our clients, enabling them to operate as leaders within their fields Recent Recognitions: One of Largest IT Consulting Staffing firms in the USA Recognized as #4 by Staffing Industry Analysts (SIA 2022) ClearlyRated Client Diamond Award Winner (2020) One of the Largest Certified MBE Companies in the NMSDC Network (2022) Advanced Tier Services partner with AWS and Gold with MS

Posted 5 hours ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc) . Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 6 hours ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Novo Nordisk Global Business Services (GBS), India Department – Supply Chain Global Business Services (GBS) Are you passionate about Senior Professional B1 with a focus on Supply Chain and/or pharmaAre you ready to take on a new challenge and join our Supply Chain Management teamWe are looking for a talented individual to fill the role of Senior Professional B1 in our Global Business Service (GBS) department. If you have a passion for data analysis and a strong understanding of supply chain concepts, then read on and apply today for a life-changing career. ! The position As a Senior Professional B1 at Novo Nordisk, you will be responsible for Creating and maintaining master data in SAP & Winshuttle according to existing business processes and rules. Handle CR-cases and DV related to the creation of master data. You will be entrusted with the below responsibilities: Create and maintain Bill of Materials (BOM) in SAP individually and in mass. Perform data cleansing to ensure data accuracy and integrity. Create and maintain Standard Operating Procedures (SOPs) and instructions as per Novo Nordisk standards. Manage stakeholders and support operations for new transitions. Adhere to Key Performance Indicators (KPIs) and actively participate in daily stand-up meetings. Qualifications Bachelor’s degree in supply chain management, production, mechanical engineering, or equivalent from a well-recognised institute. 3-5 years of experience within SAP master data, preferably within pharma or supply chain. Experience with master data. Ability to analyse and process data. Good understanding of supply chain concepts (Plan, Make, Source, Deliver, and Return) and the supporting master data. Proficient user of Microsoft Office (Excel, PowerPoint). Experience in automation with advanced Excel and building macros or have ETL knowledge with Informatica/Winshuttle. Experience in conducting meetings with peers, including preparation and facilitation. Knowledge of business rules for processes and attributes within SAP. Excellent communication skills in English, both written and oral. About the department Supply Chain was established in March 2017 as part of Product Supply Devices & Supply Chain Management business plan. The Business plan has three parts; Robust, Ready, Effective and Supply Chain Global Business Service (GBS) is part of the last one focusing business through offshoring. The unit is anchored under Supply Chain Planning (SCP) in Head Quarter and is the agreed place to consolidate Supply Chain activities across Novo Nordisk. The Supply Chain offshoring journey has started in D&S, Service Delivery Catalogue is taking form and other areas within Product Supply can soon join or add to the Catalogue to optimize costs and reduce complexity by operating an effective supply chain.

Posted 6 hours ago

Apply

7.0 - 9.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. About The Role We are looking for a highly motivated individual with strong organizational and technical skills for the position of Lead Data Engineer/ Data Engineering Manager (Snowflake). You will play a critical role working on cutting edge of Data Engineering and analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.Effectively communicate across various levels, including Executives, and functions within the global organization.Demonstrate strong leadership skills with ability to drive projects/tasks to delivering valueEngage with stakeholders, business analysts and project team to understand the data requirements.Design analytical frameworks to provide insights into a business problem.Explore and visualize multiple data sets to understand data available and prepare data for problem solving.Design database models (if a data mart or operational data store is required to aggregate data for modeling). About You You're a fit for the Lead Data Engineer/ Data Engineering Manager (Snowflake), if your background includes:QualificationsB-Tech/M-Tech/MCA or equivalentExperience7-9 years of corporate experienceLocationBangalore, IndiaHands-on experience in developing data models for large scale data warehouse/data Lake Snowflake, BWMap the data journey from operational system sources through any transformations in transit to itsdelivery into enterprise repositories (Warehouse, Data Lake, Master Data, etc.)Enabling on the overall master and reference data strategy, including the procedures to ensure the consistency and quality of Finance reference data.Experience across ETL, SQL and other emerging data technologies with experience in integrations of a cloud-based analytics environmentBuild and refine end-to-end data workflows to offer actionable insightsFair understanding of Data Strategy, Data Governance ProcessKnowledge in BI analytics and visualization toolsPower BI, Tableau #LI-NR1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 6 hours ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

The Data Scientist organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the role: Work with low to minimum supervision to solve business problems using data and analytics. Work in multiple business domain areas including Customer Experience and Service, Operations, Finance, Sales and Marketing. Work with various business stakeholders, to understand and document requirements. Design an analytical framework to provide insights into a business problem. Explore and visualize multiple data sets to understand data available for problem solving. Build end to end data pipelines to handle and process data at scale. Build machine learning models and/or statistical solutions. Build predictive models. Use Natural Language Processing to extract insight from text. Design database models (if a data mart or operational data store is required to aggregate data for modeling). Design visualizations and build dashboards in Tableau and/or PowerBI Extract business insights from the data and models. Present results to stakeholders (and tell stories using data) using power point and/or dashboards. Work collaboratively with other team members. About you: Overall 3+ years' experience in technology roles. Must have a minimum of 1 years of experience working in the data science domain. Has used frameworks/libraries such as Scikit-learn, PyTorch, Keras, NLTK. Highly proficient in Python. Highly proficient in SQL. Experience with Tableau and/or PowerBI. Has worked with Amazon Web Services and Sagemaker. Ability to build data pipelines for data movement using tools such as Alteryx, GLUE, Informatica. Proficient in machine learning, statistical modelling, and data science techniques. Experience with one or more of the following types of business analytics applications: Predictive analytics for customer retention, cross sales and new customer acquisition. Pricing optimization models. Segmentation. Recommendation engines. Experience in one or more of the following business domains Customer Experience and Service. Finance. Operations. Good presentation skills and the ability to tell stories using data and PowerPoint/Dashboard Visualizations. Excellent organizational, analytical and problem-solving skills. Ability to communicate complex results in a simple and concise manner at all levels within the organization. Ability to excel in a fast-paced, startup-like environment. #LI-SS5 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 6 hours ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Hybrid

Naukri logo

Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql

Posted 6 hours ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Role 1: Snowflake Developer (Coding, Documentation) Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Budget: 18L(Max) Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql

Posted 6 hours ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Chennai

Hybrid

Naukri logo

Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci / cd , zeppelin , pycharm , pyspark , etl tools,control-m,unit test cases,tableau,performance tuning , jenkins , qlikview , informatica , jupyter notebook,api integration,unix/linux,git,aws s3 , hive , cloudera , jasper , airflow , cdc , pyspark , apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop

Posted 6 hours ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Teradata SME We are seeking a highly experienced and knowledgeable Teradata Subject Matter Expert (SME) to provide deep technical expertise and strategic guidance on our existing Teradata data warehouse environment, with a focus on its integration, migration, and potential modernization within the Google Cloud Platform (GCP). You will be the go-to person for complex Teradata-related challenges, optimization initiatives, and architectural decisions, particularly as they relate to our cloud strategy on GCP. You will collaborate with data engineers, cloud architects, analysts, and business stakeholders to ensure our data landscape effectively leverages both Teradata and GCP capabilities. Responsibilities Serve as the primary point of contact and expert resource for all Teradata-related technical inquiries and issues, including those related to GCP integration. Provide deep technical expertise in Teradata architecture, utilities, performance tuning, and query optimization, with an understanding of how these aspects translate to or interact with GCP services. Lead efforts to integrate Teradata with GCP services for data ingestion, processing, and analysis. Provide guidance and expertise on potential migration strategies from Teradata to GCP data warehousing solutions like BigQuery . Optimize Teradata performance in the context of data pipelines that may involve GCP components. Troubleshoot and resolve complex Teradata system and application issues, considering potential interactions with GCP. Develop and maintain best practices, standards, and documentation for Teradata development and administration, with a focus on cloud integration scenarios. Collaborate with cloud architects and data engineers to design hybrid data solutions leveraging both Teradata and GCP. Provide guidance and mentorship to team members on Teradata best practices and techniques within a cloud-focused context. Participate in capacity planning and forecasting for the Teradata environment, considering its future within our GCP strategy. Evaluate and recommend Teradata upgrades, patches, and new features, assessing their compatibility and value within a GCP ecosystem. Ensure adherence to data governance policies and security standards across both Teradata and GCP environments. Stay current with the latest Teradata features, trends, and best practices, as well as relevant GCP data warehousing and integration services. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor's or Master's degree in Computer Science , Engineering, or a related field. Extensive and deep experience (typically 8+ years) working with Teradata data warehouse systems. Expert-level knowledge of Teradata architecture, including MPP concepts, BYNET, and storage management. Proven ability to write and optimize complex SQL queries in Teradata. Strong experience with Teradata utilities (e.g., BTEQ, FastLoad , MultiLoad , TPump). Deep understanding of Teradata performance tuning techniques, including workload management and query optimization. Experience with Teradata data modeling principles and best practices. Excellent analytical, problem-solving, and troubleshooting skills specific to Teradata environments, with an aptitude for understanding cloud integration. Strong communication , collaboration, and interpersonal skills, with the ability to explain complex technical concepts clearly, including those bridging Teradata and GCP. Familiarity with Google Cloud Platform (GCP) and its core data services (e.g., BigQuery , Cloud Storage, Dataflow). Preferred Qualifications/ Skills Teradata certifications. Google Cloud certifications (e.g., Cloud Architect, Data Engineer). Experience with Teradata Viewpoint and other monitoring tools. Knowledge of data integration tools (e.g., Informatica, Talend) and their interaction with both Teradata and GCP. Experience with workload management and prioritization in Teradata, and how it might be approached in GCP. Familiarity with data security concepts and implementation within both Teradata and GCP. Experience with migrating data to or from Teradata, especially to GCP. Exposure to cloud-based data warehousing solutions (specifically BigQuery ) and their architectural differences from Teradata. Scripting skills (e.g., Shell, Python) for automation of tasks across both Teradata and GCP. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 16, 2025, 11:49:57 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 9 hours ago

Apply

7.0 years

5 - 7 Lacs

Hyderābād

On-site

About the Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 9 hours ago

Apply

4.0 years

3 - 6 Lacs

Hyderābād

On-site

About Us: Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits at Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking a Technical Business Analyst role who will play a crucial role in ensuring smooth and efficient data migration and integration between diverse systems with varying architectures, databases, and APIs. This role is primarily responsible for translating complex business requirements into actionable specifications for data engineers to build and implement data pipelines. Responsibilities: Conduct thorough business analysis of source and target systems involved in data migrations and integrations. Develop a deep understanding of the functional and technical aspects of both systems, including their operational workflows and data structures. Identify and document system modules and their corresponding relationships between the two systems. Prepare migration/integration scoping documents that outline system objects to be migrated/integrated. Define and document detailed field-to-field data mapping for various objects, specifying how data fields from the source system map to the target system. Identify, analyze, and document migration criteria, considerations, limitations, and required data transformations. Collaborate with system owners, business stakeholders, and the data operations team to ensure migration requirements are fully captured and aligned with business objectives. Work closely with data engineers to facilitate automation of migration/integration processes. Support data validation and reconciliation efforts post-migration to ensure data accuracy and integrity. Maintain clear and structured documentation to support future migrations and integrations. The ideal candidate will bridge the gap between business and technical teams, ensuring successful and seamless data transfers. Competencies, Characteristics & Traits: Mandatory Experience: Minimum 4 years if experience in preparing specifications and experience in liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Good knowledge of data migration and engineering processes and concepts Proficiency in SQL and data analysis tools Understanding of cloud and on-premises database technologies and application services Experience with agile project practices Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders. Critical Thinking and collaboration skills Ability to analyze complex data issues, identify root causes, and propose solutions Skills and Experience: Essential: Experience liaising on data engineering and data migration projects Experience documenting technical requirements from business needs to assist data engineers in building pipelines Proven experience working with relational databases (e.g., SQL Server, Oracle, MySQL), data structures and APIs. Good knowledge of data migration and engineering processes and concepts Experience with data modeling documentation and related tools Proficiency in SQL and data analysis tools Excellent written and verbal communication skills to effectively interact with both technical and non-technical stakeholders Desirable: Understanding of cloud and on-premise database technologies and application services Experience with migration tools such as SnapLogic, Talend, Informatica, Fivetran, or similar. Industry-specific knowledge in Audit, Healthcare and Aviation is a plus Experience with agile project practices Business Analysis certifications (CBAP, CCBA, PMI-PBA) are a plus About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place!

Posted 9 hours ago

Apply

5.0 years

6 - 9 Lacs

Hyderābād

Remote

Job Description Role Overview: A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations. What will you do in this role: Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements. Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories. Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality. Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management. Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods. Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products. Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams. Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities. Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts. Analyze data requirements and translate them into technical specifications for ETL processes. Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place. Monitor and troubleshoot ETL processes to ensure timely and successful data delivery. Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies. Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer. What should you have: Bachelor’s degree in information technology, Computer Science or any Technology stream 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration) Integration experience utilizing REST and Custom API integration Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering. Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc. Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency Extensive Experience in design of reusable integration pattern using the cloud native technologies Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow. Experience in Agile development methodologies and release management techniques Excellent analytical and problem-solving skills Good Understanding of data modeling and data architecture principles Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R353285

Posted 9 hours ago

Apply

0 years

4 - 6 Lacs

Hyderābād

On-site

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 9 hours ago

Apply

6.0 years

10 Lacs

Hyderābād

On-site

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 9 hours ago

Apply

7.0 years

7 - 7 Lacs

Gurgaon

On-site

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 9 hours ago

Apply

8.0 years

20 - 28 Lacs

Gurgaon

On-site

Job Title: Tableau Developer Location: Gurgaon (Work Form Office) Job Type: Full Time Role Experience Level: 8-12 Years Job Summary: We are seeking a talented Tableau Developer to join our Business Intelligence and Analytics team. The ideal candidate will be responsible for designing, developing, and maintaining visually compelling and insightful dashboards and reports using Tableau. You will work closely with business stakeholders to understand requirements, translate data into actionable insights, and support data-driven decision-making. Key Responsibilities: Design and develop interactive Tableau dashboards, visualizations, and reports based on business needs. Collaborate with business analysts, data engineers, and stakeholders to gather requirements and define KPIs. Optimize dashboard performance and usability. Write complex SQL queries to extract and transform data from various sources (e.g., SQL Server, Oracle, Snowflake). Conduct data validation and ensure data quality and accuracy. Schedule and publish dashboards to Tableau Server / Tableau Online for end-user access. Provide training, documentation, and support to business users. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Information Systems, Statistics, or related field. 8-12+ years of hands-on experience with Tableau Desktop and Tableau Server. Proficiency in SQL for data manipulation and analysis. Strong understanding of data warehousing concepts and relational databases. Ability to analyze large datasets and turn them into meaningful visual insights. Experience with data blending, LOD (Level of Detail) expressions, filters, parameters, and calculated fields in Tableau. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Knowledge of ETL tools (e.g., Alteryx, Talend, Informatica) or scripting languages (Python, R). Understanding of data governance and security principles. Tableau certification (Desktop Specialist, Certified Associate, etc.) is a plus. Exposure to Agile methodologies. Job Type: Full-time Pay: ₹2,000,000.00 - ₹2,800,000.00 per year Work Location: In person

Posted 9 hours ago

Apply

4.0 - 6.0 years

0 Lacs

Bhubaneshwar

On-site

Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days

Posted 9 hours ago

Apply

5.0 years

0 Lacs

Orissa

Remote

No. of Positions: 1 Position: Lead Data Engineer Location: Hybrid or Remote Total Years of Experience: 5+ years Key Responsibilities: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations. Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses. Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. Required Skills: This job has no supervisory responsibilities. Bachelor’s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years’ experience with a strong proficiency with SQL query/development skills. Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks. Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory). Experience working in the healthcare industry with PHI/PII. Creative, lateral, and critical thinker. Excellent communicator. Well-developed interpersonal skills. Good at prioritizing tasks and time management. Ability to describe, create and implement new solutions. Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef). Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau). Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.

Posted 9 hours ago

Apply

8.0 years

0 Lacs

Orissa

Remote

No. of Positions: 1 Position: Data Integration Technical Lead Location: Hybrid or Remote Total Years of Experience: 8+ years Experience: 8+ years of experience in data integration, cloud technologies, and API-based integrations. At least 3 years in a technical leadership role overseeing integration projects. Proven experience in integrating cloud-based systems, on-premise systems, databases, and legacy platforms. Informatica Cloud (IICS) or Mulesoft certifications preferable. Technical Expertise: Expertise in designing and implementing integration workflows using IICS, Mulesoft, or other integration platforms. Proficient in integrating cloud and on-premise systems, databases, and legacy platforms using API integrations, REST/SOAP, and middleware tools. Strong knowledge of Salesforce CRM, Microsoft Dynamics CRM, and other enterprise systems for integration. Experience in creating scalable, secure, and high-performance data integration solutions. Deep understanding of data modelling, transformation, and normalization techniques for integrations. Strong experience in troubleshooting and resolving integration issues. Key Responsibilities: Work with architects and client stakeholders to design data integration solutions that align with business needs and industry best practices. Lead the design and implementation of data integration pipelines, frameworks, and cloud integrations. Lead and mentor a team of data integration professionals, conducting code reviews and ensuring high-quality deliverables. Design and implement integrations with external systems using APIs, middleware, and cloud services. Develop data transformation workflows and custom scripts to integrate data between systems. Stay updated on new integration technologies and recommend improvements as necessary. Excellent verbal and written communication skills to engage with both technical and non-technical stakeholders. Proven ability to explain complex technical concepts clearly and concisely. Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.

Posted 9 hours ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Work with Match360, Publisher, and Watsonx integrations to modernize MDM workloads Drive architectural decisions and ensure alignment with product roadmaps and enterprise standards Secondary: Informatica MDM (Desirable Skillset) Understand Key Concepts Of Informatica MDM Including: Landing, staging, base objects, trust & match rules Hierarchy configuration, E360 views, and SIF/REST API integrations Support data ingestion processes (batch & real-time), transformation, and cleansing routines via IDQ and Java-based user exits Provide insights and inputs to help us strategically position IBM MDM against Informatica, shaping unique assets and accelerators Cross-Functional and Strategic Responsibilities Collaborate with data governance and business teams to implement DQ rules, lineage, and business glossaries Mentor junior developers; participate in design/code reviews and knowledge-sharing sessions Create and maintain documentation: architecture diagrams, integration blueprints, solution specs Stay current with modern MDM practices, AI/ML in data mastering, and cloud-first platforms (e.g., CP4D, IICS, Snowflake, Databricks) Experience with other database platforms and technologies (e.g., DB2,Oracle, SQL Server). Experience with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools. Knowledge of database regulatory compliance requirements (e.g., GDPR, HIPAA). Your Role And Responsibilities We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits. MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Education Bachelor's Degree Required Technical And Professional Expertise We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Technical And Professional Experience Other required skills: IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta Show more Show less

Posted 10 hours ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA Show more Show less

Posted 10 hours ago

Apply

4.0 - 8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Hi All, We are hiring for our investment banking client in Mumbai powai location. Location: Mumbai—locals only. Experience: 4-8 years Budget: Open Competitive Market rate [always keep it low] Interview Mode: 1st Round -Virtual, 2nd/3rd -compulsory face to face, may have more than 3 rounds. Required Details: Total Experience Relevant Experience Current Company: Current Designation: Current CTC Expected CTC Notice Period: Current Location Expected Location: Offer In hand: Reason for Job Change: Degree CGPA Passed Out: JD: Requirements (indicate mandatory and/or preferred): Mandatory Must have extensive development experience in Informatica. Sound knowledge of Transformation, mapping and workflow . Good knowledge of relational database (MSSQL/Oracle) & SQL Good knowledge of Oracle/SQL stored procedure / packages / functions Good knowledge of Unix shell scripting Good communication skills and must be able to interact at all levels on a wide range of issues. Must adapt to dynamic business requirements that alter project flowsFlexible for changes and ability to multi-tasks Hard working and self-motivated person Preferred Investment Banking domain knowledge Proactive and willing to learn Knowledge of Autosys Knowledge of Python Show more Show less

Posted 10 hours ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI, Tableau, Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Experience with RGM.ai product would have an added advantage. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyse data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 10 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

The ideal candidate will be responsible for designing, developing, and deploying scalable ETL processes using Informatica PowerCenter to support our data warehousing and analytics initiatives. You will collaborate with business and technical stakeholders to ensure high data quality, availability, and performance. Key Responsibilities: Design, develop, and maintain ETL workflows and mappings using Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS). Extract, transform, and load data from various source systems (e.g., SQL Server, Oracle, flat files, cloud APIs) into data warehouses or operational data stores. Optimize ETL performance, conduct tuning, and ensure error handling and logging. Collaborate with data architects and analysts to understand data requirements and deliver high-quality data solutions. Work with QA teams to support data validation and testing efforts. Support data integration, migration, and transformation initiatives. Document ETL processes, data flows, and job schedules. Monitor daily ETL jobs and resolve production issues in a timely manner. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent work experience). 3+ years of experience with Informatica PowerCenter or Informatica IICS. Strong SQL skills and experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL). Solid understanding of data warehousing concepts and dimensional modeling. Experience in performance tuning and troubleshooting ETL processes. Hands-on experience with job scheduling tools (e.g., Autosys, Control-M, Tidal). Familiarity with version control systems and DevOps practices. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS Redshift, Azure Synapse). Exposure to data governance and data quality tools. Knowledge of scripting languages (e.g., Shell, Python). Experience working in Agile/Scrum environments. Familiarity with BI tools (e.g., Tableau, Power BI) is a plus. Benefits This position comes with competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Home Office model Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally known group Private Health Insurance Pension Plan Paid Time Off Training & Development *Note: Benefits differ based on employee level Show more Show less

Posted 11 hours ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

To lead the India-based team responsible for driving foundational data capabilities, supporting enterprise data governance, and ensuring high-quality data assets across Orange Business. This role is critical in operationalizing our data strategy and supporting data enablement across our business domains. Data Management Operations: Ensure operational execution of core data quality management activities including metadata management, data lineage, data observability, data quality monitoring and data cataloging. Team Leadership: Lead and develop a team of data quality experts in charge of the data quality remediation, monitoring and performance. Provide mentorship, guidance, and performance management and ensure new recruitments and a low turnover. Collaboration and Support: Partner closely with data owners, data stewards, and business units to support the implementation of data governance policies, data quality rules, and standards. Tools & Technologies: Administer and optimize enterprise data management tools (e.g., Collibra, Informatica, AccelData or equivalent). Ensure proper onboarding, user training, and operational support. Governance Alignment: Ensure alignment with the Orange Business Data Governance framework, providing execution support to Data Councils, Domain Owners, and Data Protection teams. Reporting & Metrics: Develop KPIs and dashboards to measure progress on data management maturity, quality improvement, and usage of data assets. Innovation & Best Practices: Promote continuous improvement, automation, and adoption of best practices in data management processes and tooling. People development: Accompany team skill developments and ensure knowledge sharing in the team Knowledge and abilities: Ability to understand the complexities of the Orange Business Data management landscape Strong understanding of data governance principles and regulatory frameworks (GDPR, etc.). Agile way of working with a can-do approach Expertise in metadata management, data cataloging, data quality, and master data processes would be a plus. Hands-on experience with enterprise data governance or data management platforms (Collibra, Informatica, Talend, Atlan, etc.). Excellent communication and stakeholder management skills. Education, qualifications, and certifications Bachelor's or Master’s degree in Computer Science, Information Systems, Data Management, or related field Other professional certification such as SCRUM, ITIL, PMP, SAFe will be an advantage. Experience A minimum of 8 years experience in data management, with at least 3 years in a team leadership or managerial roles. Experience working in a global matrix environment is a strong plus Knowledge of the telecom or B2B services sector is desirable Show more Show less

Posted 11 hours ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies