Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
12.0 - 15.0 years
30 - 35 Lacs
Kochi, Thrissur, Kozhikode
Work from Office
Development Data Enablement team is responsible for implementation of data management solutions and adoption of data management strategies across Drug Development line functions, with particular emphasis on delivering Data Engineering, Integration, Master Data Management, Metadata Management and Data Quality initiatives. The team is responsible for adoption of Enterprise Data Management principles and guidelines across Development systems to help improve data maturity & reliability, complying with Data Governance, Data Management Framework and Data Architecture Standards. In this role, your principal responsibility is to adopt and implement the data architecture across diverse data management initiatives. This role demands close collaboration with a wide range of team members including stakeholders, data leaders, SMEs, architects, modelers, and data engineers. The ideal candidate should possess a robust background in data architecture. The role reports to Data delivery lead in Data, Analytics and DS&AI for Development domain. Key Responsibilities: As a Data Architect, your main responsibilities will be to apply your deep expertise in defining how the data will be stored, consumed, integrated and managed by different entities and IT systems. Contribute to the creation and adoption of the data strategy in close cooperation with Enterprise Data Owners & Enterprise Data Architects. Define function specific data, by defining and documenting the conceptual and logical models, establishing the business process definitions of the data and use of taxonomies to establish relationships between the data. Contribute to the creation and documenting of Enterprise conceptual and logical models while ensuring solutions are aligned and documented according to Novartis data architecture principles. Collaborate closely with domain- and solution-architects to design solution-architectures that comply to all Novartis standards. Define and Track Key Performance Measures related to data architecture deliverables. Define and maintain data landscape/data-flow for the respective domain. Responsible for Data Architecture review as part of demand management including deliverables such as Data Flow Design, Data Landscape, Data Model. Articulate solutions /recommendations to business users. Provides pathways to manage data effectively for analytical uses. Coordinates, prioritize and efficiently allocates the team resources to critical initiatives: plans resources proactively, anticipates and actively manages change, sets stakeholder expectations as required, identifies operational risks and enable the team to drive issues to resolution, balances multiple priorities and minimizes surprise escalations. Collaborates with internal stakeholders, external partners and Institutions and cross-functional teams to solve critical business problems, propose operational efficiencies and innovative approaches. Ensures exemplary communication with all stakeholders including senior business leaders Ensure on time, within budget, compliant, secure, and quality delivery of portfolio of projects in Data Analytics and DS&AI. Assists Project IT teams during full lifecycle of execution including gaps assessments, cost estimations, data architecture and execution. This role requires working with senior stakeholders in business, operations & technology to implement and adopt data management roadmap. Education & Qualifications Bachelors degree in computer science, engineering, or information technology discipline; an advanced degree and related accreditations a plus. Experience A seasoned professional with 12+ years of total work experience, of which 8+ years are dedicated to Data Architecture design, creation, and management, leading data architecture initiatives. Deep understanding and significant experience with data modeling, metadata technologies, data governance tools, and data management principles. Equipped to handle complex enterprise-level data architecture design and maintenance. Demonstrable experience in aligning data architecture with business strategy, effectively communicating technical concepts to non-technical stakeholders, and leading cross-functional teams for data architecture projects. Proficiency in managing data governance processes and ensuring adherence to data regulations and privacy standards. Solid problem-solving skills in addressing data architecture challenges and performance issues, as we'll as the ability to fine-tune data models for scalability Adept at working with Data Governance Frameworks like DCAM, DAMA-DMBOK, and others, and implementing such principles into existing structures. 8+ years of relevant experience in Data Management 5+ years of relevant experience in Pharma Life Sciences. Technical / Functional Skills & Knowledge Key skills include the ability to implement a data strategy, maintaining data governance frameworks, proficiency in designing and maintaining data models, knowledge of data integration techniques and tools Knowledge and experience of data security practices, big data and visualization technologies, Data Quality implementation tools and ETL tools (Databricks) is expected of this role. Industry Knowledge: Deep understanding of Pharma Life Sciences industry, its data needs, and regulatory requirements, along with a grasp of data governance frameworks like DCAM, DAMA-DMBOK. Demonstrated strong interpersonal skills, accountability, written and verbal communication skills, and time management aligned with Novartis Values & Behaviors; deep technical expertise and understanding of the business processes and systems. Customer Orientation: Proven ability to communicate across various stakeholders, including internal and external, and align with all levels of IT and business stakeholders. Hands on experience with data governance, data quality, and master / reference data tools such as Collibra, Informatica Data Quality, Reltio, Tibco EBX and SQL.
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Pune
Work from Office
We are seeking an experienced ETL Architect to design, develop, and optimize data extraction, transformation, and loading (ETL) solutions and to work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. This role requires deep expertise in Python, AWS Cloud, and ETL tools to build and maintain scalable data pipelines and architectures. The ETL Architect will work closely with cross-functional teams to ensure efficient data integration, storage, and accessibility for business intelligence and analytics. Key Responsibilities ETL Design Development: Architect and implement high-performance ETL pipelines using AWS cloud services, Snowflake, and ETL tools such as Talend, Dbt, Informatica, ADF etc Data Architecture: Design and implement scalable, efficient, and cloud-native data architectures. Data Integration Flow: Ensure seamless data integration across multiple source systems, leveraging AWS Glue, Snowflake, and other ETL tools. Performance Optimization: Monitor and tune ETL processes for performance, scalability, and cost-effectiveness. Governance Security: Establish and enforce data quality, governance, and security standards for ETL processes. Collaboration: Work with data engineers, analysts, and business stakeholders to define data requirements and ensure effective solutions. Documentation Best Practices: Maintain comprehensive documentation and promote best practices for ETL development and data transformation. Troubleshooting Support: Diagnose and resolve performance issues, failures, and bottlenecks in ETL processes. Required Qualifications Education : Bachelors or masters degree in Computer Science, Information Technology, Data Engineering, or related field. Experience : 6+ years of experience in ETL development, with 3+ years in an ETL architecture role . Expertise in Snowflake or any MPP data warehouse (including Snowflake data modeling, optimization, and security best practices). Strong experience with AWS Cloud services, especially AWS Glue, AWS Lambda, S3, Redshift, and IAM or Azure/GCP cloud services. Proficiency in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or DataStage . Strong SQL skills and experience with relational and NoSQL databases. Experience in API integrations Proficiency in scripting languages ( Python, Shell, PowerShell ) for automation. Prior experience in Pharmaceutical or Diagnostics or healthcare domain is a plus. Soft Skills Strong analytical and problem-solving abilities. Excellent communication and documentation skills. Ability to work collaboratively in a fast-paced, cloud-first environment . Preferred Qualifications Certifications in AWS, Snowflake, or ETL tools . Experience in real-time data streaming, microservices-based architectures, and DevOps for data pipelines . Knowledge of data governance, compliance (GDPR, HIPAA), and security best practices .
Posted 2 weeks ago
4.0 - 9.0 years
3 - 8 Lacs
Pune
Work from Office
Design, develop, and maintain ETL pipelines using Informatica PowerCenter or Talend to extract, transform, and load data into EDW systems and data lake. Optimize and troubleshoot complex SQL queries and ETL jobs to ensure efficient data processing and high performance. Technologies - SQL, Informatica Power center, Talend, Big Data, Hive
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Divisional Overview The Corporate Planning & Management (CPM) Division unifies Finance & Planning, Spend Management, Operational Risk and Resilience, and CPM Engineering teams to deliver business planning and analytics, expense management, third party risk management, and governance strategies across the firm. CPM have 5 operating pillars. Finance & Planning supports the execution of the firm’s strategic objectives through the management of the planning process, firmwide reporting and analytics and insights into the firm’s business plans and budgets. They develop consistent framework for revenue division projections creating transparency, accountability, and efficiency around projections. This pillar also includes the CF&O, EO and Engineering divisional CFOs, who are strategic finance advisors helping the firm and the non-revenue divisions achieve commercial financial opportunities. Product Finance is responsible for the overall governance and proactive management of the firm’s non-compensation expenses. Spend Management encompasses the functions responsible for managing all aspects of the firm's spend with third parties - advising commercial agreements and driving operating efficiency. Departments include Strategic Sourcing, Procure to Pay, Integrated Travel and Expense, Infrastructure and Transformation and Sustainable Operations. Operational Risk & Resilience drives firmwide Operational Risk programs along with second line teams and implements required changes within CPM. The Corporate Insurance & Advisory team in this pillar identifies, procures, and manages corporate insurance needs for the firm and its investing businesses. The CPM Engineering team provides engineering solutions that enable the firm to manage third-party spend, data and automation, plan budgets, forecast financial scenarios, allocate expenses and support corporate decision making in-line with the firm’s strategic objectives. Role Overview Professionals in CPM have an analytical mindset, exhibit intellectual curiosity and are from diverse academic backgrounds. This role sits within the Spend Management pillar. The role requires collaboration with different functions across the firm on a regular basis, an ability to work independently, and ability to interact with senior professionals across the firm. It also entails in-depth analysis and reporting for senior management, requiring diligence and a commercial mindset. The candidate is required to work closely with global counterparts. Should have excellent verbal and written communication skills. Job Responsibilities Will Include, But Are Not Limited To Primary responsibility is to manage activities related to Business Intelligence (BI) team in Procure to Pay department. The role would require experience in automation, data modelling and analytics. Use digital analytics and automation tools to analyze data to identify patterns, trends, and anomalies that may indicate potential risks. Use predictive modeling techniques to forecast future events and potential impacts on the business. Collaborate with different functions and stakeholders to understand the specific risk management and process automation needs, define key process metrics, key performance indicators, and build reporting solutions to equip businesses to take informed actions or decisions. Use BI tools to develop interactive dashboards. Implementing systems for continuous monitoring of key indicators and goals Administering and optimizing databases to ensure efficient data storage, retrieval, and processing. Develop and maintain data models; create entity relationship diagrams for databases. Qualifications Academic Qualifications: A bachelor’s or master’s degree in analytics/data science (or) any math/statistics/quant background, or Bachelor’s degree in Information Technology or Computer Science with proven experience in data engineering/business intelligence tools especially strong understanding of database concepts and data modeling. Minimum 4 years of experience Business Intelligence Specialist or Data Scientist Certifications: Relevant certifications in database technologies (e.g., Certified Data Professional), ETL Tools (e.g., Alteryx Core and Advance certified) and BI Platforms (e.g., Tableau desktop specialist, RPA etc.) Strong understanding of ETL processes, data integration, data quality framework and ability to design and implement efficient data pipelines. Data Science, Analytics and Automation tools/platforms: Alteryx (preferred), RapidMiner, Informatica, Qliksense, AI or similar Familiarity with data warehousing concepts and experience in designing, building, and maintaining data warehouses to facilitate efficient querying and reporting. Experience optimizing data warehouse performance through indexing, partitioning and other techniques. Data Quality: Understanding of data quality best practices and experience in implementing data governance principles. Data Security: Experience implementing security measures to protect sensitive data and ensure compliance with data protection regulations. Experience working in a Procure to Pay function with a Financial Services Industry will be an incentive. Competencies Functional Expertise: Keeps up to date with emerging business, economic, market trends. Technical Skills: Demonstrates strong technical skills required for the role, attention to detail, takes initiative to broaden their knowledge and demonstrates appropriate analytical and risk management skills. Analytical Skills: Presents sound, persuasive rationale for ideas or opinions. Takes a position on issues and influences others' opinions with persuasive recommendations. Able to independently challenge ORR program stakeholders using data to drive business outcomes. Drive and Motivation: Successfully handles multiple tasks, takes initiative to improve their own performance, works intensely towards extremely challenging goals, and persists in the face of obstacles or setbacks to drive collaborative execution. Effective and efficient time management skills Client and Business Focus: Effectively handles difficult requests, builds trusting and long-term relationships with clients, helps clients to identify/define needs, and manages client/business expectations. Teamwork: Acts as a strong team player, collaborates with others within and across global teams, encourages other team members to participate and contribute, and acknowledges others' contributions Communication: Communicates what is relevant and important in a clear and concise manner and shares information/new ideas with others. Able to clearly articulate observations and linked recommendations Judgment and Problem Solving: Thinks ahead, anticipates questions, plans for contingencies, finds alternative solutions, and identifies clear objectives. Sees the big picture and effectively analyses complex issues. Risk Management: Able to effectively apply risk mindset while re-evaluating process controls and balance the risk versus benefits to protect the firm. About Goldman Sachs At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disabilitystatement.html Applicants who wish to request for a medical or religious accommodation, or any other accommodation required under applicable law, can do so later in the process. Please note that accommodations are not guaranteed and are decided on a case by case basis. © The Goldman Sachs Group, Inc., 2023. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity Show more Show less
Posted 2 weeks ago
4.0 - 9.0 years
7 - 17 Lacs
Hyderabad
Hybrid
Mega Walkin Drive for Senior Software Engineer- Informatica, Teradata, SQL Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities : Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification.Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 4 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications: Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures.Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Shift Timings : Shift: General Shift (5 Days WFO for initial 8 weeks) Skills: Data Engineering Hadoop Hive Python SQL Teradata Notice Period- 0-45 Days Pre requisites : Aadhar Card a copy, PAN card copy, UAN Disclaimer : The selected candidates will initially be required to work from the office for 8 weeks before transitioning to a hybrid model with 2 days of work from the office each week.
Posted 2 weeks ago
6.0 - 8.0 years
10 - 15 Lacs
Hyderabad
Hybrid
Mega Walkin Drive for Senior Software Engineer - Informatica Developer Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities: Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification. Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 6 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications : Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures. Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Skills: Hadoop Hive Informatica Oracle Teradata Unix Notice Period- 0-45 Days Pre requisites : Aadhar Card a copy, PAN card copy, UAN Disclaimer : The selected candidates will initially be required to work from the office for 8 weeks before transitioning to a hybrid model with 2 days of work from the office each week.
Posted 2 weeks ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: Possess hands-on experience in IICS / Informatica Powercenter. Demonstrated involvement in end-to-end IICS/IDMC project. Possess hands-on experience in Informatica PowerCenter Structured Query Language (SQL) Data Warehouse expertise Experience in Extract, Transform, Load (ETL) Testing Effective communication skills Key Responsibilities Design and develop ETL processes using Informatica IICS / Informatica Powercenter. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Should have good expertise on IICS Data integration / Informatica Powercenter and Application Integration, Oracle SQL Implement data integration solutions that ensure data accuracy and consistency. Monitor and optimize existing workflows for performance and efficiency. Troubleshoot and resolve any issues related to data integration and ETL processes. Maintain documentation for data processes and integration workflows. Mandatory Skill Sets ETL Informatica Preferred Skill Sets ETL Informatica Years Of Experience Required 4+ Education Qualification BTech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL (Informatica) Optional Skills Informatica ETL Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Extensive experience in developing solutions with Informatica Intelligent Cloud Services (IICS). Design data mapping strategies between source systems, Salesforce Service Cloud, and Oracle CPQ. Assist in mapping source data to target fields within Salesforce. Validate data before and after migration to ensure its accuracy. Generate reconciliation files and document any discrepancies identified. Design and develop ETL workflows using IICS to extract, transform, and load data into Salesforce Service Cloud. Implement data transformation rules according to mapping specifications. Assist with data loading into Salesforce using tools like Data Loader, when necessary. Conduct various types of testing, including unit testing, system testing, and user acceptance testing (UAT), to ensure data accuracy and integrity. Log and track defects until they are resolved. Maintain comprehensive documentation of data migration processes, mappings, and validation rules. Exhibit strong communication skills (both verbal and written), with the ability to effectively document development workflows, pipelines, and reporting functionalities. Good To Have Experience in Data migration Experience in Snowflake Experience in SOQL Mandatory Skill Sets IICS, Salesforce, ETL Preferred Skill Sets IICS, Salesforce, ETL Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Informatica Intelligent Cloud Services Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description A minimum of 2 years of working as a MDM consultant or directly with clients leveraging popular tools like Informatica, Reltio etc. A minimum of 2 years in a role that had taken ownership of the data assets for the organization to provide users with high-quality data that is accessible in a consistent manner. A minimum of 2 years facilitating Data cleansing and enrichment through data de-duplication and construction A minimum of 2 years in a role that captures the current state of the system, encompassing processes such as data discovery, profiling, inventories. A minimum of 2 years in a role that defined processes include data classification, business glossary creation and business rule definition. A minimum of 2 years in a role that applied processes with aim to operationalize and ensure compliance with policies and include automating rules, workflows, collaboration etc. Experience in a role that led measurement and monitoring to determine the value generated and include impact analysis, data lineage, proactive monitoring, operational dashboards and business value. Experience performing: Master Data Management Metadata Management Data Management and Integration Systems Development Lifecycle (SDLC) Data Modeling Techniques and Methodologies Database Management Database Technical Design and Build Extract Transform & Load (ETL) Tools Cloud Data Architecture, Data Architecture Principle Online Analytical Processing (OLAP) Data Processes Data Architecture Principles Data Architecture Estimation Mandatory Skill Sets Master Data Management, ETL, Database Management Preferred Skill Sets Master Data Management, ETL, Database Management Years Of Experience Required 2-4 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Management Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 2 weeks ago
1.0 - 6.0 years
5 - 8 Lacs
Noida
Work from Office
Manages the Accounts Payable Dispute resolution and helpdesk Process Department of an Engagement, and may be assigned an additional team within Accounts Payable Responsible for the execution of the Accounts Payable Helpdesk and Dispute activities undertaken on behalf of the client. Resolve all Business requestor and supplier escalations that can be handled within the engagement, and drives the resolution of escalation, in collaboration with the client, through conference calls with Requestor groups, Procurement teams, VMD teams, and other client groups and contacts. Takes ownership in driving issues to resolution. Drives the AP process and implements appropriate strategies and quality improvements where necessary Maximizes the use of technology to reduce manual effort, and drive Effective and Efficient Accounts Payable process Looks strategically at business rules, dynamic strategies, and works with reporting an insights teams to develop predictive analysis. Also works with technology teams to ensure a maximized level of automation across all Primary Skills: Provides mentoring to team as needed and conducts no stripes meetings with teams. Creates a strong team Accounts Payable culture. Ensures team SLA and KPI targets are met, reviews critical supplier accounts, and takes ownership in resolving key account issues. Secondary Skills: Sets forth the new process and team targets. Has an end to end vision of the overall objectives and synergies between the teams. Re-evaluates process timeline to decrease turnaround time & improve End user satisfaction (Clients Business users and suppliers team)
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies. Redesign Control M Batch processing for the ETL job build to run efficiently in Production. Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow. Responsibilities: Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support. Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Area MDM CoE MDM Grade Associate/Sr. Associate # of people 9 Skill Set Informatica MDM Location Gurgaon / Bangalore YoE 4-7 Comments Should be able to Lead MDM Delivery as a solution architect and contribute in BD Mandatory skill sets- Informatica, MDM Preferred Skill Sets- Informatica, MDM Year of experience required- 4-8 Years Qualifications- BTech/MBA/MTech/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Informatica MDM Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 2 weeks ago
9.0 - 11.0 years
10 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Detailed JD *(Roles and Responsibilities) Mandatory skills* Minimum 5 years of relevant informatica technical experience with Enterprise Data Warehouse(DWH) design and development using Informatica • Design, develop, and optimize robust ETL/ELT pipelines using Informatica PowerCenter and other data integration tools. • Manage and maintain IPP application related components BDM • Ability to do performance tuning of Informatica BDM jobs • Write complex SQL queries for data extraction, transformation, and analysis across relational databases. • Architect and implement data solutions leveraging AWS services such as S3 • Collaborate with business stakeholders to understand data requirements and deliver solutions. • Monitor and troubleshoot data pipelines to ensure high availability and data quality. • Implement best practices in data governance, security, and compliance. • Create, maintain, review design documentations • Implement and manage Change Requests pertaining to IPP application related components – S3, BDM and MFT Work with other relevant teams (SCP, IOC, Production Control, Infra, Security, Audit) as required • Work with platform and application teams to manage and close incidents / issues and implement changes, to troubleshoot complex Data issues and resolve them within SLA • Documenting and updating of Standard Operating Procedures (SOPs) Desired skills* Architect and implement data solutions leveraging Python, lambda, EMR • Build and maintain scalable data workflows and orchestration using Apache Airflow.
Posted 2 weeks ago
8.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions Dos 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deilver No.Performance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Informatica Data Analyst. Experience: 8-10 Years.
Posted 2 weeks ago
0 years
0 Lacs
Telangana
On-site
About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Key Responsibilities: Design, develop, and implement ETL processes using IICS to extract, transform, and load data from various sources into our data warehouse. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical specifications. Optimize and maintain existing data pipelines to ensure high performance and reliability. Work with Informatica Data Management Cloud (IDMC) to manage and integrate data across cloud and on-premise environments. Monitor data quality and implement data validation and cleansing processes to ensure data accuracy and integrity. Troubleshoot and resolve data-related issues and provide support for data integration processes. Document data engineering processes, technical specifications, and workflows. Stay updated with the latest trends and best practices in data engineering and cloud data management. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in Informatica Intelligent Cloud Services (IICS) and ETL tools. Experience with Informatica Data Management Cloud (IDMC) is highly desirable. Solid understanding of data warehousing concepts and data modeling. Proficiency in SQL and experience with relational databases. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers TBD
Posted 2 weeks ago
5.0 years
4 - 5 Lacs
Hyderābād
On-site
Who We Are: In today's work environment, employees use a myriad of devices to access IT applications and data over multiple networks to stay productive, wherever and however they work. Ivanti elevates and secures Everywhere Work so that people and organizations can thrive. While our headquarters is in the U.S., half of our employees and customers are outside the country. We have 36 offices in 23 nations, with significant offices in London, Frankfurt, Paris, Sydney, Shanghai, Singapore, and other major cities around the world. Ivanti's mission is to be a global technology leader enabling organizations to elevate Everywhere Work, automating tasks that discover, manage, secure, and service all their IT assets. Through diverse and inclusive hiring, decision-making, and commitment to our employees and partners, we will continue to build and deliver world-class solutions for our customers. Our Culture - Everywhere Work Centered Around You At Ivanti, our success begins with our people. This is why we embrace Everywhere Work across the globe, where Ivantians and our customers are thriving. We believe in a healthy work-life blend and act on it by fostering a culture where all perspectives are heard, respected, and valued. Through Ivanti's Centered Around You approach, our employees benefit from programs focused on their professional development and career growth. We align through our core values by locking arms in collaboration, being champions for our customers, focusing on the outcomes that matter most and fighting the good fight against cyber-attacks. Are you ready to join us on the journey to elevate Everywhere Work? Why We Need you! Ivanti is looking for a highly experienced Dell Boomi developer to join our Information Technology team. The successful candidate will have a proven track record of developing and implementing integration solutions using Dell Boomi, a strong understanding of data integration, EDI, and expertise in the associated technologies and integration methods. Experience with Informatica Cloud Application Integration (CAI) is a very strong plus. What You Will Be Doing: Primarily, you will play a critical role in the maintaining of current and future Boomi integrations. You may be asked to develop and deploy of Informatica cloud application integration solutions to ensure that our architectures are scalable, reliable, and adhere to best practices. Analyze business needs, document requirements, and design innovative technical solutions. Collaborate with application developers, business analysts, and architecture teams to design, develop, test, deploy, and support transactional communication processes. Convert business requirements into actionable and efficient technical implementations. Maintain existing Boomi integrations and update them as needed. Develop, optimize, and implement scalable integrations to automate business processes using the Boomi platform. To Be Successful in The Role, You Will Have: At least 5 years of hands-on experience with complex integration projects involving SaaS and business applications. Minimum 3+ years of experience developing integration processes in the Boomi platform. 1+ year of experience working with Informatica Cloud Application Integrations. Strong verbal and written communication skills, with the ability to collaborate with cross-functional teams and stakeholders to gather and refine requirements. Creative thinker with robust planning, organizational, and problem-solving skills. Proficiency in integration design patterns, EDI, cloud technologies, and Service Oriented Architecture (SOA). Advanced expertise with the Dell Boomi platform, including connectors, process building, and API design. Knowledge of Electronic Data Interchange (EDI) processes Hands-on coding experience with JavaScript, Python, and related technologies. Track record of integrating enterprise applications such as Salesforce, SAP, and others. Boomi Certification is a significant plus. Exceptional analytical skills paired with excellent communication and collaboration abilities. This job posting will remain active until a qualified candidate is identified. At Ivanti, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, color, religion, sex, pregnancy (including childbirth, lactation and related medical conditions), national origin, age, physical and mental disability, marital status, sexual orientation, gender identity, gender expression, genetic information (including characteristics and testing), military and veteran status, and any other characteristic protected by applicable law. Ivanti believes that diversity and inclusion among our teammates is critical to our success as a global company, and we seek to recruit, develop and retain the most talented people from a diverse candidate pool. #LI-VG2
Posted 2 weeks ago
0 years
0 Lacs
Hyderābād
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing… You’ll be part of a highly skilled group of marketing technologists who are responsible for the Verizon Business Group customer experience (CX) and marketing technology (martech) stack. In this dynamic role, you'll be responsible for developing, maintaining, integrating, and implementing platforms and data flows across the stack to deliver omni-channel customer journeys and insights. Working with the stakeholders and technology partners, you’ll be creating solutions designed to advance and mature the martech stack, ensuring it supports current and future marketing strategies and campaign needs, and integrates seamlessly with other corporate technologies. You’ll also be part of new technology evaluations and proofs of concept for innovative new solutions including GenAI. You will play a leading role supporting our core Marketing and CX teams. Responsibilities include: Partnering on cross functional projects with enterprise architecture, martech, CX, and GTS teams to deliver revenue impacting campaigns, end-to-end journeys, and insights. Creating technical documentation, data flow and system integration diagrams, and user stories. Executing technical development like APIs, business or third-party system integrations, hands-on in-platform configurations, and scripting. Finding ways to automate manual tasks and improve efficiencies. Supporting troubleshooting efforts. Identifying opportunities for new processes and optimization. Delivering forward-thinking solutions on-time and on budget. Coordinating and leading UAT testing efforts. Other duties as assigned. What we’re looking for... You take a lot of pride in your work and as a self-starter you enjoy learning new skills and taking on new tasks. While you work well independently, your team orientation and collaborative nature make you an obvious choice to lead cross-functional teams. No stranger to succeeding in a fast-paced, ever-changing marketing environment, you balance competing priorities with ease. Talking to people comes very naturally to you and as a strong communicator you can effectively explain issues and resolutions to all levels of the organization. You are detail oriented and forward thinking, looking for the future-proof solution. You'll need to have: Bachelor’s degree or four or more years of work experience Four or more years of relevant work experience Three or more years experience working on complex projects that involve technology implementation and innovation for large enterprises Three or more years hands-on experience working to optimize and troubleshoot data flows between marketing, sales, or business systems Hands-on experience developing Application Programming Interfaces (APIs) integrations and Java and/or Python scripting Hands-on experience creating system architecture documentation and authoring technical user stories Experience creating Extract, Transform, Load (ETL) scripts for automated data transformation and load into CX, sales, and marketing technology platforms Even better if you have one or more of the following: Hands-on experience with Adobe Experience Platform and other Adobe Experience Cloud applications A solid understanding of Salesforce data structures and relational databases Strong SQL skills and the ability to create complex SQL queries for data analysis Experience with Informatica and Adobe Fusion Adobe Business Practitioner certifications on the Products above (or equivalent) Previous experience with Pega Customer Decision Hub Understanding/Experience MarTech Tools - Adobe Analytics/Google Analytics/Marketo Pega Certification If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.
Posted 2 weeks ago
5.0 years
5 - 8 Lacs
Hyderābād
On-site
As one of the world’s leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day! Job Description Work Experience: 5 to 6 years relevant experience working in an Information Technology environment Experience supporting financial applications Tech ops activity like access provisioning, environment refresh activity, data base password rotations etc. Proficiency with Oracle Financials Cloud ERP Technical BIP report development PL/SQL is must. Experience with ETL tools like Informatica, MuleSoft etc. Experience with Agile software development lifecycle process. Experience within financial services (including asset management), is desirable. Hands on experience on developing and maintaining BI Publisher Reports in Cloud ERP is preferred Education/Training: Bachelor’s degree in Computer Science, Engineering, Accounting, or Finance. Minimum 5 years professional experience in business analysis, application design, implementation, support or auditor in an online environment Minimum knowledge, skills and abilities required: Must be a detail-oriented, self-starter with strong organizational skills. Required Skills: SQL , PL/SQL – Expert level Exposure to UAC, Autosys, Informatica, Power BI, Power Apps is a plus Oracle Financial Cloud ERP BIP report development in any of the applications like Accounts Receivables (AR), Accounts Payable (AP), and General Ledger (GL), Sub Ledger Accounting (SLA), Project Costing, Project Accounting, Financial Accounting Hub (FAH), Procurement, Tax and Order to Cash Exposure to OTBI and BI Publisher report – Intermediate Financial Close, Consolidation, General Ledger Accounting - Basic Accounting Operations - Basic Desired Skills : Exposure to third party software like Revport, FIS Integrity (Treasury Management System) Knowledge of Oracle table structures, API's. Strong problem-solving skills, able to multitask and meet deadline. Excellent written and oral communication skills. Self-directed, able to work in a team and independently and collaborate in a group. Understanding of relational databases and query tools. Ability to learn new applications (Oracle and non-Oracle). Ability to understand business requirements and evaluate the impact on systems. Knowledge of PM techniques is required, cross functional experience is strongly preferred. Full Time / Part Time Full time Worker Type Employee Job Exempt (Yes / No) Yes Workplace Model At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office. Why Invesco In Invesco, we act with integrity and do meaningful work to create impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other’s identities, lives, health, and well-being. We come together to create better solutions for our clients, our business and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally. We believe in diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially and we encourage connection and community through our many employee-led Business Resource Groups (BRGs). What’s in it for you? As an organization we support personal needs, diverse backgrounds and provide internal networks, as well as opportunities to get involved in the community and in the world. Our benefit policy includes but not limited to: Competitive Compensation Flexible, Hybrid Work 30 days’ Annual Leave + Public Holidays Life Insurance Retirement Planning Group Personal Accident Insurance Medical Insurance for Employee and Family Annual Health Check-up 26 weeks Maternity Leave Paternal Leave Adoption Leave Near site Childcare Facility Employee Assistance Program Study Support Employee Stock Purchase Plan ESG Commitments and Goals Business Resource Groups Career Development Programs Mentoring Programs Invesco Cares Dress for your Day In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI enabled learning platform delivers curated content based on your role and interest. We ensure our manager and leaders also have many opportunities to advance their skills and competencies that becomes pivotal in their continuous pursuit of performance excellence. To know more about us About Invesco: https://www.invesco.com/corporate/en/home.html About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html Apply for the role @ Invesco Careers: https://careers.invesco.com/india/
Posted 2 weeks ago
3.0 years
5 - 8 Lacs
Gurgaon
On-site
At Yum! We’re looking for a Software Engineer to add to our dynamic and rapid scaling team. We’re making this investment to help us optimize our digital channels and technology innovations with the end goal of creating competitive advantages for our restaurants around the globe. We’re looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges in our company. We’re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects - with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India. As a Software Engineer, you will: Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions. Play a key role in our AIDA team - developing data solutions responsible for driving Yum! Growth. Develop & maintain high performance & scalable data solutions. Design and develop data pipelines – streaming and batch – to move data from point-of-sale, back of house, operational platforms and more to our Global Data Hub Contribute to standardizing and developing a framework to extend these pipelines across brands and markets Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.). Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points. Developing scalable REST APIs in python. Develop and maintain backend services using Python (e.g., FastAPI, Flask, Django). Minimum Requirement: Vast background in all things data related (3+ years of Experience) AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.) Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus High level of proficiency with SQL (Snowflake a big plus) Proficiency with Python for transforming data and automating tasks Experience with Kafka, Pulsar, or other streaming technologies Experience orchestrating complex tasks flows across a variety of technologies. Bachelor’s degree from an accredited institution or relevant experience Experience with at least one of NoSQL databases MongoDB, Elasticsearch etc. The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill - born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.
Posted 2 weeks ago
4.0 years
2 - 6 Lacs
Gurgaon
On-site
Key Responsibilities: Understand and analyze ETL requirements, data mapping documents, and business rules. Design, develop, and execute test cases, test scripts, and test plans for ETL processes. Perform data validation, source-to-target data mapping, and data integrity checks. Write complex SQL queries for data verification and backend testing. Conduct regression, integration, and system testing for ETL pipelines and data warehouse environments. Work with BI tools to validate reports and dashboards if applicable. Collaborate with developers, business analysts, and data engineers to ensure testing coverage and resolve issues. Document defects, test results, and provide detailed bug reports and testing status. Required Skills and Experience: 4+ years of experience in ETL testing or data warehouse testing. Strong proficiency in SQL for data validation and analysis. Hands-on experience with ETL tools like Informatica, Talend, SSIS, or similar. Knowledge of data warehousing concepts, star/snowflake schemas, and data modeling. Experience with test management tools (e.g., JIRA, HP ALM, TestRail). Understanding of automation in data testing (a plus, e.g., Python, Selenium with databases). Familiarity with cloud platforms (e.g., AWS Redshift, Google BigQuery, Azure Data Factory) is a plus Job Type: Full-time Work Location: In person
Posted 2 weeks ago
4.0 - 9.0 years
5 - 12 Lacs
New Delhi, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities Key Responsibilities: Understand and analyze ETL requirements, data mapping documents, and business rules. Design, develop, and execute test cases, test scripts, and test plans for ETL processes. Perform data validation, source-to-target data mapping, and data integrity checks. Write complex SQL queries for data verification and backend testing. Conduct regression, integration, and system testing for ETL pipelines and data warehouse environments. Work with BI tools to validate reports and dashboards if applicable. Collaborate with developers, business analysts, and data engineers to ensure testing coverage and resolve issues. Document defects, test results, and provide detailed bug reports and testing status. Required Skills and Experience: 4+ years of experience in ETL testing or data warehouse testing. Strong proficiency in SQL for data validation and analysis. Hands-on experience with ETL tools like Informatica , Talend , SSIS , or similar. Knowledge of data warehousing concepts, star/snowflake schemas, and data modeling. Experience with test management tools (e.g., JIRA, HP ALM, TestRail). Understanding of automation in data testing (a plus, e.g., Python, Selenium with databases). Familiarity with cloud platforms (e.g., AWS Redshift, Google BigQuery, Azure Data Factory) is a plus ETL Tester- Preferred candidate profile skill Matrix Years ETL Testing SQL ETL Tool Testing Tool (e.g Jira etc) Cloud Data Warehouse
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Archive Specialist Level: Senior Associate Department: Data Management & Compliance Role Summary: As a Data Archive Specialist, you will be key in safeguarding the organization's data assets by implementing and managing comprehensive data archival strategies. This involves ensuring compliance with data retention policies and regulatory frameworks like GDPR, HIPAA, and FDA. You will work closely with IT teams to integrate and optimize archival platforms such as AWS S3, Azure Blob Storage, and OpenText, ensuring seamless access and efficient data management. Your expertise will support the organization's data governance, providing insights through detailed reporting and audits to enhance data integrity and compliance. Key Responsibilities: 1. Data Archival Strategy Development: - Design and implement comprehensive data archival strategies aligned with organizational goals and regulatory requirements. - Evaluate and select appropriate archival platforms to ensure data integrity and security. 2. Regulatory Compliance: - Ensure all data archival processes comply with industry regulations such as GDPR, HIPAA, and FDA. - Conduct regular audits to verify compliance and address discrepancies. 3. Platform Management: - Collaborate with IT to manage and optimize archival platforms like AWS S3, Azure Blob Storage, and OpenText. - Ensure smooth integration of archival systems with existing IT infrastructure. 4. Data Retention and Retrieval: - Establish and enforce data retention policies to manage the lifecycle of archived data. - Develop efficient retrieval processes to ensure quick access to archived data when required. 5. Reporting and Monitoring: - Create comprehensive reports to monitor archival activities, data integrity, and compliance status. - Present findings to senior management and recommend improvements to archival processes. Mandatory Skills: Handle inactive data across various systems and applications, ensuring compliance, integrity, and long-term accessibility. ETL Tools: Informatica (including IICS) and custom Python/Shell scripts. Archival Platforms: OpenText or similar, AWS S3/Glacier, Azure Blob Storage. Database: Oracle or similar. Knowledge of regulatory frameworks: HIPAA, GDPR, FDA. Good to Have Skills: IT collaboration skills for integrating archival platforms with existing systems. -Proficiency in creating detailed reports to monitor archival activities and compliance status. Show more Show less
Posted 2 weeks ago
5.0 years
1 - 10 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Working with Operations and Product Development staff to support applications/processes to facilitate the effective and efficient implementation/migration of new clients' healthcare data through the Optum Impact Product Suite Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations Preferred Qualifications: 5+ years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts Experience in Unix or Powershell or other batch scripting languages Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI, Qlik, Tableau, MicroStrategy, etc.) Experience supporting analytical capabilities inclusive of reporting, dashboards, extracts, BI tools, analytical web applications and other similar products Experience contributing to cross-functional efforts with proven success in creating healthcare insights Experience and credibility interacting with analytics and technology leadership teams Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations Exposure to Azure, AWS, or google cloud ecosystems Exposure to Amazon Redshift, Amazon S3, Hadoop HDFS, Azure Blob, or similar big data storage and management components Demonstrated desire to continuously learn and seek new options and approaches to business challenges Willingness to leverage best practices, share knowledge, and improve the collective work of the team Demonstrated ability to effectively communicate concepts verbally and in writing Demonstrated awareness of when to appropriately escalate issues/risks Demonstrated excellent communication skills, both written and verbal At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 weeks ago
2.0 years
18 Lacs
India
On-site
About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM . Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs , and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP) , and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: Up to ₹1,853,040.32 per year Benefits: Flexible schedule Health insurance Life insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): What is your notice period? Education: Bachelor's (Preferred) Experience: Informatica: 4 years (Preferred) Location: Noida H.O, Noida, Uttar Pradesh (Preferred) Work Location: In person
Posted 2 weeks ago
3.0 years
0 Lacs
Andhra Pradesh
On-site
Technical Skills Microsoft Purview Expertise (Required) Unified Data Catalog: Experience setting up and configuring the catalog, managing collections, classifications, glossary terms, and metadata curation. Data Quality (DQ): Implementing DQ rules, defining metrics (accuracy, completeness, consistency), and using quality scorecards and reports. Data Map and Scans: Ability to configure sources, schedule scans, manage ingestion, and troubleshoot scan issues. Data Insights and Lineage: Experience visualizing data lineage and interpreting catalog insights. Azure Platform Knowledge (Desirable) Azure Data Factory Azure Synapse Analytics Microsoft Fabric including OneLake Experience 3to 5+ years in data governance or data platform projects, ideally with enterprise clients. 2+ years implementing Microsoft Purview or similar tools (Collibra, Informatica, Alation). Hands-on experience configuring and implementing Microsoft Purview Unified Catalog and Data Quality Experience onboarding multiple data sources (on-prem, cloud). Background in data management, data architecture, or business intelligence is highly beneficial. Certifications Desirable Microsoft Certified Azure Data Engineer Associate Microsoft Certified Azure Solutions Architect Expert About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2