Home
Jobs

2331 Informatica Jobs - Page 38

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are seeking a highly skilled and detail-oriented Data Governance Analyst with a minimum of 5 years of experience in the BFSI domain. The ideal candidate will support the implementation and management of enterprise data governance strategies to ensure data quality, consistency, security, and regulatory compliance across the organization. The role demands strong collaboration with data stewards, business units, and IT teams to support data-driven decision-making and governance practices. Key Responsibilities: Collaborate with data owners and stewards across business and IT to define and implement data governance policies, standards, and procedures. Facilitate the establishment and maintenance of a data governance framework that aligns with regulatory requirements. Conduct data quality assessments and support data remediation initiatives in line with business priorities. Monitor data governance metrics and provide regular reporting to leadership on data quality, policy adherence, and issue resolution status. Support metadata management initiatives, including cataloging of critical data elements, definitions, and data lineage tracking. Coordinate with compliance and risk teams to ensure data governance efforts support internal audits, regulatory reporting, and data privacy mandates. Provide training and guidance to stakeholders on data governance principles, roles, and responsibilities. Leverage tools and platforms such as Collibra, Informatica, Alation, or similar for metadata management, data cataloging, and workflow automation. Required Skills & Qualifications: Minimum 5 years of experience in Data Governance or Data Management within the BFSI sector. Strong understanding of data governance frameworks (e.g., DAMA-DMBOK) and data management best practices. Experience with regulatory and compliance standards relevant to BFSI (e.g., RBI, SEBI, FATCA, GDPR, Basel). Proficiency in data quality tools , metadata repositories, and data lineage solutions. Hands-on experience with tools like Collibra, Informatica Axon/EDC, Alation, or equivalent . Knowledge of SQL and basic data analysis techniques. Excellent analytical, problem-solving, and stakeholder communication skills. Show more Show less

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Location Hyderabad, Telangana, India Category Technology Careers Job Id JREQ191891 Job Type Full time Hybrid As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: informatica,data integration,data engineering,sql,gcp,dbt,power bi,snowflake,fivetran,business intelligence,python,etl,airflow,data modeling,azure,luigi,workflow management tools,data analysis,powerbi,azkaban,data warehousing,aws,informatica administration,cloud services Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: informatica,data integration,data engineering,sql,gcp,dbt,power bi,snowflake,fivetran,business intelligence,python,etl,airflow,data modeling,azure,luigi,workflow management tools,data analysis,powerbi,azkaban,data warehousing,aws,informatica administration,cloud services Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Job Title: Salesforce Development Lead Location: Remote Company Overview: At Codvo, software and people transformations go hand-in-hand. We are a global empathy led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. Job Summary: We are seeking a highly experienced Salesforce Development Lead to take full ownership of project delivery across the Salesforce ecosystem. In this leadership role, you will be responsible for guiding the technical direction, overseeing the development lifecycle, and ensuring successful deployment of Salesforce solutions that align with business goals. You will collaborate closely with cross-functional teams, mentor junior developers, and drive the adoption of best practices in architecture, integration, and development. Key Responsibilities: Lead end-to-end delivery of Salesforce projects, from requirements gathering through to deployment and support, ensuring alignment with organizational objectives. Architect scalable, secure, and maintainable Salesforce solutions using Apex, Visualforce, Lightning Web Components (LWC), and other platform capabilities. Collaborate with business analysts, product owners, and stakeholders to gather functional and technical requirements, and translate them into well-architected solutions. Oversee and contribute to the customization and configuration of Salesforce to support complex business processes. Design and implement robust data migration and integration strategies using APIs, middleware, and ETL tools. Define and enforce coding standards, perform code reviews, and ensure development practices align with industry and platform best practices. Provide technical leadership, mentorship, and coaching to development team members, fostering a culture of collaboration and continuous improvement. Lead troubleshooting efforts and resolve high-priority issues related to Salesforce applications, integrations, and customizations. Stay current with Salesforce platform updates, features, and industry trends to continuously enhance system capabilities and team knowledge. Drive the adoption of DevOps practices, including version control, automated testing, and CI/CD pipelines. Collaborate with QA, DevOps, and other IT teams to ensure end-to-end quality and performance in solution delivery. Maintain comprehensive technical documentation including architectural diagrams, solution designs, and development processes. Evaluate and recommend new technologies, tools, and methodologies to improve the efficiency and effectiveness of Salesforce delivery. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of hands-on Salesforce development experience, with at least 2 years in a technical leadership or lead developer role. Salesforce Platform Developer I certification is required; additional certifications such as Platform Developer II, Application Architect, or System Architect are highly preferred. Deep expertise in Apex, Visualforce, Lightning Components (Aura and LWC), SOQL, and other Salesforce development tools. Strong understanding of Salesforce integration patterns, including REST/SOAP APIs, middleware solutions (e.g., MuleSoft), and ETL tools (e.g., Data Loader, Informatica). Experience with version control systems (e.g., Git) and CI/CD tools (e.g., Copado, Gearset, Jenkins). In-depth knowledge of software engineering best practices, including design patterns, testing methodologies, and agile delivery models. Proven ability to lead technical teams, manage project timelines, and deliver solutions that meet quality and performance benchmarks. Excellent communication skills with the ability to convey complex technical concepts to both technical and non-technical audiences. Demonstrated ability to proactively identify areas of improvement and drive innovation within the Salesforce ecosystem. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. Job Title: Technical Director Location: Mumbai Experience: 12+ years Job Summary We are seeking a highly experienced and visionary Senior Data Architect to lead the design and implementation of scalable, secure, and high-performance data platforms across cloud and hybrid environments. The ideal candidate will bring deep expertise in data engineering, cloud architecture, and modern data paradigms such as Data Mesh and Lakehouse, with a proven track record of delivering enterprise-grade solutions. Key Responsibilities Lead the architecture, design, and implementation of data platforms including Data Lakes, Data Warehouses, and Lakehouse’s on AWS, Azure, and GCP. Define and implement data strategies, governance, and best practices for data ingestion, transformation, and consumption. Collaborate with cross-functional teams including business stakeholders, data scientists, and engineering teams to deliver robust data solutions. Provide technical leadership in pre-sales engagements, RFP responses, and solutioning for new business opportunities. Mentor and guide data engineering teams, fostering a culture of innovation and continuous learning. Drive the adoption of modern data architecture principles such as Data Mesh and real-time streaming. Evaluate and recommend tools, frameworks, and platforms to enhance data capabilities. Required Skills & Qualifications 15+ years of experience in data engineering and architecture roles. Strong hands-on experience with cloud platforms: AWS, Azure, GCP. Expertise in tools and technologies such as Snowflake, Databricks, ElasticSearch, Kafka, Informatica, Pentaho, Apache Spark, Hive. Proficiency in Python, SQL, PL/SQL, and real-time data processing (CDC, Debezium, Kafka). Deep understanding of Data Lake, Data Warehouse, Data Mesh, and Lakehouse architectures. Experience in leading large-scale data migration and modernization projects. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications Any engineering or post graduation Show more Show less

Posted 1 week ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Strong expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Strong proficiency in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Extensive experience in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.

Posted 1 week ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Basic Qualifications: Expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Proficient in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Expertise in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.

Posted 1 week ago

Apply

3.0 - 6.0 years

1 - 4 Lacs

Hyderabad

Work from Office

Naukri logo

Master Data Analyst - Material & Production What you will do Let’s do this. Let’s change the world. In this vital role The Master Data Analyst at Amgen will support the accuracy and consistency of master data (Material, Production, Quality, Customer, Transportation and/or Plant) across the organization. This role will manage data validation, cleansing, and enrichment while collaborating with teams to resolve issues and ensure data integrity. The analyst will support key performance monitoring, data governance, and compliance efforts, as well as assist in data migration and integration projects. Candidates should have experience in enterprise applications like SAP or Oracle, familiarity with data governance frameworks and compliance standards, and strong analytical skills. Roles & Responsibilities: Perform data operations tasks, mainly maintenance, and validation, to ensure the accuracy and integrity of master data Support process optimization initiatives to improve data management workflows and enhance efficiency Conduct data analysis to identify trends, discrepancies, and opportunities for improvement Deliver training and support to partners, customers, and end-users on master data processes, tools, and standard methodologies. Maintain data quality reports to monitor performance metrics and ensure data compliance. Collaborate multi-functionally with business, IT, and operations teams to resolve data-related issues and ensure alignment with organizational goals. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree in a STEM field and 3-5 years of experience in SAP ECC, master data management, data governance, or data operations, preferably in the healthcare or biotech supply chains Technical Proficiency Experience in SAP/Oracle, Microsoft Office (Excel, Power Point), and other data management tools (e.g., Informatica, Oracle MDM). Analytical Skills Solid ability to analyze large datasets and deliver actionable insights. Problem Solving Skilled at identifying root causes of data issues and implementing effective solutions. Attention to Detail High accuracy and attention to detail, with a solid focus on data quality. Communication Excellent written and verbal communication skills, with the ability to communicate findings to both technical and non-technical partners. Functional Skills: Must-Have Skills: Working knowledge of SAP/Oracle Understanding of master data management processes, frameworks, and governance. Proficiency in Excel and MS Office Suite, with experience in data analysis Basic understanding of data governance frameworks and ensuring data accuracy and quality. Good communication skills for presenting data insights to both technical and non-technical audiences. Good-to-Have Skills: SAP S/4, SAP MDG, SAP TM Professional Certifications (please mention if the certification is preferred or mandatory for the role): Soft Skills: Good analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation, centered around data perfection Team-oriented, with a focus on achieving team goals. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Lets do this. Lets change the world. In this vital role you will work closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Take ownership of complex software projects from conception to deployment of Clinical Study Planning, Design Optimization & start-up (CSPDS) Systems. Design, Develop and implement custom applications using APEX, JavaScript, Visual Force, AJAX, HTML, CSS; Apply standard methodologies and experience to build Salesforce.com applications; Handle day to day development activities on the salesforce.com platform using Apex, Visual Force, Lightning Web Components and Aura; Manage software delivery scope, risk, and timeline Possesses strong rapid prototyping skills and can quickly translate concepts into working code Provide technical guidance and mentorship to junior developers Contribute to both front-end and back-end development using cloud technology Stay updated with the latest trends, advancements and standard process for CSPDS systems Maintain knowledge of trends in application development frameworks in salesforce.com, related new technologies to provide, recommend and deliver standard methodology solutions Analyse and understand the functional and technical requirements of CSPDS applications, solutions and systems and translate them into software architecture and design specifications Develop innovative solution using generative AI technologies Conduct code reviews to ensure code quality and adherence to best practices Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations Work closely with product team, business team, and other stakeholders Analyze and understand the functional and technical requirements of CSPDS applications, solutions and systems and translate them into software architecture and design specifications Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process Identify and resolve software bugs and performance issues Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time Maintain detailed documentation of software designs, code, and development processes Work on integrating with other systems and platforms to ensure seamless data flow and functionality Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Doctorate degree OR Masters degree and 4 to 6 years of experience in Computer Science, IT or related field OR Bachelors degree and 6 to 8 years of experience in Computer Science, IT or related field OR Diploma and 10 to 12 years of experience in Computer Science, IT or related field Preferred Qualifications: Must-Have Skills (Not more than 3 to 4): Proficiency in Salesforce.com configuration/customization Proficiency in writing SQL queries & programming languages such as Python, JavaScript preferred or other programming languages Strong understanding of software development methodologies, including Agile and Scrum Proficient in designing efficient and scalable data structures within salesforce Experience with version control systems like Git Ability to manage, mentor junior developers and guide projects Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Knowledge of Artificial Intelligence (AI), Robotic Process Automation (RPA), Machine Learning (ML), and Natural Language Processing (NLP) automation technologies to meet business requirements. Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Knowledge of reporting tools such as Tableau, Spotfire & Power BI Experience with API integrations such as MuleSoft Experience with ETL Tools (Informatica, Databricks) Professional Certifications (please mention if the certification is preferred or mandatory for the role): SAFE for Teams certification (Preferred) Certified Salesforce Developer/Administrator (Preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Willing to learn new technologies

Posted 1 week ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Intelligent Cloud Services. - Strong understanding of application development methodologies. - Experience with cloud-based application deployment and management. - Familiarity with data integration and transformation processes. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Intelligent Cloud Services. - This position is based at our Kolkata office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Experience - 5+Years Job TitlePlatform Administrator / Data Platform Administrator Job Summary: We are seeking a highly skilled Platform Administrator with expertise in cloud solutions, data platform management, and data pipeline orchestration. The ideal candidate will possess a strong background in AWS architecture, experience with tools like Airflow, Informatica IDMC, and Snowflake, and a proven ability to design, implement, and maintain robust data platforms that support scalable, secure, and cost-effective operations. Key Responsibilities: Administer and manage cloud-based data platforms, ensuring high availability, scalability, and performance optimization. Architect and implement solutions on AWS to support data integration, transformation, and analytics needs. Leverage AWS services (such as EC2, S3, Lambda, RDS, and Redshift) to design and deploy secure and efficient cloud infrastructures. Manage and optimize data pipelines using Apache Airflow, ensuring reliable scheduling and orchestration of ETL processes. Oversee data integration and management processes using Informatica IDMC for data governance, quality, and privacy in the cloud. Administer and optimize Snowflake data warehousing solutions for querying, storage, and data analysis. Collaborate with cross-functional teams to ensure seamless data flow, system integration, and business intelligence capabilities. Ensure compliance with industry standards and best practices for cloud security, data privacy, and cost management. Required Qualifications: AWS Certified Solutions Architect - Professional or equivalent. Strong experience in platform administration and cloud architecture. Hands-on experience with Airflow, Informatica IDMC, and Snowflake. Proficiency in designing, deploying, and managing cloud-based solutions in AWS. Familiarity with data integration, ETL processes, and data governance best practices. Knowledge of scripting languages (Python, Shell, etc.) for automation and workflow management. Excellent troubleshooting, problem-solving, and performance optimization skills. Strong communication skills and the ability to collaborate with teams across technical and non-technical disciplines. Preferred Qualifications: Experience in multi-cloud environments. Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data visualization tools and business intelligence platforms.

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Must have minimum 6 years of relevant experience in IT Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Proficiency in DBT (Data Build Tool) for data transformation and modelling Experience with ETL/ELT processes and integrating data from multiple sources. Experience in designing Tableau dashboards, data visualizations, and reports Familiarity with data warehousing concepts and best practices Strong problem-solving skills and ability to work in cross-functional teams.

Posted 1 week ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Job ID: 197269 Required Travel :No Travel Managerial - No LocationIndia- Pune (Amdocs Site) In one sentence Analyzes and reconciles Compensation & Benefits (C&B) data, and executes regional and/or global C&B policies and processes. All you need is... Education: Bachelor's degree holder / Master's Degree holder (Full-time degree) Master's in accounting, finance, economics, or other quantitative fields preferred Experience: Minimum 3-6 years of work experience in Sales Commissions or relevant areas like Finance, HR, etc. Experience in Sales Performance Management and Incentive Compensation plan design, modeling, operations, and reporting. Experience in Compensation Reporting on tools such as Xactly, Varicent, etc. is a big plus Skills: Proficiency with Microsoft Excel, including advanced formulas and functions Strong analytical and problem-solving skills Strong interpersonal and communication skills, with the ability to collaborate across multiple departments Fluent in English, both written and spoken Data extraction, manipulation, and analysis of one-time and periodic requests from sales leadership (primarily in Excel) Eagerness to learn and build competencies in new software, tools, and techniques Responsibilities: Gather an in-depth understanding of existing compensation and reporting capabilities across business units Drive operational excellence and improvements by working with various stakeholders Collaborate with sales leadership to extract, manipulate, and analyze data for one-time and periodic requests Design, model, and report on incentive compensation plans Ensure accurate and timely compensation reporting using tools such as Xactly, Varicent, etc. Demonstrate a positive, "can do" attitude and work effectively as a team player Attributes: A team player with a positive, can do attitude Demonstrating commitment and a results-oriented attitude Taking ownership and making required judgments whenever necessary Why you will love this job: Be part of the core unit that will define, shape and organize the compensation and benefits structure of the company Discover and hone your skills Be surrounded by talented HR professionals who collaborate towards a common goal What will your job look like You will implement Compensation and/or Benefits Strategy You will execute regional and/or global C&B processes to support and align with the business's needs and strategy. You will support the implementation of C&B policies & procedures You will analyze complex data, provides performance and compensation & benefits analyses and simulations on individuals or groups to support decision making You will provide an on-going support to the business on the various aspects of compensation & benefits You will maintain an on-going contact with external compensation and benefits vendors You will be involved with ad hoc projects as restructuring activities, mergers & acquisitions, Rebadge transfers. Who are we Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our approximately 30,000 employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $4.89 billion in fiscal 2023. Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job description Job TitleETL Tester Job Responsibilities: Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Use Tableau to create visualizations and dashboards that help in data analysis and reporting. Work with Snowflake to test and validate data stored in the cloud data warehouse. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results Exposure to Salesforce and proficiency in developing SQL queries The ideal candidate will have a strong background in ETL processes, data validation, and experience with Tableau and Snowflake. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline.

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 6+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Position: Salesforce Solution Architect Location: Bangalore/Pune (Hybrid) Experience: 10+ Years We are seeking a talented and experienced Salesforce Solution Architect. As a Salesforce Solution Architect, you will be responsible for design, architect and implementing innovative and comprehensive Salesforce solutions for our clients. You will translate business requirements into technical solutions with your deep knowledge of Salesforce and passion for innovation and collaborate with clients and developers to bring ideas to life. Key Responsibilities: Solution Design & Architecture Design scalable and robust Salesforce solutions, including system integrations and data migrations, to meet business requirements. Technical Leadership: Provide technical leadership and guidance to the development team, ensuring best practices and standards are followed. Stakeholder Management Collaborate with business stakeholders to understand their requirements, translate them into technical specifications, and ensure successful implementation. Customization & Configuration: Customize and configure Salesforce applications to build workflows, custom objects, fields, and validation rules. Integration: Design and implement integrations between Salesforce and other systems using APIs, middleware, and other technologies. Data Management Ensure data integrity and consistency across Salesforce and related systems through effective data management strategies. Project Management Lead project planning, execution, and delivery, ensuring projects are completed on time and within budget. Work closely with UX/UI designers, business analysts, and other stakeholders to translate business requirements into technical specifications and intuitive designs. Documentation: Create and maintain comprehensive documentation of designs, configurations, and project deliverables. Training & Support: Provide training and support to end-users and junior team members. Required Skills: The candidate must have proven experience in understanding complex business roadmaps and solution thinking Solution Design Develop robust and scalable Salesforce solutions to address complex business requirements, considering best practices and industry standards. Design and develop custom Lightning Web Components to enhance the Salesforce user experience. o Architectural BlueprintsCreate high-level architectural blueprints that outline the proposed Salesforce solution, including integrations and customizations. o Proof of Concept (PoC)Develop and present POCs to demonstrate the feasibility and value of the proposed solutions. o Customization ProposalsSuggest and design custom solutions using Salesforce Lightning Web Components, Lightning Design System, and other front-end technologies. Architecture Design and document solution architecture, including Customizations, Data modeling, Data migration, Data sharing and security, multi component solutions, Salesforce configuration, connected apps, User experience (UX), Environment management, Deployments and Leading a team of consultants. Technical Skills: Proficiency in Salesforce configuration, Apex, Visualforce, Lightning Components, and SOQL. o Strong understanding of Salesforce data model, security, and sharing rules. o Strong experience with Salesforce integration tools (e.g., MuleSoft, Informatica etc.) and APIs (REST/SOAP). o Proficient with web technologies (HTML, CSS, JavaScript). o Possess a deep understanding of the Salesforce platform, encompassing Sales Cloud, Service Cloud, Marketing Cloud, Education Cloud, Health Cloud, Data Cloud etc. and the broader Salesforce product suite. Quality Assurance Ensure the quality of the Salesforce solutions by conducting thorough testing and validation of configurations and customizations. Client Collaboration Work closely with clients to understand their needs and objectives, translating them into effective solutions for scalability and efficiency. o Discovery Workshops, Requirement Gathering, Solution Demos, Technical Presentations Documentation o Create and maintain comprehensive documentation for the proposed solutions, including architecture diagrams, data flows, and integration points and best practices. o Technical SpecificationsWrite detailed technical specifications and scope documents that outline the proposed solution. o Effort EstimationProvide accurate estimates of the time, cost, and resources required to implement the proposed solution. o RFP/RFI ResponsesAssist in responding to Requests for Proposals (RFPs) and Requests for Information (RFIs) with technical and architectural insights. Training and Support Provide training and ongoing support to clients and internal teams to ensure the successful adoption and maintenance of Salesforce solutions. Communication Outstanding written and verbal communication abilities, effectively communicate with stakeholders and team members; strong organizational, communication, documentation and interpersonal skills. Leadership & Mentorship: Mentor junior developers and provide guidance on best practices for LWC, SLDS, and front-end development within the Salesforce ecosystem Qualifications Bachelor's degree in computer science, information technology, or related field. 10 to 20 years of hands-on experience with at least 3 years as Salesforce solution architect with implementation experience. Certifications - Salesforce Application Architect, System Architect, Sales Cloud, Service cloud, Education cloud, Health cloud, Marketing cloud, CPQ, Data cloud etc. Experience in working with enterprise clients is a must. About Us: Relanto.ai is a cutting-edge IT services company at the forefront of revolutionizing business operations through Intelligent Automation, Digital Transformation, and harnessing the power of Data and AI. Our passion lies in creating solutions that empower businesses to achieve unprecedented efficiency, innovation, and growth. To learn more about us and our culture visit www.relanto.ai & https://www.linkedin.com/company/relanto-inc/

Posted 1 week ago

Apply

9.0 - 14.0 years

22 - 35 Lacs

Gurugram, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role in the Veeva Vault team you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgens R&D Veeva Vaults and hosted applications. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard processes around Veeva Platform Governance. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Experience with Veeva Vault Platform and Products, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. 6-8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Solid understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Preferred Qualifications: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role in the Veeva Vault team you will be responsible for designing, developing, and maintaining software applications and solutions in Amgens Vault PromoMats and Vault MedComm, that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgens Amgens Vault PromoMats and Vault MedComm and its hosted applications. Stay updated with the latest trends, advancements and standard processes for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard processes around Veeva Platform Governance. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Experience with Amgens Vault PromoMats and Vault MedComm, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. 6 - 8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Solid understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Preferred Qualifications: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.

Posted 1 week ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Requirement : Senior Business Analyst (Data Application & Integration)Experience: 10+ Years Location: Gurgaon (Hybrid) Budget Max:35 LPA Preferred: Immediate Joiners Job Summary: We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills

Posted 2 weeks ago

Apply

4.0 - 6.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do In this vital role you will be tasked with designing, developing, and maintaining software applications and solutions in Amgens Veeva Vault RIM to meet business requirements and ensure the availability and performance of critical systems and applications. This role involves collaborating with product managers, designers, and other engineers to create scalable software solutions, automate operations, monitor system health, and address incidents to reduce downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgens Veeva Vault RIM and hosted applications. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard methodologies around Veeva Platform Governance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelors degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Must-Have Skills: Experience with Veeva Vault RIM and its application, including Veeva configuration settings and custom builds. Strong knowledge of Data Lake technologies like Databricks, and etc. Experience in MuleSoft, Python script and REST API script development Extensive knowledge of enterprise architecture frameworks, technologies and 6-8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in handling systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and leading change controls. Proficiency in programming languages such as Python, JavaScript etc. Strong understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and solving skills. Strong verbal and written communication skills. Ability to work optimally with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Location: Working Full time from Strategy Pune office. We are seeking a highly skilled and experienced Senior ETL Developer to join our dynamic team. This role is crucial in ensuring the integrity, usability, and performance of our data solutions. The ideal candidate will have extensive experience with ETL processes, database design, and Informatica PowerCenter/IICS. Key Responsibilities : ETL Development and Maintenance : Engage with stakeholders to understand business objectives and design effective ETL processes aligned with organizational goals. Maintain existing ETL processes ensuring data accuracy and adequate process performance Data Warehouse Design & Development : Develop and maintain essential database objects, including tables, views, and stored procedures to support data analysis and reporting functions. Proficiently utilize SQL queries to retrieve and manipulate data as required. Data Quality and Analysis : Analyze datasets to identify gaps, inconsistencies, and other quality issues, and devise strategic solutions to enhance data quality. Implement data quality improvement strategies to ensure the accuracy and reliability of data. Performance Optimization : Investigate and resolve database and query performance issues to ensure optimal system functionality. Continuously monitor system performance and make recommendations for improvements. Business Collaboration : Collaborate with business users to gather comprehensive data and reporting requirements. Facilitate user-acceptance testing in conjunction with business, resolving any issues that arise. Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of hands-on experience with Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS). Proven expertise in designing, implementing, and managing ETL processes and data warehouses. Proficiency with SQL and experience in optimizing queries for performance. Strong analytical skills with the ability to diagnose data issues and recommend comprehensive solutions. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Detail-oriented with strong problem-solving capabilities.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title ADF + Databricks Number of Openings 4 Duration of contract 6 months + Total Experience 8+ Relevant Experience on Mandatory Skills 5+ Job Description high proficiency and experience in designing/developing data analytics and data warehouse solutions with Azure Data Bricks, Azure Data Factory, services related to Azure Analytics, Azure SQL, Azure function app, logic app experience in designing large data distribution, integration with service-oriented architecture and/or data warehouse solutions proficient coding experience using Spark (Scala/Python), T-SQL prior ETL development experience using industry tool e.g. informatica/SSIS/Talend etc. a proven track of work experience in cloud (MS Azure platform preferred) in an agile SDLC environment, leveraging modern programming languages, DevOps, and test-driven development. Azure Data Bricks/Data Factory/Data Lake/ Devops/ Spark/Scala/Python Production grade work experience with Python Good understanding of Core Deep Learning theory, Language models, Transformers Solid understanding of LLM fundamentals Prompting, RAG, Function calling, Agentic Workflows Vendor Billing range INR 9000-11000/Day Specific Work Location & Mode of work Pune only Client Interview (Yes /No) Yes Background verification Outcome needed - Before or After Onboarding Pre- Onboarding

Posted 2 weeks ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies