Home
Jobs

2335 Informatica Jobs - Page 15

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

1 - 6 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Informatica /L2 L3 Support Experience: 8+ yrs Job Location: Pan India(Hybrid) Notice Period: Only Immediate Joiners Shift Timings: 6 AM to 3.30 PM & 12.30 PM to 9.30 PM Mandatory Skills: L2 L3 Support Informatica , Tableau UNIX , SQL Must have 5 or more years of hands-on development experience and can deep dive into issues in the technologies related to: Informatica 10.5.6 (Power Center), Tableau Unix/Linux shell scripting, PL/SQL Nice to have skills : SQL Developer/Toad, Putty, Control M Sche duling , Teradata SQL Assistant, Teradata Viewpoint, SAP etc Release/Deployment, Operational Readiness, Production Governance is a must Interested candidates can share your CV to " kalyan.v@talent21.in "

Posted 5 days ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do We are seeking a detail-oriented and proactive Data Steward to support our pharmaceutical data governance initiatives. The ideal candidate will ensure the quality, consistency, integrity, and security of data across our systems, particularly within commercial, regulatory, R&D, and supply chain functions. You will play a key role in managing master data, supporting compliance (e.g., GxP, GDPR), and facilitating data-driven decision-making across the organization. Maintain and monitor master data for key domains (e.g., customer, product, vendor, material) Identify, investigate, and remediate data issues using profiling tools and dashboards. Define and enforce data definitions, naming conventions, and standard operating procedures Identify and correct data quality issues and anomalies using established data validation protocols. Implement data cleansing and enrichment processes Support company-wide data governance policies and frameworks. Ensure data complies with internal standards and external regulations (e.g., FDA, EMA, GDPR). Work closely with regulatory and quality teams to ensure alignment with compliance standards. Use statistical tools and modeling techniques to interpret trends, patterns, and correlations maintain data pipelines and manage data cleaning processes. Help identify data gaps and recommend solutions for improvement. ensure data accuracy, consistency, and integrity across systems Analyze large and complex datasets from internal and external sources (e.g., clinical trials, sales, market data). Serve as a liaison between business units (e.g., marketing, regulatory, supply chain) and IT. Collaborate with data owners and SMEs to define and enforce data standards. Provide training and support to end-users on data management best practices. Maintain the accuracy and completeness of master data in key systems (e.g., SAP, Veeva, Oracle). Support data migration and integration projects, ensuring clean and standardized data input. Assist in data extraction, reporting, and analytics to support business operations and decision-making. Generate periodic data quality reports and metrics Basic Qualifications: Degree in computer science, Data Management, or related field & engineering preferred with 2-5 years of software development experience 2-5 years of experience in a Data Steward, Data Analyst, or related role in the pharmaceutical industry Experience with data systems such as SAP, Veeva, Salesforce, Oracle, or Informatica is a plus. Excellent communication and collaboration skills. Preferred Qualifications: Experience with global data governance or enterprise data management initiative Knowledge of GxP, FDA/EMA guidelines, and industry-specific data compliance. Proficiency in SQL and at least one programming language (e.g., Python, R). Strong skills in data visualization tools (Power BI, Tableau, Qlik, etc.). Strong skills Postgres SQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, S3 Buckets Excellent communication skills, with the ability to present complex data in a simple, understandable way Good to Have Skills Postgres SQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, S3 Buckets design patterns, data structures, data modelling, data algorithms Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 5 days ago

Apply

4.0 - 9.0 years

6 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: We are seeking an experienced MDM Senior Data Engineer with 6- 9 years of experience and expertise in backend engineering to work closely with business on development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio and data engineering experience . This role will also involve guiding junior data engineers /analysts , and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong Data Engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS, API Integrations etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Develop the MDM backend solutions and implement ETL and Data engineering pipelines using Databricks, AWS, Python/PySpark, SQL etc Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Perform data profiling and identify the DQ rules need. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Help custom UI team for integration with backend data using API or other integration methods for better user experience on data stewardship Basic Qualifications and Experience: Masters degree with 4 - 6 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 6 - 9 years of experience in Business, Engineering, IT or related field OR Diploma with 10 - 12 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong understanding and hands on experience of Databricks and AWS cloud services. Proficiency in Python, PySpark, SQL, and Unix for data processing and orchestration. Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ). Must have knowledge on customer master data (HCP, HCO etc) Experience with data modeling, governance, and DCR lifecycle management. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file-based integrations Strong experience with external data enrichments like D&B Strong experience on match/merge and survivorship rules implementations Very good understanding on reference data and its integration with MDM Hands on experience with custom workflows or building data pipelines/orchestrations Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure or knowledge of DataScience and GenAI capabilities. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Databricks Certifications (Data engineer or Architect) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams

Posted 5 days ago

Apply

8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – Manager - Guidewire Data Manager EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We are seeking an experienced Guidewire Manager with a strong background in the insurance domain and extensive knowledge of traditional ETL tools. The ideal candidate will have a robust understanding of data warehousing architecture and hands-on experience with various ETL tools, including Informatica PowerCenter, SSIS, SAP BODS, and Talend. Your Key Responsibilities Lead and manage Guidewire implementation projects, ensuring alignment with business objectives and technical requirements. Oversee the design, development, and maintenance of data warehousing solutions. Collaborate with cross-functional teams to gather and analyze business requirements. Develop and implement ETL processes using tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Ensure data quality, integrity, and security across all data warehousing and ETL processes. Provide technical guidance and mentorship to team members. Stay updated with industry trends and best practices in data warehousing and ETL. Skills And Attributes For Success Bachelor's degree in Computer Science, Information Technology, or a related field. 8-11 years of experience in data warehousing and ETL processes. Strong background in the insurance domain. Hands-on experience with ETL tools such as Informatica PowerCenter, SSIS, SAP BODS, and Talend. Excellent understanding of data warehousing architecture and best practices. Proven leadership and project management skills. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. To qualify for the role, you must have Experience with Guidewire implementation projects. Knowledge of additional ETL tools and technologies. Certification in relevant ETL tools or data warehousing technologies. Why EY? At EY, we offer a dynamic and inclusive work environment where you can grow your career and make a meaningful impact. Join us to work on challenging projects, collaborate with talented professionals, and contribute to innovative solutions in the insurance domain. Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about Life insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 5 days ago

Apply

1.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking an MDM Admin/Infrastructure Resource with 2-5 years of experience to support and maintain our enterprise MDM (Master Data Management) platforms using Informatica MDM and IDQ. This role is critical in ensuring the reliability, availability, and performance of master data solutions across the organization, utilizing modern tools like Databricks and AWS for automation, backup, recovery, and preventive maintenance. The ideal candidate will have strong experience in server maintenance, data recovery, data backup, and MDM software support. Roles Responsibilities: Administer and maintain customer, product, study master data using Informatica MDM and IDQ solutions. Perform data recovery and data backup processes to ensure master data integrity. Conduct server maintenance and preventive maintenance activities to ensure system reliability. Leverage Unix/Linux, Python, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement automation processes for data backup, recovery, and preventive maintenance. Utilize AWS cloud services for data storage and compute processes related to MDM. Support MDM software maintenance and upgrades. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Functional Skills: Must-Have Skills: Strong experience with Informatica MDM, IDQ platforms in administering and maintaining configurations Strong experience in data recovery and data backup processes Strong experience in server maintenance and preventive maintenance activities (Linux/Unix strong hands on and server upgrade experience) Expertise in handling data backups, server backups, MDM products upgrades, server upgrades Good understanding and hands on experience of access control Experience with IDQ, data modeling, and approval workflow/DCR Advanced SQL expertise and data wrangling Knowledge of MDM, data governance, stewardship, and profiling practices Good-to-Have Skills: Familiarity with Databricks and AWS architecture Background in Life Sciences/Pharma industries Familiarity with project tools like JIRA and Confluence Basics of data engineering concepts Basic Qualifications and Experience: Master s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Professional Certifications (preferred): Any ETL certification (e. g. , Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or Azure) Soft Skills: Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task management skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 5 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Purpose: The primary purpose of this role is to work on the QA and development work for the Product Processor Technology area. This will involve working closely with the business and SMEs to prioritize business requests, manage the ETL development workslate, QA Automation efforts, provide estimate efforts, and ensure timely delivery on committed items and to project manage all aspects of software development according to the Software Development Lifecycle (SDLC). Job Background/context: The role forms part of Product Processor Development Team in Pune and supports the GTPL application which helps Product Control & Finance Department. GTPL is the global finance product control’s strategic product processor for all cash products and internally traded futures. GTPL will be the one stop shop to enable consistent and granular accounting globally, accepting latest global reference and market data to reduce manual adjustments and cleaner reconciliations. GTPL will continue enabling several global functions like Compliance, Risk including BASEL, Tax and Regulatory Reporting and firm-wide strategic initiatives by being the gateway to 100+ systems Key Responsibilities: Understanding Business Requirements and Functional Requirements provided by Business Analysts and to convert into Technical Design Documents and leading the development team to deliver on those requirements. Leading a Technical Team in Pune supporting GTPL in Product Processor Departments. Ensure projects Plans are created and PTS documentation is up to date. Work closely with Cross Functional Teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Controls and Production Support. Prepare handover documents, manage SIT , UAT, automation of Unit Tsting. Identify and proactively resolve issues that could impact system performance, reliability, and usability. Demonstrates an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives; requires a good understanding of the industry. Work proactively & independently to address testing requirements and articulate issues/challenges with enough lead time to address risks Ability to understand complex data problems, analyze and provide generic solutions compatible with existing Infrastructure. Design, Implement, Integrate and test new features. Owns success – Takes responsibility for successful delivery of the solutions. Mentoring other developers on their implementation as needed, and organize review activities like design review, code review and technical document review etc. to ensure successful delivery. Explore existing application systems, determines areas of complexity, potential risks to successful implementation. Contribute to continual improvement by suggesting improvements to software architecture, software development process and new technologies etc. Ability to build relationship with business and technology stakeholders. Knowledge/Experience: 10+ Year Software development and QA t experience. 6+ Year Oracle PL/SQL experience 6+ Year ETL QA Experience (AbInitio or Informatica). Hands on experience in testing complex ETL applications. Development experience in a fast-paced, time-to-market driven environment Experience with test automation, test scenario and test scripts creation and modification Comfortable with writing complex queries Experience with reporting tools. Hands on experience with testing automation tools. Proficiency with Oracle PL/SQL, SQL tuning, writing packages, triggers, functions and procedures. Experience with data conversion / migration Excellent trouble shooting and debugging skills. Worked in Onsite - offshore model. Skills: Strong analytic skills. Excellent communication and internal customer management skills. Excellent written and verbal communication skills. Excellent facilitation skills. Ability to build relationships at all levels. Qualifications: B.E/B.Tech or Master degree in Computer Science or Engineering or related discipline. Competencies: Strong work organization and prioritization capabilities. Takes ownership and accountability for assigned work. Ability to manage multiple activities. Focused and determined in getting the job done right. Ability to identify and manage key risks and issues. Personal maturity and sense of responsibility. Shows drive, integrity, sound judgment, adaptability, creativity, self-awareness and an ability to multitask and prioritize. Sensitive to cultural and background differences and environments Confident and assertive. Values diversity: Demonstrates an appreciation of a diverse workforce. Appreciates differences in style or perspective. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Technology Quality ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Linkedin logo

About This Role Aladdin Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin. Data Pipeline Engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems. They partner with data and analytics experts to deliver high quality analytical and derived data to our consumers. We are looking for data engineers who like to innovate and seek complex problems. We recognize that strength comes from diversity and will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual. We are committed to open source and we regularly give our work back to the community. Engineers looking to work in the areas of orchestration, data modeling, data pipelines, APIs, storage, distribution, distributed computation, consumption and infrastructure are ideal candidates. Responsibilities Data Pipeline Engineers are expected to be involved from inception of projects, understand requirements, architect, develop, deploy, and maintain data pipelines (ETL / ELT). Typically, they work in a multi-disciplinary squad (we follow Agile!) which involves partnering with program and product managers to expand product offering based on business demands. Design is an iterative process, whether for UX, services or infrastructure. Our goal is to drive up user engagement and adoption of the platform while constantly working towards modernizing and improving platform performance and scalability. Deployment and maintenance require close interaction with various teams. This requires maintaining a positive and collaborative working relationship with teams within DOE as well as with wider Aladdin developer community. Production support for applications is usually required for issues that cannot be resolved by operations team. Creative and inventive problem-solving skills for reduced turnaround times are highly valued. Preparing user documentation to maintain both development and operations continuity is integral to the role. And Ideal candidate would have At least 4+ years’ experience as a data engineer Experience in SQL, Sybase, Linux is a must Experience coding in two of these languages for server side/data processing is required Java, Python, C++ 2+ years experience using modern data stack (spark, snowflake, Big Query etc.) on cloud platforms (Azure, GCP, AWS) Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) Experience with Database Modeling, Normalization techniques Experience with object-oriented design patterns Experience with dev ops tools like Git, Maven, Jenkins, Gitlab CI, Azure DevOps Experience with Agile development concepts and related tools Ability to trouble shoot and fix performance issues across the codebase and database queries Excellent written and verbal communication skills Ability to operate in a fast-paced environment Strong interpersonal skills with a can-do attitude under challenging circumstances BA/BS or equivalent practical experience Skills That Would Be a Plus Perl, ETL tools (Informatica, Talend, dbt etc.) Experience with Snowflake or other Cloud Data warehousing products Exposure with Workflow management tools such as Airflow Exposure to messaging platforms such as Kafka Exposure to NoSQL platforms such as Cassandra, MongoDB Building and Delivering REST APIs Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

West Bengal, India

On-site

Linkedin logo

Requirements JOB DESCRIPTION 10+ years of strong experience with data transformation & ETL on large data sets Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.) 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts Experience in industry ETL tools (i.e., Informatica, Unifi) Experience with Business Requirements definition and management, structured analysis, process design, use case documentation Experience with Reporting Technologies (i.e., Tableau, PowerBI) Experience in professional software development Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome Must be self-managed, proactive and customer focused Degree in Computer Science, Information Systems, Data Science, or related field Show more Show less

Posted 6 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Work location Pune You will work with Being part of a digital delivery data group supporting Realtime data for our Solutions group, you will apply your domain knowledge and familiarity with domain data processes to support the organisation. The data team provides daily operational data management, data engineering and analytics support to this organisation across a broad range of activity About The Role The Data Steward applies practitioner level knowledge of a business domain to curate and validate accuracy, security and referential integrity of data required to drive compliance, safety and business critical decision making. They are responsible for implementing technical changes and controls across systems of record and communicating planned changes to the data owners. They are responsible for implementing the data requirements to populate systems of record and transpose data between package software, including Input quality checks on received data and providing technical insights into creating, remediating and maintaining data definitions. What you will deliver Act as a custodian of Realtime engineering, reliability, maintenance and facilities data, ensuring data integrity, consistency, and compliance across the organization, prioritising safety and operational efficiency for the business. Your focus areas will include: production data sets, production accounting, forecasting and production optimisation Enforce data governance policies, standards, and regulations; participate in improvement of these based on business need. Assess, report on and resolve data quality issues through root cause analysis and remediation planning. Ensure that data, documents and models represent the physical reality of our assets. Responsible for implementing the data requirements to populate systems of record and transpose data between package software, including Input quality checks on received data Work closely with data engineers and business analysts to ensure high-quality, standardized data. Support business users by providing guidance on data usage, access, and policies. Implement technical changes and controls across systems of record and communicate planned changes to the data owners Assist in metadata management, ensuring all critical datasets are properly captured. Facilitate collaboration between business and technology teams to improve data literacy and governance. Support regulatory and compliance efforts related to data privacy, security, and access control. What you will need to be successful (experience and qualifications) Essential : Bachelor’s degree in a STEM area or equivalent experience Experience with Realtime facility telemetry and industrial data platforms Hands on experience in AVEVA PI portfolio including AVEVA PI Vision Experience with SCADA systems Knowledge and understanding of managing piping and instrumentation diagrams Working knowledge of oil and gas process equipment (separators, valves, heat exchanges, pumps, produced water system, glycol system, distillation) 2+ years experience in stewardship of datasets within an operating Oil & Gas organisation or asset-intensive industry Strong understanding of data governance frameworks, master data management principles, policies, and compliance within the wells & subsurface data domain. Ability to work with business and technical teams to resolve data quality issues. Excellent communication and documentation skills. Analytical mindset with a strong focus on data accuracy and process improvement. Desired : Proficiency in SQL and ability to work with large datasets. Experience with Palantir Foundry Familiarity with cloud data platforms (AWS, Azure, or GCP). Experience with data governance tools (e.g., Collibra, Alation, Informatica). About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 6 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Enterprise Technology Engineers in bp bp is reinventing itself and digital capability is at the core of this vision. As a Senior Enterprise Technology Engineer you are a digital expert bringing deep specialist expertise to bp. Enterprise Technology Engineers work on the strategic technology platforms we exploit from the market, or come with deep skills in the implementation and integration of market solutions into our overall technology landscape. You will bring a broad base of Digital technical knowledge and a strong understanding of software delivery principles. You will be familiar with lifecycle methods, with Agile delivery and the DevOps approach at the core. You will be skilled in the application of approaches such as Site Reliability Engineering in the delivery and operation of the technologies you deliver, working as part of multi disciplinary squads. You thrive in a culture of continuous improvement within teams, encouraging and empowering innovation and the delivery of changes that optimise operational efficiency and user experience. You are curious and improve your skills through continuous learning of new technologies, trends & methods, applying knowledge gained to improve bp standards and the capabilities of the Engineering Community. You coach others in the Field to drive improved performance across our business. You embrace a culture of change and agility, evolving continuously, adapting to our changing world. You are an effective great teammate, looking beyond your own area/organizational boundaries to consider the bigger picture and/or perspective of others, while understanding cultural differences. You continually enhance your self-awareness and seek guidance from others on your impact and effectiveness. Well organized, you balance proactive and reactive approaches and multiple priorities to complete tasks on time. You apply judgment and common sense – you use insight and good judgment to inform actions and respond to situations as they arise. Key Accountabilities Technical lead for invoice processing application called eBilling Managing reliability of service and delivering to agreed SLA Collaborating with platform and security teams for patching and vulnerability management The safety of our people and our customers is our highest priority. The role will advocate and lead in this and promote security and safety in everything that we do. Work as part of evolving multi disciplinary teams which may include Software Engineers, Enterprise Technology, Engineers, Designers, SecOps, and Product owners to deliver value through the application of specialist skills Work with vendors and partners providing market solutions to optimize the usage and value which can be delivered from the appropriate technology platform Ensure operational integrity of what you build, assuring operational compliance with architectural and security standards, as well as compliance and policy controls refined by Strategy. Mentoring and become a conduit to connect the broader organization. Define and document standard run books and operating procedures. Create and maintain system information and architecture diagrams Education A first degree from a recognized institute of higher learning, ideally computer science or engineering based. Essential Experience And Job Requirements Total 8+ Years experience with Good knowledge of the Order to Cash process (preferably with Aviation domain) Informatica ETL MS SQL Data Integration Patterns (preferably with XML invoice processing) Experience with leading teams Demonstrable Knowledge of modern Service Delivery methods - Site Reliability Engineering to traditional ITIL, and understanding of Product Based delivery Strong Communications skills and a high ‘EQ’ with the ability to operate across complex business environments and collaborators up to senior executive level Desirable criteria Project Management experience delivering IT led projects Broad experience contributing and collaborating to assist design, plan, implement, maintain, and document services and solutions Development experience in one or more object-oriented or applicable programming languages (e.g. Python, Go, Java, C/C++) Skills That Set You Apart Passion for mentoring and coaching engineers in both technical and soft skills You focus on delighting customers with outstanding user experiences and customer service You are comfortable operating in an environment that is loosely coupled but tightly aligned toward a shared vision About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 6 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Bangalore Rural, Bengaluru

Hybrid

Naukri logo

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Bangalore Payroll: BCforward Work Mode: Hybrid JD-Agile Champion Informatica; Oracle; Shell scripting; Rally; Agile Champion Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 15-Days joiners at most. All the best

Posted 6 days ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Join a dynamic team transforming BCG into a data-driven organization! As part of the Data Team, you'll help build essential data platforms, products, and capabilities that empower our clients and colleagues with high-quality, actionable insights. Our Data Portfolio Team focuses on creating scalable, governed data solutions to drive informed decision-making across the company. Your Work Will Enable Better decision-making through improved data integrity and governance New insights by strengthening product data analytics and reporting Operational excellence by optimizing data workflows and governance practices Risk minimization through strong compliance and governance frameworks The Product Owner for Master Data Management (MDM) will lead an Agile development squad to deliver trusted, uniform, and secure enterprise master data using the Informatica MDM SaaS platform. This role is pivotal in ensuring harmonized master data to power key analytical and operational use cases across the organization, in alignment with BCG’s broader data strategy. As the Product Owner for Master Data Management, you will: Define and drive the strategic vision for MDM, aligning it with BCG’s enterprise data roadmap. Manage the end-to-end lifecycle of MDM solutions, from strategy to execution and adoption. Collaborate with cross-functional teams (Finance, HR, Procurement, IT) to prioritize, refine, and implement MDM enhancements. Ensure seamless integration of master data across enterprise systems, analytics platforms, and digital products. Own the MDM backlog, defining and prioritizing features based on customer impact, business value, and delivery feasibility. Enable scalable, high-quality master data through data governance, quality controls, and data lineage tracking. Drive user adoption by partnering with stakeholders, data stewards, and technical teams to embed MDM into business processes. Monitor and optimize MDM performance, addressing issues and identifying opportunities for continuous improvement. Your Key Responsibilities Include Define and Manage the MDM Vision & Roadmap Develop and own the strategic roadmap for Master Data Management, aligning with BCG’s broader Data Strategy. Collaborate with Data Leadership, Data Product Owners, Data Governance teams, and global stakeholders (Finance, HR, Procurement, Legal, etc.) to prioritize MDM initiatives based on business value. Engage with users, business leaders, and technical teams to define MDM enhancements and ensure alignment with key enterprise-wide initiatives (e.g., ERP transformation, Source-to-Pay implementation). Take an MVP-centric approach to delivering value while keeping a long-term vision in focus. Collaborate and influence Software and Services Vendors to shape their roadmaps and service offerings to meet BCG’s changing MDM solution needs. Ensure Sustainable MDM Delivery & Adoption Translate the MDM roadmap into actionable epics, features, and user stories. Own and manage the product backlog, defining priorities based on customer impact, business value, cost, and speed-to-delivery. Oversee the release planning and validation process for MDM enhancements and new data entities. Ensure that MDM components are built cohesively with scalable and modular architectures. Track product performance & usability to refine future enhancements. Collaborate with technical teams to monitor the Informatica MDM platform, identify capability gaps, and drive improvements. Establish KPIs and measurable outcomes to assess the effectiveness of MDM adoption. Support & Enable the MDM Squad Work closely with Data Owners, Sponsors, and Portfolio Management, and Data Stewards to align priorities and ensure seamless collaboration. Drive cross-functional engagement, ensuring the squad understands and aligns with business needs (e.g., Finance, Procurement, BCG X Teams). Work with Scrum Leads, MDM Technical Lead, and Chapters to monitor squad progress, offer constructive feedback, and clarify requirements as needed. Proactively identify and escalate risks, along with mitigation plans, to key stakeholders and PMO. What You'll Bring 10+ years of experience in Master Data Management / Reference Data Management with Product Owner/Project leadership experience Hands-on experience with MDM and DQ tools; Informatica MDM SaaS and IDMC preferred. Strong knowledge of Enterprise Data Architecture, Data Quality, and Metadata Management. Experience collaborating with Finance, IT, HR, Procurement, and Legal functions on data and digital transformation programs. Exceptional communication & stakeholder management skills, bridging business & technical discussions. Proven ability to drive data adoption and business process transformation. Agile Certification preferred (e.g., Professional Scrum Product Owner, Certified Scrum Product Owner). Consulting experience is a plus. Who You'll Work With Data Products Portfolio Leadership, Data Product Owners, and Chapter Leads. Global Data Owners & Data Stewards for enterprise-wide domains (e.g., Supplier, Worker, Client). Stakeholders across BCG functions, including Finance, BI&A, HR, and IT leadership. Technical teams across BCG (Enterprise Architecture, APIs, Security, Data Engineering) to select the right MDM model and manage cross-tribe dependencies. Software and Services Vendors to expand MDM’s offerings. Members of the MDM Squad and Operations Support Team Additional info YOU’RE GOOD AT Customer-Centric Thinking – Understanding and anticipating stakeholder needs to shape impactful MDM solutions. Data Strategy & Governance – Deep knowledge of MDM, data quality, and governance best practices. Cross-Functional Collaboration – Engaging with business, technical, and leadership teams to align goals. Product Ownership & Agile Leadership – Driving execution in an Agile environment, with expertise in Scrum methodologies. Stakeholder Engagement & Communication – Ability to influence senior leadership and translate complex data topics into business language. Operational Excellence & Innovation – Continuously identifying opportunities to optimize, automate, and scale MDM processes. Data-Driven Decision Making – Using data analytics, KPIs, and reporting to prioritize and measure MDM success. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Big Data Developer to carry out coding or programming of Hadoop applications and developing software using Hadoop technologies like Spark, Scala, Python, Hbase, Hive, Cloudera. In this role, you will need to concentrate in creating, testing, implementing, and monitoring applications designed to meet the organization?s strategic goals. What You?ll Do Develop (Coding) for Hadoop, Spark and Java and Angular Js Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%) Review code and provide feedback relative to best practices; improve performance Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies Adhere to appropriate SDLC and Agile practices Contribute actively to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client?s business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Provide guidance and coaching to associate software developers Use Informatica or similar products, with an understanding of heterogeneous data replication technique Conduct performance tuning, improvement, balancing, usability and automation Expertise You?ll Bring Experience developing code on distributed databases using Spark, HDFS, Hive 3+ years of experience in Application Developer / Data Architect, or equivalent role Strong knowledge of data and data models Good understanding of data consumption patterns by business users Solid understanding of business processes and structures Basic knowledge of the securities trading business and risk Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Big Data Developer to carry out coding or programming of Hadoop applications and developing software using Hadoop technologies like Spark, Scala, Python, Hbase, Hive, Cloudera. In this role, you will need to concentrate in creating, testing, implementing, and monitoring applications designed to meet the organization?s strategic goals. What You?ll Do Develop (Coding) for Hadoop, Spark and Java and Angular Js Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%) Review code and provide feedback relative to best practices; improve performance Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies Adhere to appropriate SDLC and Agile practices Contribute actively to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client?s business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Provide guidance and coaching to associate software developers Use Informatica or similar products, with an understanding of heterogeneous data replication technique Conduct performance tuning, improvement, balancing, usability and automation Expertise You?ll Bring Experience developing code on distributed databases using Spark, HDFS, Hive 3+ years of experience in Application Developer / Data Architect, or equivalent role Strong knowledge of data and data models Good understanding of data consumption patterns by business users Solid understanding of business processes and structures Basic knowledge of the securities trading business and risk Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Big Data Developer to carry out coding or programming of Hadoop applications and developing software using Hadoop technologies like Spark, Scala, Python, Hbase, Hive, Cloudera. In this role, you will need to concentrate in creating, testing, implementing, and monitoring applications designed to meet the organization?s strategic goals. What You?ll Do Develop (Coding) for Hadoop, Spark and Java and Angular Js Collaborate with like-minded team members to establish best practices, identify optimal technical solutions (20%) Review code and provide feedback relative to best practices; improve performance Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies Adhere to appropriate SDLC and Agile practices Contribute actively to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client?s business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Provide guidance and coaching to associate software developers Use Informatica or similar products, with an understanding of heterogeneous data replication technique Conduct performance tuning, improvement, balancing, usability and automation Expertise You?ll Bring Experience developing code on distributed databases using Spark, HDFS, Hive 3+ years of experience in Application Developer / Data Architect, or equivalent role Strong knowledge of data and data models Good understanding of data consumption patterns by business users Solid understanding of business processes and structures Basic knowledge of the securities trading business and risk Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers Show more Show less

Posted 6 days ago

Apply

10.0 - 14.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: We are seeking an experienced MDM Manager with 1014 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio. This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong MDM experience along with Data Governance, DQ, Data Cataloging implementation knowledge, hence the candidates must have minimum 6-8 years of core MDM technical experience for this role (Along with total experience in the range of 10-14 years) . Roles & Responsibilities: Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation experience Design and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience: Masters degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS, Eclipse etc Strong experience with external data enrichment services like D&B, Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a highly motivated Informatica Developer with strong hands-on experience in ETL development and Python scripting . The ideal candidate will be responsible for building and maintaining efficient, scalable data pipelines and integrating them with business logic using Python. This is an urgent requirement , and we're looking for professionals who can join immediately or within 15 days . Role: Informatica Developer Experience Level: 5+ years (Relevant Exp) Notice Period: Immediate to 1 month ETL Experience: Build and optimize ETL pipelines, integrate data from multiple sources, and handle data transformations. (Informatica- IDQ /Cloud) Python Expertise: Advanced knowledge in Python, including libraries like pandas, requests. API Development Soap/Rest API Database Management: Proficient in SQL and data warehousing.

Posted 6 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description To provide data and reporting solutions with a focus on Treasury leveraging SAP and non-SAP technologies, such as S/4HANA, Informatica MDM and Google BigQuery. To troubleshoot the complex production issues including data integrity and performance issues. To establish technical standards and evaluate custom solutions. To lead the vendor architecture evaluation and conduct hands on proof of concepts. To deliver data and functional solution to green field Treasury and Finance & Accounting design To deliver the technical design to data analytics, integration fabric, security and cyber defense, hosting and client infrastructure. To thrive in a fast-paced working norm and manage multiple deliverables simultaneously. Responsibilities Architecture solutioning experience and familiar with TOGAF In depth data reporting and functional knowledge of S/4HANA eco-system (CM, TM, MBC, IHC, FX, FI, CO, Grouping Reporting, SD, MM, CA and et al) SAP technical experience a definite Plus. Active practitioner of AGILE methodology and familiar with architecture runway. Qualifications Familiar with SAP testing strategy. Strong hands-on PoC skill Negotiation skills and ability to achieve consensus with several and diverse set of stakeholders/customers. Strong and open communication skills both verbal and written. Self-starter, ability to work independently with minimum supervision. Show more Show less

Posted 6 days ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Senior PySpark Data Engineer Share only the quality profiles Location: Pune/Hybrid Experience: 5-8 years Budget: 8LPA-11 LPA Notice Period: Immediate to 15 days Mandatory Skills: · Python · SQL · ETL · Informatica PowerCenter · AWS/Azure Good to Have: · IDMC Tech Stack Table Skills Experience Rating out of 10 Python SQL ETL Informatica PowerCenter Aws/Azure Job Summary: We are seeking a Senior PySpark Data Engineer with extensive experience in developing, optimizing, and maintaining data processing jobs using PySpark. The ideal candidate will possess a robust background in SQL and ETL processes, along with proficiency in cloud platforms such as AWS or Azure. This role will require excellent analytical skills and the ability to communicate effectively with both technical and non-technical stakeholders. Key Responsibilities: · Design, develop, and optimize PySpark jobs for enhanced performance and scalability. · Collaborate with data architects and business analysts to understand data requirements and translate them into technical specifications. · Redesign and maintain complex SQL queries and stored procedures to support data extraction and transformation processes. · Utilize ETL tools, specifically Informatica PowerCenter, to build effective data pipelines. · Troubleshoot and resolve data quality issues and performance bottlenecks. · Mentor and provide technical guidance to a team of developers to enhance productivity and code quality. · Stay updated with new technologies and practices to continually improve data processing capabilities. Qualifications: · Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. · Experience: 5-8 years of experience in data engineering, with a strong focus on PySpark and ETL processes. Technical Skills: Must-Have: 1. Extensive experience with PySpark, focusing on job optimization techniques. 2. Proficiency in SQL, with experience in SQL Server, MySQL, or other relational databases. 3. Strong knowledge of ETL concepts and tools, particularly Informatica PowerCenter and IDMC. 4. Excellent analytical and troubleshooting skills. 5. Strong communication skills for effective collaboration. Good to Have: 1. Basic knowledge of Unix commands and Shell scripting. 2. Experience in leading and mentoring development teams. 3. Familiarity with Azure/Fabric. Kindly share a profile only in this tracker format ,attach the tracker to the body of the mail. Without this tracker format and Tech Stack Table profile will not be considered. S.no Date Position Names of the Candidate Mobile Number Email id Total Experience Relevant Experience CUrrent CTC Expected CTC Notice Period / On Paper Current Organisation Current Location Address with Pin code Reason of leaving DOB Offer in hand VENDOR NAME - Regards Damodar 91-8976334593 info@d-techworks.com D-TechWorks Pvt Ltd USA | INDIA www.d-techworks.com Information Technology Services Technology | Consulting | Development | Staff Augmentation Show more Show less

Posted 6 days ago

Apply

40.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

India - Hyderabad JOB ID: R-213468 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 12, 2025 CATEGORY: Information Systems ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are seeking an MDM Associate Data Engineerwith 2–5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications: Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply

3.0 years

6 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

The Data Scientist organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the role: Work with low to minimum supervision to solve business problems using data and analytics. Work in multiple business domain areas including Customer Experience and Service, Operations, Finance, Sales and Marketing. Work with various business stakeholders, to understand and document requirements. Design an analytical framework to provide insights into a business problem. Explore and visualize multiple data sets to understand data available for problem solving. Build end to end data pipelines to handle and process data at scale. Build machine learning models and/or statistical solutions. Build predictive models. Use Natural Language Processing to extract insight from text. Design database models (if a data mart or operational data store is required to aggregate data for modeling). Design visualizations and build dashboards in Tableau and/or PowerBI Extract business insights from the data and models. Present results to stakeholders (and tell stories using data) using power point and/or dashboards. Work collaboratively with other team members. About you: Overall 3+ years' experience in technology roles. Must have a minimum of 1 years of experience working in the data science domain. Has used frameworks/libraries such as Scikit-learn, PyTorch, Keras, NLTK. Highly proficient in Python. Highly proficient in SQL. Experience with Tableau and/or PowerBI. Has worked with Amazon Web Services and Sagemaker. Ability to build data pipelines for data movement using tools such as Alteryx, GLUE, Informatica. Proficient in machine learning, statistical modelling, and data science techniques. Experience with one or more of the following types of business analytics applications: Predictive analytics for customer retention, cross sales and new customer acquisition. Pricing optimization models. Segmentation. Recommendation engines. Experience in one or more of the following business domains Customer Experience and Service. Finance. Operations. Good presentation skills and the ability to tell stories using data and PowerPoint/Dashboard Visualizations. Excellent organizational, analytical and problem-solving skills. Ability to communicate complex results in a simple and concise manner at all levels within the organization. Ability to excel in a fast-paced, startup-like environment. #LI-SS5 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here . Learn more on how to protect yourself from fraudulent job postings here . More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 6 days ago

Apply

3.0 - 5.0 years

2 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Country/Region: IN Requisition ID: 26436 Work Model: Position Type: Salary Range: Location: INDIA - HYDERABAD - BIRLASOFT OFFICE Title: Technical Lead-Data Engg Description: Area(s) of responsibility We are seeking a skilled Informatica ETL Developer with 3–5 years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Join the Ford HR Management Security & Controls department! Our team is dedicated to ensuring robust and secure user access management for Human Resource (HR) applications in our global environment. We are responsible for the tools and processes that allow HR staff, Ford employees, and suppliers to request and authorize access efficiently and securely. We also maintain critical interfaces that connect our access management systems to various downstream applications. A key focus area for our team is the configuration and management of security roles within our global HR system, Oracle HCM. Oracle HCM (Human Capital Management) is Ford's comprehensive global HR platform. This includes Core HR processes (like employee data management, promotions, and internal transfers), as well as Compensation, Learning & Development, Talent Management, Recruiting and Payroll. We are looking for a skilled and experienced IT Analyst/Specialist with deep knowledge of Oracle HCM, particularly its security and access management capabilities. This role is critical to ensuring the integrity and security of our HR data and systems. You will also leverage your skills in SQL and Informatica PowerCenter to support data analysis, reporting, and ETL processes vital to our operations. You'll be joining a dynamic, globally distributed IT team with members located in the US, India, and Germany, collaborating across regions to achieve our shared goals. Responsibilities Configure, manage, and maintain security roles, profiles, and permissions within the global Oracle HCM system, ensuring compliance with security policies. Design, develop, and maintain Extract, Transform, Load (ETL) processes using Informatica PowerCenter to move and integrate data from various sources. Utilize SQL for data extraction, analysis and validation. Collaborate closely with HR functional teams and other IT teams to understand security and data requirements. Ensure implemented solutions adhere to security best practices and internal controls. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience. 3+ years experience with Oracle HCM, with a strong focus on security configuration and user access management. 3+ years experience with SQL for data querying, analysis, and manipulation. Hands-on experience designing, developing, and maintaining ETL processes (e.g. by using Informatica IICS). Understanding of data security principles and best practices, especially in an HR context. Experience troubleshooting complex technical issues related to access, security, or data integration. Strong analytical and problem-solving skills. Excellent communication and collaboration skills, comfortable working with global teams across different time zones. Desired Skills: Experience with other Oracle HCM Security Module Experience with other Oracle technologies or modules within HCM (e.g., Oracle BI Publisher). Experience working in a large, global enterprise environment. Show more Show less

Posted 6 days ago

Apply

3.0 years

0 Lacs

Chennai

On-site

GlassDoor logo

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a successful Professional, Data Conversions doat Fiserv? A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor’s degree in programming or related field Minimum 3 years’ relevant experience in data processing (ETL) conversions or financial services industry 3 – 5 years’ Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 6 days ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Gurugram

Remote

Naukri logo

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Remote Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 6 days ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies