Home
Jobs

3523 Informatica Jobs - Page 49

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Technical Skills Microsoft Purview Expertise (Required) Unified Data Catalog: Experience setting up and configuring the catalog, managing collections, classifications, glossary terms, and metadata curation. Data Quality (DQ): Implementing DQ rules, defining metrics (accuracy, completeness, consistency), and using quality scorecards and reports. Data Map and Scans: Ability to configure sources, schedule scans, manage ingestion, and troubleshoot scan issues. Data Insights and Lineage: Experience visualizing data lineage and interpreting catalog insights. Azure Platform Knowledge (Desirable) Azure Data Factory Azure Synapse Analytics Microsoft Fabric including OneLake Experience 3to 5+ years in data governance or data platform projects, ideally with enterprise clients. 2+ years implementing Microsoft Purview or similar tools (Collibra, Informatica, Alation). Hands-on experience configuring and implementing Microsoft Purview Unified Catalog and Data Quality Experience onboarding multiple data sources (on-prem, cloud). Background in data management, data architecture, or business intelligence is highly beneficial. Certifications Desirable Microsoft Certified Azure Data Engineer Associate Microsoft Certified Azure Solutions Architect Expert Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Primary Skill – Informatica Master Data Management Total Exp – 5 – 8 years Notice Period – 1 month Job Location – PAN India JD: • Informatica MDM Lead or should have 5-8 years of experience into data management, data mastering, data governance, data quality and data integration activities. • Experience in working with Informatica tools like MDM, ActiveVOS. • Extensive experience using MDM Hub console, PT, Java/J2ee, RDMS, flat files, xml, SQL and Unix. • Expertise on MDM Hub configurations, ActiveVOS workflow implementation, SIF/BES API calls, User Exit implementation, PT configurations. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you will work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your Primary Responsibilities Include Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Preferred Education Master's Degree Required Technical And Professional Expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred Technical And Professional Experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

P1,C3,STS Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency Skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description: Job Description Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations ͏ Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness ͏ Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation ͏ Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations͏ Mandatory Skills: Informatica Admin . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd, We are hiring for UX with JavaScript, ETL Testing + Python Programming, Automation Testing with Java, Selenium, BDD, Cucumber, ETL DB Testing, ETL Testing Automation skills. Please walk-in for interview on 10th and 11th June 2025 at Pune location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215162 Interview details Interview Date: 10th and 11th June 2025 Interview Time: 10 AM till 1 PM Interview Venue: Pune:: Hinjewadi Phase 1 Infosys BPM Limited, Plot No. 1, Building B1, Ground floor, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi Phase 1, Pune, Maharashtra-411057 Please find below Job Description for your reference: Work from Office*** Min 2 years of experience on project is mandate*** Job Description: UX with JavaScript Technical Design tools (e.g., Photoshop, XD, Figma), strong knowledge of HTML, CSS, JavaScript, and experience with SharePoint customization. Experience with Wireframe, Prototype, intuitive, responsive design, Documentation, Able to Lead the Team Nice to have Understanding of SharePoint Framework (SPFx) and modern SharePoint development. Job Description: ETL Testing + Python Programming Experience in Data Migration Testing (ETL Testing), Manual & Automation with Python Programming. Strong on writing complex SQLs for data migration validations. Work experience with Agile Scrum Methodology Functional Testing- UI Test Automation using Selenium, Java Financial domain experience Good to have AWS knowledge Job Description: Automation Testing with Java, Selenium, BDD, Cucumber Hands on exp in Automation. Java, Selenium, BDD , Cucumber expertise is mandatory. Banking Domian Experience is good. Financial domain experience Automation Talent with TOSCA skills, Payment domain skills is preferable. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job description: Job Description Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations ͏ Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness ͏ Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation ͏ Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations͏ Mandatory Skills: Informatica Admin . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome. Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

As the hands-on Staff Database Engineer for Clearwater Analytics, you will play a crucial role in designs, develops, and maintains data systems and architectures to collect, store, process, and analyse large volumes of data. You will be building data pipelines, optimize data models, and ensure data quality and security. You will be collaborating with cross-functional teams to meet business objectives and stay updated with emerging technologies and industry best practices. Responsibilities and Duties: Extensive experience with Snowflake, including proficiency in Snow SQL CLI, Snowpipe, creating custom functions, developing Snowflake stored procedures, schema modeling, and performance tuning. In-depth expertise in Snowflake data modeling and ELT processes using Snowflake SQL, as well as implementing complex stored procedures and leveraging Snowflake Task Orchestration for advanced data workflows. Strong background in DBT CLI, DBT Cloud, and GitHub version control, with the ability to design and develop complex SQL processes and ELT pipelines. Take a hands-on approach in designing, developing, and supporting low-latency data pipelines, prioritizing data quality, accuracy, reliability, and efficiency Advance SQL knowledge and hands-on experience in complex query writing using Analytical functions, Troubleshooting, problem-solving, and performance tuning of SQL queries accessing data warehouse as well as Strong knowledge of stored procedures. Collaborate closely with cross-functional teams including, Enterprise Architects, Business Analysts, Product Owners, Solution Architects actively engaging in gathering comprehensive business requirements and translate these requirements into scalable data cloud and Enterprise Data Warehouse (EDW) solutions that precisely align with organizational needs Play a hands-on role in conducting data modeling, ETL (Extract, Transform, Load) development, and data integration processes across all Snowflake environments. Develop and implement comprehensive data governance policies and procedures to fortify the accuracy, security, and compliance of Snowflake data assets across all environments. Capable of independently conceptualizing and developing innovative ETL and reporting solutions, driving them through to successful completion. Create comprehensive documentation for database objects and structures to ensure clarity and consistency. Troubleshoot and resolve production support issues post-deployment, providing effective solutions as needed. Devise and sustain comprehensive data dictionaries, metadata repositories, and documentation to bolster governance and facilitate usage across all Snowflake environments. Remain abreast of the latest industry trends and best practices, actively sharing knowledge and encouraging the team to continuously enhance their skills. Continuously monitor the performance and usage metrics of Snowflake database and Enterprise Data Warehouse (EDW), conducting frequent performance reviews and implementing targeted optimization efforts Skills Required: Familiarity with big data, data warehouse architecture and design principles Strong understanding of database management systems, data modeling techniques, data profiling and data cleansing techniques Expertise in Snowflake architecture, administration, and performance tuning. Experience with Snowflake security configurations and access controls. Knowledge of Snowflake's data sharing and replication features. Proficiency in SQL for data querying, manipulation, and analysis. Experience with ETL (Extract, Transform, Load) tools and processes. Ability to translate business requirements into scalable EDW solutions. Streaming Technologies like AWS Kinesis Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field. 10 years of hands-on experience in data warehousing, ETL development, and data modeling, with a strong track record of designing and implementing scalable Enterprise Data Warehouse (EDW) solutions. 3+ years of extensive hands-on experience with Snowflake, demonstrating expertise in leveraging its capabilities. Proficiency in SQL and deep knowledge of various database management systems (e.g., Snowflake, Azure, Redshift, Teradata, Oracle, SQL Server). Experience utilizing ETL tools and technologies such as DBT, Informatica ,SSIS, Talend Expertise in data modeling techniques, with a focus on dimensional modeling and star schema design. Familiarity with data governance principles and adeptness in implementing security best practices. Excellent problem-solving and troubleshooting abilities, coupled with a proven track record of diagnosing and resolving complex database issues. Demonstrated leadership and team management skills, with the ability to lead by example and inspire others to strive for excellence. Experience in the Finance industry will be a significant advantage Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Engineer - 6months to 2yrs exp 2023-2024 graduates ONLY Company Description Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description We are seeking a Data Engineer with a strong background in data engineering. This role involves managing system requirements, design, development, integration, quality assurance, implementation, and maintenance of corporate applications. Ø Work with product owners, business stakeholders and internal teams to understand business requirements and desired business outcomes. Ø Assist in scoping and designing analytic data assets, implementing modelled attributes and contributing to brainstorming sessions. Ø Build and maintain a robust data engineering process to develop and implement self-serve data and tools for Visa’s product management teams and data scientists. Ø Find opportunities to create, automate and scale repeatable analyses or build self-service tools for business users. Ø Execute data engineering projects ranging from small to large either individually or as part of a project team. Ø Set the benchmark in the team for good data engineering practices and assist leads and architects in solution design. Ø Exhibit a passion for optimizing existing solutions and making incremental improvements. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Qualifications Basic Qualification -Bachelors degree, OR 3+ years of relevant work experience Preferred Qualification -Minimum of 1 years’ experience in building data engineering pipelines. -Design and coding skills with Big Data technologies like Hadoop, Spark, Hive and Map reduce. -Mastery in Pyspark or Scala. -Expertise in any programming like Java or Python. Knowing OOP concepts like inheritance, polymorphism and implementing Design Patterns in programming is needed. -Experience with cloud platforms like AWS, GCP, or Azure is good to have. -Excellent problem-solving skills and ability to think critically. -Experience with any one ETL tool like Informatica, SSIS, Pentaho or Azure Data Factory. -Knowledge of successful design, and development of data driven real time and batch systems. -Experience in data warehousing and an expert in any one of the RDBMS like SQL Server, Oracle, etc. -Nice to have reporting skills on PowerBI/Tableau/QlikView. -Strong understanding of cloud architecture and service offerings including compute, storage, databases, networking, AI, and ML. -Passionate about delivering zero defect code that meets or exceeds the proposed defect SLA and have high sense of accountability for quality and timelines on deliverables. -Experience developing as part of Agile/Scrum team is preferred and hands on with Jira. -Understanding basic CI/ CD functionality and Git concepts is must. Additional information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role : PySpark Developer Locations : Hyderabad & Bangalore Work Mode : Hybrid Interview Mode : Virtual (2 Rounds) Type : Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Title: Data Support Engineer Duration : 6 months Contract Skill : Data engineering and Support activities, SQL,Python Mandatory Skills: 3+ years of experience in Data engineering and Support activities . Strong expertise in SQL (advanced queries, optimization, indexing) . Proficiency in Python, Shell scripting for automation. Knowledge of Data warehousing concepts (star schema, partitioning, etc.) . Good understanding of monitoring tools . Good to have understanding, technical knowledge on ETL tools (Informatica) and Cloud data platforms (Snowflake, Databricks). Desired Skills: Strong analytical and problem-solving abilities. Excellent communication and collaboration skills. Ability to work in a fast-paced environment to meet project timelines Bachelor’s degree in Computer science, IT, or related field . Relevant certifications ( Azure Data Engineer, etc. ) will be a plus Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Greetings, We have immediate opportunity for ETL Developer – 5-7 years Synechron– Mumbai Job Role: ETL Developer Job Location: Mumbai About Synechron We began life in 2001 as a small, self-funded team of technology specialists. Since then, we’ve grown our organization to 14,500+ people, across 58 offices, in 21 countries, in key global markets. Innovative tech solutions for business We're now a leading global digital consulting firm, providing innovative technology solutions for business. As a trusted partner, we're always at the forefront of change as we lead digital optimization and modernization journeys for our clients. Customized end-to-end solutions Our expertise in AI, Consulting, Data, Digital, Cloud & DevOps and Software Engineering, delivers customized, end-to-end solutions that drive business value and growth. Job Summary: Responsibilities: • Perform business/data analysis to investigate into various business problems and propose the solution working closely with clients and team • Analyze Report Requirements, perform gap analysis, understand existing systems and data flows, model the required changes, source, enrich and publish to Axiom for final reporting • Detailed requirements analysis with End-Users to understand business (reporting) needs • Performance and scalability optimization to support large scale deployments • Define new features in conjunction with product management, and provide specifications • Ensure quality and completeness of the final product through unit testing, documentation, and maintenance as appropriate. Technical Skills • 3 years of Experience in Database development (SQL/Data Modelling/Stored Procedures) • 3 years of ETL Programming/Data Engineering (Tools like Informatica not necessary) • Job scheduler tools: Autosys • UNIX Preferred Skills • Axiom CV10 experience • Understanding of U.S. risk and regulatory requirements • DevOps Tooling • Experience with Java/Scala/Python and Spark Framework • Exposure to BFSI and finance industry Non-Technical Skills: • High interest in Finance business & data analysis required. Interest in Accounting preferred. • Eager to learn business in Finance such as financial products, regulatory requirement, and accounting, working closely with internal clients • Ownership/Accountability and desire to take on an expanded role over time. • Ability to multi-task and work against deadlines/priorities • Eager to work with new technologies and apply them towards enterprise-level data solutions • Capable of working with a global team and thriving in a team development environment Advanced problem-solving skills and the ability to deal with real world business issues. For more information on the company, please visit our website or LinkedIn community. If you find this this opportunity interesting kindly share your updated profile on Mandar.Jadhav@synechron.com Current CTC- Expected CTC- Notice period- Current Location- Ready to relocate to Mumbai- If you had gone through any interviews in Synechron before? If Yes when Regards, Mandar Jadhav Mandar.Jadhav@synechron.com Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Greater Delhi Area

On-site

Linkedin logo

About the Company - About- R Systems International Limited (https://www.rsystems.com/about-us/factsheet/) R Systems is a Blackstone portfolio Company founded in 1993, Headquartered at El Dorado Hills, California, United States (USA) and offshore delivery centers located at Noida, Pune and Chennai. R Systems International Limited is listed publically at NSE and BSE with current share price at around RS 500+. It is a leading digital product engineering company that designs and builds next-gen products, platforms, and digital experiences empowering clients across various industries to overcome digital barriers, put their customers first, and achieve higher revenues as well as operational efficiency. We constantly innovate and bring fresh perspectives to harness the power of the latest technologies like cloud, automation, AI, ML, analytics, Mixed Reality etc. Role- Data Engineer Responsibilities - Strong expertise in ETL processes and tools like Informatica, with experience in designing and maintaining data pipelines. Proficiency in Pyspark for big data processing and distributed computing. Advanced SQL skills and experience with PL/SQL for managing and querying large datasets in relational databases. Exposure to cloud platforms like AWS and GCP for data storage, data processing, and deployment of ETL solutions. Experience in data integration, transformation, and loading between different systems and databases. Familiarity with data modeling, data warehousing, and database performance tuning. Understanding of Agile methodologies and participation in sprint-based development cycles. Qualifications - BE/Btech Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Senior ETL and Backend Developer (Salesforce) Hyderabad, India; Ahmedabad, India Information Technology 316835 Job Description About The Role: Grade Level (for internal use): 10 Title : Senior ETL and Backend Developer (Salesforce) Job Location : Hyderabad, Ahmedabad, Gurgaon, Virtual-India The Team: We are seeking a skilled Senior ETL and Backend Developer with extensive experience in Informatica and Salesforce. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes and backend systems to ensure seamless data integration and management. The team works in a challenging environment that gives ample opportunities to use innovative ideas to solve complex problems. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: You will be making significant contribution in building solutions for the Web applications using new front-end technologies & Micro services. The work you do will deliver products to build solutions for S&P Global Commodity Insights customers. Responsibilities: ETL Development: Design, develop, and maintain ETL processes using Informatica PowerCenter and other ETL tools. Data Integration: Integrate data from various sources, including databases, APIs, flat files, and cloud storage, into data warehouses or data lakes. Backend Development: Develop and maintain backend systems using relevant programming languages and frameworks. Salesforce Integration: Implement and manage data integration between Salesforce and other systems. Performance Tuning: Optimize ETL processes and backend systems for speed and efficiency. Data Quality: Ensure data quality and integrity through rigorous testing and validation. Monitoring and Maintenance: Continuously monitor ETL processes and backend systems for errors or performance issues and make necessary adjustments. Collaboration: Work closely with data architects, data analysts, and business stakeholders to understand data requirements and deliver solutions. Qualifications: Basic Qualifications: Bachelor's /Master’s Degree in Computer Science, Information Systems or equivalent. A minimum of 8+ years of experience in software engineering & Architecture. A minimum 5+ years of experience in ETL development, backend development, and data integration. A minimum of 3+ years of Salesforce development, administration/Integration. Proficiency in Informatica PowerCenter and other ETL tools. Strong knowledge of SQL and database management systems (e.g., Oracle, SQL Server). Experience with Salesforce integration and administration. Proficiency in backend development languages (e.g., Java, Python, C#). Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Nice to have – GenAI, Java, Spring boot, Knockout JS, requireJS, Node.js, Lodash, Typescript, VSTest/ MSTest/ nUnit. Preferred Qualifications: Proficient with software development lifecycle (SDLC) methodologies like SAFe, Agile, Test- driven development. Experience with other ETL tools and data integration platforms. Informatica Certified Professional Salesforce Certified Administrator or Developer Knowledge of back-end technologies such as C#/.NET, Java or Python. Excellent problem solving, analytical and technical troubleshooting skills. Able to work well individually and with a team. Good work ethic, self-starter, and results oriented. Excellent communication skills are essential, with strong verbal and writing proficiencies. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316835 Posted On: 2025-06-03 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

Noida H.O , Noida, Uttar Pradesh

On-site

Indeed logo

About the Role: We are seeking talented and detail-oriented Data Engineers with expertise in Informatica MDM to join our fast-growing data engineering team. Depending on your experience, you’ll join as a Software Engineer or Senior Software Engineer, contributing to the design, development, and maintenance of enterprise data management solutions that support our business objectives. As a key player, you will be responsible for building reliable data pipelines, working with master data management, and ensuring data quality, governance, and integration across systems. Responsibilities: Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM . Develop and maintain batch and real-time data integration workflows. Collaborate with data architects, business analysts, and stakeholders to understand data requirements. Perform data profiling, data quality assessments, and master data matching/merging. Implement governance, stewardship, and metadata management practices. Optimize the performance of Informatica MDM Hub, IDD, and associated components. Write complex SQL queries and stored procedures as needed. Senior Software Engineer – Additional Responsibilities: Lead design discussions and code reviews; mentor junior engineers. Architect scalable data integration solutions using Informatica and complementary tools. Drive adoption of best practices in data modeling, governance, and engineering. Work closely with cross-functional teams to shape the data strategy. Required Qualifications: Software Engineer: Bachelor’s degree in Computer Science, Information Systems, or related field. 2–4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). Strong SQL and data modeling skills. Familiarity with ETL concepts, REST APIs , and data integration tools. Understanding of data governance and quality frameworks. Senior Software Engineer: Bachelor’s or Master’s in Computer Science, Data Engineering, or related field. 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. Proven track record of designing scalable MDM solutions in large-scale environments. Strong leadership, communication, and stakeholder management skills. Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP) , and big data tools is a plus. Preferred Skills (Nice to Have): Experience with other Informatica products (IDQ, PowerCenter). Exposure to cloud MDM platforms or cloud data integration tools. Agile/Scrum development experience. Knowledge of industry-standard data security and compliance practices. Job Type: Full-time Pay: Up to ₹1,853,040.32 per year Benefits: Flexible schedule Health insurance Life insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Application Question(s): What is your notice period? Education: Bachelor's (Preferred) Experience: Informatica: 4 years (Preferred) Location: Noida H.O, Noida, Uttar Pradesh (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Contract Duration: 12 Months Location: PAN India Experience: 6 to 10 years Key Responsibilities Assist developers in implementing enterprise data governance and management tools (Data Catalog, Metadata Management, Data Quality, Master Data Management). Conduct testing and validation of data governance tools/processes to ensure compliance with standards. Monitor deployment practices and enforce governance guardrails. Coordinate with stakeholders to ensure requirements meet acceptance criteria related to data governance frameworks, data dictionary, metadata, and access controls. Ensure remediation plans are implemented for data failing governance standards. Support business owners and data stewards in resolving enterprise data issues using industry standards. Communicate changes, issues, and business impacts clearly within the team. (Good to have) Understanding of Agile methodologies (Scrum, Kanban, Lean). Minimum Requirements Bachelor’s/Master’s degree in Computer Science, Software Engineering, or equivalent. 5+ years in data governance, data quality, data migration, or data preparation. Minimum 3 years hands-on experience with Informatica CDGC, Axon EDC, IDQ. Experience working with Agile methodologies. Strong critical thinking and problem-solving skills. Excellent verbal and written communication skills. Ability to work effectively with senior management and cross-functional teams. Good To Have Understanding of Cloud technologies (AWS Data Lake, S3, Glue, Snowflake). Knowledge of BI tools like ThoughtSpot, PowerBI. Experience in Real Estate industry (preferred). Intellectual curiosity and willingness to learn new technical skills. Show more Show less

Posted 2 weeks ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies