Home
Jobs

3523 Informatica Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: β‚Ή0
Max: β‚Ή10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Intelligent Cloud Services. - Strong understanding of application development methodologies. - Experience with cloud-based application deployment and management. - Familiarity with data integration and transformation processes. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Intelligent Cloud Services. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. TempHtmlFile About KPMG In India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. About Our Financial Crimes specialist teams provide solutions to BFSI clients by conducting model validation testing for AML risk models and frameworks, sanctions screening and transaction monitoring system to ensure efficiency and efficacy of underlying frameworks both functionally and statistically. We are looking to hire colleagues with advance data science and analytics skill to support our financial crimes team. You will play a crucial role in helping clients tackle the multifaceted challenges of financial crime. By utilizing advanced analytics and deep technical knowledge, our team aids top clients in reducing risks associated with financial crime, terrorist financing, and sanctions violations. We also work to enhance their screening and transaction monitoring systems. Our team of specialized analysts ensures that leading financial institutions adhere to industry best practices for robust programs and controls. Through a variety of project experiences, you will develop your professional skills, assisting clients in understanding and addressing complex issues, and implementing top-tier solutions to resolve identified problems. Minimum work experience: 3+ years of advance analytics Preferred experience: 1+ years in AML model validation Responsibilities Support functional SME teams to build data driven Financial Crimes solution Conduct statistical testing of the screening matching algorithms, risk rating models and thresholds configured for detection rules Validate data models of AML systems built on systems such as SAS Viya, Actimize, Lexis Nexis, Napier, etc. Develop, validate, and maintain AML models to detect suspicious activities and transactions. Conduct Above the Line and Below the Line testing Conduct thorough model validation processes, including performance monitoring, tuning, and calibration. Ensure compliance with regulatory requirements and internal policies related to AML model risk management. Collaborate with cross-functional teams to gather and analyze data for model development and validation. P erform data analysis and statistical modeling to identify trends and patterns in financial transactions. Prepare detailed documentation and reports on model validation findings and recommendations. Assist in feature engineering for improvising Gen AI prompts applicable for automation of AML / Screening related investigations Use advanced Machine Learning deployment (e.g. XGBoost) and GenAI approaches Criteria Bachelor’s degree from accredited university 3+ years of complete hands-on experience in Python with an experience in Java, Fast, Django, Tornado or Flask frameworks Working experience in Relational and NoSQL databases like Oracle, MS SQL MongoDB or ElasticSearch Proficiency BI tools such as Power BI, Tableau, etc. Proven experience in data model development and testing Education background in Data Science and Statistics Strong proficiency in programming languages such as Python, R, and SQL. Expertise in machine learning algorithms, statistical analysis, and data visualization tools. Familiarity with regulatory guidelines and standards for AML Experience in AML related model validation and testing Expertise in techniques and algorithms to include sampling, optimization, logistic regression, cluster analysis, Neural Networks, Decision Trees, supervised and unsupervised machine learning Preferred Experiences Validation of AML compliance models such as statistical testing of customer / transaction risk models, screening algorithm testing, etc. Experience with developing proposals (especially new solutions) Experience working AML technology platforms e.g. Norkom, SAS, Lexis Nexis, etc. Hands on experience with data analytics tools using Informatica, Kafka, etc. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications TempHtmlFile Bachelor’s degree from accredited university Education background in Data Science and Statistics 3+ years of complete hands-on experience in data science and data analytics Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Minimum of 8 Plus years experience in ETL Development. Technical Skills in order of priority: Informatica BDM, Snowflakes Experience. Strong analytical and problem solving skills. Strong understanding of ETL Development best practices, Strong understanding of Database Concepts, Performance Tuning in SQL and Snowflakes. Proven ability to work independently in a dynamic environment with multiple assigned projects and tasks. Outstanding ability to communicate, both verbally and in writing. Having experience in production support. Experience in Banking domain. Locations Bangalore, IN, 562110 Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, β€œthe stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role Sr. Software Developer – This role will be responsible for developing, testing and deploying scalable, modern, and efficient solutions for various financial applications and integrations that are part of the Enterprise Technology Finance Planning & Deal Management Solutions team’s portfolio. Collaborate closely and work with the internal and external teams, provide guidance to junior team members and ensure high quality delivery of software products/solutions. Your Role Accountabilities OPERATIONS/PROJECT MANAGEMENT Participate in all required project meetings and provide status of assigned tasks Build diagrams representing as-is (current state) architecture and target architecture (future state). Communicate technical information to non-technical stakeholders and provide technical guidance to junior team members. Participate and collaborate with rest of the development team to prioritize and deliver projects. Provide post go-live support on the completed projects. Perform routine administrative tasks such as monitoring jobs, address vulnerabilities and assist infrastructure/database teams. Participate in SOX and internal audit testing. ANALYZE/BUILD/TEST/DEPLOY Work on various projects and assist project teams, business analysts and project managers to execute business requirements using standard tools Assist with implementing large-scale, innovative IT solutions to drive business transformation. Build software applications using application specific technology Build and implement cloud-based solutions using AWS Services, 3rd party development tools, APIs, etc. Participate in code reviews and ensure adherence to coding standards and best practices. Address issues and adhoc-tasks assigned. Write and execute unit tests – both manual and automated tests using various tools Perform integration testing and debugging. Perform regression testing. Deploy the developed solutions in PROD using standard methods. Prepare technical documents (TDD, integration diagrams, Knowledge Transfer documents) Assist support team members with resolution of issues in production environment. CONTINUOUS LEARNING & IMPROVEMENT Stay up to date with latest technologies and trends. Attend in-person/online-trainings/conferences to improve skills. Qualifications & Experiences 5+ years of prior experience in a related field (media, entertainment, business development) At least 3 years of experience as a developer in the ERP Finance, Supply Chain Management and Financial Solutions (packaged software, custom solutions, SaaS applications) area. A bachelor’s or master’s degree or foreign equivalent in Computer Science, Information Technology, Management Information Systems, Electronics, Management, or a related technical field Experience developing architecture diagrams, data models using Visio, Lucid Charts Working knowledge of one or more financial systems (ERP- SAP/S4, Oracle ERP Cloud, PeopleSoft Financials, and Workday Financials), Ariba, and eInvoicing platforms Knowledge of AWS, Azure, Informatica, Data Lake. Experience working with SaaS, PaaS, Custom and 3rd Party applications. Superior analytical and problem-solving skills Excellent written and verbal communication Superb relationship building skills. Work collaboratively w/small teams. Ability to handle multiple assignments concurrently. Not Required But Preferred Experience Experience working in a global company. Experience working with distributed team in multiple geographical locations. Some visualization tool knowledge would be helpful (i.e. Tableau, Power BI) Comfortable in working in highly iterative and somewhat unstructured environment. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Job Description An experienced consulting professional who has a broad understanding of solutions, industry best practices, multiple business processes or technology designs within the financial services industry. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle Finance and Risk Products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Strong Technical Resource with experience implementing OFSAA EPM and/Or DIH solutions in the financial services industry. Domain knowledge and experience in OFSAA platform and Applications in a technical capacity. Analyze user requirements, procedure, and problems to automate/improve systems. Career Level - IC3 Responsibilities Our Team OFS consulting team is a team which Implements OFSAA (Oracle Financial Services Analytical Application) products in various Tier1, Tier2, Tier 3 Banks and financial institutions. Key strength of OFS consulting is our experience in deploying OFSAA Product solutions for our clients on a global basis. We have implemented our solutions across the Asia Pacific, Africa, Americas, Middle East and Europe, in leading countries such as United Arab Emirates, Kuwait, Singapore, Malaysia, Japan, Korea, Brazil, America, United Kingdom, Spain, Greece, Jordan, Lebanon Switzerland, India etc. The Oracle OFSAA Product architecture offers a variety of implementation options, ranging from individual sites, local and regional hubs, cloud and centralized compliance. The OFSAA Product technology aligns to the operating model of the firm, rather than requiring their business processes to map to the technology. The OFSAA Applications are categorized under three major categories – ERM – Enterprise Risk Management, EPM – Enterprise Performance Management, FCCM – Financial Crime and Compliance Management. Other Technical / Functional and rest under Data Management Looking for ERM / EPM / IFRS17 Technical Consultant who has Implementation experience of Banking and /or Banking Analytics experience will provide his functional inputs and work closely with Functional subject matter experts and other project team members to successfully deploy the OFSAA products. Should have experience in Installation, Configuration , deployment and execution of EPM Applications like Profitability (PFT), Fund Transfer Pricing (FTP), Balance sheet planning(BSP) , GL – Recon ERM applications like Basel Regulatory capital, CR, MR, ALM, Liquidity risk, Economic capital , operational Risk etc. IFRS (International Financial Reporting Standards) – IFRS9, IFRS17 Install and deploy the latest versions of ERM / EPM solutions in a customer environment. This includes coordinating with customer technical resources to ensure that third party products are correctly configured for integration with OFSAA products. If they do not have OFS ERM / EPM / IFRS17 Product experience, the resource should have experience in implementing similar products like Actimize, Fortent or Norkom compliance solutions in a Technical capacity Support all phases of deploying the OFSAA ERM / EPM / IFRS17 solutions at customer sites including initial installation, patches and upgrades, application configuration, user acceptance testing and go-live. Interact with Functional and Technical Consultants to ensure the successful deployment of the OFSAA Products. Strong customer interaction skills and the ability to assess a client’s IT processes and strategies. In addition, must be able to lead clients through the process of integrating the OFSAA ERM / EPM / IFRS17 solutions into their operational environment. Implementation experience with OFSAA solutions and has been working with Functional consultant to help on data mapping Should be able to come up with the Configuration, Technical design and Batch execution documents. Should have SDLC, Agile concepts and also cloud based implementation experience will be handy. Excellent English written and oral communication skills. Your Opportunity You will be responsible for owning the Technical delivery by liaising with the client IT and Compliance experts in installing OFS ERM / EPM / IFRS17, conducting product technical workshops, Support in data element mapping, documenting of configuration, Technical / Architecture document, work with Functional consultant to configure the application to do the SIT, UAT and Support for Production rollout. Your Qualifications Graduate level degree focused on Computer Science, Software Engineering or Systems Engineering Hands on experience in installing and configuring OFSAA ERM / EPM /IFRS17 and has worked with large implementation involving multi country rollout. Expert in PLSQP, Oracle DB concepts, various Webserver configurations (WebLogic, WebSphere, Tomcat etc.), configuring LDAP, Schedulers like ControlM etc. Years of Experience in Implementation of OFSAA ERM / EPM /IFRS17 or similar products in a Functional capacity Our Ideal Candidate Preferably an OFSAA ERM / EPM /IFRS17 Technical expert who has worked with large Transformation Projects Expert in Agile methodologies and has expertise in Oracle DB / PLSQL querying skills Strong knowledge in OFSAA ERM / EPM /IFRS17 application installation, migration , configuration independently. Desire to get upskilled to latest in Functional areas like Machine learning, Block chain etc. and implement them in the project. Exposure to any of the following tools Informatica, Datastage, Abinitio, Oracle BI Tools, SQL Server BI tool set, Business Objects, Cognos, ERWIN Good understanding of Databases (Oracle, DB2, Exadata) Working knowledge of job scheduling products such as AutoSys and Control M Ability to read and edit Unix shell script files Java, JSP, J2EE standards (studs) with Oracle database Knowledge of any of the Web Servers like WebSphere, WebLogic etc. Excellent English written and oral communication skills. The TCN must be able to clearly articulate OFSAA ERM / EPM / IFRS17 functionality and requirements to both clients and colleagues at all levels, from engineering staff to senior executive management. Your Responsibilities As an integral part of the development team, you will be responsible for the following – Understanding the requirements from the Client Functional and Business Teams and qualify the requirements as per the product functionality. Conducts Workshops, Trainings on the product functional scope. Take care of the functional delivery responsibility by delivering the Mapping support, BRD preparation, support for configuration, testing and rollout Working on Agile/ Water-Scrum-Fall based development methodology. Support the testing phase (System Testing, SIT, UAT, Production) and ensure quick turnaround of defect fixes. Minimum Required Skills 5+ years of experience in OFSAA ERM / EPM / IFRS17 Product implementation in a Technical Capacity or in similar product Worked as part of a team or lead a team in implementing large OFSAA ERM / EPM / IFRS17 Transformation projects Exposure to any of the following tools Informatica, Datastage, Abinitio, Oracle BI Tools, SQL Server BI tool set, Business Objects, Cognos, ERWIN Good understanding of Databases (Oracle, DB2, Exadata) Working knowledge of job scheduling products such as AutoSys and ControlM Java, JSP, J2EE standards (studs) with Oracle database SaaS Cloud implementation knowledge is a plus Knowledge of any of the Web Servers like WebSphere, WebLogic etc Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sectorβ€”and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Description An experienced consulting professional who has a broad understanding of solutions, industry best practices, multiple business processes or technology designs within the financial services industry. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle Finance and Risk Products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Strong Technical Resource with experience implementing OFSAA EPM and/Or DIH solutions in the financial services industry. Domain knowledge and experience in OFSAA platform and Applications in a technical capacity. Analyze user requirements, procedure, and problems to automate/improve systems. Career Level - IC3 Responsibilities Our Team OFS consulting team is a team which Implements OFSAA (Oracle Financial Services Analytical Application) products in various Tier1, Tier2, Tier 3 Banks and financial institutions. Key strength of OFS consulting is our experience in deploying OFSAA Product solutions for our clients on a global basis. We have implemented our solutions across the Asia Pacific, Africa, Americas, Middle East and Europe, in leading countries such as United Arab Emirates, Kuwait, Singapore, Malaysia, Japan, Korea, Brazil, America, United Kingdom, Spain, Greece, Jordan, Lebanon Switzerland, India etc. The Oracle OFSAA Product architecture offers a variety of implementation options, ranging from individual sites, local and regional hubs, cloud and centralized compliance. The OFSAA Product technology aligns to the operating model of the firm, rather than requiring their business processes to map to the technology. The OFSAA Applications are categorized under three major categories – ERM – Enterprise Risk Management, EPM – Enterprise Performance Management, FCCM – Financial Crime and Compliance Management. Other Technical / Functional and rest under Data Management Looking for ERM / EPM / IFRS17 Technical Consultant who has Implementation experience of Banking and /or Banking Analytics experience will provide his functional inputs and work closely with Functional subject matter experts and other project team members to successfully deploy the OFSAA products. Should have experience in Installation, Configuration , deployment and execution of EPM Applications like Profitability (PFT), Fund Transfer Pricing (FTP), Balance sheet planning(BSP) , GL – Recon ERM applications like Basel Regulatory capital, CR, MR, ALM, Liquidity risk, Economic capital , operational Risk etc. IFRS (International Financial Reporting Standards) – IFRS9, IFRS17 Install and deploy the latest versions of ERM / EPM solutions in a customer environment. This includes coordinating with customer technical resources to ensure that third party products are correctly configured for integration with OFSAA products. If they do not have OFS ERM / EPM / IFRS17 Product experience, the resource should have experience in implementing similar products like Actimize, Fortent or Norkom compliance solutions in a Technical capacity Support all phases of deploying the OFSAA ERM / EPM / IFRS17 solutions at customer sites including initial installation, patches and upgrades, application configuration, user acceptance testing and go-live. Interact with Functional and Technical Consultants to ensure the successful deployment of the OFSAA Products. Strong customer interaction skills and the ability to assess a client’s IT processes and strategies. In addition, must be able to lead clients through the process of integrating the OFSAA ERM / EPM / IFRS17 solutions into their operational environment. Implementation experience with OFSAA solutions and has been working with Functional consultant to help on data mapping Should be able to come up with the Configuration, Technical design and Batch execution documents. Should have SDLC, Agile concepts and also cloud based implementation experience will be handy. Excellent English written and oral communication skills. Your Opportunity You will be responsible for owning the Technical delivery by liaising with the client IT and Compliance experts in installing OFS ERM / EPM / IFRS17, conducting product technical workshops, Support in data element mapping, documenting of configuration, Technical / Architecture document, work with Functional consultant to configure the application to do the SIT, UAT and Support for Production rollout. Your Qualifications Graduate level degree focused on Computer Science, Software Engineering or Systems Engineering Hands on experience in installing and configuring OFSAA ERM / EPM /IFRS17 and has worked with large implementation involving multi country rollout. Expert in PLSQP, Oracle DB concepts, various Webserver configurations (WebLogic, WebSphere, Tomcat etc.), configuring LDAP, Schedulers like ControlM etc. Years of Experience in Implementation of OFSAA ERM / EPM /IFRS17 or similar products in a Functional capacity Our Ideal Candidate Preferably an OFSAA ERM / EPM /IFRS17 Technical expert who has worked with large Transformation Projects Expert in Agile methodologies and has expertise in Oracle DB / PLSQL querying skills Strong knowledge in OFSAA ERM / EPM /IFRS17 application installation, migration , configuration independently. Desire to get upskilled to latest in Functional areas like Machine learning, Block chain etc. and implement them in the project. Exposure to any of the following tools Informatica, Datastage, Abinitio, Oracle BI Tools, SQL Server BI tool set, Business Objects, Cognos, ERWIN Good understanding of Databases (Oracle, DB2, Exadata) Working knowledge of job scheduling products such as AutoSys and Control M Ability to read and edit Unix shell script files Java, JSP, J2EE standards (studs) with Oracle database Knowledge of any of the Web Servers like WebSphere, WebLogic etc. Excellent English written and oral communication skills. The TCN must be able to clearly articulate OFSAA ERM / EPM / IFRS17 functionality and requirements to both clients and colleagues at all levels, from engineering staff to senior executive management. Your Responsibilities As an integral part of the development team, you will be responsible for the following – Understanding the requirements from the Client Functional and Business Teams and qualify the requirements as per the product functionality. Conducts Workshops, Trainings on the product functional scope. Take care of the functional delivery responsibility by delivering the Mapping support, BRD preparation, support for configuration, testing and rollout Working on Agile/ Water-Scrum-Fall based development methodology. Support the testing phase (System Testing, SIT, UAT, Production) and ensure quick turnaround of defect fixes. Minimum Required Skills 5+ years of experience in OFSAA ERM / EPM / IFRS17 Product implementation in a Technical Capacity or in similar product Worked as part of a team or lead a team in implementing large OFSAA ERM / EPM / IFRS17 Transformation projects Exposure to any of the following tools Informatica, Datastage, Abinitio, Oracle BI Tools, SQL Server BI tool set, Business Objects, Cognos, ERWIN Good understanding of Databases (Oracle, DB2, Exadata) Working knowledge of job scheduling products such as AutoSys and ControlM Java, JSP, J2EE standards (studs) with Oracle database SaaS Cloud implementation knowledge is a plus Knowledge of any of the Web Servers like WebSphere, WebLogic etc Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sectorβ€”and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

300.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Data Engineer- Customer Lifecycle Engineering, CMR ABOUT US: LSEG (London Stock Exchange Group) is more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a dedication to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It’s how we’ve contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services – and our open-access model – we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. LSEG is headquartered in the United Kingdom, with significant operations in 65 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. LSEG’s ticker symbol is LSEG. OUR PEOPLE: People are at the heart of what we do and drive the success of our business. Our values of Integrity, Partnership, Excellence and Change shape how we think, how we do things and how we help our people fulfil their potential. We embrace diversity and actively seek to attract individuals with unique backgrounds and perspectives. We break down barriers and encourage teamwork, enabling innovation and rapid development of solutions that make a difference. Our workplace generates an enriching and rewarding experience for our people and customers alike. Our vision is to build an inclusive culture in which everyone feels encouraged to fulfil their potential. We know that real personal growth cannot be achieved by simply climbing a career ladder – which is why we encourage and enable a wealth of avenues and interesting opportunities for everyone to broaden and deepen their skills and expertise. As a global organisation spanning 65 countries and one rooted in a culture of growth, opportunity, diversity and innovation, LSEG is a place where everyone can grow, develop and fulfil your potential with meaningful careers. ROLE PROFILE As a Data Engineer, your primary focus will be designing data integration / migration using big data technologies, developing new features, implementing, testing and managing data workloads with in the standard continuous development and continuous delivery processes in a cloud-based environment. TECH PROFILE/ESSENTIAL SKILLS 3-5 years of technical experience. 3 years of experience on data migration using ETL tools. Able to translate business requirements into data solutions. Expert level skill on Informatica IICS or Matillion or equivalent ETL tool which covers from extracting the source to building the complex mappings. Proficient in the use or extract of data from Salesforce. Expert level skill on Power BI/Tableau Knowledge of the Salesforce Data modelling is desirable. Proficient level coding knowledge on Python and SQL. Understanding of Salesforce concepts is desirable. Experience with providing technical solutions and supporting documentation. PREFERRED SKILLS AND EXPERIENCE Must have Experience on Snowflake Storage and Database. Should have experience of working with Cloud Native based applications using AWS/Azure Understanding of the SDLC and agile delivery methodology. Experience working with databases and data, performing data cleanup, and/or data manipulation and migration to and from Salesforce.com. Ability to work independently and in a team environment. Ability to communicate effectively in English with all levels of staff, both orally and written. Ability to handle own work and multitask to meet tight deadlines without losing sight of priorities under minimum supervision. Highly motivated, self-directed individual with a positive & pro-active demeanor to work. Customer and service focused, with determination to meet their needs and expectations. Be driven and committed to the goals and objectives of the team and organization. Experience with JIRA and Confluence EDUCATION AND PROFESSIONAL SKILLS BE/MS degree in Computer Science, Software Engineering or STEM degree (Desirable). Solid English reading/writing capability required. Good communication & articulation skills. Curious about new technologies and tools, creative thinking and initiative taking. Agile related certifications preferable. DETAILED RESPONSIBILITIES Proficient in data discovery, data migration, data quality analysis, end to end data reconciliations by profiling the source systems using ETL /SQL (IICS /Matillion Preferable). Analyses the data quality of the sources and lists the data quality metrics of the source systems. Proficient in MDM and Customer data solutions. Involved in the Single Customer View and Customer 360 implementations. Proficient in the data migration strategies for large scale programs Develops and improves data governance and business data processes within the Technology and business organizations and understands client requirements, specifying and analyzing these to a sufficient level of detail to ensure transparency of definition and ability for technical teams to translate to a technical solution design. Works with developers, architects, and solution designers to translate sophisticated business requirements and provides feedback on technical solutions proposed. Responsible for building and maintaining a relationship with business collaborators and impacted users. Proactively identifies, recommends, and implements improvements to the process as it relates to assigned projects. Frequently keeps up to date with the latest industry development and technology innovations. Flexible in approach, adapting plans and strategies to help manage risks around ambiguity. Strategic problem solver with strong intuition for business and well-versed in current technological trends and business concepts. LSEG PURPOSE AND VALUES Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Underpinning our purpose, our values of Integrity, Partnership, Excellence and Change set the standard for everything we do, every day. They guide the way we interact with each other, the partners we work with and our customers. Delivering on our purpose and living up to our values is a responsibility that we all share. To achieve our ambitions through a strong culture, People Leaders need to role model our Values and create the culture for everyone at LSEG to be at their best. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less

Posted 2 days ago

Apply

1.0 - 3.0 years

2 - 6 Lacs

Bengaluru

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description As a Senior Model builder you are supposed to interact with the Business Analyst or Solution Architect at Onsite and understand the requirements Basic understanding of Source and Target systems. Assign the User stories and assist in Sprint PlanningProvide the velocity to each story by technically assessing the dependencies Create Model, Modules, Lists, Line Items, Subsets, Line Item Subsets, Usage of Calculation functions Create List properties. Ability to use Lookup and Sum functions Ability to work on multiple timescales such as Day/Week/Month/Quarter/Year Integration knowledge of Anaplan Connect and Informatica on Cloud will be helpful. UAT Support and Deployment using ALMAnaplan certification of L1, L2 & L3 At least 1-3 year experience in modelling sales, marketing, planning or finance business processes Experience in Territory Quota planning and Incentive compensation would be an advantage. ͏ Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Anaplan. Experience: 3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

Master Data Management Architect CONMED is a global medical technology company that specializes in the development and manufacturing of surgical devices and equipment. With a mission to empower healthcare professionals to deliver exceptional patient care, CONMED is dedicated to innovation, quality, and excellence in all aspects of our operations. This is a remote opportunity, for people living in India. Role Overview The Master Data Management (MDM) Architect provides technical and administrative support for Master Data Management, focusing on improving data quality and aligning data governance to support data transformation. The role ensures consistent and accurate master data across the enterprise for better decision-making and operational efficiency. The ideal candidate is self-driven and adaptable, meticulously attentive, and brings relevant experience completing major data transformations. Responsibilities Master Data Management Lead MDM projects involving data cleansing, standardization, and migration to support digital transformation initiatives. Collaborate with cross-functional teams across the organization to understand data needs, gather requirements, and ensure alignment with business transformation objectives. Establish data quality metrics, monitor data quality issues, and implement corrective actions to maintain high data integrity throughout the transformation process. Monitor and evaluate the effectiveness of MDM processes, identify opportunities for further optimization to support evolving business needs. Manage on-premise application and administration of Syniti and/or Informatica MDM platform Develop and enforce data governance policies. Train staff on data management protocols. Qualifications Bachelor’s degree in Information Technology, Computer Science, Business Administration, or related field. Experience with Syniti and/or Informatica MDM platform and SAP knowledge preferred. S4HANA. 5+ years of proven experience in managing MDM projects and implementing MDM solutions across large organizations. Strong understanding of data governance principles, data quality best practices, and data cleansing techniques. Strong analytical and problem-solving skills to identify data quality issues and develop solutions. Excellent communication and stakeholder management skills to collaborate with cross-functional teams. Self-driven and adaptable. Preferred 3-5 years of experience supporting ERP transformations Key Competencies Technical Expertise: Experience with MDM projects, data governance principles, data quality best practices, and data cleansing techniques. Analytical Skills: Strong problem-solving abilities to identify and address data quality issues. Communication Skills: Excellent ability to collaborate with cross-functional teams and manage stakeholders. Project Management: Proven experience in managing MDM projects and implementing solutions across large organizations. Adaptability: Self-driven and adaptable to evolving business needs. Platform Knowledge: Preferred experience with the Syniti platform and SAP. Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with β€œlet’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. ● Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules ● Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. ● Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. ● Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. ● Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). ● Experience with scripting languages to automate data testing. ● Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) πŸ”§ Primary Skills Python Spark (PySpark) SQL Delta Lake πŸ“Œ Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies