Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 8.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners . KPI Partners is a leading provider of technology consulting and solutions, specializing in delivering high-quality services that enable organizations to optimize their operations and achieve their strategic objectives. We are committed to empowering businesses through innovative solutions and a strong focus on customer satisfaction. Job Description. We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities. - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.). - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment. Why Join KPI Partners? - Opportunity to work with a talented and diverse team on cutting-edge projects. - Competitive salary and comprehensive benefits package. - Continuous learning and professional development opportunities. - A culture that values innovative thinking and encourages collaboration. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.**
Posted 1 day ago
7.0 years
0 Lacs
Hyderābād
On-site
Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.
Posted 1 day ago
3.0 years
0 Lacs
Gurgaon
On-site
JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. TempHtmlFile About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. About: Our Financial Crimes specialist teams provide solutions to BFSI clients by conducting model validation testing for AML risk models and frameworks, sanctions screening and transaction monitoring system to ensure efficiency and efficacy of underlying frameworks both functionally and statistically. We are looking to hire colleagues with advance data science and analytics skill to support our financial crimes team. You will play a crucial role in helping clients tackle the multifaceted challenges of financial crime. By utilizing advanced analytics and deep technical knowledge, our team aids top clients in reducing risks associated with financial crime, terrorist financing, and sanctions violations. We also work to enhance their screening and transaction monitoring systems. Our team of specialized analysts ensures that leading financial institutions adhere to industry best practices for robust programs and controls. Through a variety of project experiences, you will develop your professional skills, assisting clients in understanding and addressing complex issues, and implementing top-tier solutions to resolve identified problems. Minimum work experience: 3+ years of advance analytics Preferred experience: 1+ years in AML model validation Responsibilities · Support functional SME teams to build data driven Financial Crimes solution · Conduct statistical testing of the screening matching algorithms, risk rating models and thresholds configured for detection rules · Validate data models of AML systems built on systems such as SAS Viya, Actimize, Lexis Nexis, Napier, etc. · Develop, validate, and maintain AML models to detect suspicious activities and transactions. · Conduct Above the Line and Below the Line testing · Conduct thorough model validation processes, including performance monitoring, tuning, and calibration. · Ensure compliance with regulatory requirements and internal policies related to AML model risk management. · Collaborate with cross-functional teams to gather and analyze data for model development and validation. · P erform data analysis and statistical modeling to identify trends and patterns in financial transactions. · Prepare detailed documentation and reports on model validation findings and recommendations. · Assist in feature engineering for improvising Gen AI prompts applicable for automation of AML / Screening related investigations · Use advanced Machine Learning deployment (e.g. XGBoost) and GenAI approaches Criteria: · Bachelor’s degree from accredited university · 3+ years of complete hands-on experience in Python with an experience in Java, Fast, Django, Tornado or Flask frameworks · Working experience in Relational and NoSQL databases like Oracle, MS SQL MongoDB or ElasticSearch · Proficiency BI tools such as Power BI, Tableau, etc. · Proven experience in data model development and testing · Education background in Data Science and Statistics · Strong proficiency in programming languages such as Python, R, and SQL. · Expertise in machine learning algorithms, statistical analysis, and data visualization tools. · Familiarity with regulatory guidelines and standards for AML · Experience in AML related model validation and testing · Expertise in techniques and algorithms to include sampling, optimization, logistic regression, cluster analysis, Neural Networks, Decision Trees, supervised and unsupervised machine learning Preferred experiences: Validation of AML compliance models such as statistical testing of customer / transaction risk models, screening algorithm testing, etc. Experience with developing proposals (especially new solutions) Experience working AML technology platforms e.g. Norkom, SAS, Lexis Nexis, etc. Hands on experience with data analytics tools using Informatica, Kafka, etc. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. QUALIFICATIONS Bachelor’s degree from accredited university · Education background in Data Science and Statistics · 3+ years of complete hands-on experience in data science and data analytics
Posted 1 day ago
0 years
5 - 7 Lacs
Pune
Remote
Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Description Role Overview A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations. What Will You Do In This Role Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements. Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories. Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality. Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management. Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods. Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products. Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams. Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities. Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts. Analyze data requirements and translate them into technical specifications for ETL processes. Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place. Monitor and troubleshoot ETL processes to ensure timely and successful data delivery. Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies. Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer. What Should You Have Bachelor’s degree in information technology, Computer Science or any Technology stream 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration) Integration experience utilizing REST and Custom API integration Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering. Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc. Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency Extensive Experience in design of reusable integration pattern using the cloud native technologies Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow. Experience in Agile development methodologies and release management techniques Excellent analytical and problem-solving skills Good Understanding of data modeling and data architecture principles Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business, Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R353285 Show more Show less
Posted 1 day ago
0 years
5 - 7 Lacs
Pune
Remote
Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less
Posted 1 day ago
5.0 - 7.0 years
3 - 4 Lacs
Chennai
On-site
Overview: 5 to 7 years in experience , Full time WFO Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience OR Hands on development experience in ETL IICS Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts Qualifications: B.E or Any Qualification Essential skills: Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience Proficiency in warehousing architecture techniques Experience in Data Warehouse and/or Data Marts Good communication skills and should be self-sufficient to collaborate with project teams Good to Have Experience in database modeling – Enterprise Data Warehouse Exposure to any other ETL tool like Informatica MYSQL or SQL Server hands-on
Posted 1 day ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
5 to 7 years in experience , Full time WFO Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience OR Hands on development experience in ETL IICS Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts Qualifications B.E or Any Qualification Essential Skills Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience Proficiency in warehousing architecture techniques Experience in Data Warehouse and/or Data Marts Good communication skills and should be self-sufficient to collaborate with project teams Good to Have Experience in database modeling – Enterprise Data Warehouse Exposure to any other ETL tool like Informatica MYSQL or SQL Server hands-on Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Required Skills And Experience 8+ years in IT operations, scheduling, and workflow automation using Control-M. Strong experience integrating Control-M with AWS cloud services. Hands-on experience working with enterprise ETL tools like Ab Initio or Informatica. Experience supporting data migration and orchestration involving modern cloud data platforms like Snowflake. Proficiency in Python scripting for automation and custom tooling around Control-M. Familiarity with real-time data streaming platforms such as Kafka or Kinesis. Solid understanding of job scheduling concepts, batch processing, and event-driven automation. Experience with CI/CD pipelines, Git, and automation of deployment workflows. Strong troubleshooting, root cause analysis, and incident resolution skills. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, IT, or related field. Experience managing large-scale Control-M environments in enterprise settings. Knowledge of cloud data architecture and modern data engineering practices. Familiarity with Snowflake features and cloud data warehousing concepts. Certification in Control-M Administration or related scheduling tools is a plus. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a detail-oriented Data Test Engineer to join our data migration and cloud modernization team. The ideal candidate will have hands-on experience testing complex ETL pipelines, data migration workflows, and cloud data platforms like Snowflake, with exposure to legacy ETL tools such as Ab Initio or Informatica. Experience in automating data validation, performance testing, and supporting real-time ingestion using Kafka or similar technologies is essential. ________________________________________ Key Responsibilities Design, develop, and execute test plans for data migration projects moving data from legacy systems to Snowflake. Validate data pipelines developed using ETL tools like Ab Initio and Informatica, ensuring data quality, accuracy, and integrity. Develop automated test scripts and frameworks using Python for data validation, reconciliation, and regression testing. Perform end-to-end data validation including schema validation, volume checks, transformation logic verification, and performance benchmarking. Test real-time data ingestion workflows integrating Kafka, Snowpipe, and Snowflake COPY commands. Collaborate closely with development, data engineering, and DevOps teams to identify defects, track issues, and ensure timely resolution. Participate in designing reusable test automation frameworks tailored for cloud data platforms. Ensure compliance with data governance, security, and regulatory requirements during testing. Document test cases, results, and provide clear reporting to stakeholders. Support CI/CD pipelines by integrating automated testing into the deployment workflow. ________________________________________ Required Skills And Experience 5+ years in data testing or quality assurance with strong experience in data validation and ETL testing. Hands-on experience testing data migrations to Snowflake or other cloud data warehouses. Familiarity with legacy ETL tools like Ab Initio or Informatica and their testing methodologies. Proficient in scripting languages such as Python for test automation and data validation. Knowledge of real-time data streaming platforms such as Kafka, Kinesis, or equivalents. Strong SQL skills for writing complex queries to validate data integrity and transformations. Experience with automated testing tools and frameworks for data quality checks. Understanding of cloud environments, particularly AWS services (S3, Lambda, Glue). Familiarity with CI/CD tools and practices to integrate automated testing. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Experience with performance and load testing of data pipelines. Knowledge of data governance and compliance frameworks. Exposure to BI tools such as Tableau, Power BI for validating data consumption layers. Certifications in data quality or cloud platforms (Snowflake, AWS) are a plus Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview The Compliance team of ERF APS provides end-to-end technology solutions for applications supporting the Global Financial Crime, Global Compliance, Operational Risk and Trade Surveillance . The team is engaged in over 100+ applications for Production Support & related activities. GBAMT’s type of work and services include architecture, design, development, change management, implementation and support using a wide range of technologies. The APS India team provides extensive support of these applications by adapting the ITIL processes. Job Description The Production Services Specialist is responsible for ensuring availability and reliability of assigned production applications within a line of business (LOBs) of Compliance (Sanctions and AML). To perform this function, the resource will have to perform daily tasks assigned by the Production Support Lead. These tasks are aligned to the key services outlined within the Global Support Services Service Catalog: Monitoring Incident Management Request Management Disaster Recovery / Application Recovery Certification Exercise Metrics Reporting Application Capacity Management Responsibilities User and batch job issue resolution, Manage highly critical applications - support BAU and ensure no business impact Creating support documentation and updating existing documentation Initiate the incident management process & lead triage when BAU is impacted Provide regular communication & generate reports to all the stake holders Investigation of root cause analysis and irreversible corrective action Identify risk and drive remediations – ensure any day audit ready situation for the team Will work closely with Business Partners and Development team Hands on support of the applications Disaster recovery test coordination, preparation and execution Monitoring of daily batch processing Timely adherence to all deliverables Mentor / guide team members on issues/queries and other deliverables Perform Capacity Management for applications in scope Requirements Education Degree from a reputed university Certifications If Any: Informatica, Oracle Experience Range: 2 - 5 years Foundational skills: Unix / Shell Scripting Oracle/SQL Server Informatica Autosys ITRS/SPLUNK/Dynatrace Desired skills: Knowledge on Incident Management (ITSM Remedy, MyITSM) ITIL Process Knowledge Excellent verbal and written communication Willing to be flexible sometimes with providing stand-by out of hours support on rotational basis for production system (as needed) Good understanding of financial/banking industry Creative and strong problem solving skills Excellent written and verbal communications skills Ability to operate in high-pressure situations Results oriented, and must be able to effectively interact with Senior Management and Business Partners Self-driven. Work Timings Shift 1: 7:30 AM to 4:30 PM Shift 2: 11:30 AM to 8:30 PM Rotational weekend shifts. Job Location: Chennai Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 day ago
2.5 years
0 Lacs
Mumbai, Maharashtra, India
On-site
SQL-ETL Developer Role & responsibilities: Preferred candidate profile: - 2.5+ years of experience in SQL-ETL Developing, Informatica , SQL, Stored Procedures & Hands-on. Other Details: Experience: 2.5 + years Location: Mumbai Desirable: Experienced in SQL-ETL Developing, Informatica , SQL, Stored Procedures. Work Mode: Work from Office Immediate Joining (30 days Notice Period Preferred) Company Description Infocus Technologies Pvt Ltd is a Kolkata-based consulting company that provides SAP, ERP & Cloud consulting services. The company is an ISO 9001:2015 DNV certified, CMMI Level 3 Certified company, and a Gold partner of SAP in Eastern India. Infocus helps customers to migrate and host SAP infrastructure on AWS cloud. Its services in the ERP domain include implementation, version upgrades, and Enterprise Application Integration (EAI) solutions. Show more Show less
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Title - QA Manual Testing Experience - 5-8 Years Location - Pune & Gurgaon (Hybrid) Key Responsibilities: Understand business requirements and data flows to create comprehensive test plans and test cases for ETL jobs. Perform data validation and reconciliation between source systems, staging, and target data stores (DWH, data lakes, etc.). Develop and execute automated and manual tests to ensure data accuracy and quality. Work with SQL queries to validate data transformations and detect anomalies. Identify, document, and track defects and inconsistencies in data processing. Collaborate with data engineering and BI teams to improve ETL processes and data pipelines. Maintain QA documentation and contribute to continuous process improvements. Must Have Skills: Strong SQL skills – ability to write complex queries for data validation and transformation testing. Hands-on experience in ETL testing – validating data pipelines, transformations, and data loads. Knowledge of data warehousing concepts – dimensions, facts, slowly changing dimensions (SCD), etc. Experience in test case design, execution, and defect tracking . Experience with QA tools like JIRA , TestRail , or equivalent. Ability to work independently and collaboratively in an Agile/Scrum environment. Good to Have Skills: Experience with ETL tools like Informatica, Talend, DataStage , or Azure/AWS/GCP native ETL services (e.g., Dataflow, Glue). Knowledge of automation frameworks using Python/Selenium/pytest or similar tools for data testing. Familiarity with cloud data platforms – Snowflake, BigQuery, Redshift, etc. Basic understanding of CI/CD pipelines and QA integration. Exposure to data quality tools such as Great Expectations , Deequ , or DQ frameworks . Understanding of reporting/BI tools such as Power BI, Tableau, or Looker. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Show more Show less
Posted 1 day ago
9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Lead Software Quality Automation Engineer Experience: 9 to 12 Years Notice period: Candidates with an official notice period of maximum 1 month All key skills must be clearly mentioned in the project details section of the resume. Validate relocation cases thoroughly. Word Mode : Hybrid (2-3 days WHO/ week) Ready to work in flexible working hours and collaborate with US/India/Colombia teams Excellent communication skills (written, verbal, listening, and articulation) Candidate should have team leading experience. (Minimum 2 reportees) Responsibilities Perform lead role in ETL testing, UI testing, DB testing and team management. Understand the holistic requirements, review and analyse stories, specifications, and technical design documents and develop detailed test cases and test data to ensure business functionality is thoroughly tested – both Automation & Manual. Validate ETL workflows, ensuring data integrity, accuracy and the Transformation rules using complex Snowflake SQL queries. Working Knowledge on DBT is a PLUS Create, Execute and maintain scripts on automation in BDD – Gherkin/Behave, Pytest. Experience in writing DB queries (preferably in Postgres/ Snowflake/ MySQL/ RDS) Preparation, review and update of test cases and relevant test data consistent with system requirements including functional, integration & regression, UAT testing. Coordinate with cross team subject matter experts to develop, maintain, and validate test scenarios to the best interest of that POD. Taking ownership on creating and maintaining artifacts on: Test strategy, BRD, Defect count/leakage report and different quality issues. Collaborate with DevOps/SRE team to integrate test automation into CI/CD pipelines (Jenkins, Rundeck, GitHub etc.) Should have the ability to oversee and guide a team of min 4 testers, lead them by example, institutionalizing best practices in testing processes & automation in agile methodology. Meet with internal stakeholders to review current testing approaches, provide feedback on ways to improve / extend / automate along with data backed inputs and provisioning senior leadership with metrics consolidation. Maximize the opportunity to excel in an open and recognized work culture. Be a problem solver and a team player. Requirements 8-11 years of strong expertise in STLC, defect management, Test Strategy designing, planning and approach. Should have experience with Test requirement understanding, test data, test plan & test case designing. Should have minimum 6+ years of strong work experience in UI, Database, ETL testing. Experience in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL). Any experience with AWS/Cloud hosted applications is an added advantage. Hands-on experience in writing DB queries (preferably in postgres/ Snowflake/ MySQL/ RDS) Should have 3+ years of experience with automation scripts execution, maintenance & enhancements with Selenium web-driver (v3+)/playwright, with programming experience in Python (MUST) with BDD – Gherkin and Behave, Pytest. Key competencies required: Strong analytical, Problem-Solving, Communication skills, Collaboration, Accountability, Stakeholder management, passion to drive initiatives, Risk highlighting and Team leading capabilities. Proven Team leadership experience with min 2 people reporting. Experienced working with Agile methodologies, such as Scrum, Kanban. MS Power BI reporting. Front end vs Back end validation – Good to have. Advantage if, Has Healthcare/Life Sciences domain experience Has a working knowledge on manual and automation testing, and ETL testing Professional Approach Ready to work in flexible working hours and collaborate with US/India/Colombia teams Skills: automation testing,etl testing,ms power bi,db testing,behave,postgres,rds,dbt,snowflake sql,rundeck,gherkin,mysql,software quality automation,etl/data warehouse testing (dbt/informatica, snowflake, sql),github,selenium web-driver,python,ui testing,jenkins,pytest Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
India
On-site
We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. About the Role The candidate will be responsible for leading data modeling initiatives and ensuring compliance with healthcare regulations while collaborating with various stakeholders to translate business requirements into technical solutions. Responsibilities: Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management. Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment). Create and maintain data lineage documentation and data dictionaries for healthcare datasets. Establish data modeling standards and best practices across the organization. Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica. Architect scalable data solutions that handle large volumes of healthcare transactional data. Collaborate with data engineers to optimize data pipelines and ensure data quality. Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI). Design data models that support analytical, reporting and AI/ML needs. Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations. Partner with business stakeholders to translate healthcare business requirements into technical data solutions. Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements. Establish data quality monitoring and validation processes for critical health plan metrics. Lead efforts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications: Technical Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data. Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches. Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing. Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks). Proficiency with data modeling tools (Hackolade, ERwin, or similar). Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data. Experience with healthcare data standards and medical coding systems. Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment). Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI). Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments. Strong analytical and problem-solving skills with ability to work with ambiguous requirements. Excellent communication skills with ability to explain technical concepts to business stakeholders. Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations. Cloud platform certifications (AWS, Azure, or GCP). Experience with real-time data streaming and modern data lake architectures. Knowledge of machine learning applications in healthcare analytics. Previous experience in a lead or architect role within healthcare organizations. Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
India
Remote
About Lemongrass Lemongrass is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers. Our customers span multiple verticals and geographies across the Americas, EMEA and APAC. We partner with AWS, SAP, Microsoft and other global technology leaders. We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law. What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you’ll find helpful State of the art tech: An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, religion, color, national origin, religious creed, gender, sexual orientation, gender identity, gender expression, age, genetic information, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. ● Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules ● Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. ● Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. ● Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. ● Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). ● Experience with scripting languages to automate data testing. ● Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Data Fullstack - Descriptive Analytics Location: Chennai Work Type: Onsite Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required: Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred: Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: AWS, Python, SQL, Airflow, DBT Must have done 1 or 2 projects in Clinical Domain/Clinical Industry. Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Contract: 6 months Location: Remote in India • Mandatory 5+ years of experience in Informatica MDM C360 Cloud solutions. • Not on-prem experience, Cloud experience is a must. • Experienced and highly skilled SME, responsible for providing knowledge and guidance in the implementation and optimization of Informatica MDM C360 Cloud solutions. • Collaborate with business stakeholders and technical teams to understand their master data management challenges and requirements, help with the designing MDM solutions using Informatica C360 Cloud, and assisting in the implementation, configuration, and maintenance of these solutions. • Responsible for developing the master data management system, including data integration, data modeling and data migration, ensuring data quality, data integration, and consistency across enterprise systems. • Play a key role in handling critical production data issues and collaborating with cross-functional teams to deliver high-quality solutions around Customer MDM. • Provide architecture and design, use case development, and solution implementation advice, responding promptly to internal customer questions with technical explanations of product features and capabilities when needed, and being able to prepare and deliver unique solution presentations or technical proposals. QUALIFICATIONS AND SKILLS • Bachelor's or master’s degree in Computer Science, Information Systems, or a related field. • 5+ years of experience in Data Management. • 3+ years of hands-on experience in Informatica SaaS solutions, preferable in Informatica Intelligent Data Management Cloud (IDMC) Customer 360. • Experience implementing full lifecycle MDM projects. • Hands-on experience as an MDM expert/specialist or a similar role, specifically with Informatica MDM Customer 360 and/or Multidomain MDM and/or Reference 360, including handling critical production data issues, hot fixes, and patches. • Strong understanding of master data management concepts, data governance principles, and data integration strategies. • Experience in designing and implementing MDM solutions, data models, and data hierarchies. • Proficiency in data profiling, data cleansing, and data matching techniques. • Excellent analytical and problem-solving skills with the ability to translate business requirements into technical solutions. • Strong communication and interpersonal skills to effectively collaborate with clients and cross-functional teams. • Hands-on experience in Microsoft PowerBI and Snowflake is desirable. • Strong ability and passion to document things and present to different audiences. • Strong understanding of MDM best practices and industry standards in the customer domain. • Experience in integrating external business applications with MDM hub. • Strong understanding of MDM architectures and business processes. • Solid understanding of Data integration, Data Quality, Data Architecture, and Master Data Management. • Familiarity with other related Informatica services is plus. • Relevant certifications in Informatica MDM, such as Informatica MDM Developer or Administrator, are a plus. Show more Show less
Posted 1 day ago
7.0 years
0 Lacs
India
On-site
Job Title: Informatica Architect Job Type: Full-time, Contractor Location: Hybrid- Bengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products. As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains. Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools. Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization. Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies. Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use. Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources. Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities. Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs. Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies. At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production. Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms. In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices. Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs. Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders. Experience collaborating with diverse teams to deliver robust data governance and analytics solutions. Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian. Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms. Hands-on experience implementing open-source data catalog tools in enterprise environments. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Systems Engineering Practitioner Location: Chennai Duration: 12 Months Work Type: Onsite Position Description The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings! One of our esteemed client Japanese multinational information technology (IT) service and consulting company headquartered in Tokyo, Japan. The company acquired Italy -based Value Team S.p.A. and launched Global One Teams. Join this dynamic, high-impact firm where innovation meets opportunity — and take your career to new height s! 🔍 We Are Hiring: Informatica Administrator (5-10 years) Note - We need pure Informatica Admin, No Developer profiles required. Shift Timings: 9am to 6pm Rel Exp – 5+ years Work Location and Address: Hi-tech City Layout, Madhapur, Hyderabad - 500 081 Interview process - 2 Round (1 round of in-person is a MUST) Mandate skills - Informatica Administration in MDM-E360/PIM-P360 Oracle DB Unix Kafka configuration is an addon. JD - To install, configure, manage, and support Informatica MDM and PIM platforms, ensuring high availability, performance, and data integrity for enterprise-level master and product data domains. Installation & Configuration Install and configure Informatica MDM (Hub, IDD, E360) and PIM (Informatica Product 360). Set up application tiers including database, application server (WebLogic/JBoss/Tomcat), and web server. Configure integration points with source/target. Experience in upgradation of PC, IDQ, MDM/E360, PIM/P360 to higher versions. Experience in migrating PC, IDQ, MDM/E360, PIM/P360 objects and good at trouble shooting the performance bottle necks. Interested candidates, please share your updated resume along with the following details : Total Experience: Relevant Experience in Informatica Admin: Current Loc Current CTC: Expected CTC: Notice Period: 🔒 We assure you that your profile will be handled with strict confidentiality. 📩 Apply now and be part of this incredible journey Thanks, Syed Mohammad!! syed.m@anlage.co.in Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2