Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
5 to 7 years in experience , Full time WFO Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience OR Hands on development experience in ETL IICS Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts Qualifications B.E or Any Qualification Essential Skills Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience Proficiency in warehousing architecture techniques Experience in Data Warehouse and/or Data Marts Good communication skills and should be self-sufficient to collaborate with project teams Good to Have Experience in database modeling – Enterprise Data Warehouse Exposure to any other ETL tool like Informatica MYSQL or SQL Server hands-on Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Required Skills And Experience 8+ years in IT operations, scheduling, and workflow automation using Control-M. Strong experience integrating Control-M with AWS cloud services. Hands-on experience working with enterprise ETL tools like Ab Initio or Informatica. Experience supporting data migration and orchestration involving modern cloud data platforms like Snowflake. Proficiency in Python scripting for automation and custom tooling around Control-M. Familiarity with real-time data streaming platforms such as Kafka or Kinesis. Solid understanding of job scheduling concepts, batch processing, and event-driven automation. Experience with CI/CD pipelines, Git, and automation of deployment workflows. Strong troubleshooting, root cause analysis, and incident resolution skills. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, IT, or related field. Experience managing large-scale Control-M environments in enterprise settings. Knowledge of cloud data architecture and modern data engineering practices. Familiarity with Snowflake features and cloud data warehousing concepts. Certification in Control-M Administration or related scheduling tools is a plus. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a detail-oriented Data Test Engineer to join our data migration and cloud modernization team. The ideal candidate will have hands-on experience testing complex ETL pipelines, data migration workflows, and cloud data platforms like Snowflake, with exposure to legacy ETL tools such as Ab Initio or Informatica. Experience in automating data validation, performance testing, and supporting real-time ingestion using Kafka or similar technologies is essential. ________________________________________ Key Responsibilities Design, develop, and execute test plans for data migration projects moving data from legacy systems to Snowflake. Validate data pipelines developed using ETL tools like Ab Initio and Informatica, ensuring data quality, accuracy, and integrity. Develop automated test scripts and frameworks using Python for data validation, reconciliation, and regression testing. Perform end-to-end data validation including schema validation, volume checks, transformation logic verification, and performance benchmarking. Test real-time data ingestion workflows integrating Kafka, Snowpipe, and Snowflake COPY commands. Collaborate closely with development, data engineering, and DevOps teams to identify defects, track issues, and ensure timely resolution. Participate in designing reusable test automation frameworks tailored for cloud data platforms. Ensure compliance with data governance, security, and regulatory requirements during testing. Document test cases, results, and provide clear reporting to stakeholders. Support CI/CD pipelines by integrating automated testing into the deployment workflow. ________________________________________ Required Skills And Experience 5+ years in data testing or quality assurance with strong experience in data validation and ETL testing. Hands-on experience testing data migrations to Snowflake or other cloud data warehouses. Familiarity with legacy ETL tools like Ab Initio or Informatica and their testing methodologies. Proficient in scripting languages such as Python for test automation and data validation. Knowledge of real-time data streaming platforms such as Kafka, Kinesis, or equivalents. Strong SQL skills for writing complex queries to validate data integrity and transformations. Experience with automated testing tools and frameworks for data quality checks. Understanding of cloud environments, particularly AWS services (S3, Lambda, Glue). Familiarity with CI/CD tools and practices to integrate automated testing. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Experience with performance and load testing of data pipelines. Knowledge of data governance and compliance frameworks. Exposure to BI tools such as Tableau, Power BI for validating data consumption layers. Certifications in data quality or cloud platforms (Snowflake, AWS) are a plus Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 2 days ago
2.5 years
0 Lacs
Mumbai, Maharashtra, India
On-site
SQL-ETL Developer Role & responsibilities: Preferred candidate profile: - 2.5+ years of experience in SQL-ETL Developing, Informatica , SQL, Stored Procedures & Hands-on. Other Details: Experience: 2.5 + years Location: Mumbai Desirable: Experienced in SQL-ETL Developing, Informatica , SQL, Stored Procedures. Work Mode: Work from Office Immediate Joining (30 days Notice Period Preferred) Company Description Infocus Technologies Pvt Ltd is a Kolkata-based consulting company that provides SAP, ERP & Cloud consulting services. The company is an ISO 9001:2015 DNV certified, CMMI Level 3 Certified company, and a Gold partner of SAP in Eastern India. Infocus helps customers to migrate and host SAP infrastructure on AWS cloud. Its services in the ERP domain include implementation, version upgrades, and Enterprise Application Integration (EAI) solutions. Show more Show less
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Title - QA Manual Testing Experience - 5-8 Years Location - Pune & Gurgaon (Hybrid) Key Responsibilities: Understand business requirements and data flows to create comprehensive test plans and test cases for ETL jobs. Perform data validation and reconciliation between source systems, staging, and target data stores (DWH, data lakes, etc.). Develop and execute automated and manual tests to ensure data accuracy and quality. Work with SQL queries to validate data transformations and detect anomalies. Identify, document, and track defects and inconsistencies in data processing. Collaborate with data engineering and BI teams to improve ETL processes and data pipelines. Maintain QA documentation and contribute to continuous process improvements. Must Have Skills: Strong SQL skills – ability to write complex queries for data validation and transformation testing. Hands-on experience in ETL testing – validating data pipelines, transformations, and data loads. Knowledge of data warehousing concepts – dimensions, facts, slowly changing dimensions (SCD), etc. Experience in test case design, execution, and defect tracking . Experience with QA tools like JIRA , TestRail , or equivalent. Ability to work independently and collaboratively in an Agile/Scrum environment. Good to Have Skills: Experience with ETL tools like Informatica, Talend, DataStage , or Azure/AWS/GCP native ETL services (e.g., Dataflow, Glue). Knowledge of automation frameworks using Python/Selenium/pytest or similar tools for data testing. Familiarity with cloud data platforms – Snowflake, BigQuery, Redshift, etc. Basic understanding of CI/CD pipelines and QA integration. Exposure to data quality tools such as Great Expectations , Deequ , or DQ frameworks . Understanding of reporting/BI tools such as Power BI, Tableau, or Looker. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Show more Show less
Posted 2 days ago
9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Lead Software Quality Automation Engineer Experience: 9 to 12 Years Notice period: Candidates with an official notice period of maximum 1 month All key skills must be clearly mentioned in the project details section of the resume. Validate relocation cases thoroughly. Word Mode : Hybrid (2-3 days WHO/ week) Ready to work in flexible working hours and collaborate with US/India/Colombia teams Excellent communication skills (written, verbal, listening, and articulation) Candidate should have team leading experience. (Minimum 2 reportees) Responsibilities Perform lead role in ETL testing, UI testing, DB testing and team management. Understand the holistic requirements, review and analyse stories, specifications, and technical design documents and develop detailed test cases and test data to ensure business functionality is thoroughly tested – both Automation & Manual. Validate ETL workflows, ensuring data integrity, accuracy and the Transformation rules using complex Snowflake SQL queries. Working Knowledge on DBT is a PLUS Create, Execute and maintain scripts on automation in BDD – Gherkin/Behave, Pytest. Experience in writing DB queries (preferably in Postgres/ Snowflake/ MySQL/ RDS) Preparation, review and update of test cases and relevant test data consistent with system requirements including functional, integration & regression, UAT testing. Coordinate with cross team subject matter experts to develop, maintain, and validate test scenarios to the best interest of that POD. Taking ownership on creating and maintaining artifacts on: Test strategy, BRD, Defect count/leakage report and different quality issues. Collaborate with DevOps/SRE team to integrate test automation into CI/CD pipelines (Jenkins, Rundeck, GitHub etc.) Should have the ability to oversee and guide a team of min 4 testers, lead them by example, institutionalizing best practices in testing processes & automation in agile methodology. Meet with internal stakeholders to review current testing approaches, provide feedback on ways to improve / extend / automate along with data backed inputs and provisioning senior leadership with metrics consolidation. Maximize the opportunity to excel in an open and recognized work culture. Be a problem solver and a team player. Requirements 8-11 years of strong expertise in STLC, defect management, Test Strategy designing, planning and approach. Should have experience with Test requirement understanding, test data, test plan & test case designing. Should have minimum 6+ years of strong work experience in UI, Database, ETL testing. Experience in ETL/data warehouse testing (DBT/Informatica, Snowflake, SQL). Any experience with AWS/Cloud hosted applications is an added advantage. Hands-on experience in writing DB queries (preferably in postgres/ Snowflake/ MySQL/ RDS) Should have 3+ years of experience with automation scripts execution, maintenance & enhancements with Selenium web-driver (v3+)/playwright, with programming experience in Python (MUST) with BDD – Gherkin and Behave, Pytest. Key competencies required: Strong analytical, Problem-Solving, Communication skills, Collaboration, Accountability, Stakeholder management, passion to drive initiatives, Risk highlighting and Team leading capabilities. Proven Team leadership experience with min 2 people reporting. Experienced working with Agile methodologies, such as Scrum, Kanban. MS Power BI reporting. Front end vs Back end validation – Good to have. Advantage if, Has Healthcare/Life Sciences domain experience Has a working knowledge on manual and automation testing, and ETL testing Professional Approach Ready to work in flexible working hours and collaborate with US/India/Colombia teams Skills: automation testing,etl testing,ms power bi,db testing,behave,postgres,rds,dbt,snowflake sql,rundeck,gherkin,mysql,software quality automation,etl/data warehouse testing (dbt/informatica, snowflake, sql),github,selenium web-driver,python,ui testing,jenkins,pytest Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
India
On-site
We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. About the Role The candidate will be responsible for leading data modeling initiatives and ensuring compliance with healthcare regulations while collaborating with various stakeholders to translate business requirements into technical solutions. Responsibilities: Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management. Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment). Create and maintain data lineage documentation and data dictionaries for healthcare datasets. Establish data modeling standards and best practices across the organization. Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica. Architect scalable data solutions that handle large volumes of healthcare transactional data. Collaborate with data engineers to optimize data pipelines and ensure data quality. Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI). Design data models that support analytical, reporting and AI/ML needs. Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations. Partner with business stakeholders to translate healthcare business requirements into technical data solutions. Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements. Establish data quality monitoring and validation processes for critical health plan metrics. Lead efforts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications: Technical Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data. Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches. Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing. Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks). Proficiency with data modeling tools (Hackolade, ERwin, or similar). Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data. Experience with healthcare data standards and medical coding systems. Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment). Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI). Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments. Strong analytical and problem-solving skills with ability to work with ambiguous requirements. Excellent communication skills with ability to explain technical concepts to business stakeholders. Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations. Cloud platform certifications (AWS, Azure, or GCP). Experience with real-time data streaming and modern data lake architectures. Knowledge of machine learning applications in healthcare analytics. Previous experience in a lead or architect role within healthcare organizations. Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
India
Remote
About Lemongrass Lemongrass is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers. Our customers span multiple verticals and geographies across the Americas, EMEA and APAC. We partner with AWS, SAP, Microsoft and other global technology leaders. We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law. What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you’ll find helpful State of the art tech: An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, religion, color, national origin, religious creed, gender, sexual orientation, gender identity, gender expression, age, genetic information, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics Show more Show less
Posted 2 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : Data Testing Engineer Exp : 8+ years Location : Hyderabad and Gurgaon (Hybrid) Notice Period : Immediate to 15 days Job Description : Develop, maintain, and execute test cases to validate the accuracy, completeness, and consistency of data across different layers of the data warehouse. ● Test ETL processes to ensure that data is correctly extracted, transformed, and loaded from source to target systems while adhering to business rules ● Perform source-to-target data validation to ensure data integrity and identify any discrepancies or data quality issues. ● Develop automated data validation scripts using SQL, Python, or testing frameworks to streamline and scale testing efforts. ● Conduct testing in cloud-based data platforms (e.g., AWS Redshift, Google BigQuery, Snowflake), ensuring performance and scalability. ● Familiarity with ETL testing tools and frameworks (e.g., Informatica, Talend, dbt). ● Experience with scripting languages to automate data testing. ● Familiarity with data visualization tools like Tableau, Power BI, or Looker Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Data Fullstack - Descriptive Analytics Location: Chennai Work Type: Onsite Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required: Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred: Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Summary: We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: AWS, Python, SQL, Airflow, DBT Must have done 1 or 2 projects in Clinical Domain/Clinical Industry. Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Contract: 6 months Location: Remote in India • Mandatory 5+ years of experience in Informatica MDM C360 Cloud solutions. • Not on-prem experience, Cloud experience is a must. • Experienced and highly skilled SME, responsible for providing knowledge and guidance in the implementation and optimization of Informatica MDM C360 Cloud solutions. • Collaborate with business stakeholders and technical teams to understand their master data management challenges and requirements, help with the designing MDM solutions using Informatica C360 Cloud, and assisting in the implementation, configuration, and maintenance of these solutions. • Responsible for developing the master data management system, including data integration, data modeling and data migration, ensuring data quality, data integration, and consistency across enterprise systems. • Play a key role in handling critical production data issues and collaborating with cross-functional teams to deliver high-quality solutions around Customer MDM. • Provide architecture and design, use case development, and solution implementation advice, responding promptly to internal customer questions with technical explanations of product features and capabilities when needed, and being able to prepare and deliver unique solution presentations or technical proposals. QUALIFICATIONS AND SKILLS • Bachelor's or master’s degree in Computer Science, Information Systems, or a related field. • 5+ years of experience in Data Management. • 3+ years of hands-on experience in Informatica SaaS solutions, preferable in Informatica Intelligent Data Management Cloud (IDMC) Customer 360. • Experience implementing full lifecycle MDM projects. • Hands-on experience as an MDM expert/specialist or a similar role, specifically with Informatica MDM Customer 360 and/or Multidomain MDM and/or Reference 360, including handling critical production data issues, hot fixes, and patches. • Strong understanding of master data management concepts, data governance principles, and data integration strategies. • Experience in designing and implementing MDM solutions, data models, and data hierarchies. • Proficiency in data profiling, data cleansing, and data matching techniques. • Excellent analytical and problem-solving skills with the ability to translate business requirements into technical solutions. • Strong communication and interpersonal skills to effectively collaborate with clients and cross-functional teams. • Hands-on experience in Microsoft PowerBI and Snowflake is desirable. • Strong ability and passion to document things and present to different audiences. • Strong understanding of MDM best practices and industry standards in the customer domain. • Experience in integrating external business applications with MDM hub. • Strong understanding of MDM architectures and business processes. • Solid understanding of Data integration, Data Quality, Data Architecture, and Master Data Management. • Familiarity with other related Informatica services is plus. • Relevant certifications in Informatica MDM, such as Informatica MDM Developer or Administrator, are a plus. Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
India
On-site
Job Title: Informatica Architect Job Type: Full-time, Contractor Location: Hybrid- Bengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products. As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains. Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools. Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization. Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies. Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use. Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources. Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities. Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs. Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies. At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production. Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms. In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices. Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs. Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders. Experience collaborating with diverse teams to deliver robust data governance and analytics solutions. Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian. Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms. Hands-on experience implementing open-source data catalog tools in enterprise environments. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Systems Engineering Practitioner Location: Chennai Duration: 12 Months Work Type: Onsite Position Description The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required GCP, Tekton, GitHub, TERRAFORM, Powershell, Openshift Experience Required Position Qualifications: Bachelor's Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations (OR) System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like Power BI, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA or vault keys, etc.) Experience Preferred Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Greetings! One of our esteemed client Japanese multinational information technology (IT) service and consulting company headquartered in Tokyo, Japan. The company acquired Italy -based Value Team S.p.A. and launched Global One Teams. Join this dynamic, high-impact firm where innovation meets opportunity — and take your career to new height s! 🔍 We Are Hiring: Informatica Administrator (5-10 years) Note - We need pure Informatica Admin, No Developer profiles required. Shift Timings: 9am to 6pm Rel Exp – 5+ years Work Location and Address: Hi-tech City Layout, Madhapur, Hyderabad - 500 081 Interview process - 2 Round (1 round of in-person is a MUST) Mandate skills - Informatica Administration in MDM-E360/PIM-P360 Oracle DB Unix Kafka configuration is an addon. JD - To install, configure, manage, and support Informatica MDM and PIM platforms, ensuring high availability, performance, and data integrity for enterprise-level master and product data domains. Installation & Configuration Install and configure Informatica MDM (Hub, IDD, E360) and PIM (Informatica Product 360). Set up application tiers including database, application server (WebLogic/JBoss/Tomcat), and web server. Configure integration points with source/target. Experience in upgradation of PC, IDQ, MDM/E360, PIM/P360 to higher versions. Experience in migrating PC, IDQ, MDM/E360, PIM/P360 objects and good at trouble shooting the performance bottle necks. Interested candidates, please share your updated resume along with the following details : Total Experience: Relevant Experience in Informatica Admin: Current Loc Current CTC: Expected CTC: Notice Period: 🔒 We assure you that your profile will be handled with strict confidentiality. 📩 Apply now and be part of this incredible journey Thanks, Syed Mohammad!! syed.m@anlage.co.in Show more Show less
Posted 2 days ago
15.0 years
0 Lacs
India
On-site
SAS Solution Designer We are seeking a highly experienced SAS Solution Designer to join our team in a solution engineering lead capacity. This role requires in-depth knowledge of SAS technologies, cloud-based platforms, and data solutions. The ideal candidate will be responsible for end-to-end solution design aligned with enterprise architecture standards and business objectives, providing technical leadership across squads and development teams. Mitra AI is currently looking for experienced SAS Solution Designers who are based in India and are open to relocating. This is a hybrid opportunity in Sydney, Australia. JOB SPECIFIC DUTIES & RESPONSIBILITIES Own and define the end-to-end solution architecture for data platforms, ensuring alignment with business objectives, enterprise standards, and architectural best practices. Design reliable, stable, and scalable SAS-based solutions that support long-term operational effectiveness. Lead solution engineers and Agile squads to ensure the delivery of high-quality, maintainable data solutions. Collaborate independently with business and technical stakeholders to understand requirements and translate them into comprehensive technical designs. Provide high-level estimates for proposed features and technical initiatives to support business planning and prioritization. Conduct and participate in solution governance forums to secure approval for data designs and strategies. Drive continuous improvement by identifying technical gaps and implementing best practices, emerging technologies, and enhanced processes. Facilitate work breakdown sessions and actively participate in Agile ceremonies such as sprint planning and backlog grooming. Ensure quality assurance through rigorous code reviews, test case validation, and enforcement of coding and documentation standards. Troubleshoot complex issues by performing root cause analysis, log reviews, and coordination with relevant teams for resolution. Provide mentoring and coaching to solution engineers and technical leads to support skills growth and consistency in solution delivery. REQUIRED COMPETENCIES AND SKILLS Deep expertise in SAS technologies and ecosystem. Strong proficiency in cloud-based technologies and data platforms (e.g., Azure, Hadoop, Teradata). Solid understanding of RDBMS, ETL/ELT tools (e.g., Informatica), and real-time data streaming. Ability to work across relational and NoSQL databases and integrate with various data and analytics tools. Familiarity with BI and reporting tools such as Tableau and Power BI. Experience guiding Agile delivery teams, supporting full-stack solution development through DevOps and CI/CD practices. Capability to define and implement secure, scalable, and performant data solutions. Strong knowledge of metadata management, reference data, and data lineage concepts. Ability to communicate effectively with both technical and non-technical stakeholders. Problem-solving mindset with attention to detail and an emphasis on delivering high-quality solutions. REQUIRED EXPERIENCE AND QUALIFICATIONS Minimum of 15+ years of experience in solution design and development roles, including leadership responsibilities. Strong exposure to SAS and enterprise data platforms in the financial services industry. Prior experience working within risk, compliance, or credit risk domains is highly desirable. Practical experience with Agile methodologies and DevOps principles. Bachelors or Masters degree in Computer Science, Engineering, Information Technology, or related field. Experience working in cross-functional teams with a focus on business alignment and technology delivery. Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Greater Kolkata Area
On-site
We are looking for a Senior Data Lead to lead enterprise-level data modernization and innovation. In this highly strategic role, you will design scalable, secure, and future-ready data architectures, modernize legacy systems, and provide trusted technical leadership across both technology and business teams. This is a unique opportunity to make a company-wide impact by influencing data strategy and enabling smarter, faster decision-making through data. Key Responsibilities Architect & Design: Lead the development of robust, scalable data models, data management systems, and integration frameworks to ensure enterprise-wide data accuracy, consistency, and security. Domain Expertise: Act as a subject matter expert across key business functions such as Supply Chain, Product Engineering, Sales & Marketing, Manufacturing, Finance, and Legal. Modernization Leadership: Drive the transformation of legacy systems and manage end-to-end cloud migrations with minimal business disruption. Collaboration: Partner with data engineers, scientists, analysts, and IT leaders to build high-performance, scalable data pipelines and transformation solutions. Governance & Compliance: Establish and maintain data governance frameworks including metadata repositories, data dictionaries, and data lineage documentation. Strategic Advisory: Provide guidance on data architecture best practices, technology selection, and roadmap alignment to senior leadership and cross-functional teams. Mentorship: Serve as a mentor and thought leader to junior data professionals, fostering a culture of innovation, knowledge sharing, and technical excellence. Innovation & Trends: Stay abreast of emerging technologies in cloud, data platforms, and AI/ML to identify and implement innovative solutions. Communication: Translate complex technical concepts into clear, actionable insights for technical and non-technical audiences alike. Required Qualifications 10+ years of experience in data architecture, engineering, or enterprise data management roles. Demonstrated success leading large-scale data initiatives in life sciences or other highly regulated industries. Deep expertise in modern data architecture paradigms such as Data Lakehouse, Data Mesh, or Data Fabric. Strong hands-on experience with cloud platforms like AWS, Azure, or Google Cloud Platform (GCP). Proficiency in data modeling, ETL/ELT frameworks, and enterprise integration patterns. Deep understanding of data governance, metadata management, master data management (MDM), and data quality practices. Experience with tools and platforms including but not limited to: Data Integration: Informatica, Talend Data Governance: Collibra Modeling/Transformation: dbt Cloud Platforms: Snowflake, Databricks Excellent problem-solving skills with the ability to translate business requirements into scalable data solutions. Exceptional communication skills and experience engaging with both executive stakeholders and engineering teams. Preferred Qualifications (Nice to Have) Experience with AI/ML data pipelines or real-time streaming architectures. Certifications in cloud technologies (e.g., AWS Certified Solutions Architect, Azure Data Engineer). Familiarity with regulatory frameworks such as GxP, HIPAA, or GDPR. Show more Show less
Posted 2 days ago
50.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details:- location : Hyderabad Mode Of Work : Hybrid Notice Period : Immediate Joiners Experience : 6-8 yrs Type Of Hire : Contract to Hire JOB DESCRIPTION: • Understanding of Spark core concepts like RDD’s, DataFrames, DataSets, SparkSQL and Spark Streaming. • Experience with Spark optimization techniques. • Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. • Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. • Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. • Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) • Understanding of data quality best practices and data validation techniques. Other Skills: • Understanding of data warehouse concepts, data modelling techniques. • Expertise in Git for code management. • Familiarity with CI/CD pipelines and containerization technologies. • Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio" Show more Show less
Posted 2 days ago
4.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS! TCS is hiring for Informatica PowerCenter With -Teradata/Oracle Location: - Chennai Desired Experience Range: 4 - 6 Years Must-Have Informatica, IICS Teradata/Oracle Unix Good-to-Have Control-M Designing and impact analysis experience with above technologies Agile scrum experience Exposure in data ingestion from disparate sources onto big data platform Thanks Anshika Show more Show less
Posted 2 days ago
7.0 years
0 Lacs
Gurugram, Haryana
On-site
Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India
Posted 2 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less
Posted 2 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
ABOUT US: Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 61 offices in 39 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN plays a critical role in supporting Bain's case teams globally to help with analytics and research across all industries, for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. WHO YOU’LL WORK WITH: This role is based out of the People & ORG CoE which sits in the broader Data & Tech cluster at the BCN. People & ORG CoE works on building and deploying analytical solutions pertaining to Operating Model and Organization Practice. The team primarily helps Bain case teams, across geographies and industries, solve critical client issues by applying battle-proven diagnostics/ solutions that can identify client pain points related to org, culture, and talent. The team also plays a significant role in creating, testing, and contributing to the proprietary products and Bain IP within the Org domain. This role will focus on development, maintenance and evolution of the state-of-the art org tool and data assets. WHAT YOU’LL DO: Become an expert and own, maintain and evolve advanced internal tools (Python focused) as well as help develop new tools with LLMs and GenAI, in individual capacity Be responsible for end-to-end handling of the entire tool process, i.e., developing Python scripts, troubleshooting errors, etc. Help with case delivery related to those tools and generate meaningful insights for Bain clients Potentially build and maintain internal web applications using front-end technologies (HTML, CSS, JavaScript) and frameworks like Streamlit; ensure compatibility across devices and browsers Work under the guidance of a Team Manager / Sr. Team Manager, playing a key role in driving the team’s innovation, especially on GenAI topics – identifying areas for automation and augmentation, helping team create efficiency gains Lead internal team calls and effectively communicate data, knowledge, insights and actionable next steps on tool development, relaying implications to own internal team where necessary Keep abreast of new and current statistical, database, machine learning, and advanced analytics techniques ABOUT YOU: Candidate should be a Graduate/ Post-graduate from top-tier college with strong academic background Must-have relevant experience of 4+ years on Python, with experience using/ building tools using GenAI, LLMs or Machine Learning will be preferred Advanced understanding of database design and Azure/ AWS servers functioning would be preferred Good to have experience in SQL, Git, and hands-on experience with statistical and machine learning models (e.g., regression, classification, clustering, NLP, ensemble methods), including practical application in business contexts Good to have experience of HTML, CSS, JavaScript (ES6+), pgadmin and low-code development tools such as Streamlit, Mendix, Power Apps Experience with data science/ data analytics and ETL tools such as Alteryx, Informatica, will be a plus Must be able to generate and screen realistic answers based on sound reality checks and recommend actionable solutions Must be willing to own and maintain high visibility and high impact product Experience in managing productized solutions will be helpful Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines WHAT MAKES US A GREAT PLACE TO WORK: We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents. Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Intelligent Cloud Services. - Strong understanding of application development methodologies. - Experience with cloud-based application deployment and management. - Familiarity with data integration and transformation processes. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Intelligent Cloud Services. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Description About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. TempHtmlFile About KPMG In India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. About Our Financial Crimes specialist teams provide solutions to BFSI clients by conducting model validation testing for AML risk models and frameworks, sanctions screening and transaction monitoring system to ensure efficiency and efficacy of underlying frameworks both functionally and statistically. We are looking to hire colleagues with advance data science and analytics skill to support our financial crimes team. You will play a crucial role in helping clients tackle the multifaceted challenges of financial crime. By utilizing advanced analytics and deep technical knowledge, our team aids top clients in reducing risks associated with financial crime, terrorist financing, and sanctions violations. We also work to enhance their screening and transaction monitoring systems. Our team of specialized analysts ensures that leading financial institutions adhere to industry best practices for robust programs and controls. Through a variety of project experiences, you will develop your professional skills, assisting clients in understanding and addressing complex issues, and implementing top-tier solutions to resolve identified problems. Minimum work experience: 3+ years of advance analytics Preferred experience: 1+ years in AML model validation Responsibilities Support functional SME teams to build data driven Financial Crimes solution Conduct statistical testing of the screening matching algorithms, risk rating models and thresholds configured for detection rules Validate data models of AML systems built on systems such as SAS Viya, Actimize, Lexis Nexis, Napier, etc. Develop, validate, and maintain AML models to detect suspicious activities and transactions. Conduct Above the Line and Below the Line testing Conduct thorough model validation processes, including performance monitoring, tuning, and calibration. Ensure compliance with regulatory requirements and internal policies related to AML model risk management. Collaborate with cross-functional teams to gather and analyze data for model development and validation. P erform data analysis and statistical modeling to identify trends and patterns in financial transactions. Prepare detailed documentation and reports on model validation findings and recommendations. Assist in feature engineering for improvising Gen AI prompts applicable for automation of AML / Screening related investigations Use advanced Machine Learning deployment (e.g. XGBoost) and GenAI approaches Criteria Bachelor’s degree from accredited university 3+ years of complete hands-on experience in Python with an experience in Java, Fast, Django, Tornado or Flask frameworks Working experience in Relational and NoSQL databases like Oracle, MS SQL MongoDB or ElasticSearch Proficiency BI tools such as Power BI, Tableau, etc. Proven experience in data model development and testing Education background in Data Science and Statistics Strong proficiency in programming languages such as Python, R, and SQL. Expertise in machine learning algorithms, statistical analysis, and data visualization tools. Familiarity with regulatory guidelines and standards for AML Experience in AML related model validation and testing Expertise in techniques and algorithms to include sampling, optimization, logistic regression, cluster analysis, Neural Networks, Decision Trees, supervised and unsupervised machine learning Preferred Experiences Validation of AML compliance models such as statistical testing of customer / transaction risk models, screening algorithm testing, etc. Experience with developing proposals (especially new solutions) Experience working AML technology platforms e.g. Norkom, SAS, Lexis Nexis, etc. Hands on experience with data analytics tools using Informatica, Kafka, etc. Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Qualifications TempHtmlFile Bachelor’s degree from accredited university Education background in Data Science and Statistics 3+ years of complete hands-on experience in data science and data analytics Show more Show less
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.
The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum
A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect
In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis
As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.