Jobs
Interviews

1529 Talend Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

11.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are seeking an experienced Senior Manager - Business Intelligence - to lead and expand our BI team. This role requires a visionary leader who can drive strategic decision-making through data insights, foster a data-driven culture, and manage end-to-end BI processes. The ideal candidate will bring over 11 years of expertise in Business Intelligence, Data Warehousing, and Analytics, with a proven track record of delivering scalable BI solutions. Responsible for leading the designs and implementation of leading edge solutions to meet our clients’ increasing needs to maximize the potential of data in realizing their business goals. A key aspect of your role will be to maintain an insightful understanding of the evolving technology and industry trends in the Business Intelligence and Analytics landscape and applying those insights to the development strategy and solution offering portfolio, enabling us to best meet current and future client requirements and market opportunities. KEY RESPONSIBILITES: Primarily responsible for the successful delivery of client engagements and initiatives. Leading a team of BI and Analytics professionals. Provide subject matter expert input and advisory role input to client engagements as required – strategy and solution advisory, technical resource management, client workshop facilitation and overall engagement management. Support business development activities, grow your network and contribute to our proposal development process as required – identify and develop relationships with new and existing business units. Design, develop, and maintain the data warehouse and analytics architecture to meet business analysis and reporting needs, where the scale and reliability of a modern distributed system is at the heart. Ability to understand business needs and apply analytical concepts to generate actionable business insights. Should also be able to handle assigned tasks in the capacity of individual contributors. Collaborating and mentoring the team of data analysts to provide development coverage, support, and knowledge sharing. Understand client’s priority areas and business challenge to recommend solutions. Setting goals and driving the roadmap for the business intelligence team. ESSENTIAL SKILLS: Entrepreneurial mindset and should be able to identify projects and lead them from conceptualization to the solution. 10+ years of work experience, in Business Intelligence, ETL and Data Warehousing. Extensive knowledge of at least 2-3 data visualization tools like Tableau, Power BI, DOMO, Google Data Studio Hands-on experience of using database methodology, data analysis, advanced SQL queries, ETL and business intelligence applications. Hands on experience in multiple ETL platforms like Talend, Alteryx, Informatica. Must have implemented multiple Data warehousing projects from the design phase. Should be well versed with databases like PostgreSQL, MySQL, MS SQL. Experience in working with various data sources: Web Services, application API, XML, Relational Databases, Document Oriented NoSQL databases, etc. Experience with big data storage and query engines like AWS Redshift, Big Query. Experience with cloud infrastructure providers like Amazon AWS, Google Cloud or Azure Strong experience of people management and project management. Must have educational background from Premium institute.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a skilled Data Engineer with over 6+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer, Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 6+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Talend Big Data . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

6.0 - 11.0 years

6 - 16 Lacs

Bengaluru

Work from Office

Responsibilities: Collaborate with cross-functional teams on MuleSoft integrations Ensure data accuracy through testing and validation processes Design, develop & maintain ETL solutions using Talend

Posted 1 month ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data Governance Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities: Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics. Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds. Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery). Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra. Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility. Drive root cause analysis and remediation plans for data quality issues. Support metadata and lineage enrichment to improve data traceability. Document standards, rule logic, and DQ policies in the Collibra Catalog. Conduct user training and promote data quality best practices across teams. Required Skills and Experience: 3+ years of experience in data quality, metadata management, or data governance. Hands-on experience with Collibra Data Quality & Observability (CDQ) platform. Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer. Proficiency in SQL and understanding of data profiling techniques. Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.). Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.). Excellent analytical, problem-solving, and communication skills. Additional Information: - The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability. - This position is based in Mumbai. - A 15 years full time education is required. 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Hyderabad

Work from Office

Position: Talend Production Support Location: Hyderabad, 5 days' Work from the office Shifts: Rotational shifts Education : BE OR B.Tech or MCA, or M.Tech only Interview rounds 1st Technical Virtual , 2nd Technical F2F, 3rd VP level F2F Please share your updated resumes with kjallepalli@innominds.com Roles and Responsibilities Participate in on-call rotation for 24x7 production support. Provide technical support for Talend ETL tool, ensuring timely resolution of issues and excellent customer service. Troubleshoot complex problems related to data integration, processing, and quality control using SQL queries and debugging techniques. Collaborate with cross-functional teams to identify root causes of errors and implement solutions that meet business requirements. Develop expertise in multiple modules within the application, including Open Studio, Data Quality, Data Management, and more. Please share details below Full name: Contact number Email id Current Company Reason for the change in the detailed How many companies worked with Current CTC Expected CTC Any offers in hand / Pipeline Notice period Education details Current location Preferred location Regards, Kiran Kumar Jallepalli(please connect me on LinkedIn)

Posted 1 month ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas (Oracle FCCM) Scenario Development experience throughout the full development life cycle Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstrable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Data Engineering Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 1 Minimum 15 years of Full-time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing insights and recommendations to enhance application functionality and user experience. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Advanced Proficiency in Snowflake Data Cloud Technology, DBT and Cloud Data warehousing - Good To Have Skills: Experience with Talend - Strong analytical skills to interpret complex data sets. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Cloud Technology - A Minimum 15 Years of Full time Education is required. 1 Minimum 15 years of Full-time education

Posted 1 month ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Chennai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake. Experience5-8 Years.

Posted 1 month ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Talend DI. Experience5-8 Years.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake. Experience5-8 Years.

Posted 1 month ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Pune

Work from Office

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com Mandatory Skills Tableau . Experience 5-8 Years . Expertise in Tableau, including proficiency in Advanced Analytic Proven experience in developing and working on Tableau driven dashboards and analytics Experience in working with large and complex data sets in Tableau while maximizing the performance of workbook. Ability to write queries to create calculated fields. Working knowledge of Tableau Architecture. Knowledge of interactive data visualization best practices. Understanding and hands-on use of ETL tooling (Tableau Prep, Talend etc ) and replication processes to centralize and standardize data coming from single or multiple sources) Solid understanding of RDBMS and Data Modelling with hands-on experience in SQL. Empirical understanding of the data systems and processes within the ecosystem (Source System, Data Feeds, Reference Data, Data Blending, Data Warehouse and Marts, Data ETL). Knowledge and experience with full SDLC lifecycle and Agile development methodologies. Exposure to Big Data ecosystem Skills Hands-on experience in Tableau REST and JavaScript API Experience with any scripting (Python, JavaScript) language CI / CD pipelines (Harness, Jenkins) Working knowledge of Tableau Administration. Experience in working on Finance business domain among others. Experience and desire to work in a diverse/ multiple stakeholder and global delivery environment Mandatory Skills: Tableau. Experience5-8 Years.

Posted 1 month ago

Apply

3.0 - 10.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Snowflake Must Have- Relevant Experience 3-10 years Snowflake Dev exp. not migration SQL Basic and Advanced SQL is must (joins, null handling, performance tuning, windowing functions- partition, rank etc.) o Know how of Architecture o Stored Procs o ETL/ELT/ETLT, pipeline Basic Python scripting Snowflake features Time Travel, Zero copy cloning, Data Sharing Good to Have-Python Advanced , DBT

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249139

Posted 1 month ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Testing Sr Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual. Responsibilities: Supports initiatives related to User Acceptance Testing (UAT) process and product rollout into production. Testing specialists who work with technology project managers, UAT professionals and users to design and implement appropriate scripts/plans for an application testing strategy/approach. Tests and analyzes a broad range of systems and applications to ensure they meet or exceed specified standards and end-user requirements. Work closely with key stakeholders to understand business and functional requirements to develop test plans, test cases and scripts. Works complex testing assignments. Executes test scripts according to application requirements documentation. Identifies defects and recommends appropriate course of action; performs root cause analyses. Coordinates multiple testers and testing activities within a project. Retests after corrections are made to ensure problems are resolved. Documents, evaluates and researches test results for future replication. Identifies, recommends and implements process improvements to enhance testing strategies. Analyzes requirements and design aspects of projects. Interfaces with client leads and development teams. Exhibits sound understanding of concepts and principles in own technical area and a basic knowledge of these elements in other areas. Requires in-depth understanding of how own area integrates within IT testing and has basic commercial awareness. Makes evaluative judgments based on analysis of factual information in complicated and novel situations. Participate in test strategy meetings, Has direct impact on the team and closely related teams by ensuring the quality of the tasks services information provided by self and others. Requires sound and comprehensive communication and diplomacy skills to exchange complex information. Provide metrics related to the cost, effort, and milestones of Quality activities on a project level Acts as advisor and mentor for junior members of the team. Regularly assumes informal/formal leadership role within teams. Perform other duties and functions as assigned Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 10+ years Testing Analyst experience Familiarity with the Software Development Lifecycle (SDLC) and how Quality Assurance methodology fits into the SDLC Knowledge of relevant operating systems, languages and database tools Knowledge of defect tracking systems and processes; including change management Knowledge of automated regression testing tools. Experience of testing trading platforms or similar software. Ability to work under pressure during tight dead lines Requires methodical approach to testing and problem solving. Requires theoretical and analytical skills, with demonstrated ability in planning and operations Excellent communication and stakeholder management skills with a proactive attitude, always seeking opportunities to add value Specific software languages will be dependent of area of business Education: Bachelor’s/University degree or equivalent experience We are seeking a highly skilled ETL Automation Tester with strong expertise in complex SQL, file-to-database validation, and data quality assurance. The ideal candidate will have hands-on experience validating various feed file formats (.csv, .json, .xls), and be comfortable with automation frameworks and tools for enhancing test efficiency. This role involves close collaboration with developers, data engineers, and stakeholders to ensure the integrity, consistency, and quality of data across our systems. Experience in testing reporting systems with Cognos/Tableau is required. Key Responsibilities: Lead the end-to-end validation of ETL processes, including data extraction, transformation, and loading validation across large volumes of structured and semi-structured data. Drive data quality assurance initiatives by defining test strategy, creating comprehensive test plans, and executing test cases based on data mapping documents and transformation logic. Validate file-based feeds (.csv, .json, .xls, etc.) by ensuring accurate ingestion into target data warehouse environments. Develop and optimize complex SQL queries to perform deep data audits, aggregation checks, and integrity validations across staging and warehouse layers. Own the defect lifecycle using tools like JIRA, providing high-quality defect reporting and traceability across all testing cycles. Collaborate with business analysts, developers, and data architects to ensure test alignment with business expectations and technical design. Perform report-level validations in tools such as Cognos or Tableau, ensuring consistency between backend data and visual representations. Mentor junior testers, review test artifacts, and guide the team in best practices for ETL testing and documentation. Contribute to QA process improvements, testing templates, and governance initiatives to standardize data testing practices across projects. Required Skills: Strong hands-on experience in ETL and data warehouse testing. Advanced proficiency in SQL and strong experience with RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). In-depth experience with file-to-database validation and knowledge of various data formats. Proven track record in test strategy design, test planning, and defect management for large-scale data migration or ETL initiatives. Experience with ETL tools like Talend, or custom data processing scripts (tool-specific expertise not mandatory). Strong understanding of data modeling concepts, referential integrity, and transformation rules. Familiarity with Agile methodologies and experience working in fast-paced environments with iterative delivery. Excellent communication, stakeholder management, and documentation skills. Good to Have: Exposure to BI tools like Cognos, Tableau, etc. for end-user report validation. Prior experience in validating front-end UI connected to data dashboards or reports. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Technology Quality ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

🔍 Hiring: Big Data Engineer (AWS + SQL Expertise) 📍 Locations: Chennai (Primary), Gurugram, Pune 💼 Experience Level: A – 6 years | AC – 8 years ✅ Key Skills Required (Must Have): Cloud: AWS Big Data Stack – S3, Glue, Athena, EMR Programming: Python, Spark, SQL, Mulesoft, Talend, dbt Data Warehousing & ETL: Redshift / Snowflake, ETL/ELT pipeline development Data Handling: Structured & semi-structured data transformation Process & Optimization: Data ingestion, transformation, performance tuning ✅ Preferred Skills (Good to Have): AWS Data Engineer Certification Experience with Spark, Hive, Kafka, Kinesis, Airflow Familiarity with ServiceNow, Jira (ITSM tools) Data modeling experience 🎯 Key Responsibilities: Build scalable data pipelines (ETL/ELT) for diverse data sources Clean, validate, and transform data for business consumption Develop data model objects (views, tables) for downstream usage Optimize storage and query performance in AWS environments Collaborate with business & technical stakeholders Support code reviews, documentation, and incident resolution. 🎓 Qualifications: Bachelor's in Computer Science, IT, or related field (or equivalent experience) Strong problem-solving, communication, and documentation skills

Posted 1 month ago

Apply

12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Skill : Technical Business Analyst Exp : 10+ Yrs Location : Gurgaon Key Responsibilities Requirements Gathering & Analysis – Engage with stakeholders (business users, risk, compliance, operations) to elicit detailed functional and non-functional requirements. – Translate complex banking processes into clear user stories, process flows, and specification documents. Solution Design & Validation – Collaborate with data engineers, architects, and BI teams to define data models, ETL pipelines, and integration points. – Validate technical designs against business needs and regulatory requirements (e.g., KYC, AML, Basel norms). Development Support – Leverage your hands-on development experience (SQL, Python/Java) to prototype data extracts, transformations, and reports. – Assist the development team with code reviews, test data setup, and troubleshooting. Testing & Quality Assurance – Define acceptance criteria; design and execute system, integration, and user-acceptance test cases. – Coordinate defect triage and ensure timely resolution. Documentation & Training – Maintain up-to-date functional specifications, data dictionaries, and user guides. – Conduct workshops and training sessions for users and support teams. Project & Stakeholder Management – Track project deliverables, highlight risks and dependencies, and communicate progress. – Act as the primary point of contact between business, IT, and external vendors. Required Skills & Experience Domain Expertise: – 10–12 years’ experience in banking (retail, corporate, or investment) with focus on data-driven initiatives. – Strong understanding of banking products (loans, deposits, payments) and regulatory landscape. Technical Proficiency: – Hands-on development background: advanced SQL; scripting in Python or Java. – Experience designing and supporting ETL processes (Informatica, Talend, or equivalent). – Familiarity with data warehousing concepts, dimensional modeling, and metadata management. Exposure to cloud data services (AWS Redshift, Azure Synapse, GCP Big Query) is a plus. Analytical & Process Skills: – Solid experience in data profiling, data quality assessment, and root-cause analysis. – Comfortable with Agile methodologies; adept at sprint planning and backlog management. Communication & Collaboration: – Excellent verbal and written skills; able to explain technical concepts to non-technical audiences. – Proven track record of stakeholder management at all levels.

Posted 1 month ago

Apply

5.0 years

2 - 9 Lacs

Mumbai Metropolitan Region

On-site

Senior MIS Analyst Industry: Digital Transformation & Analytics Consulting We empower enterprises to unlock data-driven efficiency by building robust Management Information Systems (MIS) that turn raw operational data into strategic insight. Join our on-site analytics hub in India and steer mission-critical reporting initiatives end-to-end. Role & Responsibilities Own the complete MIS lifecycle—data extraction, transformation, validation, visualization, and scheduled distribution. Design automated dashboards and reports in Excel/Power BI that track KPIs, SLAs, and cost metrics for cross-functional leadership. Write optimized SQL queries and ETL scripts to consolidate data from ERP, CRM, and cloud platforms into a single reporting warehouse. Establish strong data governance, ensuring integrity, version control, and auditability of all reports. Collaborate with finance, operations, and technology teams to gather requirements, translate into reporting specs, and deliver within committed timelines. Mentor junior analysts on advanced Excel functions, VBA macros, and visualization best practices. Skills & Qualifications Must-Have Bachelor’s degree in Information Systems, Computer Science, or equivalent. 5+ years professional experience in MIS or Business Intelligence. Expert-level Excel with pivots, Power Query, and VBA scripting. Proficiency in SQL and relational databases (MySQL/SQL Server/PostgreSQL). Hands-on building interactive dashboards in Power BI or Tableau. Demonstrated ability to translate raw data into executive-ready insights. Preferred Experience with ETL tools (Informatica, Talend) or Python pandas. Knowledge of cloud data stacks (Azure Synapse, AWS Redshift, or GCP BigQuery). Understanding of statistical methods for forecasting and trend analysis. Benefits & Culture Highlights High-ownership role with direct visibility to C-suite decision makers. Continuous learning budget for BI certifications and advanced analytics courses. Collaborative, innovation-first culture that rewards data-driven thinking. Apply now to transform complex datasets into strategic clarity and accelerate your analytics career. Skills: business intelligence,ims,automation,sql,fms,google sheets,data governance,vba,excel,etl,analytics,pms,data visualization,dashboard design,data analysis,advanced,looker,power bi,dashboards

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 3 to 6 years Role Description The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels. Key Responsibilities Development & Engineering: Design, code, test, and implement scalable and efficient data pipelines. Develop features in accordance with requirements and low-level design. Write optimized, clean code using Python, PySpark, SQL, and ETL tools. Conduct unit testing and validate data integrity. Maintain comprehensive documentation of work. Monitoring & Support Monitor dashboards, pipelines, and databases across assigned shifts. Identify, escalate, and resolve anomalies using defined SOPs. Collaborate with L2/L3 teams to ensure timely issue resolution. Analyze trends and anomalies using SQL and Excel. Process Adherence & Contribution Follow configuration and release management processes. Participate in estimation, knowledge sharing, and defect management. Adhere to SLA and compliance standards. Contribute to internal documentation and knowledge bases. Mandatory Skills Strong command of SQL for data querying and analysis. Proficiency in Python or PySpark for data manipulation. Experience in ETL tools (any of the following): Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow. Experience working with cloud platforms (AWS, Azure, or GCP). Hands-on experience with data validation and performance tuning. Working knowledge of data schemas and data modeling. Good To Have Skills Certification in Azure, AWS, or GCP (foundational or associate level). Familiarity with monitoring tools and dashboard platforms. Understanding of data warehouse concepts. Exposure to BigQuery, ADLS, or similar services. Soft Skills Excellent written and verbal communication in English. Strong attention to detail and analytical skills. Ability to work in a 24x7 shift model, including night shifts. Ability to follow SOPs precisely and escalate issues appropriately. Self-motivated with minimal supervision. Team player with good interpersonal skills. Outcomes Expected Timely and error-free code delivery. Consistent adherence to engineering processes and release cycles. Documented and trackable issue handling with minimal escalations. Certification and training compliance. High availability and uptime of monitored pipelines and dashboards. Skills Sql,Data Analysis,Ms Excel,Dashboards

Posted 1 month ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s In It For You Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills And Qualifications Required Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India

Posted 1 month ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant - Sr.Data Enginee r ( DBT +Snowfl ake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake's advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities :  Bachelor’s degree in Computer Science, Data Engineering, or a related field. e xperience in data engineering, wit h experience working with Snowflake.  Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI.  Strong proficiency in SQL, Python, and data modeling .  Experience with data integration tools (e.g., Matillion , Talend, Informatica).  Knowledge of cloud platforms such as AWS, Azure, or GCP.  Excellent problem-solving skills, with a focus on data quality and performance optimization.  Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 24, 2025, 10:43:09 PM Unposting Date Dec 22, 2025, 2:43:09 AM Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply

0 years

5 - 9 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Principal Consultant- Sr.Data Engineer ( DBT +Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake's advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities :  Bachelor’s degree in Computer Science, Data Engineering, or a related field.  experience in data engineering, with e xperience working with Snowflake.  Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI.  Strong proficiency in SQL, Python, and data modeling .  Experience with data integration tools (e.g., Matillion , Talend, Informatica).  Knowledge of cloud platforms such as AWS, Azure, or GCP.  Excellent problem-solving skills, with a focus on data quality and performance optimization.  Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 24, 2025, 11:18:37 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply

0 years

3 - 8 Lacs

Hyderābād

On-site

Job Description: Job Purpose Intercontinental Exchange, Inc. (ICE) presents a unique opportunity to work with cutting-edge technology to provide solutions to business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. We are seeking an Integration Developer to join our collaborative Enterprise Information Management team to support the delivery of solutions to various business organizations. This candidate will be a significant part of the Integration team to support cross-system application and data integrations. The candidate will be working with a team of experts in data, ETL, and Integrations. This position requires technical proficiency as well as an eager attitude, professionalism, and solid communication skills. An Integration Developer will be a member of the team who drives strategy for tools and development. This person will not have direct reports. Responsibilities Build, maintain, and support applications in a global software platform and various other corporate systems, tools, and scripts Collaborate with other internal groups to translate business and functional requirements into technical implementation for the automation of existing processes and the development of new applications Communicate with internal customers in non-technical terms, understand business requirements, and propose solutions Manage projects from specification gathering, to development, to QA, user acceptance testing, and deployment to production Document changes and follow proper SDLC procedures Enhances team and coworkers through knowledge sharing and implementing best practices in day to day activities Takes initiative to continually learn and enhance technical knowledge and skills. Knowledge and Experience BS degree preferably in CS or EE, or a related discipline 2 – 3 yr. experience as an integration developer using applications like Talend or MuleSoft or any other. Familiarity with building multi-threaded application, and some understanding of distributed system like Kafka, Rabbit MQ Experience in developing REST based services Familiarity with different data formats like JSON, XML etc. High proficiency in RDBMS concepts and SQL Understanding of design patterns and object-oriented design concepts Experience with deployment automation tools such as Jenkins, Artifactory, Maven Strong written and verbal communication skills Ability to multitask and work independently on multiple projects Preferred Linux, Bash, SSH Familiarity Experience with application like Salesforce, ServiceNow, ORMB and other financial applicatons Financial industry expertise

Posted 1 month ago

Apply

3.0 years

6 - 9 Lacs

Noida

On-site

Job Description Job ID ANALY014365 Employment Type Regular Work Style on-site Location Noida,UP,India Role Analytics Consultant II Company Overview: With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Description The Analytics Consultant II (Level-2) is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant II will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities include: Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 3+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal-opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 1 month ago

Apply

0 years

0 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Associate - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake's advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities:  Bachelor’s degree in Computer Science, Data Engineering, or a related field.  experience in data engineering, with experience working with Snowflake.  Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI.  Strong proficiency in SQL, Python, and data modeling .  Experience with data integration tools (e.g., Matillion, Talend, Informatica).  Knowledge of cloud platforms such as AWS, Azure, or GCP.  Excellent problem-solving skills, with a focus on data quality and performance optimization.  Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Associate Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 24, 2025, 8:23:36 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies