Home
Jobs

585 Talend Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. About UHG United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibilities Gather, analyze and document business requirements. At the same time leveraging knowledge of claims, clinical and other healthcare systems ETL jobs Development using Talend, Python, Cloud Based Data-Warehouse, Jenkins, Kafka and an orchestration tool Writing advanced SQL queries Create and interpret functional and technical specifications and design documents Understand the business and how various data elements and subject areas are utilized in order to develop and deliver the reports to business Be an SME either on Claims, member or provider module Provide regular status updates to higher management Design, develop, and implement scalable and high-performing data models and solutions using Snowflake and Oracle Manage and optimize data replication and ingestion processes using Oracle and Snowflake Develop and maintain ETL pipelines using Azure Data Factory (ADF) and Databricks Optimize query performance and reduce latency by leveraging pre-aggregated tables and efficient data processing techniques Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions Implement data security measures and ensure compliance with industry standards Automate data governance and security controls to maintain data integrity and compliance Develop and maintain comprehensive documentation for data architecture, data flows, ETL processes, and configurations Continuously optimize the performance of data pipelines and queries to improve efficiency and reduce costs Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors or 4 year university degree 5+ years of experience Experience in developing ETL jobs using Snowflake, ADF , Databricks and Python Experience in writing efficient and advanced SQL queries Experience in both producing and consuming data utilizing Kafka Experience working on large scale cloud-based data warehouse- Snowflake, Databricks Good experience in building data pipelines using ADF Knowledge of Agile methodologies, roles, responsibilities and deliverables Proficiency in Python for data processing and automation Demonstrated ability to learn and adapt to new data technologies Preferred Qualifications Certified in Azure Data Engineering (AZ-205) Extensive experience with Azure cloud services (Azure Data Factory, Azure Databricks, Azure SQL Database, etc.) Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI/CD) Knowledge of SQL and NoSQL databases Proficiency in Python for data processing and automation Proven excellent time management, communication, decision making, and presentation skills Proven good problem-solving skills Proven good communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Senior Consultant Technical - ETL + SQL Expertise Insightsoftware (ISW) is a growing, dynamic computer software company that helps businesses achieve greater levels of financial intelligence across their organization with our world-class financial reporting solutions. At insightsoftware, you will learn and grow in a fast-paced, supportive environment that will take your career to the next level. The Data Conversion Specialist is a member of the insightsoftware Project Management Office (PMO) who demonstrates teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude. LocationHyderabad (Work from Office- Hybrid) Working Hours2:30 PM to 11:30 PM for 3 Days 5:00 PM - 2:00AM IST or 6:00 PM to 3:00 AM IST for 2 Days, should be ok to work in night shift as per requirement. Position Summary The Senior Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, andorganizational skills to ensure the client data is converted efficiently and accurately to the insightsoftware(ISW) platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Qualifications Experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or ISWProject Manager to scope, develop, test, and implement conversion/integration Effectively communicate withISWProject Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/orISWProject Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited toAgile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure clients data is converted/integrated accurately and within deadlines established byISWProject Manager. Experience in customer SIT, UAT, migration and go live support. Additional Information ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** Background checks are required for employment with insightsoftware, where permitted by country, state/province.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Technical Consultant(ETL + SQL + Data Migration) Insightsoftware (ISW) is a growing, dynamic computer software company that helps businesses achieve greater levels of financial intelligence across their organization with our world-class financial reporting solutions. At insightsoftware, you will learn and grow in a fast-paced, supportive environment that will take your career to the next level. The Data Conversion Specialist is a member of the insightsoftware Project Management Office (PMO) who demonstrates teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude. LocationHyderabad (Work from Office) Working Hours:5:00 PM - 2:00AM IST or 6:00 PM to 3:00 AM IS T, should be ok to work in night shift as per requirement. Position Summary The Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, andorganizational skills to ensure the client data is converted efficiently and accurately to the insightsoftware(ISW) platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Qualifications Experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or ISWProject Manager to scope, develop, test, and implement conversion/integration Effectively communicate withISWProject Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/orISWProject Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited toAgile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure clients data is converted/integrated accurately and within deadlines established byISWProject Manager. Experience in customer SIT, UAT, migration and go live support. Additional Information ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** Background checks are required for employment with insightsoftware, where permitted by country, state/province.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

7 - 17 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Job Title : Talend Developer Location : Pune, Chennai & Bangalore Job Summary : We are seeking a skilled Talend Developer to join our team. The ideal candidate will have a strong background in data integration and ETL processes, with expertise in Talend tools. You will be responsible for designing, developing, and maintaining data integration solutions to support our business needs. Key Responsibilities : Design and develop ETL processes using Talend to integrate data from various sources. Implement data transformation and cleansing using Talend components such as tMap, tJoin, and others. Manage input and output components for files and databases, ensuring seamless data flow. Develop error handling mechanisms to ensure data integrity and reliability. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize Talend jobs for performance and scalability. Document ETL processes and maintain technical documentation. Must-Have Skills : Proficiency in Talend Data Integration tools. Experience with input/output components for files and databases. Strong knowledge of transformation components like tMap and tJoin. Expertise in error handling within Talend jobs. Familiarity with Talend's best practices and performance optimization techniques.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled and experienced Cognos TM1 Developer with a strong background in ETL processes and Python development. The ideal candidate will be responsible for designing, developing, and supporting TM1 solutions, integrating data pipelines, and automating processes using Python. This role requires strong problem-solving skills, business acumen, and the ability to work collaboratively with cross-functional teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of hands-on experience with IBM Cognos TM1 / Planning Analytics. Strong knowledge of TI processes, rules, dimensions, cubes, and TM1 Web. Proven experience in building and managing ETL pipelines (preferably with tools like Informatica, Talend, or custom scripts). Proficiency in Python programming for automation, data processing, and system integration. Experience with REST APIs, JSON/XML data formats, and data extraction from external sources Preferred technical and professional experience strong SQL knowledge and ability to work with relational databases. Familiarity with Agile methodologies and version control systems (e.g., Git). 3.Excellent analytical, problem-solving, and communication skills

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

The ideal candidate should be a highly skilled Production Support Engineer with at least 3 years of relevant experience, with a strong focus on ETL and Data Warehouse. The candidate should have good understanding of DevOps practices and ITIL concepts. You will Monitor the daily data pipeline runs, ensure timely data loads by proactively identifying and troubleshooting issues. Perform RCA to identify the underlying causes of issues in the data warehouse and ETL pipelines. Document findings and implement corrective actions to prevent recurrence. Collaborate with various teams, including data engineers, DevOps Engineers, architects, and business analysts, to resolve issues and implement improvements. Communicate effectively with stakeholders to provide updates on issue resolution and system performance. Maintain detailed documentation of data warehouse configurations, ETL processes, operational procedures, and issue resolutions. Participate in an on-call rotation and operating effectively in a global 24 7 environment. Ensure data integrity and accuracy and take actions to resolve data discrepancies. Generate regular reports on system performance, issues, and resolutions. Your Skills Strong experience with Oracle databases and AWS cloud services. Proficiency in SQL and PL/SQL. Should be familiar with monitoring tools Dynatreace, CloudWatch, etc. Familiarity with other AWS services such as Account creation, VPC, Cloud Front, IAM, ALB, EC2, RDS, Route 53, Auto scaling, Lambda, etc. Experience with ETL tools and processes (e.g., Informatica, Talend, AWS Glue). Familiarity with scripting languages (e.g., Python, Shell scripting). Familiarity with DevOps tools and practices (e.g., GitHub, Jenkins, Docker, Kubernetes). Strong analytical and problem-solving abilities. Experience in performing root cause analysis and implementing corrective actions. Ability to work independently as well as in a collaborative team environment. Excellent written and verbal communication skills. Bachelor’s degree in computer science, Information Technology, or a related field. Minimum of 3 years of experience in a support engineer role, preferably in data warehousing and ETL environments. Certification in AWS, Oracle, or relevant DevOps tools is a plus. Your benefits: We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad. We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered. Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach. About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality Allianz Group is one of the most trusted insurance and asset management companies in the world. Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer. Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us. We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer. We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in. We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation. Join us. Let's care for tomorrow. Show more Show less

Posted 2 weeks ago

Apply

125.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche’s 125-year history as one of the world’s largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients’ needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we’re driven by a shared passion for technological novelties and optimal IT solutions. Position Overview We are seeking an experienced ETL Architect to design, develop, and optimize data extraction, transformation, and loading (ETL) solutions and to work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. This role requires deep expertise in Python, AWS Cloud, and ETL tools to build and maintain scalable data pipelines and architectures. The ETL Architect will work closely with cross-functional teams to ensure efficient data integration, storage, and accessibility for business intelligence and analytics. Key Responsibilities ETL Design & Development: Architect and implement high-performance ETL pipelines using AWS cloud services, Snowflake, and ETL tools such as Talend, Dbt, Informatica, ADF etc Data Architecture: Design and implement scalable, efficient, and cloud-native data architectures Data Integration & Flow: Ensure seamless data integration across multiple source systems, leveraging AWS Glue, Snowflake, and other ETL tools Performance Optimization: Monitor and tune ETL processes for performance, scalability, and cost-effectiveness Governance & Security: Establish and enforce data quality, governance, and security standards for ETL processes Collaboration: Work with data engineers, analysts, and business stakeholders to define data requirements and ensure effective solutions Documentation & Best Practices: Maintain comprehensive documentation and promote best practices for ETL development and data transformation Troubleshooting & Support: Diagnose and resolve performance issues, failures, and bottlenecks in ETL processes Required Qualifications Education: Bachelor's or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field Experience: 6+ years of experience in ETL development, with 3+ years in an ETL architecture role Expertise in Snowflake or any MPP data warehouse (including Snowflake data modeling, optimization, and security best practices) Strong experience with AWS Cloud services, especially AWS Glue, AWS Lambda, S3, Redshift, and IAM or Azure/GCP cloud services Proficiency in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or DataStage Strong SQL skills and experience with relational and NoSQL databases Experience in API integrations Proficiency in scripting languages (Python, Shell, PowerShell) for automation Prior experience in Pharmaceutical or Diagnostics or healthcare domain is a plus Soft Skills Strong analytical and problem-solving abilities Excellent communication and documentation skills Ability to work collaboratively in a fast-paced, cloud-first environment Preferred Qualifications Certifications in AWS, Snowflake, or ETL tools Experience in real-time data streaming, microservices-based architectures, and DevOps for data pipelines Knowledge of data governance, compliance (GDPR, HIPAA), and security best practices Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&M’s data governance and MDM objectives. Key Responsibilities Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g., parts, engineering data, bill of material, service parts, supplier and dealer master data). Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g., 3DS specialists) to ensure cohesive delivery of the project objectives. Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience Required Skills & Qualifications 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge Strong understanding of MDM concepts, data flows, and governance frameworks. Familiarity with auto-specific data domains (e.g., ECCMA/E-Class Schema). Experience with process modelling tools (e.g., Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. Methodical and structured approach to problem-solving and project delivery. Ability to summarize complex information into clear, concise recommendations. Education: Bachelor’s degree in business, Information Systems, or a related field (or equivalent experience). Certifications: Relevant certifications (e.g., CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications Prior consulting experience in a client-facing role. Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. Exposure to data quality analysis or profiling (not required to be at a Data Analyst level) Skills: 3ds,eccma/e-class schema,data governance policies,high-quality deliverables,data flows,talend,governance frameworks,mdm platforms,mdm strategy,mdm strategies,team centre,mdm leadership,mdm objectives,m&m's data governance,master data management (mdm),sap,process modelling tools,cbap,master data,stakeholder collaboration,process mapping,visio,pmp,data quality analysis,mdg,data governance,informatica,3ds plm,data governance knowledge,problem-solving,workshop facilitation,mdm,mdm concepts,bpmn,auto-specific data domains,mdm-specific credentials,blue yonder,communication skills,reltio,sap s/4hana,lucid chart,profiling Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&M’s data governance and MDM objectives. Key Responsibilities Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g., parts, engineering data, bill of material, service parts, supplier and dealer master data). Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g., 3DS specialists) to ensure cohesive delivery of the project objectives. Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience Required Skills & Qualifications 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge Strong understanding of MDM concepts, data flows, and governance frameworks. Familiarity with auto-specific data domains (e.g., ECCMA/E-Class Schema). Experience with process modelling tools (e.g., Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. Methodical and structured approach to problem-solving and project delivery. Ability to summarize complex information into clear, concise recommendations. Education: Bachelor’s degree in business, Information Systems, or a related field (or equivalent experience). Certifications: Relevant certifications (e.g., CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications Prior consulting experience in a client-facing role. Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. Exposure to data quality analysis or profiling (not required to be at a Data Analyst level) Skills: 3ds,eccma/e-class schema,data governance policies,high-quality deliverables,data flows,talend,governance frameworks,mdm platforms,mdm strategy,mdm strategies,team centre,mdm leadership,mdm objectives,m&m's data governance,master data management (mdm),sap,process modelling tools,cbap,master data,stakeholder collaboration,process mapping,visio,pmp,data quality analysis,mdg,data governance,informatica,3ds plm,data governance knowledge,problem-solving,workshop facilitation,mdm,mdm concepts,bpmn,auto-specific data domains,mdm-specific credentials,blue yonder,communication skills,reltio,sap s/4hana,lucid chart,profiling Show more Show less

Posted 2 weeks ago

Apply

0.0 - 8.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

About the Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests . Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code . Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code . Collaborate effectively with technical and non-technical stakeholders . Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools . Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines . Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow . Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData . Write infrastructure as code to develop sandbox environments . Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed . Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10352294 Date posted 06/02/2025 End Date 06/30/2025 City Pune State/Region Maharashtra Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Tech Lead, Software Development Engineering Lead, Software Development – .Net What does a .Net Developer do at Fiserv? As an experienced member of our Payment Exchange Development Team, you will own Requirement Analysis, Technical Design and Development of our Product. What You Will Do: Your Primary responsibility would be to independently develop or code for the assignments and maintain highest degree of quality for all deliverables. You should be able to lead features end to end till production and be accountable for the release. Provide estimations for analysis, design, development, and unit testing of application functionality for the assignment. Requirement Understanding through difference source of information/documents available with the team. Assist Architect(s) in designing activities and estimation of the work required. Learning the best practices and the methodology used in the program and adhere to the standards. Provide T3 support as and when necessary for production issues. Analyze production issues and provide short-term and long-term fix and help the client support team in the whole process. Mentor/train existing and new team members. Identify and collaborate with all necessary stakeholders to reach agreement in accordance with defined project goals with little to no assistance. Track progress against assigned tasks, report status, and proactively identify issues and report to the Product Owners, Architects and Management team. Work successfully in a team environment and demonstrate a willingness to help team members in achieving their project goals if required. Ownership and accountability for delivering assigned tasks and deliverables within the established schedule. What You Will Need to Have: Minimum of 8+ years of software development experience. Technical Skillset required – Strong experience on ASP.NET Core, ASP.Net MVC 5 and SQL Server, WCF & REST APIs Strong knowledge of UI Technology like REACT, Angular Strong knowledge of Agile processes & tools associated with it, such as – JIRA, Git hub, Jenkins. Strong knowledge of security scans such as FOP & Sonatype, WebInspect. Need to have good debugging and problem-solving skills. Should have good understanding of Framework and Design Patterns and able to understand complex Architectural aspects. Experience in integrating to third party applications. Must have good written and verbal communication skills. Must have an experience in retail banking domain. What Would Be Great to Have: Experience on Microservices, Docker and Kubernetes Hands on experience on CI\CD tools like Jenkins/Harness. Knowledge of tools like Talend/SSIS Experience on Java Spring boot Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 2 weeks ago

Apply

0.0 - 8.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Engineer, Software Engineering Hyderabad, India Information Technology 311642 Job Description About The Role: Grade Level (for internal use): 09 The Team: We are looking for a highly motivated Engineer to join our team supporting Marketplace Platform. S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without having fear of failure. Impact: You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect: An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities: We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global marketplace product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of web applications, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests. Have past experience working with AWS, Azure DevOps, Jenkins, Docker, Kubernetes/EKS, Ansible and Prometheus or related cloud technologies. Have good understanding of single, hybrid and multicloud architecture with preferably hands-on experience. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code. Produce technical design documents and conduct technical walkthrough. Document and demonstrate solutions using technical design docs, diagrams and stubbed code. Collaborate effectively with technical and non-technical stakeholders. Respond to and resolve production issues. What we are looking for: Minimum of 5-8 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, web & application server, GIT, Continuous Integration, unit-testing tool and defect management tools. Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders. Required technical skills: Build data pipelines. Utilize platforms like snowflake, talend, databricks etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow. Develop federated data services to provide scalable and performant data APIs, REST, GraphQL, OData. Write infrastructure as code to develop sandbox environments. Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed. Experience building sophisticated and highly automated infrastructure. Experience with automation tools such as terraform, Cloud technologies, cloud formation, ansible etc., Demonstrates ability to adapt to new technologies and learn quickly. Desirable technical skills: Java, Springboot, React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311642 Posted On: 2025-06-02 Location: Hyderabad, Telangana, India

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing and best practices among team members. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL. - Strong understanding of data integration processes and methodologies. - Experience with data warehousing concepts and practices. - Familiarity with SQL and database management systems. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in Talend ETL. - This position is based in Hyderabad. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. Requirements: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Position Overview Designing and developing efficient data ingestion processes, data models, and data pipelines leveraging the power of Snowflake Collaborating extensively with clients to gain a deep understanding of their data requirements and translating them into robust technical solutions Implementing end-to-end ETL/ELT workflows to seamlessly extract, transform, and load data into Snowflake Optimizing data pipelines to achieve exceptional performance, scalability, and reliability Conducting thorough data quality assessments and implementing effective data governance best practices Monitoring and troubleshooting data integration processes to ensure the utmost accuracy and integrity of data Staying at the forefront of industry trends and best practices related to Snowflake and data engineering Qualifications & Experience Looking for top-notch lead engineering talent who will be able to thrive in an entrepreneurial environment that demands quick turnaround for mission critical technical solutions, have extremely high standards with a low tolerance for low quality output. Strong background in computer science concepts. Minimum of 3+ years of server-side development using Java and/or Python. Excellent oral and written, and problem-solving skills are required. Candidate should be comfortable working in a fast-paced environment and can help build APIs, Calculators, on new cutting-edge cloud and big data technologies such as Snowflake, Fivetran, DBT etc. Experience using ETL/ELT tools and technologies such as Talend, Informatica, SSIS a plus Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime, and reliability in mind Expertise in relational and dimensional data modeling Ideal candidates should have strong analytical skills and a penchant for tackling complex problems and designs of scale. Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members Apollo provides equal employment opportunities regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, colour, nationality, ethnic or national origin, religion or belief, veteran status, gender/sex or sexual orientation, or any other criterion or circumstance protected by applicable law, ordinance, or regulation. The above criteria are intended to be used as a guide only – candidates who do not meet all the above criteria may still be considered if they are deemed to have relevant experience/ equivalent levels of skill or knowledge to fulfil the requirements of the role. Any job offer will be conditional upon and subject to satisfactory reference and background screening checks, all necessary corporate and regulatory approvals or certifications as required from time to time, and entering into definitive contractual documentation satisfactory to Apollo Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Infrastructure Lead/Architect Job Type: Full-Time Location: On-site Hyderabad, Pune or New Delhi Job Summary Join our customer's team as an Infrastructure Lead/Architect and play a pivotal role in architecting, designing, and implementing next-generation cloud infrastructure solutions. You will drive cloud and data platform initiatives, ensure system scalability and security, and act as a technical leader, shaping the backbone of our customers’ mission-critical applications. Key Responsibilities Architect, design, and implement robust, scalable, and secure AWS cloud infrastructure utilizing services such as EC2, S3, Lambda, RDS, Redshift, and IAM. Lead the end-to-end design and deployment of high-performance, cost-efficient Databricks data pipelines, ensuring seamless integration with business objectives. Develop and manage data integration workflows using modern ETL tools in combination with Python and Java scripting. Collaborate with Data Engineering, DevOps, and Security teams to build resilient, highly available, and compliant systems aligned with operational standards. Act as a technical leader and mentor, guiding cross-functional teams through infrastructure design decisions and conducting in-depth code and architecture reviews. Oversee project planning, resource allocation, and deliverables, ensuring projects are executed on-time and within budget. Proactively identify infrastructure bottlenecks, recommend process improvements, and drive automation initiatives. Maintain comprehensive documentation and uphold security and compliance standards across the infrastructure landscape. Required Skills and Qualifications 8+ years of hands-on experience in IT infrastructure, cloud architecture, or related roles. Extensive expertise with AWS cloud services; AWS certifications are highly regarded. Deep experience with Databricks, including cluster deployment, Delta Lake, and machine learning integrations. Strong programming and scripting proficiency in Python and Java. Advanced knowledge of ETL/ELT processes and tools such as Apache NiFi, Talend, Airflow, or Informatica. Proven track record in project management, leading cross-functional teams; PMP or Agile/Scrum certifications are a plus. Familiarity with CI/CD workflows and Infrastructure as Code tools like Terraform and CloudFormation. Exceptional problem-solving, stakeholder management, and both written and verbal communication skills. Preferred Qualifications Experience with big data platforms such as Spark or Hadoop. Background in regulated environments (e.g., finance, healthcare). Knowledge of Kubernetes and AWS container orchestration (EKS). Show more Show less

Posted 2 weeks ago

Apply

4.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: ETL Test Engineer Experience range: 4-10 years Location: Hyderabad ONLY Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Markovate. At Markovate, we dont just follow trendswe drive them. We transform businesses through innovative AI and digital solutions that turn vision into reality. Our team harnesses breakthrough technologies to craft bespoke strategies that align seamlessly with our clients' ambitions. From AI Consulting And Gen AI Development To Pioneering AI Agents And Agentic AI, We Empower Our Partners To Lead Their Industries With Forward-thinking Precision And Unmatched Overview We are seeking a highly experienced and innovative Senior Data Engineer with a strong background in hybrid cloud data integration, pipeline orchestration, and AI-driven data modelling. Requirements This role is responsible for designing, building, and optimizing robust, scalable, and production-ready data pipelines across both AWS and Azure platforms, supporting modern data architectures such as CEDM and Data Vault Requirements : 9+ years of experience in data engineering and data architecture. Excellent communication and interpersonal skills, with the ability to engage with teams. Strong problem-solving, decision-making, and conflict-resolution abilities. Proven ability to work independently and lead cross-functional teams. Ability to work in a fast-paced, dynamic environment and handle sensitive issues with discretion and professionalism. Ability to maintain confidentiality and handle sensitive information with attention to detail with discretion. The candidate must have strong work ethics and trustworthiness. Must be highly collaborative and team oriented with commitment to Responsibilities : Design and develop hybrid ETL/ELT pipelines using AWS Glue and Azure Data Factory (ADF). Process files from AWS S3 and Azure Data Lake Gen2, including schema validation and data profiling. Implement event-based orchestration using AWS Step Functions and Apache Airflow (Astronomer). Develop and maintain bronze ? silver ? gold data layers using DBT or Coalesce. Create scalable ingestion workflows using Airbyte, AWS Transfer Family, and Rivery. Integrate with metadata and lineage tools like Unity Catalog and Open Metadata. Build reusable components for schema enforcement, EDA, and alerting (e.g., MS Teams). Work closely with QA teams to integrate test automation and ensure data quality. Collaborate with cross-functional teams including data scientists and business stakeholders to align solutions with AI/ML use cases. Document architectures, pipelines, and workflows for internal stakeholders. Experience with cloud platforms: AWS (Glue, Step Functions, Lambda, S3, CloudWatch, SNS, Transfer Family) and Azure (ADF, ADLS Gen2, Azure Functions, Event Grid). Skilled in transformation and ELT tools: Databricks (PySpark), DBT, Coalesce, and Python. Proficient in data ingestion using Airbyte, Rivery, SFTP/Excel files, and SQL Server extracts. Strong understanding of data modeling techniques including CEDM, Data Vault 2.0, and Dimensional Modelling. Hands-on experience with orchestration tools such as AWS Step Functions, Airflow (Astronomer), and ADF Triggers. Expertise in monitoring and logging with CloudWatch, AWS Glue Metrics, MS Teams Alerts, and Azure Data Explorer (ADX). Familiar with data governance and lineage tools: Unity Catalog, OpenMetadata, and schema drift detection. Proficient in version control and CI/CD using GitHub, Azure DevOps, CloudFormation, Terraform, and ARM templates. Experienced in data validation and exploratory data analysis with pandas profiling, AWS Glue Data Quality, and Great to have: Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data and AI services. Knowledge of ETL tools and frameworks (e.g., Apache NiFi, Talend, Informatica). Deep understanding of AI/Generative AI concepts and frameworks (e.g., TensorFlow, PyTorch, Hugging Face, OpenAI APIs). Experience with data modeling, data structures, and database design. Proficiency with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in SQL and at least one programming language (e.g., Python, it's like to be at Markovate : At Markovate, we thrive on collaboration and embrace every innovative idea. We invest in continuous learning to keep our team ahead in the AI/ML landscape. Transparent communication is keyevery voice at Markovate is valued. Our agile, data-driven approach transforms challenges into opportunities. We offer flexible work arrangements that empower creativity and balance. Recognition is part of our DNAyour achievements drive our success. Markovate is committed to sustainable practices and positive community impact. Our people-first culture means your growth and well-being are central to our mission. Location : hybrid model 2 days onsite. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

hackajob is collaborating with OneAdvanced to connect them with exceptional tech professionals for this role. Senior Data Integration Engineer IN-KA-Bengaluru Role Introduction We are seeking a Data Integration Specialist who will be responsible for ensuring seamless data flow between various systems, using several integration toolsets, managing data integration processes, and maintaining data quality and accessibility. You will work closely with our Data Analyst and report to the Data Eco-System Leader. This is a new role in a developing team. You will be in on the Ground Floor so will help to shape how we mature in this space. What You Will Do Key Responsibilities Design and Develop Data Integration Solutions: Create and implement data integration processes using ETL (Extract, Transform, Load) tools to consolidate data from various sources into cohesive data models. Build integration scripts and flows: As defined by the Business Stakeholders build and or change Integrations already developed within the business. Data Quality Management: Conduct data quality assessments and implement measures to enhance data accuracy and integrity. Operationalise the data exceptions and reporting around the integration of data. Collaboration: Work closely within and across the functional teams to gather requirements and understand diverse data sources, ensuring that integration strategies align with business objectives. Monitoring and Troubleshooting: Oversee data integration workflows, resolve issues, and optimize performance to ensure reliable data flow. Documentation: Maintain comprehensive documentation of data integration processes, data flows, and system configurations. Stay Updated: Keep abreast of industry trends and best practices in data integration and management What You Will Have Technical Skills & Qualifications Delivery focus: you will have led a team or been the deputy manager for a delivery focused team, preferably cross discipline. Technical Expertise: Extensive knowledge of data integration tools and languages, such as Dell Boomi, Rest API, Microsoft Fabric, Integration Hub, SQL, ETL, and XML. Problem-Solving Skills: Strong analytical skills to interpret complex data and troubleshoot integration issues effectively. Communication Skills: Effective communication skills to liaise with multiple technical and business teams and explain complex data issues clearly. Experience: Proven experience as a Data Integration Specialist or a similar role, with hands-on experience using ETL tools like Talend, Informatica, or Apache Nifi. Education: A bachelor's degree in a related field such as Computer Science, Information Technology, or Engineering is typically required or proven experience in Data Mining, ETL and Data Analysis. Would be really good to have Tools: Experience with Boomi, Rest API, ServiceNow Integration Hub, JIRA and ITSM platforms is beneficial. Scripting: understanding and ability to design and script workflows and automations. Enterprise Systems: An understanding of data structures in Salesforce. What We Do For You Wellbeing focused - Our people are our greatest assets, and ensuring everyone feels their best self to come to work is integral. Annual Leave - 20 days of annual leave, plus public holidays Employee Assistance Programme - Free advice, support, and confidential counselling available 24/7. Personal Growth - We’re committed to enabling your growth personally and professionally through development programmes. Life Insurance - 2x annual salary Personal Accident Insurance - providing cover in the event of serious injury/illness. Performance Bonus - Our Group-wide bonus scheme enables you to reap the rewards of your success. Who We Are OneAdvanced is one UK's largest providers of business software and services serving 20,000+ global customers with an annual turnover of £330M+. We manage 1.5 million 111 calls per month, support over 2 million Further Education learners across the UK, handle over 10 million wills, and so much more. Our mission is to power the world of work and, as you can see, our software underpins some of the UK's most critical sectors. We invest in our brilliant people. They are at the heart of our success as we strive to be a diverse, inclusive and engaging place to work that not only powers the world of work, but empowers the growth, ambitions and talent of our people. Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Company Gentrack provides leading utilities across the world with innovative cleantech solutions The global pace of change is accelerating, and utilities need to rebuild for a more sustainable future Working with some of the world’s biggest energy and water companies, as well as innovative challenger brands, we are helping companies reshape what it means to be a utilities business We are driven by our passion to create positive impact That is why utilities rely on us to drive innovation, deliver great customer experiences, and secure profits Together, we are renewing utilities Our Values and Culture Colleagues at Gentrack are one big team, working together to drive efficiency in two of the planet’s most precious resources, energy, and water We are passionate people who want to drive change through technology and believe in making a difference Our values drive decisions and how we interact and communicate with customers, partners, shareholders, and each other Our core values are~ Respect for the planet Respect for our customers and Respect for each other Gentrackers are a group of smart thinkers and dedicated doers We are a diverse team who love our work and the people we work with and who collaborate and inspire each other to deliver creative solutions that make our customers successful We are a team that shares knowledge, asks questions, raises the bar, and are expert advisers At Gentrack we care about doing honest business that is good for not just customers but families, communities, and ultimately the planet Gentrackers continuously look for a better way and drive quality into everything they do. This is a truly exciting time to join Gentrack with a clear growth strategy and a world class leadership team working to fulfil Gentrack’s global aspirations by having the most talented people, an inspiring culture, and a technology first, people centric business. In line with our value of ‘Respect for the Planet’, we encourage all our people for creating awareness of behaviours aligned to our Sustainability Charter through supporting organisational change and actively engaging in our global sustainability programs, including enabling our people to engage and partake in events. The Opportunity We are currently looking for a Data Migration Engineer – Senior Role~ A Senior Data Migration Engineer plays a pivotal role in utility billing transformation programs, ensuring seamless data transitions from legacy systems to modern platforms. This role demands strong technical expertise, cross-functional collaboration, and a strategic mindset to manage complex data workflows with accuracy and efficiency. Key Responsibilities~ Lead end-to-end data migration activities for utility billing systems, including analysis, extraction, transformation, loading (ETL), and validation. Design and implement robust SQL queries and Snowflake-based pipelines to manage large volumes of structured data efficiently. Utilize advanced SQL and Snowflake skills to analyze large datasets, identify data quality issues, and implement robust transformation logic. Leverage data migration tools and automation frameworks to streamline data movement while ensuring compliance and auditability. Configure and orchestrate migration pipelines using ETL solutions. Ensure high standards of data quality through meticulous data validation, reconciliation, and error handling strategies. Collaborate with functional analysts to understand source and target system data models and business logic. Maintain detailed documentation using tools like Jira and Confluence, tracking migration processes and issue resolution. Provide post-migration support, including defect fixes, data repair, and customer coordination. Apply domain expertise to interpret complex data structures from source systems, map them accurately to target utility billing systems, and reconcile them post-migration. Develop and maintain automation scripts for data validation, consistency checks, and rollback plans. Coordinate with testing and support teams to validate migrated data and troubleshoot anomalies Key Skills & Expertise~ SQL~ Advanced proficiency for querying, transformation logic, and performance tuning. Snowflake~ Deep hands-on experience with cloud-based data warehousing, ELT processes, and security best practices. Data Migration Tools~ Experience with Informatica, Talend, or custom migration frameworks. You will leverage tools like Jira and Confluence for project and knowledge management, and utilize Git, Jenkins, and AWS for version control, automation, and cloud deployment. Your analytical mindset and attention to detail support critical tasks such as data reconciliation and validation—ensuring that migrated data is accurate, complete, and compliant with regulatory standards. Cloud Platform~ Familiarity with AWS services (S3, Glue, Lambda, etc. or equivalent) for large-scale migration and data pipeline management. Project Tools~ Jira, Confluence Reconciliation & Validation~ Data comparison, row/column-level verification, exception reporting Domain Expertise~ Deep understanding of utility billing systems, including source (legacy) and target system data structures Experience in automation using Python, Shell Scripting, writing stored procedures in SQL/PL SQL etc., will be added advantage' Qualifications~ Bachelor’s or Master’s Degree in Computer Science, Information Systems, or a related field 6–8 years of experience in data migration or data engineering roles, preferably with 3–5 years of domain expertise in utility or telecom billing systems. Proven track record of delivering successful migrations in complex enterprise environments Personal Attributes~ Strong analytical and problem-solving skills Ability to manage cross-functional teams and work under pressure Excellent communication and stakeholder engagement skills Commitment to data integrity, data security and customer satisfaction Domain Expertise In-depth understanding of utility billing systems and operational data such as customer information, meter data, contracts, billing, and settlements. Clear grasp of source and target system functional and data models, enabling precise mapping and impact analysis. Proven track record in handling multi-domain responsibilities~ configuration management, development, test integration, system operations, and customer support. Soft Skills Strong analytical and problem-solving mindset. Excellent communication and coordination with both technical teams and non-technical stakeholders. Ability to work independently under pressure while maintaining focus on deliverables and quality. This role also demands cross-functional collaboration, as you’ll coordinate with developers, testers, system administrators, and customer support teams. Your multi-domain expertise will be pivotal in addressing configuration, integration, and support challenges, making you a key contributor to the project’s overall success. Additional Tasks In addition to this you are required to carry out any other duties as reasonably requested by your direct line leader What we offer in return~ Personal growth – in leadership, commercial acumen, and technical excellence To be part of a global, winning high growth organization – with a career path to match A vibrant, culture full of people passionate about transformation and making a difference -with a one team, collaborative ethos A competitive reward package that truly awards our top talent A chance to make a true impact on society and the planet Gentrack want to work with the best people, no matter their background So, if you are passionate about learning new things and keen to join the mission, you will fit right in. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Azure Cloud – Technology Assurance As Risk Assurance Senior, you’ll contribute technically to Risk Assurance client engagements and internal projects. An important part of your role will be to assist fellow Seniors & Managers while actively participating within the client engagement Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. In line with EY commitment to quality, you’ll confirm that work is of high quality and is reviewed by the next-level reviewer. As a member of the team, you’ll help to create a positive learning culture and assist fellow team members while delivering an assignment. The opportunity We’re looking for professional having at least 3 years or more of experience. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Skills and Summary of Accountabilities: Designing, architecting, and developing solutions leveraging Azure cloud to ingest, process and analyse large, disparate data sets to exceed business requirements. Proficient in Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics and Azure Data Lake Storage for data storage and processing. Designed data pipelines using these technologies. Working knowledge of Data warehousing/Modelling ETL/ELT pipelines, Data Democratization using cloud services. Design, build and maintain efficient, reusable, and reliable code ensuring the best possible performance, quality, and responsiveness of applications using reliable Python code. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse studia, ADF etc Exposure working in client facing roles, collaborate with cross functional teams including internal audits, IT security and business stakeholders to assess control effectiveness and facilitate remediation activities. Preferred knowledge/understanding of experience in IT Controls, Risk and Compliance. Design IT Risk Controls framework such as IT SOX. Testing of internal controls such as IT general controls, IT application controls, IPE related controls, interface controls etc. To qualify for the role, you must have. 3 years of experience in building end-to-end business solutions using big data and data engineering. Expertise in core Microsoft Azure Data Services (e.g., Azure Data Factory, Azure Databricks, Azure Synapse, Azure SQL, Data Lake services etc). Familiar with integrating services, Azure Logic apps, Function Apps, Stream analytics, Triggers, Event hubs etc. Expertise in Cloud related Big Data integration and infrastructure tech stack using Azure Databricks & Apache Spark framework. Must have the following: Python, SQL and preferred R, Scala. Experience developing software tools using utilities, pandas, NumPy and other libraries/components etc. Hands-on expertise in using Python frameworks (like Django, Pyramid, Flask). Preferred substantial background in data extraction and transformation, developing data pipelines using MS SSIS, Informatica, Talend or any other on-premises tools. Preferred knowledge on Power BI or any BI tools. Should have good understanding of version controlling with Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, data analytics or related disciplines. Experience with AI/ML is a plus. Preferred Certification in DP-203 Azure Data Engineer or any other. Ability to communicate clearly and concisely and use strong writing and verbal skills to communicate facts, figures, and ideas to others. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant- Sr.Data Engineer (DBT+Snowflake)! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake's advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities: • • Bachelor’s degree in Computer Science, Data Engineering, or a related field. • • experience in data engineering, with at least 3 years of experience working with Snowflake. • • Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. • • Strong proficiency in SQL, Python, and data modeling. • • Experience with data integration tools (e.g., Matillion, Talend, Informatica). • • Knowledge of cloud platforms such as AWS, Azure, or GCP. • • Excellent problem-solving skills, with a focus on data quality and performance optimization. • • Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github/Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting May 30, 2025, 6:23:05 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description: At least 6-14+ years of Maximo technical experience Experience with Maximo 7.x Must have experience with generating precise functional and technical design, data migration strategy for Maximo Asset Management, Work Management, Purchasing, Spatial and Inventory modules Define Application configuration Design, Data migration strategy, Integration Designs, Prepare necessary documentation Extensive experience with Maximo Data Conversion/ Migration using MX Data loader, Talend and scripts Experience with Maximo Integration technologies (MIF, MEA, Object Structures, services, channels, etc.), User exit classes. XLST/XML, SOAP/REST API etc. Experience with Maximo Customization technologies (Java scripts, Python/Jython, Java/J2EE, SQL) Experience with WebSphere 8.5/ 9.0 application servers administration (Linux environment preferred) Experience on BIRT Reports Knowledge on MAS and MAS Functional/Technical Certification would be an added advantage Job summary We are seeking a Sr. Analyst MAP with 6 to 8 years of experience to join our team. The ideal candidate will have expertise in IBM Maximo Asset Management and will work in a hybrid model. This role involves day shifts and does not require travel. The candidate will play a crucial role in managing and analyzing geospatial data to support our asset management initiatives. Responsibilities Manage and analyze geospatial data to support asset management initiatives. Oversee the implementation and maintenance of IBM Maximo Asset Management systems. Provide technical expertise in geospatial data analysis and asset management. Collaborate with cross-functional teams to ensure data accuracy and integrity. Develop and maintain geospatial databases and related systems. Conduct regular audits to ensure data quality and compliance with industry standards. Create detailed reports and visualizations to support decision-making processes. Train and support team members in the use of geospatial tools and technologies. Identify and address any issues related to geospatial data and asset management. Ensure that all geospatial data is up-to-date and accurately reflects current asset conditions. Work closely with stakeholders to understand their geospatial data needs and requirements. Contribute to the development of best practices and standards for geospatial data management. Stay updated with the latest trends and advancements in geospatial technologies. Qualifications Possess a strong background in IBM Maximo Asset Management with hands-on experience. Demonstrate expertise in geospatial data analysis and management. Have excellent problem-solving and analytical skills. Show proficiency in creating detailed reports and visualizations. Exhibit strong communication and collaboration skills. Be familiar with industry standards and best practices for geospatial data management. Have the ability to train and support team members in geospatial tools. Stay updated with the latest trends in geospatial technologies. Be detail-oriented and ensure data accuracy and integrity. Have experience in conducting data audits and ensuring compliance. Be able to work in a hybrid model with day shifts. Demonstrate the ability to work closely with stakeholders. Show a commitment to continuous learning and improvement. Certifications Required Certified Maximo Asset Management Professional GIS Certification

Posted 2 weeks ago

Apply

0 years

3 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, India Business Support In-Office 9412 Job Description Job Purpose Intercontinental Exchange, Inc. (ICE) presents a unique opportunity to work with cutting-edge technology to provide solutions to business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. We are seeking an Integration Developer to join our collaborative Enterprise Information Management team to support the delivery of solutions to various business organizations. This candidate will be a significant part of the Integration team to support cross-system application and data integrations. The candidate will be working with a team of experts in data, ETL, and Integrations. This position requires technical proficiency as well as an eager attitude, professionalism, and solid communication skills. An Integration Developer will be a member of the team who drives strategy for tools and development. This person will not have direct reports. Responsibilities Build, maintain, and support applications in a global software platform and various other corporate systems, tools, and scripts Collaborate with other internal groups to translate business and functional requirements into technical implementation for the automation of existing processes and the development of new applications Communicate with internal customers in non-technical terms, understand business requirements, and propose solutions Manage projects from specification gathering, to development, to QA, user acceptance testing, and deployment to production Document changes and follow proper SDLC procedures Enhances team and coworkers through knowledge sharing and implementing best practices in day to day activities Takes initiative to continually learn and enhance technical knowledge and skills. Knowledge and Experience BS degree preferably in CS or EE, or a related discipline 2 – 3 yr. experience as an integration developer using applications like Talend or MuleSoft or any other. Familiarity with building multi-threaded application, and some understanding of distributed system like Kafka, Rabbit MQ Experience in developing REST based services Familiarity with different data formats like JSON, XML etc. High proficiency in RDBMS concepts and SQL Understanding of design patterns and object-oriented design concepts Experience with deployment automation tools such as Jenkins, Artifactory, Maven Strong written and verbal communication skills Ability to multitask and work independently on multiple projects Preferred Linux, Bash, SSH Familiarity Experience with application like Salesforce, ServiceNow, ORMB and other financial applicatons Financial industry expertise

Posted 2 weeks ago

Apply

Exploring Talend Jobs in India

Talend is a popular data integration and management tool used by many organizations in India. As a result, there is a growing demand for professionals with expertise in Talend across various industries. Job seekers looking to explore opportunities in this field can expect a promising job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

These cities have a high concentration of IT companies and organizations that frequently hire for Talend roles.

Average Salary Range

The average salary range for Talend professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career progression in the field of Talend may follow this path: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to expertise in Talend, professionals in this field are often expected to have knowledge or experience in the following areas: - Data Warehousing - ETL (Extract, Transform, Load) processes - SQL - Big Data technologies (e.g., Hadoop, Spark)

Interview Questions

  • What is Talend and how does it differ from traditional ETL tools? (basic)
  • Can you explain the difference between tMap and tJoin components in Talend? (medium)
  • How do you handle errors in Talend jobs? (medium)
  • What is the purpose of a context variable in Talend? (basic)
  • Explain the difference between incremental and full loading in Talend. (medium)
  • How do you optimize Talend jobs for better performance? (advanced)
  • What are the different deployment options available in Talend? (medium)
  • How do you schedule Talend jobs to run at specific times? (basic)
  • Can you explain the use of tFilterRow component in Talend? (basic)
  • What is metadata in Talend and how is it used? (medium)
  • How do you handle complex transformations in Talend? (advanced)
  • Explain the concept of schema in Talend. (basic)
  • How do you handle duplicate records in Talend? (medium)
  • What is the purpose of the tLogRow component in Talend? (basic)
  • How do you integrate Talend with other systems or applications? (medium)
  • Explain the use of tNormalize component in Talend. (medium)
  • How do you handle null values in Talend transformations? (basic)
  • What is the role of the tRunJob component in Talend? (medium)
  • How do you monitor and troubleshoot Talend jobs? (medium)
  • Can you explain the difference between tMap and tMapLookup components in Talend? (medium)
  • How do you handle changing business requirements in Talend jobs? (advanced)
  • What are the best practices for version controlling Talend jobs? (advanced)
  • How do you handle large volumes of data in Talend? (medium)
  • Explain the purpose of the tAggregateRow component in Talend. (basic)

Closing Remark

As you explore opportunities in the Talend job market in India, remember to showcase your expertise, skills, and knowledge during the interview process. With preparation and confidence, you can excel in securing a rewarding career in this field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies