Jobs
Interviews

1502 Talend Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be a seasoned Senior ETL/DB Tester with expertise in data validation and database testing across modern data platforms. Your role will involve designing, developing, and executing comprehensive test plans for ETL and database validation processes. You will validate data transformations and integrity across multiple stages and systems such as Talend, ADF, Snowflake, and Power BI. Your responsibilities will include performing manual testing and defect tracking using tools like Zephyr or Tosca. You will analyze business and data requirements to ensure full test coverage, and write and execute complex SQL queries for data reconciliation. Identifying data-related issues and conducting root cause analysis in collaboration with developers will be crucial aspects of your role. You will track and manage bugs and enhancements through appropriate tools, and optimize testing strategies for performance, scalability, and accuracy in ETL processes. Your skills should include proficiency in ETL Tools such as Talend and ADF, working on Data Platforms like Snowflake, and experience in Reporting/Analytics tools like Power BI and VPI. Additionally, you should have expertise in Testing Tools like Zephyr or Tosca, manual testing, and strong SQL skills for validating complex data. Moreover, exposure to API Testing and familiarity with advanced features of Power BI such as Dashboards, DAX, and Data Modeling will be beneficial for this role.,

Posted 2 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The incumbent will be part of Global-BI/Data Engineering Team and responsible for developing and maintaining solutions on Talend and working closely with team of developers, QAs, and Support Analysts. This individual, functionally reporting to Team Leader, Global-BI, is an active partner and a business process visionary who shapes the technology demand among customer facing employee community across global businesses. The role will need to have a broad range of technical and functional knowledge, with good blend of problem solving and communication skills; able to align regional requirements with global templates and deliver solution across multiple projects. This role will also provide consultative leadership to both technical and functional resources across the organization, from strategic decision making to project planning to overall governance and oversight. Engage key business and technology stakeholders across the enterprise; be highly collaborative, drive communication to business, delivery and support teams, and apply analytical and problem-solving skills to drive solutions with industry best practices. Experience 4+ years relevant technical experience with 3+ years in Java. Experience with Talend Data Integration: Design, develop Talend ETL scripts, creating and deploying end to end Talend Data Integration solution. Experience in Talend Studio and Talend Cloud. Good knowledge with RBDMS and SQL scripting. Good knowledge in Snowflake, Google cloud Platform (GCP) services, Git and Python. Automation, orchestrations and Performance Tuning of ETL processes along with implementing Best Practices. Experience with development and production support. Experience in designing, developing, validating and deploying the Talend ETL Pipelines. Non-Technical Requirements Proven success in contributing to a team-oriented environment. Proven ability to work creatively and analytically in a problem-solving environment. Excellent communication (written and oral) and interpersonal skills. Note: Work schedule from Thursday to Monday, 11:00 AM to 8:00 PM IST, with Tuesday and Wednesday as their off days.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Responsibilities: Develop and maintain data pipelines using Snowflake and Talend Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions. Design and implement data transformation logic using SQL, Python, or other programming languages. Optimize data pipelines for performance and scalability. Implement data security and governance best practices. Troubleshoot and resolve issues related to data pipelines and processing. Document data pipelines, processes, and best practices. Qualifications: Strong experience with Snowflake's cloud data platform, including data modeling, performance tuning, and security. Experience with SQL and other programming languages for data manipulation and analysis. Familiarity with cloud computing concepts and services, particularly Azure. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Knowledge on Talend will be added advantage. Experience: Up to 5 years of experience in data engineering, ETL development, or a related field and at-least 2-3 years of experience in Snowflake. Experience working with data warehousing and data integration projects. Experience in Talend Snow or Azure Certification will be an added advantage.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 2 days ago

Apply

10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 2 days ago

Apply

10.0 years

5 - 6 Lacs

Hyderābād

On-site

Job Information Date Opened 07/29/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Lead Cloud Data Engineer / Technical Architect Experience: 10+ Years Location - Hyderabad Job Summary: We are seeking a highly skilled and experienced Cloud Data Engineer with a strong foundation in AWS, data warehousing, and application migration. The ideal candidate will be responsible for designing and maintaining cloud-based data solutions, leading teams, collaborating with clients, and ensuring smooth migration of on-premises applications to the cloud. Key Responsibilities: Engage directly with clients to understand requirements, provide solution design, and drive successful project delivery. Lead cloud migration initiatives, specifically moving on-premise applications and databases to AWS cloud platforms. Design, develop, and maintain scalable, reliable, and secure data applications in a cloud environment. Lead and mentor a team of engineers; oversee task distribution, progress tracking, and issue resolution. Develop, optimize, and troubleshoot complex SQL queries and stored procedures. Design and implement robust ETL pipelines using tools such as Talend , Informatica , or DataStage . Ensure optimal usage and performance of Amazon Redshift and implement performance tuning strategies. Collaborate across teams to implement best practices in cloud architecture and data management. Requirements Required Skills and Qualifications: Strong hands-on experience with the AWS ecosystem , including services related to storage, compute, and data analytics. In-depth knowledge of data warehouse architecture and best practices. Proven experience in on-prem to cloud migration projects . Expertise in at least one ETL tool : Talend, Informatica, or DataStage . Strong command of SQL and Stored Procedures . Practical knowledge and usage of Amazon Redshift . Demonstrated experience in leading teams and managing project deliverables. Strong understanding of performance tuning for data pipelines and databases. Good to Have: Working knowledge or hands-on experience with Snowflake . Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Benefits As per company standards.

Posted 2 days ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Hyderābād

On-site

Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Data Analyst Position: Senior Software Engineer - Data Analyst Experience: 3 to 6 Years Main location: India, Telangana, Hyderabad Position ID: J0525-1616 Shift: General Shift (5 Days WFO for initial 8 weeks) Employment Type: Full Time Your future duties and responsibilities Design, develop, and optimize complex SQL queries for data extraction, transformation, and loading (ETL). Work with Teradata databases to perform high-volume data analysis and support enterprise-level reporting needs. Understand business and technical requirements to create and manage Source to Target Mapping (STM) documentation. Collaborate with business analysts and domain SMEs to map banking-specific data such as transactions, accounts, customers, products, and regulatory data. Analyze large data sets to identify trends, data quality issues, and actionable insights. Participate in data migration, data lineage, and reconciliation processes. Ensure data governance, quality, and security protocols are followed. Support testing and validation efforts during system upgrades or new feature implementations. Required qualifications to be successful in this role Advanced SQL – Joins, subqueries, window functions, performance tuning. Teradata – Query optimization, utilities (e.g., BTEQ, FastLoad, MultiLoad), DDL/DML. Experience with ETL tools (e.g., Informatica, Talend, or custom SQL-based ETL pipelines). Hands-on in preparing STM (Source to Target Mapping) documents. Familiarity with data modeling and data warehouse concepts (star/snowflake schema). Proficient in Excel and/or BI tools (Power BI, Tableau, etc.) for data visualization and analysis. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

6.0 years

15 - 18 Lacs

Indore

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 2 days ago

Apply

0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Job Description Position: Data Engineer Intern Location: Remote Duration: 2-6 months Company: Collegepur Type: Unpaid Internship About the Internship: We are seeking a skilled Data Engineer to join our team, with a focus on cloud data storage, ETL processes, and database/data warehouse management. If you are passionate about building robust data solutions and enabling data-driven decision-making, we want to hear from you! Key Responsibilities: 1. Design, develop, and maintain scalable data pipelines to process large datasets from multiple sources, both structured and unstructured. 2. Implement and optimize ETL (Extract, Transform, Load) processes to integrate, clean, and transform data for analytical use. 3. Manage and enhance cloud-based data storage solutions, including data lakes and data warehouses, using platforms such as AWS, Azure, or Google Cloud. 4. Ensure data security, privacy, and compliance with relevant standards and regulations. 5. Collaborate with data scientists, analysts, and software engineers to support data-driven projects and business processes. 6. Monitor and troubleshoot data pipelines to ensure efficient, real-time, and batch data processing. 7. Maintain comprehensive documentation and data mapping across multiple systems. Requirements: 1. Proven experience with cloud platforms (AWS, Azure, or Google Cloud). 2. Strong knowledge of database systems, data warehousing, and data modeling. 3. Proficiency in programming languages such as Python, Java, or Scala. 4. Experience with ETL tools and frameworks (e.g., Airflow, Informatica, Talend). 5. Familiarity with data security, compliance, and governance practices. 6. Excellent analytical, problem-solving, and communication skills. 7. Bachelor’s degree in Computer Science, Information Technology, or related field.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics D&A – SSIS- Senior We’re looking for Informatica or SSIS Engineers with Cloud Background (AWS, Azure) Primary skills: Has played key roles in multiple large global transformation programs on business process management Experience in database query using SQL Should have experience working on building/integrating data into a data warehouse. Experience in data profiling and reconciliation Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Developed expertise in complex data management or Application integration solution and deployment in areas of data migration, data integration, application integration or data quality. Experience in data processing, orchestration, parallelization, transformations and ETL Fundamentals. Leverages on variety of programming languages & data crawling/processing tools to ensure data reliability, quality & efficiency (optional) Experience in Cloud Data-related tool (Microsoft Azure, Amazon S3 or Data lake) Knowledge on Cloud infrastructure and knowledge on Talend cloud is an added advantage Knowledge of data modelling principles. Knowledge in Autosys scheduling Good experience in database technologies. Good knowledge in Unix system Responsibilities: Need to work as a team member to contribute in various technical streams of Data integration projects. Provide product and design level technical best practices Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Building a quality culture Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Qualification: BE/BTech/MCA (must) with an industry experience of 3 -7 years. Experience in Talend jobs, joblets and customer components. Should have knowledge of error handling and performance tuning in Talend. Experience in big data technologies such as sqoop, Impala, hive, Yarn, Spark etc. Informatica PowerCenter/IBM-DataStage/ SSIS development Strong proficiency in SQL/PLSQL Good experience in performance tuning ETL workflows and suggest improvements. Atleast experience of minimum 3-4 clients for short duration projects ranging between 6-8 + months OR Experience of minimum 2+ clients for duration of projects ranging between 1-2 years or more than that People with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA

Posted 2 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.

Posted 2 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills.

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

You will drive growth by identifying, developing, and closing new business opportunities while building relationships with potential clients. Understanding their needs will enable you to effectively showcase how the company's IT solutions can add value. As a Business Development Analyst, you will be instrumental in driving the growth and success of the company through strategic analysis and development initiatives. Your responsibilities will include generating new business leads and establishing strong relationships with clients. You will conduct market research and analysis to identify business opportunities, utilizing data analytics tools such as Tableau and SQL to extract insights and trends. Developing financial models and forecasts to support strategic decision-making will be a key aspect of your role. Collaborating with cross-functional teams to implement business development strategies, creating business proposals and presentations, and monitoring industry trends and competitor activities to identify potential risks and opportunities will also be part of your duties. The ideal candidate will have proficiency in project management methodologies, strong analytical skills with experience in data analysis using tools such as VBA, Python, ETL, and Talend. Familiarity with business intelligence tools like Tableau for data visualization, ability to conduct market research, and analyze data to drive business growth are essential. Knowledge of SQL for database querying and manipulation, experience in business analytics, forecasting, and trend analysis are also desired. Having familiarity with watching industry trends for strategic insights is a plus. This is a full-time, permanent position with work location in person.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a Data Architect / Data Engineer with expertise in Qlik Sense Cloud, responsible for designing the organization's data infrastructure, building and maintaining data pipelines, and managing ETL processes for seamless data integration. Your role involves integrating business intelligence tools, supporting data visualization and reporting requirements, and ensuring high-quality outputs from the data analyst teams. Your key responsibilities include designing and implementing data architectures, developing scalable data pipelines and ETL workflows, ensuring data security and governance, managing reporting standards, integrating Qlik Sense Cloud, managing data models for business dashboards, collaborating with stakeholders, optimizing data systems, and maintaining documentation of data architectures and best practices. To excel in this role, you should have deep knowledge of database technologies and big data platforms, experience with ETL tools and cloud data platforms, expertise in Qlik Sense Cloud, proficiency in programming languages, strong collaboration and leadership skills, and the ability to troubleshoot ETL issues and optimize data pipeline performance. If you are passionate about data architecture, data modeling, and delivering actionable insights through scalable reporting solutions, this role at Clinisupplies in Indore is the perfect opportunity for you.,

Posted 3 days ago

Apply

0.0 - 6.0 years

15 - 18 Lacs

Indore, Madhya Pradesh

On-site

Location: Indore Experience: 6+ Years Work Type : Hybrid Notice Period : 0-30 Days joiners We are hiring for a Digital Transformation Consulting firm that specializes in the Advisory and implementation of AI, Automation, and Analytics strategies for the Healthcare providers. The company is headquartered in NJ, USA and its India office is in Indore, MP. Job Description: We are seeking a highly skilled Tech Lead with expertise in database management, data warehousing, and ETL pipelines to drive the data initiatives in the company. The ideal candidate will lead a team of developers, architects, and data engineers to design, develop, and optimize data solutions. This role requires hands-on experience in database technologies, data modeling, ETL processes, and cloud-based data platforms. Key Responsibilities: Lead the design, development, and maintenance of scalable database, data warehouse, and ETL solutions. Define best practices for data architecture, modeling, and governance. Oversee data integration, transformation, and migration strategies. Ensure high availability, performance tuning, and optimization of databases and ETL pipelines. Implement data security, compliance, and backup strategies. Required Skills & Qualifications: 6+ years of experience in database and data engineering roles. Strong expertise in SQL, NoSQL, and relational database management systems (RDBMS). Hands-on experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery). Deep understanding of ETL tools and frameworks (e.g., Apache Airflow, Talend, Informatica). Experience with cloud data platforms (AWS, Azure, GCP). Proficiency in programming/scripting languages (Python, SQL, Shell scripting). Strong problem-solving, leadership, and communication skills. Preferred Skills (Good to Have): Experience with big data technologies (Hadoop, Spark, Kafka). Knowledge of real-time data processing. Exposure to AI/ML technologies and working with ML algorithms Job Types: Full-time, Permanent Pay: ₹1,500,000.00 - ₹1,800,000.00 per year Schedule: Day shift Application Question(s): We must fill this position urgently. Can you start immediately? Have you held a lead role in the past? Experience: Extract, Transform, Load (ETL): 6 years (Required) Python: 5 years (Required) big data technologies (Hadoop, Spark, Kafka): 6 years (Required) Snowflake: 6 years (Required) Data warehouse: 6 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of Lead, Software Engineer at Mastercard involves playing a crucial part in the Data Unification process across different data assets to create a unified view of data from multiple sources. This position will focus on driving insights from available data sets and supporting the development of new data-driven cyber products, services, and actionable insights. The Lead, Software Engineer will collaborate with various teams such as Product Manager, Data Science, Platform Strategy, and Technology to understand data needs and requirements for delivering data solutions that bring business value. Key responsibilities of the Lead, Software Engineer include performing data ingestion, aggregation, and processing to derive relevant insights, manipulating and analyzing complex data from various sources, identifying innovative ideas and delivering proof of concepts, prototypes, and proposing new products and enhancements. Moreover, integrating and unifying new data assets to enhance customer value, analyzing transaction and product data to generate actionable recommendations for business growth, and collecting feedback from clients, development, product, and sales teams for new solutions are also part of the role. The ideal candidate for this position should have a good understanding of streaming technologies like Kafka and Spark Streaming, proficiency in programming languages such as Java, Scala, or Python, experience with Enterprise Business Intelligence Platform/Data platform, strong SQL and higher-level programming skills, knowledge of data mining and machine learning algorithms, and familiarity with data integration tools like ETL/ELT tools including Apache NiFi, Azure Data Factory, Pentaho, and Talend. Additionally, they should possess the ability to work in a fast-paced, deadline-driven environment, collaborate effectively with cross-functional teams, and articulate solution requirements for different groups within the organization. It is essential for all employees working at or on behalf of Mastercard to adhere to the organization's security policies and practices, ensure the confidentiality and integrity of accessed information, report any suspected information security violations or breaches, and complete all mandatory security trainings in accordance with Mastercard's guidelines. The Lead, Software Engineer role at Mastercard offers an exciting opportunity to contribute to the development of innovative data-driven solutions that drive business growth and enhance customer value proposition.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Integration Solutions Developer, you will be responsible for designing, developing, and implementing integration solutions using various technologies and platforms such as API gateways, ESBs, iPaaS, and message queues. You will analyze business requirements and translate them into technical integration designs, as well as develop and maintain APIs and data transformation processes. Your role will involve configuring and managing integration platforms and tools, monitoring integration flows, troubleshooting issues, and implementing solutions. Collaboration is key in this role, as you will work closely with software developers, system administrators, and business stakeholders to understand integration needs and deliver effective solutions. Ensuring the security, reliability, and scalability of integration solutions will be a priority, along with documenting integration designs, specifications, and processes. Staying updated on the latest integration technologies and best practices is essential, as well as participating in code reviews and providing technical guidance. To be successful in this position, you should have 3-5 years of experience in software development or integration, with proven experience in designing and implementing system integrations. Proficiency in one or more programming languages such as Java, Python, C#, or Node.js is required, along with a strong understanding of API concepts (REST, SOAP) and experience working with APIs. Experience with integration patterns and principles, data formats like XML and JSON, databases, and querying languages (e.g., SQL) is necessary. Troubleshooting and debugging skills in complex integrated environments, as well as excellent communication and collaboration skills, are also important. Preferred qualifications include experience with specific integration platforms or tools like Mulesoft, Dell Boomi, Apache Camel, Talend, Informatica, Azure Integration Services, or AWS Integration Services. Familiarity with message queuing systems (e.g., Kafka, RabbitMQ, ActiveMQ), cloud computing platforms (e.g., AWS, Azure, GCP), containerization (e.g., Docker, Kubernetes), security best practices in integration, CI/CD pipelines for integration deployments, and knowledge of industry standards or protocols relevant to the business. Ideally, you should possess a Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Stay updated on the technology landscape and continuously enhance your skills to deliver effective integration solutions.,

Posted 3 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

YASH Technologies is a leading technology integrator that specializes in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. We pride ourselves on being a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in bringing real positive changes in an increasingly virtual world, driving us beyond generational gaps and future disruptions. We are currently seeking SAP MDM Professionals with the following qualifications: - Experience required: 10+ - Mix of directed and self-directed work - Techno-functional experience - Experience in SAP ECC and S4 (cutovers, ongoing support, governance) - Experience in multiple SAP modules (MM, PP, FI, WH/EWM, QM) and a strong understanding of SAP table structures and purpose - End-to-end data movement in SAP modules focusing on MM, PP, FI, WH/EWM, QM - Experience in Manufacturing - Experience in Kitting would be nice but not required - Working data experience of BOM, Routing, Inspection plan, work center, Inventory, Production version, PIR/SL - Experience with determining relevancy/extraction rules before/for field mapping - Experience with Interfaces is a plus - Lead Data Mapping workshops and conduct follow-up meetings with business and IT Teams - Coordinate with business on specifying data cleansing rules to standardize existing source system data - Ownership of tasks in field mapping, build-out coordination with developer contacts, and data issue resolution with business/dev - Exposure to common industry ETL tools + analysis experience for data quality and exception review to support ETL development and execution - Work with Talend developer to develop load-ready files LRF - Experience working with Functional specs, responsibility for loading data and troubleshooting issues - Coordinate and drive post-load validations with business for data verification - Functional Unit Test planning and execution - Strong communication and people skills are ideal At YASH, you are empowered to create a career that will take you where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced Data Migration and Integration engineer, you will be part of the STS group in capital markets, focusing on migrating client data from legacy/3rd party systems into FIS products. You will work alongside specialists in Adeptia, PL/SQL, and SSIS, with solid domain knowledge of Lending, Risk, and Treasury products. In this role, you will be responsible for completing Data Migration/Conversion/Integration projects within the specified time frame and ensuring the highest quality. Your duties will include clear and timely communication, problem escalation, and resolution efforts to ensure project success. To excel in this position, you should hold a Bachelor's degree in Computer Science or related fields such as B.Sc./B.C.A./B.Tech./B.E./M.C.A. with a minimum of 8-10 years of overall experience, primarily in ETL. Your expertise should include 5-6 years of experience with ETL tools like SSIS, Talend, and Adeptia, as well as proficiency in writing PL/SQL or T-SQL programming for Oracle/SQL Server databases. Additionally, you should possess strong knowledge of RDBMS concepts, OLTP system architecture, and analytical programs like Power BI, Crystal Reports/SSRS. Experience with source code control mechanisms, GIT/BitBucket, XML, JSON structures, Jenkins, job scheduling, SOAP, REST, and problem-solving skills are also essential for this role. Strong written and verbal communication, interpersonal skills, and the ability to work independently in high-pressure situations are key attributes. Previous experience in the Banking or Financial Industry is preferred, along with mentorship skills and hands-on experience in languages like Python, Java, or C#. At FIS, you will have the opportunity to learn, grow, and have a significant impact on your career. We offer extensive health benefits, career mobility options, award-winning learning programs, a flexible home-office work model, and the chance to collaborate with global teams and clients. FIS is dedicated to safeguarding the privacy and security of personal information processed for client services. Our recruitment model primarily involves direct sourcing, and we do not accept resumes from agencies not on our preferred supplier list. If you are ready to advance the world of fintech and meet the criteria outlined above, we invite you to join us at FIS.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an Associate Technical Product Analyst - Global Data & Analytics Platform at McDonald's Corporation in Hyderabad, you will be an integral part of the Global Technology Enterprise Products & Platforms (EPP) Team. In this role, you will focus on data management & operations within the Global Data & Analytics Platform (GDAP) to support integrations with core Corporate Accounting/Financial/Reporting applications. Your vision will align with McDonald's goal to be a people-led, product-centric, forward-thinking, and trusted technology partner. Your responsibilities will include supporting the Technical Product Management leadership in technical/IT-related delivery topics such as trade-offs in implementation approaches and tech stack selection. You will provide technical guidance for developers/squad members, manage the output of internal/external squads to ensure adherence to McDonald's standards, participate in roadmap and backlog preparation, and maintain technical process flows and solution architecture diagrams at the product level. Additionally, you will lead acceptance criteria creation, validate development work, support hiring and development of engineers, and act as a technical developer as needed. To excel in this role, you should possess a Bachelor's degree in computer science or engineering, along with at least 3 years of hands-on experience designing and implementing solutions using AWS RedShift and Talend. Experience in data warehouse is a plus, as is familiarity with accounting and financial solutions across different industries. Knowledge of Agile software development processes, collaborative problem-solving skills, and excellent communication abilities are essential for success in this position. Preferred qualifications include proficiency in SQL, data integration tools, and scripting languages, as well as a strong understanding of Talend, AWS Redshift, and other AWS services. Experience with RESTful APIs, microservices architecture, DevOps practices, and tools like Jenkins and GitHub is highly desirable. Additionally, foundational expertise in security standards, cloud architecture, and Oracle cloud security will be advantageous. This full-time role based in Hyderabad, India, offers a hybrid work mode. If you are a detail-oriented individual with a passion for leveraging technology to drive business outcomes and are eager to contribute to a global team dedicated to innovation and excellence, we invite you to apply for the position of Associate Technical Product Analyst at McDonald's Corporation.,

Posted 3 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. The Sr. Analytics Consultant is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Sr. Analytics Consultant will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities Include Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 5+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation in the Application and Interview Process For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies