Jobs
Interviews

1529 Talend Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 80039 Date: Jul 4, 2025 Location: Delhi Designation: Manager Entity: Deloitte Touche Tohmatsu India LLP DATE: 18-Feb-2025 BUSINESS TITLE: Data Architect POSITION DESCRIPTION WHAT YOU’LL DO Define and design future state data architecture for risk data products. Partner with Technology, Data Stewards and various Products teams in an Agile work stream while meeting program goals and deadlines. Create user personas to support various business initiatives. Engage with line of business, operations, and project partners to gather process improvements. Lead to design / build new models to efficiently deliver the risk results to senior management. Evaluate Data related tools and technologies and recommend appropriate implementation patterns and standard methodologies to ensure our Data ecosystem is always modern. Collaborate with Enterprise Data Architects in establishing and adhering to enterprise standards while also performing POCs to ensure those standards are implemented. Provide technical expertise and mentorship to Data Engineers and Data Analysts in the Data Architecture. Develop and maintain processes, standards, policies, guidelines, and governance to ensure that a consistent framework and set of standards is applied across the company. Create and maintain conceptual / logical data models to identify key business entities and visual relationships. Work with business and IT teams to understand data requirements. Maintain a data dictionary consisting of table and column definitions. Review data models with both technical and business audiences. YOU’RE GOOD AT Design, document & train the team on the overall processes and process flows for the Data architecture. Resolve technical challenges in critical situations that require immediate resolution. Develop relationships with external stakeholders to maintain awareness of data and security issues and trends. Review work from other tech team members and provide feedback for growth. Implement Data security policies that align with governance objectives and regulatory requirements. EXPERIENCE & QUALIFICATIONS Bachelor's degree or equivalent combination of education and experience. Bachelor's degree in information science, data management, computer science or related field preferred. Essential Experience & Job Requirements 12+ years of IT experience with major focus on data warehouse/database related projects Expertise in cloud databases like Snowflake/RedShift, data catalogue, MDM etc Expertise in writing SQL and database procedures Proficient in Data Modelling- Conceptual, logical, and Physical modelling Proficient in documenting all the architecture related work performed. Hand on experience in data storage, ETL/ELT and data analytics tools and technologies e.g., Talend, DBT, Attunity, Golden Gate, FiveTran, APIs, Tableau, Power BI, Alteryx etc Experienced in Data Warehousing design/development and BI/ Analytical systems Experience working projects using Agile methodologies Strong hands-on experience with data and analytics data architecture, solution design, and engineering experience Experience with Cloud Big Data technologies such as AWS, Azure, GCP and Snowflake Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) Review existing databases, data architecture, data models across multiple systems and propose architecture enhancements for cross compatibility and target systems Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations YOU’LL WORK WITH Global functional portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels Global Data Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) Consulting and internal Data Portfolio teams

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru

On-site

Key Responsibilities: A day in the life of an Infoscion As part of the Infosys delivery team your primary role would be to interface with the client for quality assurance issue resolution and ensuring high customer satisfaction You will understand requirements create and review designs validate the architecture and ensure high levels of service offerings to clients in the technology domain You will participate in project estimation provide inputs for solution delivery conduct technical risk planning perform code reviews and unit test plan reviews You will lead and guide your teams towards developing optimized high quality code deliverables continual knowledge management and adherence to the organizational guidelines and processes You would be a key contributor to building efficient programs systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you If you think you fit right in to help our clients navigate their next in their digital transformation journey this is the place for you Technical Requirements: Primary skills Technology Data Management Data Integration Talend Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities Strong Technical Skills Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving analytical and debugging skills Preferred Skills: Technology->Data Management - Data Integration->Talend

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Desired Competencies (Technical/Behavioral Competency/Skills) Must-Have** (Ideally should not be more than 3-5) ETL workflows using Talend/Informatica Good to Have Should have B. Tech / M Tech / MCA or equivalent educational Qualification. He/ She should have 5-10 years of experience in IT Industry, able to analyze the data to identify trends, patterns, and insights that inform business decisions. Keywords for Sourcing** (Internal use) ETL workflows using Talend/Informatica SN Responsibility of / Expectations from the Role** 1 Design, develop, and deploy ETL workflows using Talend/Informatica to extract, transform, and load data from various sources. 2 Integrate data from different databases, APIs, flat files, and other data sources into the company’s data warehouse. 3 Work with data architects to design scalable data pipelines that support current and future data needs. 4 Perform data quality checks, cleansing, and validation to ensure data accuracy and integrity.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Panchkula, Haryana, India

On-site

Job Description We are looking for a skilled and experienced Lead/Senior ETL Engineer with 4–8 years of experience to join our data engineering team. In this role, you will be responsible for designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. You’ll also contribute to architectural decisions, lead delivery planning, and provide mentorship to team members. Your hands-on expertise in ETL tools, cloud platforms, and scripting will be key to building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills Strong hands-on experience with ETL tools like SSIS, DataStage, Informatica, or Talend. Deep understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, Fact & Dimension tables. Proficient in working with relational databases: SQL Server, Oracle, Teradata, DB2, or MySQL. Solid scripting/programming skills in Python. Hands-on experience with cloud platforms such as AWS or Azure. Knowledge of middleware architecture and enterprise data integration strategies. Familiarity with reporting/BI tools like Tableau and Power BI. Strong grasp of data modeling principles and performance optimization. Ability to write and review high and low-level design documents. Strong communication skills and experience working with cross-cultural, distributed teams. Roles And Responsibilities Design and develop ETL workflows and data integration strategies. Create and review high and low-level designs adhering to best practices. Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. Coach and mentor junior engineers to support skill development and performance. Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. Participate in planning, estimations, and recruitment activities. Work on multiple projects simultaneously, ensuring quality and consistency in delivery. Experience in Sales and Marketing data domains. Exposure to reporting and analytics projects. Strong problem-solving abilities with a data-driven mindset. Ability to work independently and collaboratively in a fast-paced environment. Prior experience in global implementations and managing multi-location teams is a plus.

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Diverse Lynx is looking for Snowflake Developer to join our dynamic team and embark on a rewarding career journey Design and develop data solutions within the Snowflake cloud data platform, including data warehousing, data lake, and data modeling solutions Participate in the design and implementation of data migration strategies Ensure the quality of custom solutions through the implementation of appropriate testing and debugging procedures Provide technical support and troubleshoot issues as needed Stay up-to-date with the latest developments in the Snowflake platform and data warehousing technologies Contribute to the ongoing improvement of development processes and best practices

Posted 1 month ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 1 month ago

Apply

10.0 - 18.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence

Posted 1 month ago

Apply

14.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Data warehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Datawarehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs

Posted 1 month ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.

Posted 1 month ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Are you looking for a career move that will put you at the heart of a global financial institution? Then bring your skills in ETL applications to our Markets Operations technology Team. By Joining Citi, you will become part of a global organisation whose mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. Team/Role Overview Our Markets Operations Technology Team are a global team which provides exposure to all countries such as India, America and UK. The role of this team is to manage the end to end processing of a case within ETL Applications. What You’ll Do The Applications Development Senior ETL Programmer/Developer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. This role requires deep expertise in system design, hands-on coding, and strong problem-solving skills to create resilient, high-performing, and secure applications. What We’ll Need From You Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyse complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Ensure essential procedures are followed and help define operating standards and processes Acts as SME to senior stakeholders and /or other team members. Drive the adoption of modern engineering ways of working, including Agile, DevOps, and CI/CD. Advocate for automated testing, infrastructure as code, and continuous monitoring to enhance software reliability. Apply Behaviour-Driven Development (BDD), Test-Driven Development (TDD), and unit testing to ensure code quality and functionality. Conduct thorough code reviews, ensuring adherence to best practices in readability, performance, and security. Implement and enforce secure coding practices, performing vulnerability assessments and ensuring compliance with security standards. Relevant Technologies: ETL Technology like Talend, Python etc., Version Control, Strong Relational Database experience, Unix Scripting, Job Scheduling, CI/CD setup Qualifications/Experience: Candidate should have experience with ETL Applications and in the Banking Industry Experience with high performance & high volume Integrated ETL Development using Talent and database performance tuning. Strong understanding of the Database and well versed with performance tuning, Stored Procedures etc., Display sound analytical, problem solving, presentation and inter-personal skills to handle various critical situations. Ability to carry out adaptive changes necessitated by changes in business requirements and technology. Post trade processing experience; Familiarity with trade life cycle and associated business processes. Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting

Posted 1 month ago

Apply

8.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Job Summary: We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering environment. Key Responsibilities: Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL workflows Mandatory Skills: ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data workflows Good-to-Have Skills: API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Roles and Responsibilities: Lead and manage end-to-end delivery of data-centric projects including data warehousing, data integration, and business intelligence initiatives. Drive project planning, execution, monitoring, and closure using industry-standard project management methodologies (Agile/Scrum/Waterfall). Collaborate with cross-functional teams, data architects, developers, and business stakeholders to define project scope and requirements. Ensure timely delivery of project milestones within the approved scope, budget, and timelines. Proactively manage project risks, dependencies, and issues with clear mitigation strategies. Establish effective communication plans and engage stakeholders at all levels to ensure project alignment and transparency. Maintain and track detailed project documentation including timelines, resource plans, status reports, and governance logs. Lead one or more full lifecycle ETL/Data integration implementations from initiation to go-live and support transition. Ensure alignment of data architecture and modeling practices with organizational standards and best practices. Must-Have Skills Minimum 5+ years of experience in Project Management, with at least 3 years managing data-centric projects (e.g., Data Warehousing, Business Intelligence, Data Integration). Strong understanding of data architecture principles, data modeling, and database design. Proven experience managing full-lifecycle ETL/Data integration projects. Hands-on exposure to project planning, budgeting, resource management, stakeholder communication, and risk management. Ability to drive cross-functional teams and communicate effectively with both technical and non-technical stakeholders. Good-to-Have Skills Working knowledge or hands-on experience with ETL tools such as: Informatica Talend IBM DataStage SSIS AWS Glue Azure Data Factory GCP Dataflow Familiarity with Agile/Scrum methodologies and tools like JIRA, MS Project, or Confluence. PMP, PMI-ACP, or Scrum Master certification. Prior experience working with cloud-based data solutions. Skills Healthcare,Etl,Data Warehousing,Project Management

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderābād

On-site

Skills: 3-5 years of Data Warehouse, Business Intelligence work experience required working with Talend Open Studio Extensive experience with Talend Real-Time Big Data Platform in the areas of design, development and Testing with focus on Talend Data Integration and Talend Big-data Real Time including Big data Streaming (Spark) jobs with different databases Experience working with data bases like Greenplum, HAWQ, Oracle, Teradata, MsSql Server, Sybase, Casandra, Mongo-DB, Flat files, API’s and different Hadoop concepts in Bigdata (ecosystems like Hive, Pig, Sqoop, and Map Reduce) Working knowledge of Java is preferred Advanced knowledge of ETL, including the ability to read and write efficient, robust code, follow or implement best practices and coding standards, design implement common ETL strategies (CDC, SCD, etc.), and create reusable maintainable jobs Solid background in database systems (such as Oracle, SQL Server, Redshift and Salesforce) along with strong knowledge of PLSQL and SQL Knowledge and hands on of Unix commands and Shell Scripting Good knowledge of SQL, including the ability to write stored procedures, triggers, functions etc.

Posted 1 month ago

Apply

2.0 - 5.0 years

0 Lacs

Chennai

On-site

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Edits new and existing applications. Implements, testing and debugging defined software components. Documents all development activity. Works with moderate guidance in own area of knowledge. Job Description Key Skills: Advanced SQL ( Mysql, Presto, Oracle etc) Data Modeling (Normalization and Denormilation) ETL Tools (Talend, Pentaho, Informatica and Creation of Custom ETL Scripts) Big Data Technologies (Hadoop, Spark, Hive, Kafka etc) Data Warehousing (AWS, Big Query etc) Reporting (Tableau, Power BI) Core Responsibilities: Data focused role would be expected to leverage these skills to design and implement robust data solutions. They would also play a key role in mentoring junior team members and ensuring the quality and efficiency of data processes. Skills in data visualization tools like Tableau and Power BI. Good to have Data Quality principles Analyzes and determines integration needs. Evaluates and plans software designs, test results and technical manuals. Reviews literature, patents and current practices relevant to the solution of assigned projects. Programs new software, web applications and supports new applications under development and the customization of current applications. Edits and reviews technical requirements documentation. Works with Quality Assurance team to determine if applications fit specification and technical requirements. Displays knowledge of engineering methodologies, concepts, skills and their application in the area of specified engineering specialty. Displays knowledge of and ability to apply, process design and redesign skills. Displays in-depth knowledge of and ability to apply, project management skills. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) and overtime as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 2-5 Years

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Summary: We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS, and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS, or Glue. Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams. Benefits As per company standards.

Posted 1 month ago

Apply

4.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment . Equal employment opportunity information KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability, or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you. Key Responsibilities 4 to 9 years Design and implement scalable data pipelines using Snowflake Data Cloud. Develop and maintain ETL/ELT workflows using tools such as Azure Data Factory, AWS Glue, Informatica, Talend, or Qlik Replicate. Integrate data from various sources and ensure data quality and consistency. Build and manage workflow orchestration using Apache Airflow, Control-M, or Tidal Automation. Write advanced SQL queries and scripts for data transformation and analysis. Develop Python-based data processing solutions using Pandas, PySpark, or Snowpark. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Optimize data models and queries for performance and scalability. Ensure adherence to data governance and security standards. Must-Have Skills Snowflake Cloud Platform – Strong hands-on experience. ETL/ELT Tools – Proficiency in one or more: Azure Data Factory, AWS Glue, Informatica, Talend, Qlik Replicate. Workflow Orchestration – Experience with Apache Airflow, Control-M, or Tidal Automation. Programming – Advanced SQL and Python (including dataframes with Pandas, PySpark, or Snowpark). Data Engineering Concepts – Strong understanding of data pipelines, wrangling, and optimization. Good-to-Have Skills SQL scripting and procedural logic. Data modeling tools such as Erwin or dbt. Integration tools like Fivetran or Stitch. Educational Qualification: Bachelor’s degree or higher in Information Technology, Engineering, or a related field. B.E / B.Tech/ BCA/ MBA/ MCA Full-Time Education Interested candidates can share their resumes at sonalidas4@kpmg.com

Posted 1 month ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Greeting from Infosys BPM Ltd, We are hiring for Walkme, ETL Testing + Python Programming, Automation Testing with Java, Selenium, BDD, Cucumber, Test Automation using Java and Selenium, with knowledge on testing process, SQL, ETL DB Testing, ETL Testing Automation skills. Please walk-in for interview on 9th and 10th July 2025 at Pune location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-217814 Interview details Interview Date: 9th and 10th July 2025 Interview Time: 10 AM till 1 PM Interview Venue: Pune:: Hinjewadi Phase 1 Infosys BPM Limited, Plot No. 1, Building B1, Ground floor, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi Phase 1, Pune, Maharashtra-411057 Please find below Job Description for your reference: Work from Office*** Min 2 years of experience on project is mandate*** Job Description: Walkme Design, develop, and deploy WalkMe solutions to enhance user experience and drive digital adoption. Experience in task-based documentation, training and content strategy Experience working in a multi-disciplined team with geographically distributed co-workers Working knowledge technologies such as CSS and JavaScript Project management and/or Jira experience Experience in developing in-app guidance using tools such as WalkMe, Strong experience in technical writing, instructional video or guided learning experience in a software company Job Description: ETL Testing + Python Programming Experience in Data Migration Testing (ETL Testing), Manual & Automation with Python Programming. Strong on writing complex SQLs for data migration validations. Work experience with Agile Scrum Methodology Functional Testing- UI Test Automation using Selenium, Java Financial domain experience Good to have AWS knowledge Job Description: Automation Testing with Java, Selenium, BDD, Cucumber Hands on exp in Automation. Java, Selenium, BDD , Cucumber expertise is mandatory. Banking Domian Experience is good. Financial domain experience Automation Talent with TOSCA skills, Payment domain skills is preferable. Job Description: Test Automation using Java and Selenium, with knowledge on testing process, SQL Java, Selenium automation, SQL, Testing concepts, Agile. Tools: Jira and ALM, Intellij Functional Testing: UI Test Automation using Selenium, Java Financial domain experience Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

ETL Testing Primary Skills : ETL Testing, SQL, Unix Commands, Python (Automation Exposure) .Analyze business and technical requirements to understand ETL workflows and data transformations .Develop and execute comprehensive test plans, test cases, and test scripts for ETL pipelines .Hands-on experience in ETL/Data Warehouse Testin g .Strong proficiency in SQ L for data validation and transformation logic .Practical experience with Pytho n for data testing or automation scripting .Good knowledge of Shell scriptin g and ability to work in Unix/Linu x environments .Understanding of data warehouse concepts, data modeling, and data quality principles .Experience in test management and defect tracking tools .Familiarity with ETL tools (e.g., Informatica, Talend, SSIS, or custom Python-based ETL) .Exposure to cloud platforms like AWS, GCP, or Azure .Basic understanding of version control (Git) and CI/CD pipelines

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: Talend Open Studio Experience: 8 to 12 Years Location: Chennai Detailed JD: Senior Developer with min 4 years handson experience in designing, developing, and documenting Talend ETL processes, technical architecture, data pipelines. Good understanding of complete SDLC Process Minimum 4+ years of relevant experience in Talend Big Data Platform & Talend Data Integration, data area like Cloud(Azure), Data warehouse, Data Lake, ETL, data quality etc. Proficient experience in designing and developing of mappings, transformations, sessions and workflows, and deploying integration solutions using Talend tool Proficient in data integration between different sources (on prem /Cloud ) Extensively worked on data extraction, Transformation and loading data from various sources like JSON, CSV, XML and Flat files etc in Talend Experience in Performance Tuning

Posted 1 month ago

Apply

0.0 - 8.0 years

0 Lacs

Panchkula, Haryana

On-site

Description Job Description We are looking for a skilled and experienced Lead/Senior ETL Engineer with 4–8 years of experience to join our data engineering team. In this role, you will be responsible for designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. You’ll also contribute to architectural decisions, lead delivery planning, and provide mentorship to team members. Your hands-on expertise in ETL tools, cloud platforms, and scripting will be key to building efficient, scalable, and reliable data solutions for enterprise-level implementations. Skills Key Skills Strong hands-on experience with ETL tools like SSIS, DataStage, Informatica, or Talend. Deep understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, Fact & Dimension tables. Proficient in working with relational databases: SQL Server, Oracle, Teradata, DB2, or MySQL. Solid scripting/programming skills in Python. Hands-on experience with cloud platforms such as AWS or Azure. Knowledge of middleware architecture and enterprise data integration strategies. Familiarity with reporting/BI tools like Tableau and Power BI. Strong grasp of data modeling principles and performance optimization. Ability to write and review high and low-level design documents. Strong communication skills and experience working with cross-cultural, distributed teams. Responsibilities Roles and Responsibilities Design and develop ETL workflows and data integration strategies. Create and review high and low-level designs adhering to best practices. Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. Coach and mentor junior engineers to support skill development and performance. Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. Participate in planning, estimations, and recruitment activities. Work on multiple projects simultaneously, ensuring quality and consistency in delivery. Experience in Sales and Marketing data domains. Exposure to reporting and analytics projects. Strong problem-solving abilities with a data-driven mindset. Ability to work independently and collaboratively in a fast-paced environment. Prior experience in global implementations and managing multi-location teams is a plus. Contacts Email: careers@grazitti.com Address: HSIIDC Technology Park, Plot No – 19, Sector 22, 134104, Panchkula, Haryana, India

Posted 1 month ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Experience : 6+ years Location : Description : We are seeking a Data Engineer with expertise in Talend pipeline development for Snowflake Type 2 modeling and proficiency in IBM Data Replicator and Qlik Replicate. The role involves analysing current on-premises data sources across containerized DB2, DB2, Oracle, and Hadoop, and designing scalable data pipelines aligned with the companys data management framework. The candidate will handle data replication processes, including Change Data Capture (CDC), for both historical and incremental updates. Key Responsibilities Design and build Talend pipelines to implement Type 2 Slowly Changing Dimensions (SCD) models in Snowflake. Analyze and assess existing on-premises data sources (DB2, Oracle, Hadoop) for migration and integration. Develop and optimize data replication strategies using IBM Data Replicator and Qlik Replicate. Implement one-time data migration processes for history and archives and configure pipelines for CDC-based updates. Collaborate with data architects and business teams to define and enforce data modeling standards in Snowflake. Perform data profiling, validation, and reconciliation to ensure data integrity and consistency during migrations. Monitor and troubleshoot data pipelines, ensuring scalability, reliability, and performance. Document pipeline designs, workflows, and data mappings for compliance and audit : Proficiency in Talend ETL development and integration with Snowflake. Hands-on experience with IBM Data Replicator and Qlik Replicate. Strong knowledge of Snowflake database architecture and Type 2 SCD modeling. Expertise in containerized DB2, DB2, Oracle, and Hadoop data sources. Understanding of Change Data Capture (CDC) processes and real-time data replication patterns. Experience with SQL, Python, or Shell scripting for data transformations and automation. Familiarity with cloud platforms (AWS, Azure) and DevOps practices for pipeline automation. Preferred Skills Experience in data governance frameworks and metadata management. Working knowledge of version control tools (e.g., Git) and CI/CD pipelines. Exposure to Kafka or other streaming platforms for data ingestion. Strong troubleshooting and performance optimization capabilities (ref:hirist.tech)

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary We are seeking a highly experienced QA professional with over 10 years of experience to join our Quality Assurance team for the data migration project. The ideal candidate will have a strong background in ETL testing, data validation, and migration projects, with expertise in creating test cases and test plans, as well as hands-on experience with data migration to cloud platforms like Snowflake. The role requires leadership capabilities to manage testing efforts, including coordinating with both on-shore and off-shore teams, ensuring seamless collaboration and delivery. Proficiency in ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage is essential, along with a solid understanding of SQL and semi-structured data formats such as JSON and XML. Key Responsibilities Develop and implement comprehensive test strategies and plans for data migration projects, ensuring full coverage of functional and non-functional requirements. Create detailed test cases, test plans, and test scripts for validating data migration processes and transformations. Conduct thorough data validation and verification testing, leveraging advanced SQL skills to write and execute complex queries for data accuracy, completeness, and consistency. Utilize ETL tools such as Talend, Informatica PowerCenter, or DataStage to design and execute data integration tests, ensuring successful data transformation and loading into target systems like Snowflake. Validate semi-structured data formats (JSON, XML), ensuring proper parsing, mapping, and integration within data migration workflows. Lead testing efforts for data migration to cloud platforms, ensuring seamless data transfer and integrity. Act as the QA Lead to manage and coordinate testing activities with on-shore and off-shore teams, ensuring alignment, timely communication, and delivery of quality outcomes. Document and communicate test results, defects, and issues clearly to the development and project teams, ensuring timely resolutions. Collaborate with cross-functional teams to create and maintain automated testing frameworks for ETL processes, improving testing efficiency and coverage. Monitor adherence to QA best practices and standards while driving process improvements. Stay updated on the latest QA tools, technologies, and methodologies to enhance project outcomes. Qualifications 10+ years of experience in Quality Assurance, focusing on ETL testing, data validation, and data migration projects. Proven experience creating detailed test cases, test plans, and test scripts. Hands-on experience with ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage. Proficiency in SQL for complex query writing and optimization for data validation and testing. Experience with cloud data migration projects, specifically working with databases like Snowflake. Strong understanding of semi-structured data formats like JSON and XML, with hands-on testing experience. Proven ability to lead QA efforts, manage teams, and coordinate with on-shore and off-shore teams effectively. Strong analytical and troubleshooting skills for resolving data quality and testing challenges. Preferred Skills Experience with automated testing tools and frameworks, particularly for ETL processes. Knowledge of data governance and data quality best practices. Familiarity with AWS or other cloud-based ecosystems. ISTQB or equivalent certification in software testing (ref:hirist.tech)

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering Responsibilities : Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL Skills : ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data Skills : API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling) (ref:hirist.tech)

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies