Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
9 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Job Description: We are looking for a skilled ETL Tester to join our data engineering team. The ideal candidate will have hands-on experience in ETL testing using tools like Talend (preferred) , Informatica PowerCenter , or IBM DataStage , along with a strong command of SQL for data validation and backend testing. Key Responsibilities: Design, develop, and execute test cases for ETL processes. Perform data validation and data integrity testing across various data sources and destinations. Work closely with developers, data engineers, and business analysts to understand ETL requirements. Troubleshoot data issues and identify root causes through SQL analysis. Perform regression, system, integration, and UAT testing. Validate ETL performance and load testing as needed. Create detailed documentation of test cases, test results, and defect reports. Required Skills: ETL Tools: Proficiency in Talend (preferred), Informatica PowerCenter , or IBM DataStage . SQL Expertise: Strong experience in writing complex SQL queries for data comparison, validation, and debugging. Data Warehousing Concepts: Good understanding of data models, schemas, and data flows in a DWH environment. Testing Skills: Experience in functional, integration, regression, and system testing of ETL workflows. Defect Management Tools: Exposure to tools like JIRA , Bugzilla , or similar.
Posted 2 weeks ago
0 years
0 Lacs
Thane, Maharashtra, India
On-site
Role: ETL Developer Location: Thane Work Mode: Work From Office only Workdays: Monday to Friday About company: It is a cutting-edge FinTech and RegTech company headquartered in Atlanta, USA, with an R&D center in Thane, India. We specialize in solving complex problems in the banking and payments industry using AI, machine learning, and big data analytics. Our flagship product , is an AI-powered platform designed for banks, fintechs, and payment processors. It simplifies operations across four critical areas: ✅ Compliance ✅ Fraud Detection ✅ Reconciliation ✅ Analytics Built on the powerful HPCC Systems platform, it helps financial institutions improve data accuracy, reduce risk, and increase operational efficiency. Job Description: • Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities. • Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions. • Develop and implement data collection systems and other strategies that optimize Statistical efficiency and data quality. • Acquire data from primary or secondary data sources and maintain databases/data systems. • Identify, analyze, and interpret trends or patterns in complex data sets. • Work closely with management to prioritize business and information needs. • Locate and define new process improvement opportunities. • Prepare documentation for further reference. • Performing quality testing and data assurance. • High attention to detail. • Passionate about complex data structures and problem solving. Qualifications: • Bachelor’s degree in computer science, electrical engineering, or information technology • Experience working in IT. • Experience working with complex data sets. • Knowledge of at least one ETL tool (SSIS, Informatica, Talend, etc.). • Knowledge of HPCC Systems and C++ preferred • Familiarity with Kafka on-premise architectures and ELK. • Understanding of cross cluster replication, index lifecycle management and hot-warm architectures
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior ETL & Data Streaming Engineer at DataFlow Group, a global provider of Primary Source Verification solutions and background screening services, you will be a key player in the design, development, and maintenance of robust data pipelines. With over 10 years of experience, you will leverage your expertise in both batch ETL processes and real-time data streaming technologies to ensure efficient data extraction, transformation, and loading into our Data Lake and Data Warehouse. Your responsibilities will include designing and implementing highly scalable ETL processes using industry-leading tools, as well as architecting batch and real-time data streaming solutions with technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into effective pipeline solutions, ensuring data quality, integrity, and security across all storage solutions. Monitoring, troubleshooting, and optimizing existing data pipelines for performance, cost-efficiency, and reliability will be a crucial part of your role. Additionally, you will develop comprehensive documentation for all ETL and streaming processes, contribute to data governance policies, and mentor junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this position, you should have 10+ years of progressive experience in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools like Talend, proficiency in Data Streaming Technologies such as AWS Glue and Apache Kafka, and extensive experience with AWS data services like S3, Glue, and Lake Formation will be essential. Strong knowledge of traditional data warehousing concepts, dimensional modeling, programming languages like SQL and Python, and relational and NoSQL databases will also be required. If you are a problem-solver with excellent analytical skills, strong communication abilities, and a passion for staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming, we invite you to join our team at DataFlow Group and make a significant impact in the field of data management.,
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Location Chennai, Tamil Nadu, India Job ID R-231552 Date posted 15/07/2025 Job Title: Senior Consultant - Coupa GCL -D3 Introduction to role Are you ready to disrupt an industry and change lives? As a Senior Consultant specializing in the Coupa Platform, you'll leverage your technical expertise to support the delivery of life-changing solutions. You'll act as the technical domain expert, driving program management and scaling efforts by collaborating with key stake holders. Your role will be pivotal in transforming our ability to develop medicines that impact lives. Accountabilities Technical Ownership: Support Coupa technical solution design and implementation in alignment with design decisions. Participate in design discussions and contribute towards decisions. Engage in the full lifecycle of Coupa technical delivery—from concept to design to deployment and post-implementation stabilization. Gather high-level business requirements, perform analysis, define Coupa technology requirements, and design solutions based on completed analysis. Integration & Middleware Oversight: Support mapping between legacy and target systems (Coupa), coordinating with middleware and interface teams. Define and validate integration points across systems, applications, and services. Support development and testing of APIs, messaging frameworks, error handling, push-pull mechanisms, and data pipelines. Data Migration Execution: Support large-volume data migration activities, including mock runs, cutover rehearsal, and production go-live support. Ensure data cleansing, mapping rules, and exception handling are well-documented and implemented. Collaborate with business stake holders to define data acceptance criteria and validation plans. DevOps Skills: Demonstrate strong knowledge about the Coupa platform and its integrations. Actively assess system enhancements and deploy them in accordance with the latest platform product release. Identify process improvements and implement changes with clear outcomes of improvement and standardization. Undertake diagnostic work to understand specific technical issues or problems in greater depth. Manage change management end-to-end and support testing activities by triaging, scenario setting, etc. Deliver platform-based projects to improve adoption of the latest features. Resolve issues by partnering with technical, finance, procurement teams, and vendors. Essential Skills/Experience Coupa certified 8+ years of overall IT experience with solid background on Coupa Technical delivery roles, with proven experience in large-scale data migration and middleware integration. Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar. Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting). Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies. Strong problem-solving and analytical skills in technical and data contexts. Ability to translate complex technical designs into business-aligned delivery outcomes. Leadership in cross-functional and cross-technology environments. Effective communicator capable of working with developers, data engineers, testers, and business stake holders. Experienced with IT Service Management tools like ServiceNow & Jira Experience in managing and developing 3rd party business relationships - UG - B. Tech /B.E. or other equivalent technical qualifications When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we empower our teams to innovate and take ownership of their work. Our dynamic environment encourages experimentation with innovative technology while tackling challenges that have never been addressed before. With a focus on collaboration across diverse areas, we drive simplicity and frictionless interactions. Here you can design your own path with the support needed to thrive and develop. Our commitment to lifelong learning ensures that every day is an opportunity for growth. Ready to make a meaningful impact? Apply now to join us on this exciting journey! Date Posted 16-Jul-2025 Closing Date 29-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Location Chennai, Tamil Nadu, India Job ID R-231549 Date posted 15/07/2025 Title : Analyst - Coupa GCL : C3 Job Title: Analyst - Coupa Introduction to role: Are you ready to disrupt an industry and change lives? As an Analyst specializing in the Coupa Platform, you'll leverage your technical expertise to support the design, implementation, and integration of this progressive technology. You'll be the technical domain expert, leading change and scaling solutions by collaborating with stake holders. Your work will directly impact our ability to develop life-changing medicines and empower the business to perform at its peak. Accountabilities: Technical Ownership: Support Coupa technical solution design & implementation in alignment with design decisions. Participate in design discussions and supply towards decisions. Engage in the full lifecycle of Coupa technical delivery—from concept to design to deployment and post-implementation stabilization. Gather high-level business requirements, perform analysis, define Coupa technology requirements, and design solutions based on completed analysis. Integration & Middleware Oversight: Support mapping between legacy and target systems (Coupa), coordinating with middleware and interface teams. Define and validate integration points across systems, applications, and services. Support development and testing of APIs, messaging frameworks, error handling, push-pull mechanisms, and data pipelines. Data Migration Execution: Support large-volume data migration activities, including mock runs, cutover rehearsal, and production release support. Ensure data cleansing, mapping rules, and exception handling are well-documented and implemented. Collaborate with business stake holders to define data acceptance criteria and validation plans. DevOps Skills: Demonstrate strong knowledge about the Coupa platform and its integrations. Actively assess system enhancements and deploy them in accordance with the latest platform product release. Identify process improvements and implement change with clear outcomes of improvement and standardization. Undertake diagnostic work to understand specific technical issues or problems in greater depth. Manage change management end-to-end and support testing activities by triaging, scenario setting, etc. Deliver platform-based projects to improve adoption of the latest features. Resolve issues by partnering with technical, finance, procurement teams, and vendors. Essential Skills/Experience: Coupa certified 6+ years of overall IT experience with good background on Coupa Technical delivery roles Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting) Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies Strong problem-solving and analytical skills in technical and data contexts Ability to translate complex technical designs into business-aligned delivery outcomes Leadership in cross-functional and cross-technology environments Effective communicator capable of working with developers, data engineers, testers, and business stake holders Experienced with IT Service Management tools like ServiceNow & Jira Experience in managing and developing 3rd party business relationships Educational Qualifications: - UG - B. Tech /B.E. or other equivalent technical qualifications When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, you'll be part of a dynamic environment where innovation thrives. Our commitment to innovative science combined with leading digital technology platforms empowers us to make a significant impact. With a spirit of experimentation and collaboration across diverse teams, we drive cross-company change to disrupt the industry. Here, you can explore new technologies, shape your own path, and contribute to developing life-changing medicines. Ready to make a difference? Apply now to join our journey! Date Posted 16-Jul-2025 Closing Date 29-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
SQL Talend/Snowflake ETL Banking Client
Posted 2 weeks ago
2.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Based in India, you will report directly to the team lead. Your responsibilities will include developing projects from detailed business requirements, working through solutions, managing execution and rollout of these solutions within a consistent global platform. You will also create T SQL queries/stored Procedures/Functions/Triggers using SQL Server 2014 and 2017, understand basic data warehousing concepts, and design/develop SSIS packages to pull data from various source systems and load to target tables. Additionally, you may be required to develop Dashboards and Reports using SSRS and work on BAU JIRAs. Providing detailed analysis and documentation of processes and flows when necessary is also part of the role. You will consult with users, clients, and other technology groups on issues, recommend programming solutions, install and support customer exposure systems. Analyzing applications to identify vulnerabilities and security issues, as well as conducting testing and debugging are crucial responsibilities. The role requires the ability to operate with a limited level of direct supervision, exercising independence of judgment and autonomy. Qualifications: - 4-8 years of overall IT experience with 2+ years in Financial Services industry - Strong understanding of Microsoft SQL Server, SSIS, SSRS, Autosys - 2+ years experience in any ETL tool, preferably SSIS - Some knowledge of Python can be a differentiator - Highly motivated, ability to multitask and work under pressure - Strong analytical and problem-solving skills - Good to have: Talend, Github Knowledge - Strong knowledge of database fundamentals and advanced concepts, ability to write efficient SQL - Experience with a reporting tool (e.g., SSRS, QlikView) is a plus - Experience with a job scheduling tool (e.g., Autosys) - Experience in Finance Industry is desired - Experience with all phases of Software Development Life Cycle Education: - Bachelors degree/University degree or equivalent experience This job description gives a comprehensive overview of the types of work performed. Other job-related duties may be assigned as required.,
Posted 2 weeks ago
7.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Data Engineering Technology Lead position is a senior-level role where you will be responsible for establishing and implementing new or revised data platform ecosystems and programs in coordination with the Technology team. Your main objective will be to lead the data engineering team in implementing the business requirements. Your responsibilities will include designing, building, and maintaining batch or real-time data pipelines in the data platform, as well as optimizing the data infrastructure for accurate extraction, transformation, and loading of data from various sources. You will be developing ETL processes to extract and manipulate data from multiple sources, automating data workflows, and preparing raw data in Data Warehouses for technical and non-technical stakeholders. Additionally, you will partner with data scientists and functional leaders to deploy machine learning models, build, maintain, and deploy data products for analytics and data science teams, ensure data accuracy, integrity, privacy, security, and compliance, and monitor data systems performance while implementing optimization strategies. You will also collaborate with management teams to integrate functions, resolve high-impact problems/projects, provide expertise in applications programming, ensure application design aligns with the overall architecture blueprint, and develop comprehensive knowledge of how different business areas integrate to achieve goals. To qualify for this role, you should have 12+ years of total experience with at least 7 years in a relevant Data engineering role. Advanced SQL skills, experience with relational databases, and proficiency in object-oriented languages like Python/Pyspark are necessary. Experience with data ingestion tools such as Talend & Ab Initio, data lakehouse architecture like Iceberg/Starburst, scripting languages like Bash, data pipeline and workflow management tools, and strong project management and organizational skills are also required. You should possess excellent problem-solving, communication, organizational skills, proven ability to work independently and with a team, and experience in managing and implementing successful projects. Demonstrated leadership, project management skills, and clear communication are key qualities for this role. A Bachelor's degree/University degree or equivalent experience is a minimum educational requirement, with a Master's degree being preferred. Please note that this job description provides a high-level overview of the work performed, and other job-related duties may be assigned as required.,
Posted 2 weeks ago
3.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to apply for the position of Lead / Senior ETL & Data Migration QA Engineer at our company based in Hyderabad, India (Mandatory 5 days working from Office). With 4 to 12 years of experience, you will be a key member of our Quality Assurance team, focusing on a high-impact data migration project. Your responsibilities will include ETL testing, data validation, and cloud migration, utilizing your expertise in SQL, ETL tools (preferably Talend), and cloud platforms like Snowflake. This role necessitates leadership in overseeing QA efforts across global teams to ensure the accuracy of large-scale data transformations. Your main duties will involve designing and implementing robust test strategies and plans for data migration and ETL processes, as well as developing and executing detailed test cases, scripts, and plans to validate data accuracy, completeness, and consistency. You will conduct advanced SQL-based data validation and transformation testing, utilize ETL tools such as Talend, Informatica PowerCenter, or DataStage to validate data pipelines, and test semi-structured data formats like JSON and XML. Additionally, you will lead QA activities for cloud data migration projects, particularly to Snowflake, and coordinate testing activities across on-shore and off-shore teams to ensure timely and quality delivery. Documenting test results, defects, and collaborating with development teams for resolution, as well as contributing to automated testing frameworks for ETL processes, will also be part of your responsibilities. You will be expected to promote QA best practices and drive continuous improvement initiatives. To be eligible for this position, you should have at least 3 years of experience in QA with a focus on ETL testing, data validation, and data migration. Proficiency in SQL for complex queries and data validation is essential, along with hands-on experience in Talend (preferred), Informatica PowerCenter, or DataStage. Experience with cloud data platforms, especially Snowflake, and a strong understanding of semi-structured data formats (JSON, XML) are required. Your excellent analytical and problem-solving skills, along with experience working in distributed teams and leading QA efforts, will be highly valuable in this role. Additionally, preferred skills include experience with automated testing tools for ETL processes, knowledge of data governance and data quality standards, familiarity with AWS or other cloud ecosystems, and an ISTQB or equivalent certification in software testing. If you are passionate about quality assurance, data migration, and ETL processes, and possess the required qualifications and skills, we encourage you to apply for this challenging and rewarding opportunity.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Talend ETL Lead, you will be responsible for leading the design and development of scalable ETL pipelines using Talend, integrating with big data platforms, and mentoring junior developers. This is a high-impact, client-facing role requiring hands-on leadership and solution ownership. Lead the end-to-end development of ETL pipelines using Talend Data Fabric. Collaborate with data architects and business stakeholders to understand requirements. Build and optimize data ingestion, transformation, and loading processes. Ensure high performance, scalability, and reliability of data solutions. Mentor and guide junior developers in the team. Troubleshoot and resolve ETL-related issues quickly. Manage deployments and promote code through different environments. Qualifications: - 7+ years of experience in ETL/Data Engineering. - Strong hands-on experience with Talend Data Fabric. - Solid understanding of SQL, Hadoop ecosystem (HDFS, Hive, Pig, etc.). - Experience building robust data ingestion pipelines. - Excellent communication and leadership skills.,
Posted 2 weeks ago
5.0 years
0 Lacs
India
Remote
Job Title: Talend with SAP Master Data Specialist for S/4HANA Public Cloud Migration Remote Long Term/Full Time We are seeking a highly skilled and experienced Talent & SAP Master Data Specialist to play a critical role in our SAP S/4HANA Public Cloud data migration project. The ideal candidate will possess deep expertise in the Talend data integration platform and extensive knowledge of SAP Master Data, specifically within the context of SAP S/4HANA Public Cloud. This role will be instrumental in ensuring the successful extraction, transformation, loading, cleanup, and enrichment of data for our new S/4HANA environment. Responsibilities Data Migration Strategy & Execution: Collaborate with the project team to define and implement data migration strategies for SAP S/4HANA Public Cloud, leveraging Talend for efficient data extraction, transformation, and loading (ETL). Talend Development: Design, develop, test, and deploy robust and scalable ETL jobs using Talend Data Integration (or similar Talend products) to migrate master and transactional data from various source systems to SAP S/4HANA Public Cloud. SAP Master Data Expertise: Apply in-depth knowledge of SAP Master Data objects (e.g., Material Master, Customer Master, Vendor Master, Business Partner, GL Accounts, etc.) to ensure data integrity, consistency, and adherence to S/4HANA Public Cloud best practices. Data Quality & Governance: Lead data cleanup and enrichment activities, identifying and resolving data discrepancies, inconsistencies, and incompleteness. Implement data quality rules and processes to ensure high-quality data in the target S/4HANA system. Data Mapping & Transformation: Develop complex data mappings and transformations between source systems and SAP S/4HANA Public Cloud, ensuring accurate and efficient data conversion. Collaboration: Work closely with functional consultants, business users, and technical teams to understand data requirements, validate transformed data, and resolve any data-related issues. Documentation: Create and maintain comprehensive documentation for data migration processes, ETL jobs, data mappings, and data quality rules. Troubleshooting & Optimization: Identify and troubleshoot data migration issues, performance bottlenecks, and provide effective solutions. Optimize ETL processes for efficiency and performance. Required Skills & Qualifications 5+ years of hands-on experience with Talend Data Integration (or other relevant Talend products) for complex data migration and integration projects. Extensive experience (5+ years) with SAP Master Data concepts, structures, and best practices across various modules (e.g., SD, MM, FICO) in an SAP ECC or S/4HANA environment. Proven experience with SAP S/4HANA Public Cloud data migration projects is highly desirable. Strong understanding of data governance, data quality, and data stewardship principles. Proficiency in SQL and experience with various database systems. Experience with data profiling, data cleansing, and data enrichment techniques. Excellent analytical and problem-solving skills with a keen eye for detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and business stakeholders. Ability to work independently and as part of a team in a fast-paced project environment. Preferred Qualifications Certifications in Talend or SAP S/4HANA. Experience with other data migration tools or methodologies. Knowledge of SAP Activate methodology.
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Title: Data Architect Location: Bangalore /Hyderabad / Remote Key Responsibilities Design and implement scalable, secure, and high-performance data architecture solutions tailored for logistics operations. Define data standards, models, and governance policies across heterogeneous data sources (e.g., EDI, ERP, TMS, WMS). Architect and optimize data pipelines to enable real-time analytics and reporting for warehouse management, freight, and inventory systems. Collaborate with business stakeholders to translate operational logistics needs into actionable data strategies. Ensure system reliability, data security, and compliance with relevant regulations. Evaluate and recommend tools and platforms including cloud-based data services (Azure, AWS, GCP). Lead data integration efforts including legacy systems migration and EDI transformations. Required Skills & Qualifications Proven experience as a Data Architect in logistics, transportation, or supply chain domains. Strong understanding of EDI formats, warehouse operations, fleet data, and logistics KPIs. Hands-on experience with data modeling, ETL, ELT, and data warehousing. Expertise in cloud platforms (Azure preferred), relational and NoSQL databases, and BI tools. Knowledge of data governance, security, and data lifecycle management. Familiarity with tools like Informatica, Talend, SQL Server, Snowflake, or BigQuery is a plus. Excellent analytical thinking and stakeholder communication skills.
Posted 2 weeks ago
6.0 - 9.0 years
10 - 15 Lacs
Noida
Work from Office
Must haves: C#, .Net Core, .Net 8, Azure, Microservices, SQL Server Candidate needs to have 6 to 9 years of relevant experience. Writing Applications using Azure Functions, Azure Logic Apps, Azure Container Apps , Microservices Authentication & Authorization of applications like API's and other components, Functions etc Developing applications using latest .Net stack - .NET 8, Unit tests and integration tests Strong development experience in C# and .NET core technologies built up across a range of different projects Database experience with SQL Server , Writing complex queries, Indexes , procedures and functions Experience with GIT Repositories (Azure Devops repos), CI/CD pipelines knowledge (Pipeline's authoring is not a must) API integrations, testing them on local environment also on Azure Tools like Postman or any other API test tools, Experience in implementing Swagger docs Ability and willingness to learn quickly and adapt to a fast-changing environment, with a strong interest in continuous improvement and delivery. Strong problem-solving skills and a good understanding of the best practices and the importance of Test Automation processes. P&C Industry domain experience would be a huge plus. (Nice to Have) Documentation of Applications (Nice to Have) Azure Data factories, Azure APIM, Talend (Nice to Have) Knowledge of, experience with, Talend would be a plus as it is our client's ETL tool.(Nice to Have) Shift timing2 PM-11 PM IST. Mandatory Competencies Programming Language - .Net - .NET Core Beh - Communication and collaboration Cloud - Azure - ServerLess (Function App Logic App) Database - Sql Server - SSIS/ SSAS Middleware - API Middleware - Microservices Programming Language - Other Programming Language - C#
Posted 2 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives
Posted 2 weeks ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The impact you will have in this role : The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence and will be responsible for managing a growing team of consultants and employees and running a development and production support teams for the Enterprise Intelligence team for DTCC Your Primary Responsibilities : Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCCs ITIL process for incident, change and problem resolution Manage delivery and production support teams Drive delivery independently and autonomously within team and vendor teams Liaise with onshore peers to drive seamless quality of service to stakeholders Conduct working sessions with users and SMEs to align reporting and reduce use of offline spreadsheets and potentially stale data sources Provide technical leadership for projects Work closely with other project managers and scrum masters to create and update project plans Work closely with peers to improve workflow processes & communication Qualifications: 8+ years of related experience Bachelor's degree (preferred) or equivalent experience Talents Needed for Success: Minimum of 12 years of related experience Minimum of 8 years of experience in managing data warehousing, SQL, Snowflake. Minimum of 5 years of experience in People management & Team Leadership Ability to manage distributed teams with an employee/vendor mix Strong understanding of snowflake schemas and data integration methods and tools Strong knowledge on managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Ability to meet deadlines, goals and objectives Ability to facilitate educational and working sessions with stakeholders and users Self-starter, continually striving to improve the teams service offerings and ones own skillset Must have a problem-solving and innovative mindset to meet a wide variety of challenges Willingness and ability to learn all aspects of our operating model as well as new tools Developed competencies around essential project management, communication (oral, written) and personal effectiveness Good SQL skills and good knowledge of relational databases, specifically, Snowflake Ability to manage agile development cycles within the DTCC SDLC (SDP) methodology Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys)
Posted 2 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
The Impact you will have in this role: We are seeking a skilled technical Troubleshooting resource to join our dynamic team. The ideal candidate will have good experience in understanding web applications coupled with the ability to troubleshoot user issues and be able to resolve technical issues related to relational databases and web applications built using Appian. The candidate should be eager to learn, able to prioritize between multiple tasks while communicating effectively with a globally dispersed team. What You'll Do: Assist in troubleshooting and resolving technical issues for Appian-based web applications that use relational databases with the goal of ensuring minimal downtime and optimal performance. Must have a strong understanding of relational databases, with a focus on data integrity, performance, and security Help diagnose and fix problems in Talend workflows, focusing on data extraction, transformation, and loading processes Assist in handling Autosys-based job scheduling and automation, ensuring smooth execution of batch jobs and workflows. Collaborate with cross-functional teams to gather requirements, design solutions, and implement troubleshooting strategies. Document and track issues, resolutions, and best practices to improve the overall troubleshooting process. Provide technical support during production releases and maintenance windows, working closely with the Operations team. Stay up-to-date with the latest industry trends and best practices in troubleshooting and technical support Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Talents Needed for Success: Minimum of 3 years of experience in technical troubleshooting and support Proficiency in relational databases, including database management and query writing. Understanding of Web applications and integration patterns Familiarity with Talend and basic knowledge of ETL processes Familiarity with with job scheduling and automation. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills needed for Success: Knowledge of SQL Server is desired Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud) is beneficial. Exposure to Low Code No Code systems such as Appian is a plus Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Knowledge of scripting languages such as Python and Shell/Batch programming is a plus. Understanding of Agile processes and methodologies, with experience in working in an Agile framework using Scrum.
Posted 2 weeks ago
0 years
0 Lacs
Chennai
On-site
Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary The Artificial Intelligence & Engineering (AI&E) The Artificial Intelligence & Engineering (AI&E) portfolio is an integrated set of offerings that addresses our clients’ heart-of-the-business issues. This portfolio combines our functional and technical capabilities to help clients transform, modernize, and run their existing technology platforms across industries. As our clients navigate dynamic and disruptive markets, these solutions are designed to help them drive product and service innovation, improve financial performance, accelerate speed to market, and operate their platforms to innovate continuously. Role: Testing - ETL or Data or DW Level: Consultant As a Consultant at Deloitte Consulting, you will be responsible for individually delivering high quality work products within due timelines in an agile framework. Need-basis consultants will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements. As a ETL/ DW Tester, you will be responsible to analyze nonfunctional requirements, design test scenarios/cases/scripts, RTM, perform test execution, document defects, triage defects, document results and attain sign-off of the design & results. The work you will do includes: Participating in business requirements discussions with client, onshore or offshore teams and attaining any clarifications to design test scenarios/cases/scripts Designing clean, efficient, and well-documented ETL and integration test scenarios/ scripts and maintaining industry & client standards, based on business requirement analysis. Able to understand Data mapping and able to create Data mappings. Provisioning the test data required to perform test execution. Performing Integration/ ETL/ Data migration test execution, logging defects, tracking and triaging defects to closure, and documenting test results Creating performance test results/reports Tracking and resolving dependencies that impact test activities Reporting and escalating any risks/issues which are blocking test activities Qualification Skills / Project Experience: Must Have: 3-6 years of hands-on experience in testing data validation scenarios and Data ingestion, pipelines, and transformation processes. Hands-on experience on Cloud platform like AWS, Azure, GCP etc. Hands On experience and strong knowledge of complex SQL queries including aggregation logic. An understanding of big data engineering tools and how they can be used strategically (e.g. Spark, Hive, Hadoop, Dask). Experience in analyzing Data Mappings, Functional requirements and converting them into test scenarios / cases. Experience in different test phases with standardized QA Processes Experience with QA specific tools for test management, test execution progress, defect tracking and triaging. Experience in different SDLC/STLC lifecycles and methodologies. Understanding of Test Strategy & Test Planning Experience in defect logging, tracking, and closing defects Experience in test status reporting and managing a small team of 2-3 members (for tenured consultants) Strong understanding of different software development life cycles (Agile, waterfall) and contemporary software quality assurance processes and automated tools Knowledge and 3+ year experience with modern testing development practices and integrated testing products such as Pyspark based automation, Market tools like (QuerySerge, Talend, ETL Validator etc.) and their integration with tools such as Gitlab, etc. Experience with scripting languages like Unix Shell or Python for automating ETL testing. Hands-on experience on data modelling and data profiling tool. Perform analysis of data migration from various sources to target system in Cloud platform/ On prem environment. Flexibility to adapt and apply innovation to varied business domain and apply technical solutioning and learnings to use cases across business domains and industries Good to Have: 2+ yrs. of experience in API testing (JSON,XML), API automation, creation of virtualized services, service virtualization testing using any of the industry tools like CA Dev Test/Parasoft/ CA Lisa Ability to perform estimation of test activities using quantitative methods Knowledge and experience working with Microsoft Office tool Experience on analyzing slow performing database queries / execution plan analysis. Education: B.E./B. Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university Prior Experience: 3–6 years of experience working with ETL testing and Data migration testing. Location: BLR/HYD The team Deloitte Consulting LLP’s Technology Consulting practice is dedicated to helping our clients build tomorrow by solving today’s complex business problems involving strategy, procurement, design, delivery, and assurance of technology solutions. Our service areas include analytics and information management, delivery, cyber risk services, and technical strategy and architecture, as well as the spectrum of digital strategy, design, and development services Core Business Operations Practice optimizes clients’ business operations and helps them take advantage of new technologies. Drives product and service innovation, improves financial performance, accelerates speed to market, and operates client platforms to innovate continuously. Learn more about our Technology Consulting practice on www.deloitte.com Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300589
Posted 2 weeks ago
7.0 - 10.0 years
4 - 7 Lacs
Bengaluru
On-site
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join Kyndryl as a Data Analyst where you will unlock the power of data to drive strategic decisions and shape the future of our business. As a key member of our team, you will harness your expertise in basic statistics, business fundamentals, and communication to uncover valuable insights and transform raw data into rigorous visualizations and compelling stories. In this role, you will have the opportunity to work closely with our customers as part of a top-notch team. You will dive deep into vast IT datasets, unraveling the mysteries hidden within, and discovering trends and patterns that will revolutionize our customers' understanding of their own landscapes. Armed with your advanced analytical skills, you will draw compelling conclusions and develop data-driven insights that will directly impact their decision-making processes. Your role goes beyond traditional data analysis. You will be a trusted advisor, utilizing your domain expertise, critical thinking, and consulting skills to unravel complex business problems and translate them into innovative solutions. Your proficiency in cutting-edge software tools and technologies will empower you to gather, explore, and prepare data – ensuring it is primed for analysis, business intelligence, and insightful visualizations. Collaboration will be at the heart of your work. As a Data Analyst at Kyndryl, you will collaborate closely with cross-functional teams, pooling together your collective expertise to gather, structure, organize, and clean data. Together, we will ensure the data is in its finest form, ready to deliver actionable insights. Your unique ability to communicate and empathize with stakeholders will be invaluable. By understanding the business objectives and success criteria of each project, you will align your data analysis efforts seamlessly with our overarching goals. With your mastery of business valuation, decision-making, project scoping, and storytelling, you will transform data into meaningful narratives that drive real-world impact. At Kyndryl, we believe that data holds immense potential, and we are committed to helping you unlock that potential. You will have access to vast repositories of data, empowering you to delve deep to determine root causes of defects and variation. By gaining a comprehensive understanding of the data and its specific purpose, you will be at the forefront of driving innovation and making a difference. If you are ready to unleash your analytical ability, collaborate with industry experts, and shape the future of data-driven decision making, then join us as a Data Analyst at Kyndryl. Together, we will harness the power of data to redefine what is possible and create a future filled with limitless possibilities. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Key Responsibilities: Design and develop interactive dashboards and reports using tools like Power BI , Tableau , or Looker . Build and maintain ETL/ELT pipelines to support analytics and reporting use cases. Work with stakeholders to gather business requirements and translate them into technical specifications. Perform data modeling (star/snowflake schemas), data wrangling , and transformation . Ensure data quality , accuracy , and integrity across reporting layers. Collaborate with Data Architects, Analysts, and Engineers to design and implement data solutions that scale. Automate report generation, data refreshes, and alerting workflows. Maintain version control, CI/CD practices, and documentation for dashboards and data pipelines. Technical Skills Required: Required overall 7 to 10 years experience and 5 years Strong experience with BI tools : Power BI, Tableau, or similar Proficiency in SQL for data querying and transformations. Hands-on experience with ETL tools (e.g., Azure Data Factory, SSIS, Talend, dbt) Experience with data warehouses (e.g., Snowflake, Azure Synapse, BigQuery, Redshift) Programming/scripting knowledge in Python or PySpark (for data wrangling or pipeline development) Familiarity with cloud platforms : Azure, AWS, or GCP (Azure preferred) Understanding of data governance , security , and role-based access control (RBAC) Preferred Qualifications: Experience with CI/CD tools (e.g., GitHub Actions, Azure DevOps) Exposure to data lake architectures , real-time analytics, and streaming data (Kafka, Event Hub, etc.) Knowledge of DAX , MDX , or custom scripting for BI calculations Cloud certifications (e.g., Azure Data Engineer, AWS Data Analytics). Soft Skills: Strong analytical mindset and attention to detail Effective communication and stakeholder management skills Ability to manage multiple tasks and priorities in fast-paced environments Self-starter with a proactive problem-solving approach Preferred Skills and Experience Degree in a quantitative discipline, such as industrial engineering, finance, or economics Knowledge of data analysis tools and programming languages (e.g. Looker, Power BI, QuickSight, BigQuery, Azure Synapse, Python, R, or SQL) Professional certification, e.g., ASQ Six Sigma Cloud platform certification, e.g., AWS Certified Data Analytics – Specialty, Google Cloud Looker Business Analyst, or Microsoft Certified: Power BI Data Analyst Associate Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru
On-site
Req ID: 332592 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer (Talend &Pyspark) to join our team in Bangalore, Karnātaka (IN-KA), India (IN). "Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as PySpark, Talend to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Minimum Skills Required: Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as AWS. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in S3 or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred" About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 2 weeks ago
8.0 years
0 Lacs
India
Remote
#Hiring #STLC #data migration #ata mapping #data validation #data reconciliation #Selenium #Java #Python #Cross-platform #Oracle #Snowflake #SQL Server #BigQuery #SQL #ETL tools #DataStage #Talend #CI/CD tools #AWS #Jenkins #Git #Azure DevOps Title: Data migration testing Experience: 5 To 10+yrs Location: Remote 🔹 Experience Required: Minimum 5–8 years in software testing, with at least 3 years in data migration testing roles. 🔹 Key Responsibilities: Participate in all phases of the Software Testing Life Cycle (STLC) . Design and execute test strategies and test cases for data migration and transformation projects . Perform data mapping, data validation, and data reconciliation between source and target databases. Identify, document, and track defects; work closely with developers and data teams to ensure resolution. Use automation tools (e.g., Selenium) to streamline regression and functional testing where applicable. Participate in Agile ceremonies (scrum, sprint planning, retrospectives) and contribute to story/test grooming. Support CI/CD pipelines by integrating testing processes within build and deployment workflows. Validate data pipelines, transformation rules, and ensure proper lineage and traceability of data migration. 🔹 Must-Have Skills: Strong understanding of Software Test Life Cycle and quality assurance processes. Experience with both Agile and Waterfall project methodologies. Proficient in Selenium or equivalent test automation tools with solid scripting capabilities (e.g., Java, Python). Extensive hands-on experience in data migration testing , including: Cross-platform (e.g., Oracle to Snowflake, SQL Server to BigQuery) data movement. Complex data mapping and transformation validation . Row-level and summary-level data reconciliation . Familiarity with tools for database validation (e.g., SQL, ETL tools, DataStage, Talend). Exposure to CI/CD tools like Jenkins, Git, Azure DevOps, etc. 🔹 Nice to Have: API testing experience using Postman, RestAssured, or similar frameworks. Knowledge of data quality frameworks and data profiling. Experience with cloud platforms (AWS, Azure, GCP) in the context of data migration or integration. Exposure to data masking and anonymization for secure testing environments.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology? Are you passionate about improving capabilities, efficiency and performance? Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service’s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelor's or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, we're committed to achieving net-zero carbon emissions by 2050 and we're always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clients? Come join us and grow with a team of people who will challenge and inspire you! About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. R147951
Posted 2 weeks ago
7.0 years
25 - 35 Lacs
Hyderabad, Telangana, India
Remote
We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: Python, SQL, Snowflake Responsibilities Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS/Azure data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications Skills: scala,etl,java,mongodb,sql,python,talend,gcp,data engineering,elt,azure,cloud,snowflake,postgresql,aws,apache airflow,cassandra,informatica,dbt
Posted 2 weeks ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
We are looking for a highly experienced Senior Data Engineer to lead our data migration projects from On-Premise systems to Azure Cloud, utilizing Azure Databricks, PySpark, SQL , and Python. The successful candidate will be responsible for designing and implementing robust, scalable cloud data solutions to enhance business operations and decision-making processes. Responsibilities: Design and implement end-to-end data solutions using Azure Databricks, PySpark, MS SQL Server, and Python for data migration from on-premise to Azure Cloud. Develop architectural blueprints and detailed documentation for data migration strategies and execution plans. Construct, test, and maintain optimal data pipeline architectures across multiple sources and destinations within Azure Cloud environments. Leverage PySpark within Azure Databricks to perform complex data transformations, aggregations, and optimizations. Ensure seamless migration of large-scale databases from on-premise systems to Azure Cloud, maintaining data integrity and compliance. Good hands on exp. on python and SQL Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Mandatory Skills: Talend Big Data. Experience: 5-8 Years.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology? Are you passionate about improving capabilities, efficiency and performance? Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service’s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelor's or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, we're committed to achieving net-zero carbon emissions by 2050 and we're always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clients? Come join us and grow with a team of people who will challenge and inspire you! About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. R147951
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough