Jobs
Interviews

1015 Etl Process Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data pipelines for improved performance and efficiency.- Collaborate with stakeholders to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud data warehousing solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Redshift Good to have skills : PySparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Coordinate with stakeholders to gather requirements- Ensure timely delivery of projects Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue- Good To Have Skills: Experience with PySpark- Strong understanding of ETL processes- Experience in data transformation and integration- Knowledge of cloud computing platforms- Ability to troubleshoot and resolve technical issues Additional Information:- The candidate should have a minimum of 5 years of experience in AWS Glue- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management.- Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language).- Strong understanding of data pipeline architecture and design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Agile Project Management.- This position is based in Chennai (Mandatory).- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Business Intelligence Enterprise Edition (OBIEE) Plus Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring that all development aligns with best practices and organizational standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Business Intelligence Enterprise Edition (OBIEE) Plus.- Strong understanding of data modeling and ETL processes.- Experience with dashboard creation and report generation.- Familiarity with SQL and database management.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Oracle Business Intelligence Enterprise Edition (OBIEE) Plus.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and efficiency in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud platforms and services related to data analytics. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Indore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Analyze and troubleshoot data-related issues to ensure optimal performance of data solutions.- Collaborate with stakeholders to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling concepts and database design.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms such as AWS or Azure for data storage and processing.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data processing workflows to enhance performance.- Collaborate with cross-functional teams to gather requirements and deliver data solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms and services for data storage and processing.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : BasisMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data is accessible, reliable, and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture and data models.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality frameworks and data governance practices.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Kochi

Remote

Senior Data Engineer (Databricks) REMOTE Location: Remote (Portugal) Type: Contract Experience: 5+ Years Language: Fluent English required We are looking for a Senior Data Engineer to join our remote consulting team. In this role, you'll be responsible for designing, building, and optimizing large-scale data processing systems using Databricks and modern data engineering tools. You’ll collaborate closely with data scientists, analysts, and technical teams to deliver scalable and reliable data platforms. Key Responsibilities Design, develop, and maintain robust data pipelines for processing structured/unstructured data Build and manage data lakes and data warehouses optimized for analytics Optimize data workflows for performance, scalability, and cost-efficiency Collaborate with stakeholders to gather data requirements and translate them into scalable solutions Implement data governance, data quality, and security best practices Migrate legacy data processes (e.g., from SAS) to modern platforms Document architecture, data models, and pipelines Required Qualifications 5+ years of experience in data engineering or related fields 3+ years of hands-on experience with Databricks Strong command of SQL and experience with PostgreSQL, MySQL, or NoSQL databases Programming experience in Python, Java, or Scala Experience with ETL processes, orchestration frameworks, and data pipeline automation Familiarity with Spark, Kafka, or similar big data tools Experience working on cloud platforms (AWS, Azure, or GCP) Prior experience migrating from SAS is a plus Excellent communication skills in English

Posted 1 week ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Hybrid

WE'RE HIRING!! Job Title: #PLSQLDeveloper #Experience : 2+ Years Job Description: Oracle Database SQL Development (+++) Advanced PL/SQL which includes Partitions, performance tuning, local & global indexing, Dynamic SQL, Exception Handling and Bulk data handling, etc. ETL concepts Shell scripting Location: #Bangalore Mode of Work : Hybrid (10 Days) Mode of Interview: 2-3 Rounds (2nd Level- F2F as Mandatory) Notice Period : Immediate-20 Days Kindly apply to the job if matches the requirement and also share the job posts for active job seeking applicants. Name as listed in the Pan Card: Total Exp: Relevant Exp: C.CTC: E.CTC: Notice Period: Current and Preferred Location: Share your #CV to rabecca.p@twsol.com

Posted 1 week ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your Role Should have 5+ years of experience in Informatica PowerCenter Strong knowledge of ETL concepts, Data Warehouse architecture and best practices Should be well versed with different file Formats for parsing files like Flat File, XML, JSON, and various source systems for integration. Must have hands-on development as individual contributor in at least 2 Project Lifecycles (Data Warehouse/Data Mart/Data Migration) with client facing environment Your Profile Design, develop, Unit Testing, Deployment, Support Data applications and Infrastructure utilizing various technologies to process large volumes of data. Strong Technical and Functional understanding of RDBMS DWH-BI knowledge. Should have implemented- error handling, exception handling and Audit balance control framework. Good knowledge either Unix/Shell or Python scripting and scheduling tool Strong SQL, PL/SQL Skills, data analytics and performance tuning capabilities. Good to have knowledge of Cloud platform and technologies What You'll love about working here We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions are crafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql. Skills (competencies)

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role Seeking a skilled and detail-oriented OAS/OBIEE Consultant to join our data and analytics team. The ideal candidate will be responsible for designing, developing, and maintaining business intelligence (BI) and dashboarding solutions to support smelter operations and decision-making processes. You will work closely with cross-functional teams to transform raw data into actionable insights using modern BI tools and ETL processes. Key Responsibilities: Develop and maintain interactive dashboards and reports using Microsoft Power BI and Oracle Analytics . Design and implement ETL processes using Oracle Data Integrator and other tools to ensure efficient data integration and transformation. Collaborate with stakeholders to gather business requirements and translate them into technical specifications. Perform data analysis and validation to ensure data accuracy and consistency across systems. Optimize queries and data models for performance and scalability. Maintain and support Oracle Database and other RDBMS platforms used in analytics workflows. Ensure data governance, quality, and security standards are met. Provide technical documentation and user training as needed. Required Skills and Qualifications: Proven experience in BI solutions , data analysis , and dashboard development . Strong hands-on experience with Microsoft Power BI , Oracle Analytics , and Oracle Data Integrator . Proficiency in Oracle Database , SQL , and relational database concepts. Solid understanding of ETL processes , data management , and data processing . Familiarity with business intelligence and business analytics best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Experience in the smelting or manufacturing industry is a plus. Knowledge of scripting languages (e.g., Python, Shell) for automation. Certification in Power BI, Oracle Analytics, or related technologies.

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Job Overview Plan A Technologies is looking for an experienced SQL Developer with hands-on experience in designing, developing, and optimizing high-performance SQL solutions in high-volume environments This role requires deep technical expertise in SQL development, performance tuning, and a solid understanding of data flows, ETL processes, and relational database structures The ideal candidate will have a proven ability to translate complex business requirements into scalable and secure database logic, collaborate cross-functionally with product and engineering teams, and thrive in dynamic, data-intensive industries such as finance, gaming, or transactional systems This is a fast-paced job with room for significant career growth Please note: you must have at least 8+ years of experience as SQL developer to be considered for this role Job Responsibility & Experience 8+ years of strong SQL development experience in a high-volume environment Develop and maintain complex and high-performing SQL scripts, stored procedures, functions, and views Exposure to ETL, data migration, or data warehousing concepts Translate business requirements into scalable, efficient, and secure database logic Perform in-depth query performance analysis and optimization on large-scale data sets Collaborate with product managers, analysts, and developers to build and enhance database components Identify bottlenecks and troubleshoot performance or logic issues in existing systems Expertise in query optimization, indexing strategies, and execution plan analysis Experience in financial, gaming, or transactional systems Deep understanding of relational database concepts Strong problem-solving and analytical mindset with the ability to understand and interpret business logic Self-motivated and able to take ownership of deliverables with minimal supervision Excellent communication skills, both verbal and written, to interact with cross-functional teams and business users Have solid written and verbal English skills Initiative and drive to do great things About The Company/Benefits Plan A Technologies is an American software development and technology advisory firm that brings top-tier engineering talent to clients around the world Our software engineers tackle custom product development projects, staff augmentation, major integrations and upgrades, and much more The team is far more hands-on than the giant outsourcing shops, but still big enough to handle major enterprise clients Read more about us here: PlanAtechnologies, Location: Work From Home 100% of the time, or come in to one of our global offices Up to you Great colleagues and an upbeat work environment: You'll join an excellent team of supportive engineers and project managers who work hard but don't ever compete with each other Benefits: Youll get a generous vacation schedule, brand new laptop, and other goodies If this sounds like you, we'd love to hear from you!

Posted 1 week ago

Apply

3.0 - 6.0 years

9 - 13 Lacs

Bengaluru

Work from Office

The Role We are looking for a Senior Business Analyst with a strong background in Financial Services, Target Operating Models and Data Management to lead client engagements focused on designing and implementing data solutions The ideal candidate will have a solid understanding of data governance, ETL processes, and value stream analysis, along with excellent documentation and communication skills Target start date: August 2025 Hybrid model 3 days at Bangalore Office, India Salary range: 35,00,000 to 42,00,000 INR Responsibilities Lead workshops with business stakeholders to capture current-state data challenges and define future-state solutions Gather, analyse, and document business and data requirements Create and maintain technical documentation using Confluence Develop and manage product backlogs; lead sprint planning and backlog grooming sessions Translate business needs into clear functional and non-functional requirements Define conceptual, logical, and physical data models in collaboration with technical teams Conduct value stream analysis and track expected business outcomes Proactively engage stakeholders to clarify requirements and drive initiatives forward What We're Looking For In Our Applicants Minimum 10 years of experience in Data, Business Intelligence/Analytics, and Data Warehousing Previous experience working with Financial Services Experience in documenting business processes and defining Target Operating Models Proficiency with process mapping tools such as Microsoft Visio and Draw io Strong skills in Value Stream Analysis Expertise in Business Process Analysis to identify inefficiencies and drive improvements Deep knowledge of ETL tools and concepts, Data Management, Data Governance, and Data Quality Experience using Confluence for documentation and collaboration Familiarity with the DAMA-DMBOK framework (CDMP certification is a plus) Experience implementing data platforms such as Data Lakes or Data Warehouses Strong stakeholder management skills in high-paced environments Proficiency in project management methodologies: Agile, Scrum, Waterfall, PMI Strong analytical and documentation skills Experience developing data governance operating models (e-g , Data Quality, Metadata Management) Excellent time management and multitasking abilities Self-motivated, proactive, and team-oriented Ability to work independently and take initiative Why Keyrus Joining Keyrus means joining a market leader in the Data Intelligence field and an (inter)national player in Management Consultancy and Digital Experience You will be part of a young and ever learning enterprise with an established international network of thought leading professionals driven by bridging the gap between innovation and business You get the opportunity to meet specialised and professional consultants in a multicultural ecosystem Keyrus gives you the opportunity to showcase your talents and potential, to build up experience through working with our clients, with the opportunity to grow depending on your capabilities and affinities, in a great working and dynamic atmosphere Keyrus UK Benefits Competitive holiday allowance Very comprehensive Private Medical Plan Flexible working patterns Workplace Pension Scheme Sodexo Lifestyle Benefits Discretionary Bonus Scheme Referral Bonus Scheme Training & Development via KLX (Keyrus Learning Experience)

Posted 1 week ago

Apply

1.0 - 3.0 years

9 - 13 Lacs

Pune

Work from Office

Delivery Manager Data Engineering (Databricks & Snowflake) Position: Delivery Manager Data Engineering Location: Bavdhan/Baner, Pune Experience: 7-10 years Employment Type: Full-time Job Summary We are seeking a Delivery Manager Data Engineering to oversee multiple data engineering projects leveraging Databricks and Snowflake This role requires strong leadership skills to manage teams, ensure timely delivery, and drive best practices in cloud-based data platforms The ideal candidate will have deep expertise in data architecture, ETL processes, cloud data platforms, and stakeholder management Key Responsibilities Project & Delivery Management: Oversee the end-to-end delivery of multiple data engineering projects using Databricks and Snowflake Define project scope, timelines, milestones, and resource allocation to ensure smooth execution Identify and mitigate risks, ensuring that projects are delivered on time and within budget Establish agile methodologies (Scrum, Kanban) to drive efficient project execution Data Engineering & Architecture Oversight Provide technical direction on data pipeline architecture, data lakes, data warehousing, and ETL frameworks Ensure optimal performance, scalability, and security of data platforms Collaborate with data architects and engineers to design and implement best practices for data processing and analytics Stakeholder & Client Management Act as the primary point of contact for clients, senior management, and cross-functional teams Understand business requirements and translate them into technical solutions Provide regular status updates and manage client expectations effectively Team Leadership & People Management Lead, mentor, and develop data engineers, architects, and analysts working across projects Drive a culture of collaboration, accountability, and continuous learning Ensure proper resource planning and capacity management to balance workload effectively Technology & Process Improvement Stay up-to-date with emerging trends in Databricks, Snowflake, and cloud data technologies Continuously improve delivery frameworks, automation, and DevOps for data engineering Implement cost-optimization strategies for cloud-based data solutions Technical Expertise Required Skills & Experience: 10+ years of experience in data engineering and delivery management Strong expertise in Databricks, Snowflake, and cloud platforms (AWS, Azure, GCP) Hands-on experience in ETL, data modeling, and big data processing frameworks (Spark, Delta Lake, Apache Airflow, DBT) Understanding of data governance, security, and compliance standards (GDPR, CCPA, HIPAA, etc) Familiarity with SQL, Python, Scala, or Java for data transformation Project & Team Management Proven experience in managing multiple projects simultaneously Strong knowledge of Agile, Scrum, and DevOps practices Experience in budgeting, forecasting, and resource management Soft Skills & Leadership Excellent communication and stakeholder management skills Strong problem-solving and decision-making abilities Ability to motivate and lead cross-functional teams effectively Preferred Qualifications ???? Experience with data streaming (Kafka, Kinesis, or Pub/Sub) ???? Knowledge of ML & AI-driven data processing solutions ???? Certifications in Databricks, Snowflake, or cloud platforms (AWS/Azure/GCP) Apply or share your updated CV at hr@anvicybernetics,

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Bharuch

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Surendranagar

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Mehsana

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Vadodara

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Surat

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Rajkot

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies