25202 Hadoop Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

20 - 25 Lacs

gurugram

Remote

Primary Skills Design and develop on Hadoop applications Hands-on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts Experience on source code management with Git repositories Secondary Skills Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services Basic SQL programming Knowledge of agile methodology for delivering software solutions Build scripting with Maven / Cradle, Exposure to Jenkins

Posted 3 days ago

AI Match Score
Apply

5.0 - 7.0 years

12 - 17 Lacs

noida, chennai

Work from Office

US Shift Key Responsibility Area: ( Specifies Key Result Areas for the Incumbent ) o Engage with clients to precisely identify and fulfill their data engineering needs. o Lead and manage special projects to meet strategic goals. o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility. o Continuously refine data processing rules and procedures for optimal results. o Design and implement scalable data architectures using Snowflake. o Maintain and enhance data pipelines, integrating new data sources and APIs as needed. o Monitor and ensure high data quality across systems for reliable decision-making. o Utilize SSIS for efficient data extraction, transformation, a...

Posted 3 days ago

AI Match Score
Apply

5.0 - 7.0 years

12 - 17 Lacs

hyderabad

Work from Office

US Shift Key Responsibility Area: ( Specifies Key Result Areas for the Incumbent ) o Engage with clients to precisely identify and fulfill their data engineering needs. o Lead and manage special projects to meet strategic goals. o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility. o Continuously refine data processing rules and procedures for optimal results. o Design and implement scalable data architectures using Snowflake. o Maintain and enhance data pipelines, integrating new data sources and APIs as needed. o Monitor and ensure high data quality across systems for reliable decision-making. o Utilize SSIS for efficient data extraction, transformation, a...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 30 Lacs

gurugram

Work from Office

Key Points - Must Have: Advanced Java proficient Microservices Spring boot Writing SQL Queries (proficient) AWS 4-5 Yrs in Java, 2-3 exp - Spark Good to have Unix Shell scripting JD Minimum of 8 years of experience in building complex Data Platforms and Data Engineering solutions Minimum of 6 years of hands on experience in architecture and development of data solutions in AWS environment using AWS Services Experience with big data technologies such as: Spark, EMR, Hadoop, Hive, Experience programming with at least one modern language such as Scala, Java, Python Hands on experience on NoSQL DBs like DynamoDB, DocumentDB, MongoDB Hands on experience on implementing AWS Glue, EMR, Lambda funct...

Posted 3 days ago

AI Match Score
Apply

4.0 years

0 Lacs

pune, maharashtra, india

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, ...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 30 Lacs

hyderabad, pune, bengaluru

Work from Office

Location - Hyderabad/Bangalore/Gurugram/Pune/Mumbai Work type: Remote but physical onboarding will be conducted & candidate needs to visit office in 3 to 6 month for conference Key Points Must Have Advanced Java proficient- Microservices Spring boot Writing SQL Queries (proficient) AWS 4-5 Yrs in Java, - Spark Good to have Unix Shell scripting JD Minimum of 8 years of experience in building complex Data Platforms and Data Engineering solutions. Minimum of 6 years of hands on experience in architecture and development of data solutions in AWS environment using AWS Services Experience with big data technologies such as: Spark, EMR, Hadoop, Hive, Experience programming with at least one modern ...

Posted 3 days ago

AI Match Score
Apply

4.0 - 5.0 years

25 - 30 Lacs

noida

Remote

C# and Strong OOPS Concepts Required Skills: Excellent practical experience in C# for data engineering or development tasks. Strong understanding of DevOps principles and experience with Azure DevOps for CI/CD , automation, and infrastructure management. Experience developing production systems on Azure, including services such as Databricks, Event Hubs, Function Apps. Desirable: Python and Databricks.

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 27 Lacs

chennai

Hybrid

Role Overview We are seeking an experienced GCP Data Engineer with strong expertise in designing, developing, and managing large-scale data pipelines on Google Cloud Platform. The ideal candidate will have hands-on experience in Big Data technologies (Hadoop, PySpark, BigQuery) and will collaborate closely with cross-functional teams and clients to deliver high-quality data solutions. Key Responsibilities Design, build, and maintain data pipelines and ETL workflows on GCP. Develop and optimize PySpark scripts and Spark-based data processing solutions. Work with BigQuery to manage datasets, write complex SQL queries, and optimize data performance. Utilize the Hadoop ecosystem (HDFS, Hive, Map...

Posted 3 days ago

AI Match Score
Apply

4.0 - 5.0 years

25 - 30 Lacs

pune, gurugram, delhi / ncr

Work from Office

This is a remote position. Location - Hyderabad/Bangalore/Gurugram/Pune/Mumbai Work type: Remote but physical onboarding will be conducted & candidate needs to visit office in 3 to 6 month for conference Key Points Must Have Advanced Java proficient- Microservices Spring boot Writing SQL Queries (proficient) AWS 4-5 Yrs in Java, Exp:2-3 exp - Spark Good to have Unix Shell scripting Minimum of 8 years of experience in building complex Data Platforms and Data Engineering solutions. Minimum of 6 years of hands on experience in architecture and development of data solutions in AWS environment using AWS Services Experience with big data technologies such as: Spark, EMR, Hadoop, Hive, Experience p...

Posted 3 days ago

AI Match Score
Apply

3.0 - 7.0 years

20 - 25 Lacs

bengaluru

Remote

This is a remote position. Primary Skills Design and develop on Hadoop applications Hands-on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts Experience on source code management with Git repositoriesSecondary Skills Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services Basic SQL programming Knowledge of agile methodology for delivering software solutions Build scripting with Maven / Cradle, Exposure to Jenkins

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

0 - 3 Lacs

chennai

Hybrid

Dear Folk, Immediate Hiring for Data Engineer Role - CMMI level based client Experience: 5-10 years GCP, Hadoop, Pyspark, BigQuery Candidate should be a team player and will be interacting with client on regular basis. Kindly share your updated Resume along with PAN Card and passport photo to jeedi.jessica@kiya.ai

Posted 3 days ago

AI Match Score
Apply

5.0 - 8.0 years

25 - 30 Lacs

noida, hyderabad, bengaluru

Work from Office

Location - Hyderabad/Bangalore/Gurugram/Pune/Mumbai Work type: Remote but physical onboarding will be conducted & candidate needs to visit office in 3 to 6 month for conference Key Points Must Have Advanced Java proficient- Microservices Spring boot Writing SQL Queries (proficient) AWS 4-5 Yrs in Java, - Spark Good to have Unix Shell scripting JD Minimum of 8 years of experience in building complex Data Platforms and Data Engineering solutions. Minimum of 6 years of hands on experience in architecture and development of data solutions in AWS environment using AWS Services Experience with big data technologies such as: Spark, EMR, Hadoop, Hive, Experience programming with at least one modern ...

Posted 3 days ago

AI Match Score
Apply

4.0 - 5.0 years

20 - 25 Lacs

hyderabad

Remote

Primary Skills Design and develop on Hadoop applications Hands-on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts Experience on source code management with Git repositoriesSecondary Skills Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services Basic SQL programming Knowledge of agile methodology for delivering software solutions Build scripting with Maven / Cradle, Exposure to Jenkins

Posted 3 days ago

AI Match Score
Apply

5.0 - 8.0 years

12 - 17 Lacs

bengaluru

Work from Office

Key Responsibility Area: ( Specifies Key Result Areas for the Incumbent ) o Engage with clients to precisely identify and fulfill their data engineering needs. o Lead and manage special projects to meet strategic goals. o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility. o Continuously refine data processing rules and procedures for optimal results. o Design and implement scalable data architectures using Snowflake. o Maintain and enhance data pipelines, integrating new data sources and APIs as needed. o Monitor and ensure high data quality across systems for reliable decision-making. o Utilize SSIS for efficient data extraction, transformation, and loadin...

Posted 3 days ago

AI Match Score
Apply

7.0 - 12.0 years

35 - 40 Lacs

bengaluru

Work from Office

Role Overview : Looking for experienced Python Developers to work on big data solutions using PySpark and Hadoop in a client-facing environment. Key Responsibilities : Develop and optimize data processing pipelines using Python and PySpark. Work with Hadoop ecosystems for large-scale data processing. Collaborate with cross-functional teams to implement scalable solutions. Ensure performance tuning and best coding practices. Required Skills : 5-7 years (SSE) / 79 years (TL) of experience in Python development. Strong expertise in PySpark and Hadoop. Hands-on experience in big data processing and analytics.

Posted 3 days ago

AI Match Score
Apply

3.0 years

0 Lacs

pune, maharashtra, india

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and d...

Posted 3 days ago

AI Match Score
Apply

10.0 - 15.0 years

3 - 7 Lacs

gurugram

Remote

This is a remote position. Engagement: Freelance Role Overview The Ab Initio Administrator will be responsible for ensuring the stability, performance, and security of the organizations Ab Initio ETL platform. This role involves managing Ab Initio environments, monitoring system health, performing deployments, handling issues, and supporting production workflows. Key Responsibilities Install, configure, and maintain Ab Initio environments Administer end-to-end ETL job execution, scheduling, and performance tuning Handle break-fix issues, root cause analysis, and production support Set up and manage Ab Initio Control Center (ECC) and monitoring alerts Manage Metadata Hub, Conduct>IT, Co>Op sy...

Posted 3 days ago

AI Match Score
Apply

10.0 years

0 Lacs

greater kolkata area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data trans...

Posted 3 days ago

AI Match Score
Apply

3.0 years

0 Lacs

pune, maharashtra, india

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Specialist Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data tr...

Posted 3 days ago

AI Match Score
Apply

5.0 years

0 Lacs

ahmedabad, gujarat, india

On-site

About Us Zuru Tech is digitalizing the construction process of buildings all around the world. We have a multi-national team developing the world’s first digital building fabrication platform, you design, we build it! We at ZURU develop the Zuru Home app, a BIM software meant for the general public, architects, and engineers, from here anyone can buy, design, and send to manufacturing any type of building with complete design freedom. Welcome to the future! What are you Going to do? 📌Architect and develop complex data pipelines, ETL/ELT workflows, and data models on platforms such as Snowflake, Databricks, Azure Synapse, Redshift, BigQuery, etc. 📌Build scalable data transformation pipelines ...

Posted 3 days ago

AI Match Score
Apply

5.0 years

0 Lacs

ahmedabad, gujarat, india

On-site

About Us Zuru Tech is digitalizing the construction process of buildings all around the world. We have a multi-national team developing the world’s first digital building fabrication platform, you design, we build it! We at ZURU develop the Zuru Home app, a BIM software meant for the general public, architects, and engineers, from here anyone can buy, design, and send to manufacturing any type of building with complete design freedom. Welcome to the future! What are you Going to do? 📌Architect and develop complex data pipelines, ETL/ELT workflows, and data models on platforms such as Snowflake, Databricks, Azure Synapse, Redshift, BigQuery, etc. 📌Build scalable data transformation pipelines ...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

12 - 17 Lacs

gurugram

Work from Office

US Shift Key Responsibility Area: ( Specifies Key Result Areas for the Incumbent ) o Engage with clients to precisely identify and fulfill their data engineering needs. o Lead and manage special projects to meet strategic goals. o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility. o Continuously refine data processing rules and procedures for optimal results. o Design and implement scalable data architectures using Snowflake. o Maintain and enhance data pipelines, integrating new data sources and APIs as needed. o Monitor and ensure high data quality across systems for reliable decision-making. o Utilize SSIS for efficient data extraction, transformation, a...

Posted 3 days ago

AI Match Score
Apply

5.0 years

0 Lacs

india

On-site

Job Description: QA AI Engineer About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial servi...

Posted 3 days ago

AI Match Score
Apply

12.0 years

0 Lacs

pune, maharashtra, india

On-site

About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Inno...

Posted 3 days ago

AI Match Score
Apply

4.0 years

0 Lacs

hyderābād

On-site

Overview Are you an experienced Data Scientist, and you love what you do? Would you like to be a part of a global customer facing Team focussed on solving complex, real-world business problems? Would you like to be a part of a worldclass community of technical leaders, highly specialised in their disciplines and working together as one to bring the best practices of Artificial Intelligence, Machine learning, Engineering and Architecture to world’s largest enterprise customers? The Industry Solutions Delivery (ISD) Engineering & Architecture Group (EAG) is a global engineering consulting organisation that supports our most complex and leading-edge customer engagements in improving their busin...

Posted 3 days ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies