1442 Mapreduce Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

0 Lacs

gurugram, haryana, india

On-site

Summary Position Summary Artificial Intelligence & Engineering AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations. Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical...

Posted 13 hours ago

AI Match Score
Apply

6.0 - 9.0 years

0 Lacs

kolkata, west bengal, india

On-site

Summary Position Summary Artificial Intelligence & Engineering AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations. Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical...

Posted 13 hours ago

AI Match Score
Apply

6.0 - 9.0 years

0 Lacs

kolkata, west bengal, india

On-site

Summary Position Summary Artificial Intelligence & Engineering AI & Engineering leverages cutting-edge engineering capabilities to help build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions insights are powered by engineering for business advantage, helping transforming mission-critical operations. Join our AI & Engineering team to help in transforming technology platforms, driving innovation, and helping help make a significant impact on our clients' success achievements. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are could be critical...

Posted 13 hours ago

AI Match Score
Apply

1.0 - 5.0 years

3 - 7 Lacs

rajkot

Work from Office

NEX Softsys is looking for Hadoop Developer to join our dynamic team and embark on a rewarding career journey. A Hadoop Developer is responsible for designing, developing, and maintaining big data solutions using Apache Hadoop. Key responsibilities include : 1. Designing and developing scalable, efficient, and reliable data processing pipelines using Hadoop and related technologies such as MapReduce and Hive. 2. Writing and executing MapReduce jobs to process large datasets stored in Hadoop Distributed File System (HDFS). 3. Collaborating with stakeholders to understand their data processing requirements and develop solutions that meet their needs. 4. Integrating Hadoop with other data stora...

Posted 1 day ago

AI Match Score
Apply

0.0 - 5.0 years

3 - 6 Lacs

rajkot

Work from Office

NEX Softsys is looking for Big Data Developer to join our dynamic team and embark on a rewarding career journey Design, develop, and maintain big data solutions to meet business requirements and support data-driven decision making Work with stakeholders to understand their data needs and determine how to best use big data technologies to meet those needs Design and implement scalable, high-performance big data architectures, using technologies such as Hadoop, Spark, and NoSQL databases Extract, transform, and load large data sets into a big data platform for analysis and reporting Write complex SQL queries and develop custom scripts to process big data Collaborate with data scientists, data ...

Posted 1 day ago

AI Match Score
Apply

4.0 - 6.0 years

8 - 13 Lacs

chennai

Work from Office

Role Description : As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security tea...

Posted 1 day ago

AI Match Score
Apply

0.0 - 2.0 years

3 - 6 Lacs

visakhapatnam

Work from Office

Enroll N Pro is looking for Hadoop to join our dynamic team and embark on a rewarding career journey Collaborate with cross-functional teams to achieve strategic outcomes Apply subject expertise to support operations, planning, and decision-making Utilize tools, analytics, or platforms relevant to the job domain Ensure compliance with policies while improving efficiency and outcomes Disclaimer: This job description has been sourced from a public domain and may have been modified by Naukri.com to improve clarity for our users. We encourage job seekers to verify all details directly with the employer via their official channels before

Posted 2 days ago

AI Match Score
Apply

4.0 - 6.0 years

9 - 13 Lacs

chennai

Work from Office

As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that th...

Posted 2 days ago

AI Match Score
Apply

6.0 - 11.0 years

15 - 25 Lacs

pune, chennai, bengaluru

Work from Office

Title : BigData Developer Location : Bangalore Position type : Full time Requirement: No Remote: All 5 days work from office. Working experience of Hadoop, Hive SQLs, Spark, Bigdata Eco System Tools. • Should be able to tweak queries and work on performance enhancement. • The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. • The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects...

Posted 2 days ago

AI Match Score
Apply

4.0 - 6.0 years

7 - 12 Lacs

chennai

Work from Office

Role Description As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team ...

Posted 2 days ago

AI Match Score
Apply

5.0 - 9.0 years

12 - 17 Lacs

noida

Work from Office

Hive/Python/Spark Technical hands on data processing Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Database SQL/Oracle Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage, large-scale data processing model implementation, and performance tracking Mandatory Competencies Big Data - Big Data - HIVE Big Data - Big Data - Pyspark Big Data - Big Data - SPARK Data Science and Machine Learning - Data Science and Machine Learning - Data Analyst Database - Oracle - PL/SQL Packages Programming Language - Python - Python Shell Beh - Communication and collaboration

Posted 2 days ago

AI Match Score
Apply

6.0 - 8.0 years

8 - 10 Lacs

maharashtra

Work from Office

6 + years of overall IT experience in Telecom OSS especially in Assurance domain Solution, design, and Implementation - Strong Knowledge of Telecom OSS domain, with excellent experience Service Now for Assuance - Knowledge and experience on Big Data,Data lake solution, KAFKA , Hadoop/Hive. - Experience on Python (pyspark) is essential. - Implementation experience in continuous integration and delivery philosophies and practices specifically on Docker, Git, JenKins - Self driven and highly motivated candidate for a client facing role in a challenging environment

Posted 2 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

Hands On with advanced SQL, Python etc. Hands On in data profiling Hands On in working on Cloud like Azure and Cloud DW like Databricks Hands On experience on Scheduling tools like Airflow, Control M etc. Knowledgeable on Big Data tools like Spark (python/scala), Hive, Impala, Hue and storage (e.g. HDFS, HBase) Knowledgeable in CICD processes Bitbucket/GitHub, Jenkins, Nexus etc. Knowledgeable managing structured and unstructured data types

Posted 2 days ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

andhra pradesh

Work from Office

JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.

Posted 2 days ago

AI Match Score
Apply

5.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Mega Walk In TCS Chennai Hiring for Big Data Developer INTERVIEW DETAILS: Interview date: 8 th Nov 2025, Saturday Interview time: 9:00 AM - 12:00 PM Venue: TCS, Siruseri, GS 4, 2 nd Floor, Chennai Job Summary: Role : Big Data Developer Experience: 5 year to 10 years Job Location: Chennai Required Technical Skill Set: MS Azure Cloud Platform, Big Data with Hadoop, Python, Hive, Azure data and admin analysis Eligibility Criteria: •Minimum 15 years of regular education (10th + 12th + 3 years graduation) •BE/ B.Tech/MCA/M.Sc/MS with minimum 3 years of relevant experience post Qualification IT- Experience. • B.Sc/BCA Graduates with minimum 3.5 years of relevant experience post qualification IT Ex...

Posted 2 days ago

AI Match Score
Apply

3.0 years

0 Lacs

bengaluru, karnataka, india

Remote

Come join us in the Azure Core Economics team, explore your passions, and impact the world! If you join the Azure Core Economics team, you will join a group of economists, data scientists, and software engineers within Azure Engineering that tackles a variety of data-intensive, distributed computing challenges related to cloud economics that are of critical importance to Microsoft. Our team collaborates with a wide range of other teams in Microsoft on problems such as pricing optimization, demand estimation and forecasting, capacity management and planning, fraud detection, virtual machine bin packing, and others. Our software engineering team in India aids in the development of novel softwa...

Posted 4 days ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

chennai, tamil nadu, india

On-site

Greetings from TATA CONSULTANCY SERVICES! WALK-IN-DRIVE "Big Data" Interview Date: 8th November 2025 Years of Experience: 5 - 10 years Location: Chennai Job Description:- Hands on experience in installation, configuration, supporting and managing Hadoop clusters using Hortonworks distribution Hands on experience on clusters migration /Azure Cloud Upgrading the clusters and configured the Kerberos and Ranger on cluster. Good knowledge on shell scripting / PySpark Node commissioning and De-Commissioning on clusters and Data Re-balancing. Experience in setting up the High-Availability Hadoop Clusters. Experience in Distcp configuration between the clusters. Experience on Standby Ambari Server i...

Posted 4 days ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

guwahati, assam

On-site

In this role at Zaloni, you will be a part of the Engineering team responsible for architecting, designing, and building Zaloni's flagship product - Arena. Your key responsibilities will include: - Taking technical ownership for the development and delivery of product features as a member of an agile team, involving design, code, test, technical documentation, technical training, and stakeholder presentations. - Providing subject matter expertise on multiple functional areas, collaborating with developers, testers, architects, documentation, and product managers to ensure high-quality feature deliverables. - Working on a Scrum Team using Agile principles, participating in backlog grooming, t...

Posted 4 days ago

AI Match Score
Apply

10.0 years

0 - 1 Lacs

illinois, united states

Remote

Abbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 114,000 colleagues serve people in more than 160 countries. We’re focused on helping people with diabetes manage their health with life-changing products that provide accurate data to drive better-informed decisions. We’re revolutionizing the way people monitor their glucose levels with our new sensing technology. Working at Abbott At Abbott, You Can Do Work That Matters, Grow, And Learn, Care For Yourself ...

Posted 4 days ago

AI Match Score
Apply

6.0 - 11.0 years

8 - 13 Lacs

hyderabad

Work from Office

We are looking for a highly motivated and detail-oriented Catastrophe Data Analyst to join our team at Swiss Re. The ideal candidate should have 0 to 7 years of experience in the field. Roles and Responsibility Analyze and interpret catastrophe data to identify trends and patterns. Develop and maintain databases for storing and managing catastrophe data. Collaborate with cross-functional teams to design and implement new catastrophe models. Conduct research and stay updated on industry developments and advancements in catastrophe modeling. Provide insights and recommendations to stakeholders based on analysis results. Ensure data quality and integrity by implementing data validation and veri...

Posted 5 days ago

AI Match Score
Apply

6.0 - 11.0 years

2 - 5 Lacs

bengaluru

Work from Office

We are seeking a talented and experienced Databricks Engineer to join our innovative team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and optimising data pipelines and analytics solutions using the Databricks platform. You will collaborate closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business insights and decision-making. Develop and optimise ETL processes using Databricks and Apache Spark Design and implement efficient data models and schemas for optimal data storage and retrieval Collaborate with data scientists and analysts to build and deploy machine learning models Ensure data quality, consist...

Posted 5 days ago

AI Match Score
Apply

0.0 - 6.0 years

11 - 12 Lacs

gurugram

Work from Office

Design, develop, and maintain scalable ETL/ELT data pipelines on GCP (BigQuery, Dataflow, Cloud Composer, Pub/Sub, etc) Build and optimize data models and data marts in BigQuery for analytical and reporting use cases Ensure data quality, integrity, and security across the data lifecycle Implement data transformation logic using SQL, Python, and Cloud Dataflow/Dataproc Collaborate with business and analytics teams to understand data requirements and deliver efficient solutions Automate workflows and orchestrate pipelines using Cloud Composer (Airflow) Monitor and optimize BigQuery performance and manage cost efficiency Support CI/CD deployment processes and maintain version control using Git ...

Posted 5 days ago

AI Match Score
Apply

6.0 - 11.0 years

14 - 17 Lacs

mumbai

Work from Office

Your role and responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in impl...

Posted 5 days ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 6 Lacs

kolkata, hyderabad, bengaluru

Work from Office

We are looking for skilled Hadoop Developers with 5-10 years of experience to join our team in Bangalore, Kolkata, Hyderabad, and Pune. The ideal candidate should have strong proficiency in Hadoop, Scala, Spark, and SQL. Roles and Responsibility Design, develop, and implement scalable data processing systems using Hadoop. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data pipelines using Spark and Scala. Troubleshoot and resolve complex technical issues related to Hadoop. Participate in code reviews and ensure high-quality code standards. Stay updated with the latest trends and technologies in Hadoop development. Job...

Posted 5 days ago

AI Match Score
Apply

7.0 - 12.0 years

3 - 7 Lacs

hyderabad, pune

Work from Office

We are looking for a skilled Hadoop professional with 7-14 years of experience to join our team in Pune and Hyderabad. The ideal candidate will have expertise in Hadoop, Spark, Kafka, Data Visualization, scripting languages in Python and R, SQL and NoSQL, Agile methodology, ETL / ELT data pipelines. Roles and Responsibility Design and develop scalable Hadoop applications using Spark and Kafka. Create data visualizations using tools like Splunk, Qlik, and Grafana. Develop and maintain ETL/ELT data pipelines using Agile methodology. Collaborate with cross-functional teams to identify and prioritize project requirements. Troubleshoot and resolve technical issues related to Hadoop applications. ...

Posted 5 days ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies