3 Hortonworks Hadoop Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

6 - 10 Lacs

pimpri-chinchwad, pune

Work from Office

Role & responsibilities • Hands-on experience - Hadoop, System administration with sound knowledge In Unix based Operating System internals. • Working experience on Cloudera CDP and CDH and Hortonworks HDP Distribution. • Linux experience (RedHat, CentOS, Ubuntu). • Experience in setting up and supporting Hadoop environment (Cloud and On-premises). • Ability to setup, configure and implement security for Hadoop clusters using Kerberos. • Ability to implement data-at-rest encryption (required), data-in-transit encryption(optional). • Ability to setup and troubleshoot Data Replication peers and policies. • Experience in setting up services like YARN, HDFS, Zookeeper, Hive, Spark, HBase etc. • ...

Posted 1 month ago

AI Match Score
Apply

2.0 - 7.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Hadoop Admin for our client in Mumbai, you will be responsible for managing and administrating the on-premise Hortonworks Hadoop cluster. Your role will involve tasks such as user access management, data lake monitoring, designing and setting up Hadoop clusters, and managing multiple Hadoop utilities. Key Responsibilities: - Hands-on experience in managing and administrating on-premise Hortonworks Hadoop cluster - Knowledge on user access management - Data lake monitoring including cluster health checkup, database size, no of connections, IOs, edge node utilities, load balancing - Experience in designing, estimating, and setting up Hadoop cluster - Managing multiple Hadoo...

Posted 1 month ago

AI Match Score
Apply

8.0 - 13.0 years

25 - 40 Lacs

chennai

Work from Office

Roles and Responsibilities: Working with clients to understand their data. Based on the understanding you will be building the data structures and pipelines. You will be working on the application from end to end collaborating with UI and other development teams. You will be working with various cloud providers such as Azure & AWS. You will be engineering data using the Hadoop/Spark ecosystem. You will be responsible for designing, building, optimizing and supporting new and existing data pipelines. Orchestrating jobs using various tools such Oozie, Airflow, etc. Developing programs for cleaning and processing data. You will be responsible for building the data pipelines to migrate and load ...

Posted 2 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies