Role : Hadoop Admin Manager CDP.Years of Experience : 10-15 Yrs.Location : Mumbai (Kurla).Shifts : 24-7 (Rotational Shift).Mode : Onsite.Experience : 10+ yrs of experience in IT, with At least 7+ years of experience with cloud and system administration.At least 5 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem - Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc.Empowering Your Digital Transformation with Data Modernization and AI.
Job Overview
Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem.The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments.
Key Responsibilities
- Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components.
- Monitor and manage Hadoop cluster performance, capacity, and security.
- Perform routine maintenance tasks such as upgrades, patching, and backups.
- Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka.
- Ensure high availability and disaster recovery of Hadoop clusters.
- Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions.
- Troubleshoot and resolve issues related to the Hadoop ecosystem.
- Maintain documentation of Hadoop environment configurations, processes, and procedures.
Requirement
- Experience in Installing, configuring and tuning Hadoop distributions.
- Hands on experience in Cloudera.
- Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations.
- Provide Infrastructure Recommendations, Capacity Planning, work load management.
- Develop utilities to monitor cluster better Ganglia, Nagios etc.
- Manage large clusters with huge volumes of data.
- Perform Cluster maintenance tasks.
- Create and removal of nodes, cluster monitoring and troubleshooting.
- Manage and review Hadoop log files.
- Install and implement security for Hadoop clusters.
- Install Hadoop Updates, patches and version upgrades.
- Automate the same through scripts.
- Point of Contact for Vendor escalation.
- Work with Hortonworks in resolving issues.
- Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS.
- Working knowledge of any scripting language like Shell, Python, Perl.
- Should have experience in Orchestration & Deployment tools.
Academic Qualification
- BE / B.Tech in Computer Science or equivalent along with hands-on experience in dealing with large data sets and distributed computing in data warehousing and business intelligence systems using Hadoop.
(ref:hirist.tech)