Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 8.0 years
12 - 30 Lacs
Hyderabad
Work from Office
Strong Linux and Strong AWS experience Strong active directory Manage Hadoop clusters on Linux, Active Directory integration Collaborate with data science team on project delivery using Splunk & Spark Exp. managing BigData clusters in Production
Posted 1 week ago
8.0 - 13.0 years
22 - 37 Lacs
Pune
Hybrid
Role & responsibilities Role - Hadoop Admin + Automation Experience 8+ yrs Grade AVP Location - Pune Mandatory Skills : Hadoop Admin, Automation (Shell scripting/ any programming language Java/Python), Cloudera / AWS/Azure/GCP Good to have : DevOps tools Primary focus will be on candidates with Hadoop admin & Automation experience,
Posted 2 weeks ago
3.0 - 8.0 years
3 - 8 Lacs
Noida
Work from Office
We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 3 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 8 LPA
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Looking for a Hadoop Administrator to manage, monitor, and optimize Hadoop clusters. Responsibilities include deployment, upgrades, performance tuning, and security. Requires 3+ years of experience with Hadoop ecosystem tools and Linux systems. Required Candidate profile Notice Period : Immediate or 30 days max
Posted 3 weeks ago
12 - 18 years
20 - 35 Lacs
Navi Mumbai, Mumbai, Mumbai (All Areas)
Work from Office
1. BACKGROUND Job Title: Senior Technical Infra Manager - Hadoop Systems and Administration IT Domain: Big Data, Data Analytics and Data Management Client Location : BKC, Mumbai, INDIA (Client Facing Role) Functional Domain: Capital Markets, Banking, BFSI Experience: Overall, 14-18 years in Systems Administration with mandatory experience for the last 6 years in Hadoop Cluster Administration in Cloudera or Horton Works Expertise: Infrastructure, Systems Administration, Hadoop Platform Administration Project : Greenfield Hadoop (CDP) based Data Warehouse (DWH) Implementation Company : Smartavya Analytica Private limited is a niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PBs in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Hadoop, CDP, Big Data, Cloud and Analytics projects with super specialization in very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Summary: For its flagship project having more than 100 CDP nodes, we are looking for a senior and accomplished Senior / General Manager level Infrastructure leader in Hadoop, Big Data, Cloud and Data encompassing Infrastructure, CDP Platform, Applications and other Hadoop areas. The person in this role will be responsible for actively own, provide leadership, manage a team of 15 professionals, own the entire Hadoop infrastructure of 150 nodes encompassing 25+ PB’s, lead the team managing the client CDP cluster on 24*7 basis and lead a team 15 Hadoop Admins working round the clock on 365 days *24-hour basis. In this role, you will play a key part in Technical Solutioning, Infrastructure Management, Issue Resolution, Solve Performance bottlenecks, Cluster Monitoring, Cluster Administration, Linux Administration, Security Design and Management and Team Management and mentoring. The ideal candidate possesses a strong analytical mindset, a deep understanding of Hadoop Cluster Management techniques, and a knack for presenting complex analysis and resolution in a clear and concise manner. 2. KEY RESPONSIBILITIES & SKILLS a. Client and Team Leadership and Mentoring • As owner, interact with VP/Director level senior clients and provide end to end leadership • Lead a team of 15 Infra / Hadoop Administrators working on 24*7 shift basis • Design and draw long term roadmap of overall Data Strategy and implementation • Roster Management and Ensure coverage. Ensure all 3 shifts across all 365 days are manned and working efficiently • Train new team members on Hadoop Administration and get them ready on projects b. Hadoop Cluster Management and Administration: • Installation and Configuration: Install and configure Hadoop clusters using Cloudera Data Platform (CDP), ensuring optimal setup tailored to project needs. • Cluster Access Management: Utilize the Cloudera Manager Admin Console to manage cluster access and oversee administrative operations. • Cluster Maintenance and Upgrades: Plan and execute cluster upgrades, patches, and migrations with minimal downtime. • Resource Management: Ensure high availability and scalability by effectively managing cluster resources and configurations. c. Hadoop System Monitoring and Performance Optimization: • Health Monitoring: Proactively Monitor cluster health and performance using Cloudera Manager tools to ensure system reliability and ensure high availability. • Performance Tuning: Analyse system metrics to identify bottlenecks and optimize resource utilization for enhanced performance. • Troubleshooting: Promptly troubleshoot and resolve issues related to cluster operations and data processing. • Perform Linux administration tasks and manage system configurations. • Ensure data integrity and backup procedures including DR Replication. d. Security Management and Compliance • Access Control: Configure and manage user access, roles, and permissions within the Hadoop environment to maintain security protocols. Implement and manage security and data governance. • Data Security: Ensure data security by implementing encryption, authentication mechanisms, and adhering to security policies. • Regulatory Compliance: Maintain compliance with industry regulations pertinent to capital markets, ensuring data handling meets required standards. • Vulnerability Management: Collaborate with security teams to identify and address vulnerabilities, ensuring system integrity. 3. Qualifications : a. Experience: • Overall: 14-18 years in Senior Positions in Infrastructure Management, Data Center Management, Systems Administration, Big Data / Hadoop / CDP Architect and / or Senior Solutions Architect. • Hadoop Administration: Minimum of 6 years managing Hadoop clusters using Cloudera CDP or Hortonworks. b. Technical Expertise : • Proficient in managing Infrastructure, Hadoop platform administration of components and tools, including HDFS, YARN, MapReduce, Hive, Ozone, DR Replication, Kudu Management, Spark Streaming and related areas. • Strong understanding of Cloudera Data Platform (CDP) features and administrative tools. • Experience with Linux/Unix system administration and scripting languages (e.g., Bash, Python). • Knowledge of data warehouse concepts and big data best practices. • Hadoop Technologies: (Preferred) Hands-on experience with HDFS, Hive, and Spark for handling large-scale data environments. c. Domain Knowledge: Familiarity with capital markets and financial services is highly desirable. d. Soft Skills : • Excellent problem-solving and analytical abilities. • Strong communication skills, both written and verbal. • Ability to engage effectively with clients and stakeholders. • Leadership skills with experience guiding technical teams. 4. Educational Background Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 5. What We Offer • Challenging Projects: Work on a cutting-edge greenfield project in the big data and analytics space. • Professional Growth: Opportunities for learning and career advancement within the organization. • Collaborative Environment: Join a dynamic team focused on innovation and excellence. • Competitive Compensation: Attractive salary package commensurate with experience and expertise.
Posted 2 months ago
5 - 8 years
20 - 25 Lacs
Bengaluru
Work from Office
Role & responsibilities Very Important SKILL SET as per project need Managing & liaising with relevant stakeholder for escalated issues and enhancements. Apache Druid/Superset Admin Support with bug fixes and security patches and application-level security fixes Hadoop Support Engineer / Consultant to support our client onsite in Bangalore Location. 100% onsite Job Opportunity Scope - Managed Services Level 2 & Level 3 - User and cluster support for all Analytical Production & Non-Prod clusters Application/user query tuning and optimization Cluster patching, version upgrades, bug fixes deployment and new enhancements deployment Deploy security patches, CVE fixes deployment, regular maintenance of all clusters in scope including clean up, purging etc. Application & user onboarding, resource allocation, User access requests and other related Platform requests or queries resolution Hardware and network issue coordination with relevant stakeholders and provide resolution to end users of Hadoop Platform. Cluster expansion on all clusters. Managing & liaising with relevant stakeholder for escalated issues and enhancements. Apache Druid/Superset Admin Support with bug fixes and security patches and application-level security fixes 18x5 Support for tickets during the weekdays. On-call support for P1/P2 issues during weekends Create needed JIRA tickets for L4 Development team All the activities will be driven via Visas Service Now portal called as AskNow. The latest trends for last 6 months is roughly around 60/10 tickets/month comprising of all categories Incidents & Requests and Change approval tasks should be resolved in a timely manner within defined SLAs (indicative SLAs as below and to be firmed up during contracting stage) 60% tickets to be resolved within same business day. 90% tickets to be resolved within 3 business days. Resolution time for the rest 10% of tickets to be agreed with Visa Manager within 3 business days after ticket creation. Techstack and tools would primarily include Hadoop, HDFS, Spark, Hive, Kafka, Ranger, Yarn, Ambari, Kerberos ,Apache Airflow , Zookeeper, Druid , Superset etc Preferred candidate profile Bangalore Local Candidates Perks and benefits As per company policy
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2