Jobs
Interviews

30 Hadoop Administration Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

16.0 - 21.0 years

4 - 8 Lacs

Kolkata

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. Requirements:Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education

Posted 2 weeks ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

About the Role As an SRE (5 to 7 years) (Big Data) Engineer at PhonePe, you will be responsible for ensuring the stability, scalability, and performance of distributed systems operating at scale. You will collaborate with development, infrastructure, and data teams to automate operations, reduce manual efforts, handle incidents, and continuously improve system reliability. This role requires strong problem-solving skills, operational ownership, and a proactive approach to mentoring and driving engineering excellence. Roles and Responsibilities Ensure the ongoing stability, scalability, and performance of PhonePes Hadoop ecosystem and associated services. Manage and administer Hadoop infrastructure including HDFS, HBase, Hive, Pig, Airflow, YARN, Ranger, Kafka, Pinot, and Druid. Automate BAU operations through scripting and tool development. Perform capacity planning, system tuning, and performance optimization. Set-up, configure, and manage Nginx in high-traffic environments. Administration and troubleshooting of Linux + Bigdata systems, including networking (IP, Iptables, IPsec). Handle on-call responsibilities, investigate incidents, perform root cause analysis, and implement mitigation strategies. Collaborate with infrastructure, network, database, and BI teams to ensure data availability and quality. Apply system updates, patches, and manage version upgrades in coordination with security teams. Build tools and services to improve observability, debuggability, and supportability. Participate in Kerberos and LDAP administration. Experience in capacity planning and performance tuning of Hadoop clusters. Work with configuration management and deployment tools like Puppet, Chef, Salt, or Ansible. Skills Required Minimum 1 year of Linux/Unix system administration experience. Over 4 years of hands-on experience in Hadoop administration. Minimum 1 years of experience managing infrastructure on public cloud platforms like AWS, Azure, or GCP (optional ) . Strong understanding of networking, open-source tools, and IT operations. Proficient in scripting and programming (Perl, Golang, or Python). Hands-on experience with maintaining and managing the Hadoop ecosystem components like HDFS, Yarn, Hbase, Kafka . Strong operational knowledge in systems (CPU, memory, storage, OS-level troubleshooting). Experience in administering and tuning relational and NoSQL databases. Experience in configuring and managing Nginx in production environments. Excellent communication and collaboration skills. Good to Have Experience designing and maintaining Airflow DAGs to automate scalable and efficient workflows. Experience in ELK stack administration. Familiarity with monitoring tools like Grafana, Loki, Prometheus, and OpenTSDB. Exposure to security protocols and tools (Kerberos, LDAP). Familiarity with distributed systems like elasticsearch or similar high-scale environments. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news

Posted 3 weeks ago

Apply

8.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Please find the below detailed JD for Bigdata Administrator Key Responsibilities: Lead CDP platform upgrades and migrations , with strong hands-on execution and documentation from planning to go-live. Administer and tune Hadoop ecosystem services: Core: HDFS, YARN, Hive, Hue, Impala, Sqoop, Oozie Streaming: Apache Kafka (broker/topic ops), Apache Flink (streaming jobs) NoSQL/Query: HBase, Phoenix Security: Kerberos, Ranger, LDAP, TLS Manage Cribl Stream deployments: build, configure, secure, and optimize data routing pipelines. Monitor and optimize platform performance using Cloudera Manager, NewRelic, BigPanda, Prometheus, Grafana , or any other observability tools. Design and implement backup, recovery, HA, and DR strategies for critical data infrastructure. Automate platform operations using Python, Bash/Shell, Scala , and CI/CD workflows. Work cross-functionally with Data Engineers, DevOps, InfoSec, and Cloud Engineering teams to support data pipeline reliability and scalability. Manage deployments using Docker , Kubernetes , Jenkins , Bitbucket , and optionally Ansible or GitOps practices. Support and maintain cloud-native or hybrid deployments, especially in GCP (Anthos) environments. Produce and maintain robust architecture documentation, runbooks, and operational SOPs. Required Qualifications: 7+ years of experience in Big Data infrastructure, administration, and operations. Proven Cloudera CDP (7.x) experience, including production-grade migrations (7.1.6 to 7.1.9+). Deep expertise in: Apache Spark job tuning, executor/resource optimization Apache Kafka – security (SASL_SSL, GSSAPI), scaling, topic lifecycle management Apache Flink – real-time stream processing in HA environments Cribl Stream – full-lifecycle management and observability integration HBase & Phoenix – schema evolution, read/write tuning, replication Scripting & Automation: Proficient in Python , Shell (Bash) , and optionally Scala Security-first mindset: Working knowledge of Kerberos , Ranger policies , LDAP integration, and TLS configuration. DevOps Experience: Hands-on with Docker , Kubernetes , Jenkins , Bitbucket , and monitoring tools like Grafana/Prometheus . Comfortable supporting large-scale, multi-tenant environments and production on-call rotations. Preferred Qualifications: Cloudera Certified Administrator (CCA) or equivalent industry certification. Experience with BD on-prem , cloud and hybrid data infrastructure , particularly Google Cloud Platform (GCP) and Anthos clusters.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Noida

Work from Office

We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 7 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 14-15 LPA

Posted 4 weeks ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Kolkata

Work from Office

We are seeking a highly skilled and experienced Hadoop Administrator to join our dynamic team. The ideal candidate will have extensive experience in managing and optimizing Hadoop clusters, ensuring high performance and availability. You will work with a variety of big data technologies and play a pivotal role in managing data integration, troubleshooting infrastructure issues, and collaborating with cross-functional teams to streamline data workflows. Key Responsibilities : - Install, configure, and maintain Hadoop clusters, ensuring high availability, scalability, and performance. - Manage and monitor various Hadoop ecosystem components, including HDFS, YARN, Hive, Impala, and other related technologies. - Oversee the integration of data from Oracle Flexcube and other source systems into the Cloudera Data Platform. - Troubleshoot and resolve complex issues related to Hadoop infrastructure, performance, and applications. - Collaborate with cross-functional teams including data engineers, analysts, and architects to optimize data workflows and processes. - Implement and manage data backup, recovery plans, and disaster recovery strategies for Hadoop clusters. - Perform regular health checks on the Hadoop ecosystem, including managing logs, capacity planning, and system updates. - Develop, test, and optimize scripts to automate system maintenance and data management tasks. - Ensure compliance with internal security policies and industry best practices for data protection. - Provide training and guidance to junior team members and help in knowledge sharing within the team. - Create and maintain documentation related to Hadoop administration processes, system configurations, troubleshooting steps, and best practices. - Stay updated with the latest trends in Hadoop technologies and suggest improvements and new tools as necessary. Qualifications : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - 5+ years of hands-on experience in Hadoop administration, with a preference for candidates from the banking or financial sectors. - Strong knowledge of Oracle Flexcube, Cloudera Data Platform, Hadoop, Hive, Impala, and other big data technologies. - Proven experience in managing and optimizing large-scale Hadoop clusters, including cluster upgrades and performance tuning. - Expertise in configuring and tuning Hadoop-related services (e.g., HDFS, YARN, MapReduce). - Strong understanding of data security principles and implementation of security protocols within Hadoop. - Excellent analytical, troubleshooting, and problem-solving skills. - Strong communication and interpersonal skills with the ability to work collaboratively within cross-functional teams. - Ability to work independently, manage multiple priorities, and meet deadlines. - Certification in Hadoop administration or related fields is a plus. - Experience with scripting languages such as Python, Shell, or Perl is desirable.

Posted 1 month ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

We are looking for a skilled Hadoop Administrator with 5 to 7 years of experience in Hadoop Engineering, working on Python, Ansible, and DevOps methodologies. The ideal candidate will have extensive experience in CDPHDP Cluster and Server build, including Control nodes, Worker nodes, Edge nodes, and Data copy from cluster to cluster. Roles and Responsibility Design and implement scalable and efficient data processing systems using Hadoop technologies. Develop and maintain automation scripts using Python, Ansible, and other DevOps tools. Collaborate with cross-functional teams to identify and prioritize project requirements. Troubleshoot and resolve complex technical issues related to Hadoop clusters. Ensure high-quality standards for data processing and security. Participate in code reviews and contribute to the improvement of the overall codebase. Job Strong understanding of Hadoop ecosystem, including HDFS, MapReduce, and YARN. Experience with Linux operating system and scripting languages such as Bash or Python. Proficient in Shell scripting and YAML configuration files. Good technical design, problem-solving, and debugging skills. Understanding of CI/CD concepts and familiarity with GitHub, Jenkins, and Ansible. Hands-on development solutions using industry-leading Cloud technologies. Working knowledge of Git Ops and DevSecOps. Agile proficient and knowledgeable in other agile methodologies, ideally certified. Strong communication and networking skills. Ability to work autonomously and take accountability to execute and deliver on goals. Strong commitment to high-quality standards. Good communication skills and sense of ownership to work as an individual contributor.

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Experience in SQL and understanding of ETL best practices Should have good hands on in ETL/Big Data development Extensive hands on experience in Scala Should have experience in Spark/Yarn, troubleshooting Spark, Linux, Python Setting up a Hadoop cluster, Backup, recovery, and maintenance.

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Big data (Hadoop and Spark) skills. Programming language: Python, Scala Job requirement This position is for a mid-level data engineer with development experience who will focus on creating new capabilities in the Risk space while maturing our code base and development processes. Qualifications: 3 or more years of work experience with a bachelors degree or more than 2 years of work experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) Experience in creating data driven business solutions and solving data problems using a wide variety of technologies such as Hadoop, Hive, Spark, MongoDB, NoSQL, as well as traditional data technologies like RDBMS, MySQL a plus Ability to program in one or more scripting languages such as Perl or Python and one or more programming languages such as Java or Scala Experience with data visualization and business intelligence tools like Tableau is a plus Experience with or knowledge of Continuous Integration & Development and automation tools such as Jenkins, Artifactory, Git etc. Experience with or knowledge of Agile and Test-Driven Development methodology Strong analytical skills with excellent problem-solving ability

Posted 1 month ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Ahmedabad

Work from Office

Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Data Analytics, Data Warehouse ETL Testing, Big Data Analysis Tool and Techniques, Hadoop Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :Experienced Data Engineer with a strong background in Azure data services and broadcast supply chain ecosystems. Skilled in OTT streaming protocols, cloud technologies, and project management. Roles & Responsibilities:- Proven experience as a Data Engineer or in a similar role.- Lead and support expert guidance to Principal - Solutions & Integration.- Track and report on project progress using internal applications.- Transition customer requirements to on-air operations with proper documentation.- Scope projects and ensure adherence to budgets and timelines.- Generate design and integration documentation. Professional & Technical Skills: - Strong proficiency in Azure data services (Azure Data Factory, Azure Databricks, Azure SQL Database).- Experience with SQL, Python, and big data tools (Hadoop, Spark, Kafka).- Familiarity with data warehousing, ETL techniques, and microservices in a cloud environment.- Knowledge of broadcast supply chain ecosystems (BMS, RMS, MAM, Playout, MCR/PCR, NLE, Traffic).- Experience with OTT streaming protocols, DRM, and content delivery networks.- Working knowledge of cloud technologies (Azure, Docker, Kubernetes, AWS Basics, GCP Basics).- Basic understanding of AWS Media Services (Media Connect, Elemental, MediaLive, Media Store, Media 2 Cloud, S3, Glacier). Additional Information:- Minimum of 5 years' experience in Data Analytics disciplines.- Good presentation and documentation skills.- Excellent interpersonal skills.- Undergraduate qualifications in engineering or computer science.Networking:Apply basic networking knowledge including TCP/IP, UDP/IP, IGMP, DHCP, DNS, and LAN/WAN technologies to support video delivery systems.Highly Desirable:- Experience in defining technical solutions with over 99.999% reliability. Qualification Specific undergraduate qualifications ie engineering computer science

Posted 1 month ago

Apply

8.0 - 12.0 years

14 - 15 Lacs

Pune

Work from Office

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: 8 -12 Years Experience in Hadoop Administration with working experience on Python, Ansible DevOps methodologies. Hadoop Admin is responsible for building different kinds of solutions on Big Data Platform CDP and ODP Environment Builds Python and Ansible Automations and DevSecOps Contributions Use Agile /DevOps methodology to deliver quality software. Guide team members in arriving at and delivering the right solutions. Monitor and improve the performance of Hadoop platforms. CDP / ODP Migration / Upgrade experience Requirements To be successful in this role, you should meet the following requirements: Big data eco system and Hadoop administration knowledge. Also knowledgeable about Active Directory and Centrify. Working Knowledge on Python, Ansible and CI/CD tools. Coordinating with vendors and business teams during environment outages. Have experience development / coding experience (Java / Python / Groove / Shell scripting). Comfortable dealing with frequent testing and incremental releases. Understanding of Ops challenges and how they can be addressed during design and development. Soft skills for better collaboration across the team.

Posted 1 month ago

Apply

16.0 - 21.0 years

4 - 8 Lacs

Kolkata

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration, Ansible on Microsoft Azure Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. :Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to resolve, maintain and manage clients software/ hardware/ network based on the service requests raised from the end-user as per the defined SLAs ensuring client satisfaction Do Ensure timely response of all the tickets raised by the client end user Service requests solutioning by maintaining quality parameters Act as a custodian of clients network/ server/ system/ storage/ platform/ infrastructure and other equipments to keep track of each of their proper functioning and upkeep Keep a check on the number of tickets raised (dial home/ email/ chat/ IMS), ensuring right solutioning as per the defined resolution timeframe Perform root cause analysis of the tickets raised and create an action plan to resolve the problem to ensure right client satisfaction Provide an acceptance and immediate resolution to the high priority tickets/ service Installing and configuring software/ hardware requirements based on service requests 100% adherence to timeliness as per the priority of each issue, to manage client expectations and ensure zero escalations Provide application/ user access as per client requirements and requests to ensure timely solutioning Track all the tickets from acceptance to resolution stage as per the resolution time defined by the customer Maintain timely backup of important data/ logs and management resources to ensure the solution is of acceptable quality to maintain client satisfaction Coordinate with on-site team for complex problem resolution and ensure timely client servicing Review the log which Chat BOTS gather and ensure all the service requests/ issues are resolved in a timely manner Deliver NoPerformance ParameterMeasure1.100% adherence to SLA/ timelines Multiple cases of red time Zero customer escalation Client appreciation emails Mandatory Skills: Hadoop Admin.

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

We are seeking a skilled Hadoop/Cloudera Administrator to provide technical support for data integration and visualization platforms. The ideal candidate will also have exposure to Snowflake and AWS administration. Provide technical support to customers and internal teams for data integration and visualization platforms, primarily focused on Hadoop/Cloudera administration . Additional knowledge/experience in Snowflake and AWS administration is a plus. Investigate and troubleshoot software and system issues reported by users; perform root cause analysis and implement long-term solutions. Collaborate closely with development and QA teams to test and validate fixes and system enhancements. Debug application-level issues and provide effective resolutions or temporary workarounds as needed. Create and maintain comprehensive documentation for support processes, known issues, and resolution procedures. Maintain and update Standard Operating Procedures (SOPs) and Known Error Database (KEDB) with accurate and actionable information. Participate in problem management by identifying patterns in recurring incidents and driving root cause analysis and permanent fixes. Participate in on-call rotations to support critical production systems outside of standard business hours. Proactively monitor system performance and identify opportunities to enhance platform reliability, scalability, and user experience. Hadoop Administration, Aws, Cloudera (Hadoop

Posted 2 months ago

Apply

4.0 - 9.0 years

5 - 8 Lacs

Gurugram

Work from Office

RARR Technologies is looking for HADOOP ADMIN to join our dynamic team and embark on a rewarding career journey. Responsible for managing the day-to-day administrative tasks Provides support to employees, customers, and visitors Responsibilities:1 Manage incoming and outgoing mail, packages, and deliveries 2 Maintain office supplies and equipment, and ensure that they are in good working order 3 Coordinate scheduling and meetings, and make arrangements for travel and accommodations as needed 4 Greet and assist visitors, and answer and direct phone calls as needed Requirements:1 Experience in an administrative support role, with a track record of delivering high-quality work 2 Excellent organizational and time-management skills 3 Strong communication and interpersonal skills, with the ability to interact effectively with employees, customers, and visitors 4 Proficiency with Microsoft Office and other common office software, including email and calendar applications

Posted 2 months ago

Apply

8.0 - 13.0 years

22 - 37 Lacs

Pune

Hybrid

Role & responsibilities Role - Hadoop Admin + Automation Experience 8+ yrs Grade AVP Location - Pune Mandatory Skills : Hadoop Admin, Automation (Shell scripting/ any programming language Java/Python), Cloudera / AWS/Azure/GCP Good to have : DevOps tools Primary focus will be on candidates with Hadoop admin & Automation experience,

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Unix Shell Scripting, Hadoop Administration, PySparkMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient and scalable application solutions.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide constructive feedback to team members.- Stay updated on industry trends and best practices to enhance application development processes.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with Unix Shell Scripting, Hadoop Administration, PySpark.- Strong understanding of ETL processes and data integration.- Experience in developing and optimizing data pipelines.- Knowledge of data warehousing concepts and methodologies.- Familiarity with database technologies and SQL queries. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

3 - 8 Lacs

Noida

Work from Office

We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 3 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 8 LPA

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Looking for a Hadoop Administrator to manage, monitor, and optimize Hadoop clusters. Responsibilities include deployment, upgrades, performance tuning, and security. Requires 3+ years of experience with Hadoop ecosystem tools and Linux systems. Required Candidate profile Notice Period : Immediate or 30 days max

Posted 2 months ago

Apply

5.0 - 10.0 years

3 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are currently hiring for a Database Administrator position with one of our prestigious clients based in Muscat, Oman . We are offering Visa Sponsorship , Free Travel , and 15 days of accommodation for selected candidates. Position Details: Role: Database Administrator Experience: Minimum 7 years in IT, with at least 5 years in Database Administration and 3 years supporting open-source databases Location: Muscat, Oman Key Skills: PostgreSQL, Hadoop, MongoDB High-Availability Solutions (Always On, Log Shipping, Mirroring, Replication, Clustering) Disaster Recovery & Backup Strategy Security Patching and Hot Fix Management Windows Server & Network Coordination 24x7 Production and On-Call Support Responsibilities: Design and implement application integrations with existing/new databases Coordinate closely with Security, Network, and Windows teams Install and upgrade database systems per company standards Ensure best practices in database security and performance Conduct regular DR drills and ensure system reliability

Posted 2 months ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Location Bengaluru Band B3 (7+ years and above) Notice Period Immediate to 30 days Interview rounds - L1 (Virtual), L2 (Face-2-Face) Mandatory Skills: Hadoop, HDFS, Unix/Linux server setup Good to have: Scripting languages (Bash, Python) - Proven experience in Hadoop administration and HDFS management. - Extensive experience in building and managing data pipelines in Hadoop. - Strong background in Unix/Linux server setup, maintenance, and upgrades. - Excellent troubleshooting skills and experience with Linux package installation. - Skilled in scripting languages (Bash, Python) for automation of tasks and workflows. - Familiarity with virtualization technologies and Conda / Python environment management. - Experience with running ML pipelines in NVIDIA GPU clusters Hadoop Admin

Posted 2 months ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop. Experience5-8 Years.

Posted 2 months ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Work from Office

Experience: 2+ yrs of experience in IT, with At least 1+ years of experience with cloud and system administration. At least 2 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: • Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement: • Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files Install and implement security for Hadoop clusters Install Hadoop Updates, patches and version upgrades. Automate the same through scripts Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS Working knowledge of any scripting language like Shell, Python, Perl Should have experience in Orchestration & Deployment tools. Academic Qualification:

Posted 2 months ago

Apply

10.0 - 15.0 years

8 - 14 Lacs

Chennai

Work from Office

Years of Experience : 10-15 Yrs Shifts : 24*7 (Rotational Shift) Mode : Onsite Experience : 10+ yrs of experience in IT, with At least 7+ years of experience with cloud and system administration. At least 5 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem - Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview : Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement : Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies