Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
16.0 - 21.0 years
4 - 8 Lacs
Kolkata
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration, Ansible on Microsoft Azure Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. :Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education
Posted 4 days ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to resolve, maintain and manage clients software/ hardware/ network based on the service requests raised from the end-user as per the defined SLAs ensuring client satisfaction Do Ensure timely response of all the tickets raised by the client end user Service requests solutioning by maintaining quality parameters Act as a custodian of clients network/ server/ system/ storage/ platform/ infrastructure and other equipments to keep track of each of their proper functioning and upkeep Keep a check on the number of tickets raised (dial home/ email/ chat/ IMS), ensuring right solutioning as per the defined resolution timeframe Perform root cause analysis of the tickets raised and create an action plan to resolve the problem to ensure right client satisfaction Provide an acceptance and immediate resolution to the high priority tickets/ service Installing and configuring software/ hardware requirements based on service requests 100% adherence to timeliness as per the priority of each issue, to manage client expectations and ensure zero escalations Provide application/ user access as per client requirements and requests to ensure timely solutioning Track all the tickets from acceptance to resolution stage as per the resolution time defined by the customer Maintain timely backup of important data/ logs and management resources to ensure the solution is of acceptable quality to maintain client satisfaction Coordinate with on-site team for complex problem resolution and ensure timely client servicing Review the log which Chat BOTS gather and ensure all the service requests/ issues are resolved in a timely manner Deliver NoPerformance ParameterMeasure1.100% adherence to SLA/ timelines Multiple cases of red time Zero customer escalation Client appreciation emails Mandatory Skills: Hadoop Admin.
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
We are seeking a skilled Hadoop/Cloudera Administrator to provide technical support for data integration and visualization platforms. The ideal candidate will also have exposure to Snowflake and AWS administration. Provide technical support to customers and internal teams for data integration and visualization platforms, primarily focused on Hadoop/Cloudera administration . Additional knowledge/experience in Snowflake and AWS administration is a plus. Investigate and troubleshoot software and system issues reported by users; perform root cause analysis and implement long-term solutions. Collaborate closely with development and QA teams to test and validate fixes and system enhancements. Debug application-level issues and provide effective resolutions or temporary workarounds as needed. Create and maintain comprehensive documentation for support processes, known issues, and resolution procedures. Maintain and update Standard Operating Procedures (SOPs) and Known Error Database (KEDB) with accurate and actionable information. Participate in problem management by identifying patterns in recurring incidents and driving root cause analysis and permanent fixes. Participate in on-call rotations to support critical production systems outside of standard business hours. Proactively monitor system performance and identify opportunities to enhance platform reliability, scalability, and user experience. Hadoop Administration, Aws, Cloudera (Hadoop
Posted 2 weeks ago
4.0 - 9.0 years
5 - 8 Lacs
Gurugram
Work from Office
RARR Technologies is looking for HADOOP ADMIN to join our dynamic team and embark on a rewarding career journey. Responsible for managing the day-to-day administrative tasks Provides support to employees, customers, and visitors Responsibilities:1 Manage incoming and outgoing mail, packages, and deliveries 2 Maintain office supplies and equipment, and ensure that they are in good working order 3 Coordinate scheduling and meetings, and make arrangements for travel and accommodations as needed 4 Greet and assist visitors, and answer and direct phone calls as needed Requirements:1 Experience in an administrative support role, with a track record of delivering high-quality work 2 Excellent organizational and time-management skills 3 Strong communication and interpersonal skills, with the ability to interact effectively with employees, customers, and visitors 4 Proficiency with Microsoft Office and other common office software, including email and calendar applications
Posted 2 weeks ago
8.0 - 13.0 years
22 - 37 Lacs
Pune
Hybrid
Role & responsibilities Role - Hadoop Admin + Automation Experience 8+ yrs Grade AVP Location - Pune Mandatory Skills : Hadoop Admin, Automation (Shell scripting/ any programming language Java/Python), Cloudera / AWS/Azure/GCP Good to have : DevOps tools Primary focus will be on candidates with Hadoop admin & Automation experience,
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Unix Shell Scripting, Hadoop Administration, PySparkMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient and scalable application solutions.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide constructive feedback to team members.- Stay updated on industry trends and best practices to enhance application development processes.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with Unix Shell Scripting, Hadoop Administration, PySpark.- Strong understanding of ETL processes and data integration.- Experience in developing and optimizing data pipelines.- Knowledge of data warehousing concepts and methodologies.- Familiarity with database technologies and SQL queries. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
3 - 8 Lacs
Noida
Work from Office
We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 3 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 8 LPA
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Looking for a Hadoop Administrator to manage, monitor, and optimize Hadoop clusters. Responsibilities include deployment, upgrades, performance tuning, and security. Requires 3+ years of experience with Hadoop ecosystem tools and Linux systems. Required Candidate profile Notice Period : Immediate or 30 days max
Posted 3 weeks ago
5.0 - 10.0 years
3 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are currently hiring for a Database Administrator position with one of our prestigious clients based in Muscat, Oman . We are offering Visa Sponsorship , Free Travel , and 15 days of accommodation for selected candidates. Position Details: Role: Database Administrator Experience: Minimum 7 years in IT, with at least 5 years in Database Administration and 3 years supporting open-source databases Location: Muscat, Oman Key Skills: PostgreSQL, Hadoop, MongoDB High-Availability Solutions (Always On, Log Shipping, Mirroring, Replication, Clustering) Disaster Recovery & Backup Strategy Security Patching and Hot Fix Management Windows Server & Network Coordination 24x7 Production and On-Call Support Responsibilities: Design and implement application integrations with existing/new databases Coordinate closely with Security, Network, and Windows teams Install and upgrade database systems per company standards Ensure best practices in database security and performance Conduct regular DR drills and ensure system reliability
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Location Bengaluru Band B3 (7+ years and above) Notice Period Immediate to 30 days Interview rounds - L1 (Virtual), L2 (Face-2-Face) Mandatory Skills: Hadoop, HDFS, Unix/Linux server setup Good to have: Scripting languages (Bash, Python) - Proven experience in Hadoop administration and HDFS management. - Extensive experience in building and managing data pipelines in Hadoop. - Strong background in Unix/Linux server setup, maintenance, and upgrades. - Excellent troubleshooting skills and experience with Linux package installation. - Skilled in scripting languages (Bash, Python) for automation of tasks and workflows. - Familiarity with virtualization technologies and Conda / Python environment management. - Experience with running ML pipelines in NVIDIA GPU clusters Hadoop Admin
Posted 3 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop. Experience5-8 Years.
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Work from Office
Experience: 2+ yrs of experience in IT, with At least 1+ years of experience with cloud and system administration. At least 2 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview: Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: • Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement: • Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files Install and implement security for Hadoop clusters Install Hadoop Updates, patches and version upgrades. Automate the same through scripts Point of Contact for Vendor escalation. Work with Hortonworks in resolving issues Should have Conceptual/working knowledge of basic data management concepts like ETL, Ref/Master data, Data quality, RDBMS Working knowledge of any scripting language like Shell, Python, Perl Should have experience in Orchestration & Deployment tools. Academic Qualification:
Posted 3 weeks ago
10.0 - 15.0 years
8 - 14 Lacs
Chennai
Work from Office
Years of Experience : 10-15 Yrs Shifts : 24*7 (Rotational Shift) Mode : Onsite Experience : 10+ yrs of experience in IT, with At least 7+ years of experience with cloud and system administration. At least 5 years of experience with and strong understanding of 'big data' technologies in Hadoop ecosystem - Hive, HDFS, Map/Reduce, Flume, Pig, Cloudera, HBase Sqoop, Spark etc. Job Overview : Smartavya Analytica Private Limited is seeking an experienced Hadoop Administrator to manage and support our Hadoop ecosystem. The ideal candidate will have strong expertise in Hadoop cluster administration, excellent troubleshooting skills, and a proven track record of maintaining and optimizing Hadoop environments. Key Responsibilities: Install, configure, and manage Hadoop clusters, including HDFS, YARN, Hive, HBase, and other ecosystem components. Monitor and manage Hadoop cluster performance, capacity, and security. Perform routine maintenance tasks such as upgrades, patching, and backups. Implement and maintain data ingestion processes using tools like Sqoop, Flume, and Kafka. Ensure high availability and disaster recovery of Hadoop clusters. Collaborate with development teams to understand requirements and provide appropriate Hadoop solutions. Troubleshoot and resolve issues related to the Hadoop ecosystem. Maintain documentation of Hadoop environment configurations, processes, and procedures. Requirement : Experience in Installing, configuring and tuning Hadoop distributions. Hands on experience in Cloudera. Understanding of Hadoop design principals and factors that affect distributed system performance, including hardware and network considerations. Provide Infrastructure Recommendations, Capacity Planning, work load management. Develop utilities to monitor cluster better Ganglia, Nagios etc. Manage large clusters with huge volumes of data Perform Cluster maintenance tasks Create and removal of nodes, cluster monitoring and troubleshooting Manage and review Hadoop log files
Posted 3 weeks ago
5 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Hadoop. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
4 - 9 years
11 - 15 Lacs
Bengaluru
Work from Office
About PhonePe Group: PhonePe is Indias leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! JOB DESCRIPTION Minimum of 1 year of experience in Linux/Unix Administration. Minimum of 2 years of hands on experience with managing infra on public cloud i.e Azure/AWS/GCP Over 4+ years of experience in Hadoop administration. Strong understanding of networking, open-source technologies, and tools. Familiar with best practices and IT operations for maintaining always-up, always-available services. Experience and participation during on-call rotation.Excellent communication skills. Solid expertise in Linux networking, including IP, iptables, and IPsec.Proficient in scripting and coding with languages such as Perl, Golang, or Python. Strong Knowledge of databases like Mysql,Nosql,Sql serverHand on experience with setting up , configuring and Managing Nginx as reverse proxy and load balancing in high traffic environments. Hands-on experience with both private and public cloud environments. Strong troubleshooting skills and operational expertise in areas such as system capacity, bottlenecks, memory, CPU, OS, storage, and networking. Practical experience with the Hadoop stack, including Hdfs,HBase,Hive, Pig, Airflow, YARN, HDFS, Ranger, Kafka, and Druid. Good to have experience with Design,develop and maintain Airflow DAGs and tasks to automate BAU processes,ensuring they are robust,scalable and efficient. Good to have experience with ELK stack administration. Experience in administering Kerberos and LDAP. Familiarity with open-source configuration management and deployment tools like Puppet, Salt, or Ansible.Responsible for the implementation and ongoing administration of Hadoop infrastructure. Experience in capacity planning and performance tuning of Hadoop clusters. Collaborate effectively with infrastructure, network, database, application, and business intelligence teams to ensure high data quality and availability. Develop tools and services to enhance debuggability and supportability.Work closely with security teams to apply Hadoop updates, OS patches, and version upgrades as needed. Troubleshoot complex production issues, identify root causes, and provide mitigation strategies. Work closely with teams to optimize the overall performance of the PhonePe Hadoop ecosystem. Experience with setting up & managing monitoring stack like OpenTsdb,Prometheus,ELK,Grafana,Loki PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe .
Posted 1 month ago
6 - 10 years
18 - 30 Lacs
Hyderabad
Hybrid
Position Big Data or Kubernetes Admin Location Hyderabad Hybrid Mode Fulltime with CASPEX End Client EXPERIAN Note: In both profiles good knowledge in Linux administration and Cloud experience is necessary Kubernetees administrations is not always DevOps, a common Linux or Cloud engineer can learn Kubernetees administration in their day to day work, which is who we actually looking for, not the one only knew of Devops tool without proper Linux and Cloud experience. Linux & AWS & Kubernetees Administrator Must Have skills : Deep understanding of Linux, networking fundamentals and security Experience working with AWS cloud platform and infrastructure services like EC2, S3, VPC, Subnet, ELB, LoadBalnacer, RDS, Route 53 etc.) Experience working with infrastructure as code with Terraform or Ansible tools Experience in building, deploying, and monitoring distributed apps using container systems (Docker) and container orchestration (Kubernetes, EKS) Kubernetes Administration: Cluster Setup and Management, Cluster Configuration and Networking, Upgrades, Monitoring and Logging, Security and Compliance, App deployement etc. Experience in Automation and CI/CD Integration,Capacity Planning, Pod Scheduling, Resource Quotas etc. Experience at OS level upgrades and Patching, including vulnerability remediations Ability to read and understand code (Java / Python / R / Scala) Nice to have skills: Experience in SAS Viya administration Experience managing large Big Data clusters Experience in Big Data tools like Hue, Hive, Spark, Jupyter, SAS and R-Studio Professional coding experience in at least one programming language, preferably Python. Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc. Bigdata Administrator & Linux & AWS Must Have skills: Deep understanding of Linux, networking and security fundamentals. Experience working with AWS cloud platform and infrastructure. Experience working with infrastructure as code with Terraform or Ansible tools. Experience managing large BigData clusters in production (at least one of -- Cloudera, Hortonworks, EMR). Excellent knowledge and solid work experience providing observability for BigData platforms using tools like Prometheus, InfluxDB, Dynatrace, Grafana, Splunk etc. Expert knowledge on Hadoop Distributed File System (HDFS) and Hadoop YARN. Decent knowledge of various Hadoop file formats like ORC, Parquet, Avro etc. Deep understanding of Hive (Tez), Hive LLAP, Presto and Spark compute engines. Ability to understand query plans and optimize performance for complex SQL queries on Hive and Spark. Experience supporting Spark with Python (PySpark) and R (SparklyR, SparkR) languages Solid professional coding experience with at least one scripting language - Shell, Python etc. Experience working with Data Analysts, Data Scientists and at least one of these related analytical applications like SAS, R-Studio, JupyterHub, H2O etc. Able to read and understand code (Java, Python, R, Scala), but expertise in at least one scripting languages like Python or Shell. Nice to have skills: Experience with workflow management tools like Airflow, Oozie etc. Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc. Implementation history of Packer, Chef, Jenkins or any other similar tooling. Prior working knowledge of Active Directory and Windows OS based VDI platforms like Citrix, AWS Workspaces etc.
Posted 1 month ago
2 - 7 years
4 - 9 Lacs
Ahmedabad
Work from Office
Hadoop Administrator: Job Description: As an Open Source Hadoop Administrator, role will involve managing and maintaining the Hadoop infrastructure based on open source technologies within an organization. You will be responsible for the installation, configuration, and administration of open source Hadoop clusters and related tools in a production environment. Primary goal will be to ensure the smooth functioning of the Hadoop ecosystem and support the data processing and analytics needs of the organization. Responsibilities: Hadoop Cluster Management: Install, manual configure, and maintain open source Hadoop clusters and related components such as HDFS, YARN, MapReduce, Hive, Pig, Spark, HBase, etc. Monitor cluster health and performance, troubleshoot issues, and optimize cluster resources. Capacity Planning: Collaborate with data architects and infrastructure teams to estimate and plan for future capacity requirements of the open source Hadoop infrastructure. Scale the cluster up or down based on the changing needs of the organization. Security and Authentication: Implement and manage security measures for the open source Hadoop environment, including user authentication, authorization, and data encryption. Ensure compliance with security policies and best practices. Backup and Recovery: Design and implement backup and disaster recovery strategies for the open source Hadoop ecosystem. Regularly perform backups and test recovery procedures to ensure data integrity and availability. Performance Tuning: Monitor and analyze the performance of open source Hadoop clusters and individual components. Identify and resolve performance bottlenecks, optimize configurations, and fine-tune parameters to achieve optimal performance. Monitoring and Logging: Set up monitoring tools and alerts to proactively identify and address issues in the open source Hadoop environment. Monitor resource utilization, system logs, and cluster metrics to ensure reliability and performance. Troubleshooting and Support: Respond to and resolve incidents and service requests related to the open source Hadoop infrastructure. Collaborate with developers, data scientists, and other stakeholders to troubleshoot and resolve issues in a timely manner. Documentation and Reporting: Maintain detailed documentation of open source Hadoop configurations, procedures, and troubleshooting guidelines. Generate regular reports on cluster performance, resource utilization, and capacity utilization. Requirements: Proven experience as a Hadoop Administrator or similar role with open source Hadoop distributions such as Apache Hadoop, Apache HBase, Apache Hive, Apache Spark, etc. Strong knowledge of open source Hadoop ecosystem components and related technologies. Experience with installation, configuration, and administration of open source Hadoop clusters. Proficiency in Linux/Unix operating systems and shell scripting. Familiarity with cluster management and resource allocation frameworks. Understanding of data management and processing concepts in distributed computing environments. Knowledge of security frameworks and best practices in open source Hadoop environments. Experience with performance tuning, troubleshooting, and optimization of open source Hadoop clusters. Strong problem-solving and analytical skills. Hadoop Developer: Job Responsibilities: A Hadoop developer is responsible for designing, developing, and maintaining Hadoop-based solutions for processing and analyzing large datasets. Their job description typically includes: 1. Data Ingestion: Collecting and importing data from various sources into the Hadoop ecosystem using tools like Apache Sqoop, Flume, or streaming APIs. 2. Data Transformation: Preprocessing and transforming raw data into a suitable format for analysis using technologies like Apache Hive, Apache Pig, or Spark. 3. Hadoop Ecosystem: Proficiency in working with components like HDFS (Hadoop Distributed File System), MapReduce, YARN, HBase, and others within the Hadoop ecosystem. 4. Programming: Strong coding skills in languages like Java, Python, or Scala for developing custom MapReduce or Spark applications. 5. Cluster Management: Setting up and maintaining Hadoop clusters, including tasks like configuring, monitoring, and troubleshooting. 6. Data Security: Implementing security measures to protect sensitive data within the Hadoop cluster. 7. Performance Tuning: Optimizing Hadoop jobs and queries for better performance and efficiency. 8. Data Analysis: Collaborating with data scientists and analysts to assist in data analysis, machine learning, and reporting. 9. Documentation: Maintaining clear documentation of Hadoop jobs, configurations, and processes. 10. Collaboration: Working closely with data engineers, administrators, and other stakeholders to ensure data pipelines and workflows are running smoothly. 11. Continuous Learning: Staying updated with the latest developments in the Hadoop ecosystem and big data technologies. 12. Problem Solving: Identifying and resolving issues related to data processing, performance, and scalability. Requirements for this role typically include a strong background in software development, knowledge of big data technologies, and proficiency in using Hadoop-related tools and languages. Additionally, good communication skills and the ability to work in a team are important for successful collaboration on data projects.
Posted 1 month ago
7 - 12 years
9 - 14 Lacs
Ahmedabad
Work from Office
Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Data Analytics, Data Warehouse ETL Testing, Big Data Analysis Tool and Techniques, Hadoop Administration Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :Experienced Data Engineer with a strong background in Azure data services and broadcast supply chain ecosystems. Skilled in OTT streaming protocols, cloud technologies, and project management. Roles & Responsibilities: Proven experience as a Data Engineer or in a similar role. Lead and support expert guidance to Principal - Solutions & Integration. Track and report on project progress using internal applications. Transition customer requirements to on-air operations with proper documentation. Scope projects and ensure adherence to budgets and timelines. Generate design and integration documentation. Professional & Technical Skills: Strong proficiency in Azure data services (Azure Data Factory, Azure Databricks, Azure SQL Database). Experience with SQL, Python, and big data tools (Hadoop, Spark, Kafka). Familiarity with data warehousing, ETL techniques, and microservices in a cloud environment. Knowledge of broadcast supply chain ecosystems (BMS, RMS, MAM, Playout, MCR/PCR, NLE, Traffic). Experience with OTT streaming protocols, DRM, and content delivery networks. Working knowledge of cloud technologies (Azure, Docker, Kubernetes, AWS Basics, GCP Basics). Basic understanding of AWS Media Services (Media Connect, Elemental, MediaLive, Media Store, Media 2 Cloud, S3, Glacier). Additional Information: Minimum of 5 years' experience in Data Analytics disciplines. Good presentation and documentation skills. Excellent interpersonal skills. Undergraduate qualifications in engineering or computer science.Networking:Apply basic networking knowledge including TCP/IP, UDP/IP, IGMP, DHCP, DNS, and LAN/WAN technologies to support video delivery systems.Highly Desirable: Experience in defining technical solutions with over 99.999% reliability. Qualifications Specific undergraduate qualifications ie engineering computer science
Posted 1 month ago
6 - 10 years
5 - 9 Lacs
Bengaluru
Work from Office
Exp- 6+ Location- initially Bengaluru but willing to relocate to Poland after 4 months We need to fulfil few positions on Hadoop Admin. This particular position is for Poland, however, since we are not getting any good profiles in Poland location, customer has agreed to recruit some good candidates from Bengaluru location who will be traveling to Poland after working for 3-4 months from Bengaluru. Please find the detailed JD attached below: Hadoop administration Automation (Ansible, shell scripting or python scripting) DEVOPS skills (Should be able to code at least in one language preferably python The role involves performing Big Data Administration and Engineering activities on multiple open-source platforms such as Hadoop, Kafka, HBase, and Spark. The successful candidate will possess strong troubleshooting and debugging skills. The role involves planning and performing capacity expansions and upgrades in a timely manner to avoid any scaling issues and bugs. This includes automating repetitive tasks to reduce manual effort and prevent human errors. The successful candidate will tune alerting and set up observability to proactively identify issues and performance problems. They will also work closely with Level-3 teams in reviewing new use cases and cluster hardening techniques to build robust and reliable platforms. The role involves creating standard operating procedure documents and guidelines on effectively managing and utilizing the platforms. The person will leverage DevOps tools, disciplines (Incident, problem, and change management), and standards in day-to-day operations. The individual will ensure that the Hadoop platform can effectively meet performance and service level agreement requirements. They will also perform security remediation, automation, and self-healing as per the requirement. The individual will concentrate on developing automations and reports to minimize manual effort. This can be achieved through various automation tools such as Shell scripting, Ansible, or Python scripting, or by using any other programming language Hadoop Administration
Posted 2 months ago
1 - 5 years
3 - 7 Lacs
Allahabad, Noida
Work from Office
Feather Thread Corporation is looking for Bigdata administrator to join our dynamic team and embark on a rewarding career journey. Office Management:Oversee general office operations, including maintenance of office supplies, equipment, and facilities Manage incoming and outgoing correspondence, including mail, email, and phone calls Coordinate meetings, appointments, and travel arrangements for staff members as needed Administrative Support:Provide administrative support to management and staff, including scheduling meetings, preparing documents, and organizing files Assist with the preparation of reports, presentations, and other materials for internal and external stakeholders Maintain accurate records and databases, ensuring data integrity and confidentiality Communication and Coordination:Serve as a point of contact for internal and external stakeholders, including clients, vendors, and partners Facilitate communication between departments and team members, ensuring timely and effective information flow Coordinate logistics for company events, meetings, and conferences Documentation and Compliance:Assist with the development and implementation of company policies, procedures, and guidelines Maintain compliance with regulatory requirements and industry standards Ensure proper documentation and record-keeping practices are followed Project Support:Provide support to project teams by assisting with project coordination, documentation, and tracking of tasks and deadlines Collaborate with team members to ensure project deliverables are met on time and within budget
Posted 2 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
10 - 20 years
13 - 23 Lacs
Hyderabad
Hybrid
Urgent Hiring - Senior Hadoop Administrator - 10 + year experience Preferred candidate profile Position: Hadoop Administrator Required Experience - 10 + year Location : Hyderabad/ Pune Role & responsibilities Professional Hadoop administration experience with CDP distribution. and at least 2 years within Dataflow(for Kafka and nifi profiles). Knowledge of Linux Administration Good experience in administration of Big data platform and the allied toolset. Big data platform software from Hortonworks, Cloudera. Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates Prior experience of Linux system administration Good knowledge of Hive as a service, Hbase Significant experience on Linux shell scripting. Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc.) Professional Hadoop administration experience with CDP distribution. and at least 2 years within Dataflow (for Kafka and Nifi profiles). Knowledge of Linux Administration Dataflow knowledge with respect to streaming tools like Kafka and Nifi Understanding of Schema Registry Good experience in administration of Big data platform and the allied toolset. Big data platform software from Hortonworks, Cloudera. Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates Prior experience of Linux system administration Good knowledge of Hive as a service, Hbase, Kafka, Spark Significant experience on Linux shell scripting. Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc.) Working knowledge of Hortonworks Data flow (HDF) architecture, setup and ongoing administration Promethues / Grafana etc If you are interested in exploring this opportunity further, can apply by sending your updated resume to shalini.purohit@wipro.com Thanks TA
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Pune, Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language), Hadoop Administration Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : A:MS or equivalent in Computer Science, Information Systems, Engineering, Physics, Maths or other Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Python and Hadoop Administration. Your typical day will involve working with Python, developing and configuring applications, and ensuring their smooth functioning. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Python and Hadoop Administration. Collaborate with cross-functional teams to identify and prioritize application requirements, ensuring their timely delivery. Develop and maintain technical documentation, including design specifications, test plans, and user manuals. Ensure the smooth functioning of applications by identifying and resolving technical issues, performing regular maintenance, and implementing upgrades and patches. Professional & Technical Skills: Must To Have Skills:Proficiency in Python (Programming Language) and Hadoop Administration. Strong understanding of software engineering principles and best practices. Experience with application development frameworks such as Django or Flask. Experience with database technologies such as MySQL, PostgreSQL, or MongoDB. Experience with version control systems such as Git or SVN. Additional Information: The candidate should have a minimum of 5 years of experience in Python (Programming Language). The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Chennai office. Qualifications A:MS or equivalent in Computer Science, Information Systems, Engineering, Physics, Maths or other
Posted 3 months ago
8 - 13 years
10 - 17 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
MUST HAVE - Minimum of 8+ yrs of role exp. Big data admin resource with strong experience of Terraform Big Data Administration with cloud expertise SRE experience is an added advantage Data Platform Management and Optimization - designs, builds and manages data storage and workflows/compute in cloud environments. They ensure that data is secure, accessible, and processed efficiently.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2