Posted:2 weeks ago| Platform:
Hybrid
Full Time
Position Big Data or Kubernetes Admin Location Hyderabad Hybrid Mode Fulltime with CASPEX End Client EXPERIAN Note: In both profiles good knowledge in Linux administration and Cloud experience is necessary Kubernetees administrations is not always DevOps, a common Linux or Cloud engineer can learn Kubernetees administration in their day to day work, which is who we actually looking for, not the one only knew of Devops tool without proper Linux and Cloud experience. Linux & AWS & Kubernetees Administrator Must Have skills : Deep understanding of Linux, networking fundamentals and security Experience working with AWS cloud platform and infrastructure services like EC2, S3, VPC, Subnet, ELB, LoadBalnacer, RDS, Route 53 etc.) Experience working with infrastructure as code with Terraform or Ansible tools Experience in building, deploying, and monitoring distributed apps using container systems (Docker) and container orchestration (Kubernetes, EKS) Kubernetes Administration: Cluster Setup and Management, Cluster Configuration and Networking, Upgrades, Monitoring and Logging, Security and Compliance, App deployement etc. Experience in Automation and CI/CD Integration,Capacity Planning, Pod Scheduling, Resource Quotas etc. Experience at OS level upgrades and Patching, including vulnerability remediations Ability to read and understand code (Java / Python / R / Scala) Nice to have skills: Experience in SAS Viya administration Experience managing large Big Data clusters Experience in Big Data tools like Hue, Hive, Spark, Jupyter, SAS and R-Studio Professional coding experience in at least one programming language, preferably Python. Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc. Bigdata Administrator & Linux & AWS Must Have skills: Deep understanding of Linux, networking and security fundamentals. Experience working with AWS cloud platform and infrastructure. Experience working with infrastructure as code with Terraform or Ansible tools. Experience managing large BigData clusters in production (at least one of -- Cloudera, Hortonworks, EMR). Excellent knowledge and solid work experience providing observability for BigData platforms using tools like Prometheus, InfluxDB, Dynatrace, Grafana, Splunk etc. Expert knowledge on Hadoop Distributed File System (HDFS) and Hadoop YARN. Decent knowledge of various Hadoop file formats like ORC, Parquet, Avro etc. Deep understanding of Hive (Tez), Hive LLAP, Presto and Spark compute engines. Ability to understand query plans and optimize performance for complex SQL queries on Hive and Spark. Experience supporting Spark with Python (PySpark) and R (SparklyR, SparkR) languages Solid professional coding experience with at least one scripting language - Shell, Python etc. Experience working with Data Analysts, Data Scientists and at least one of these related analytical applications like SAS, R-Studio, JupyterHub, H2O etc. Able to read and understand code (Java, Python, R, Scala), but expertise in at least one scripting languages like Python or Shell. Nice to have skills: Experience with workflow management tools like Airflow, Oozie etc. Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc. Implementation history of Packer, Chef, Jenkins or any other similar tooling. Prior working knowledge of Active Directory and Windows OS based VDI platforms like Citrix, AWS Workspaces etc.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 5.0 - 9.0 Lacs P.A.
INR 1.0 - 5.0 Lacs P.A.
INR 6.0 - 13.0 Lacs P.A.
Kolkata, Chennai, Bengaluru
INR 10.0 - 19.0 Lacs P.A.
New Delhi, Pune, Delhi / NCR
INR 3.0 - 6.0 Lacs P.A.
INR 10.0 - 13.0 Lacs P.A.
INR 20.0 - 25.0 Lacs P.A.
INR 7.0 - 12.0 Lacs P.A.
Hyderabad
INR 18.0 - 25.0 Lacs P.A.
INR 20.0 - 35.0 Lacs P.A.