5 - 10 years
7 - 12 Lacs
Posted:6 days ago|
Platform:
Work from Office
Full Time
Job Summary Synechron is seeking a dedicated and technically skilled Hadoop Shell Scripting Engineer to manage and optimize our Hadoop ecosystem. The role involves developing automation utilities, troubleshooting complex issues, and collaborating with vendors for platform enhancements. Your expertise will directly support enterprise data processing, performance tuning, and cloud migration initiatives, ensuring reliable and efficient data infrastructure that aligns with organizational goals. Software Requirements Required Skills: Strong proficiency in UNIX Shell scripting with hands-on experience in developing automation utilities In-depth understanding of Hadoop architecture and ecosystem components (HDFS, Hive, Spark) Experience with SQL querying and database systems Familiarity with Git and enterprise version control practices Working knowledge of DevOps and CI/CD tools and processes Preferred Skills: Experience with Python scripting for automation and utility development Knowledge of Java programming language Familiarity with cloud platforms (AWS, Azure, GCP) related to Hadoop ecosystem support Exposure to Hadoop vendor support and collaboration processes (e.g., Cloudera) Overall Responsibilities Develop, maintain, and enhance scripts and utilities to automate Hadoop cluster management and data processing tasks Serve as the Level 3 point of contact for issues related to Hadoop and Spark platforms Perform performance tuning and capacity planning to support enterprise data workloads Conduct proof-of-concept tests for emerging technologies and evaluate their suitability for cloud migration projects Collaborate with vendor support teams and internal stakeholders for issue resolution, feature requests, and platform improvements Review and validate all changes going into production to ensure stability and performance Continuously analyze process inefficiencies and develop new automation utilities to enhance productivity Assist in capacity management and performance monitoring activities Technical Skills (By Category) Programming Languages: Essential: UNIX Shell scripting Preferred: Python, Java Databases & Data Management: Essential: Knowledge of SQL querying and database systems Preferred: Experience with Hive, HDFS Cloud Technologies: Preferred: Basic familiarity with cloud platforms for Hadoop ecosystem support and migration Frameworks & Libraries: Not specifically applicable; focus on scripting and platform tools Development Tools & Methodologies: Essential: Git, version control, DevOps practices, CI/CD pipelines Preferred: Automation frameworks, monitoring tools Security Protocols: Not explicitly specified but familiarity with secure scripting and data access controls is advantageous Experience Requirements Minimum of 5+ years of hands-on experience working with Hadoop clusters and scripting in UNIX shell Proven experience in managing enterprise Hadoop/Spark environments Experience in performance tuning, capacity planning, and utility development Exposure to cloud migrations or proof-of-concept evaluations is a plus Background in data engineering or platform support roles preferred Day-to-Day Activities Develop and enhance UNIX shell scripts for Hadoop automation and utility management Troubleshoot and resolve complex platform issues as the Level 3 point of contact Work with application teams to optimize queries and data workflows Engage with vendor support teams for platform issues and feature requests Perform system performance reviews, capacity assessments, and tuning activities Lead initiatives for process automation, efficiency improvement, and new technology evaluations Document procedures, scripts, and platform configurations Participate in team meetings, provide technical feedback, and collaborate across teams on platform health Qualifications Educational Requirements: Bachelor's degree in Computer Science, Information Technology, or related field Equivalent professional experience in data engineering, platform support, or Hadoop administration Certifications (Preferred): Certificates in Hadoop ecosystem, Linux scripting, or cloud platform certifications Training & Professional Development: Ongoing learning related to big data platforms, automation, and cloud migration Professional Competencies Strong analytical and troubleshooting skills Excellent written and verbal communication skills Proven ability to work independently with minimal supervision Collaborative team player with a positive attitude Ability to prioritize tasks effectively and resolve issues swiftly Adaptability to evolving technologies and environments Focus on quality, security, and process improvement
Synechron
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
7.0 - 12.0 Lacs P.A.
3.0 - 8.0 Lacs P.A.
9.0 - 14.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
6.0 - 8.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
5.0 - 8.0 Lacs P.A.
Ahmedabad, Chennai, Bengaluru
55.0 - 60.0 Lacs P.A.
Ahmedabad, Chennai, Bengaluru
55.0 - 60.0 Lacs P.A.
Bengaluru
7.0 - 11.0 Lacs P.A.
7.0 - 11.0 Lacs P.A.