Senior Consultant (Bigdata / Hadoop Components & platform)

0 - 4 years

0 Lacs

Posted:1 day ago| Platform: Shine logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Hortonworks is seeking an Associate Big Data Developer to join its professional services team in APAC. In this role, you'll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, Spark, and related Big Data technology. This role is a client-facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This is a great opportunity for a junior software developer looking to gain experience in the Big Data stack, as you will be provided with mentoring, training, and the opportunity to work with large customers across Asia Pacific. Responsibilities: - Work closely with Hortonworks Architects to design and build Big Data solutions on Hadoop - Work directly with customers to implement Big Data solutions at scale using the Hortonworks Data Platform and Hortonworks Dataflow - Help design and implement Hadoop platform architectures and configurations for customers - Work closely with Hortonworks teams at all levels to help ensure the success of project consulting engagements with customers - Write and produce technical documentation, knowledgebase articles - Attend speaking engagements when needed - Keep current with the Hadoop Big Data ecosystem technologies - Travel up to 75% Qualifications: - Aptitude for learning new skills and technology quickly - Demonstrable programming experience (in any language) - Experience in Linux / Unix environments - Strong analytical and problem-solving abilities - Theoretical knowledge of big data / analytics concepts - Proven interest in Big Data development, Data Engineering, or Data Science - Strong communication, consulting, and presentation skills Nice to have (but not required): - Bachelor of Software Engineering / IT / Computer Science or related discipline (Honors/Masters/PHD a plus) - Hands-on development experience with Apache technologies such as Hadoop, Spark, Hbase, Hive, Pig, Storm, Solr, Phoenix, Accumulo, Falcon, Sqoop, Flume, Kafka, Oozie - Experience with open-source software - Experience setting up multi-node Hadoop clusters - Strong SQL skills - Java, C++, Python, and/or R development - Experience with NoSQL data technology - Experience in any cloud environment (AWS, Azure, Google Cloud, SoftLayer, OpenStack) - Understanding of various enterprise security solutions (LDAP and/or Kerberos) - Understanding of Networking concepts - Understanding of Site Reliability Engineering concepts and practices - Experience with distributed systems, machine learning, or artificial intelligence algorithms - Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments - Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, data engineering, etc.,

Mock Interview

Practice Video Interview with JobPe AI

Start Data Science Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Java Skills

Practice Java coding challenges to boost your skills

Start Practicing Java Now

RecommendedJobs for You