- 5 years of in depth hands on experience of developing testing deployment and debugging of Spark Jobs using Scala in Hadoop Platform
- In depth knowledge of Spark Core working with RDDs Spark SQL
- In depth knowledge on Spark Optimization Techniques and Best practices
- Good Knowledge of Scala Functional Programming Try Option Future Collections
- Good Knowledge of Scala OOPS Classes Traits and Objects Singleton and Companion Case Classes
- Good Understanding of Scala Language Features Type System Implicit Givens
- Hands on experience of working in Hadoop Environment HDFS Hive AWS S3 EMR
- Working experience on Workflow Orchestration tools like Airflow Oozie
- Working with API calls in Scala
- Understanding and exposure to file formats such as Apache AVRO Parquet JSON
- Good to have knowledge of Protocol Buffers and Geospatial data analytics
- Writing Test cases using frameworks such as scalatest
- Good Knowledge of Build Tools such as Gradle SBT in depth
- Experience on using GIT resolving conflicts working with branches
- Good to have Python programming skills
- Good to have worked on some workflow systems as Airflow
- Strong programming skills using data structures and algorithms
- Excellent analytical skills
- Good communication skills
Key Responsibilities:
- 5 years of in depth hands on experience of developing testing deployment and debugging of Spark Jobs using Scala in Hadoop Platform
- In depth knowledge of Spark Core working with RDDs Spark SQL
- In depth knowledge on Spark Optimization Techniques and Best practices
- Good Knowledge of Scala Functional Programming Try Option Future Collections
- Good Knowledge of Scala OOPS Classes Traits and Objects Singleton and Companion Case Classes
- Good Understanding of Scala Language Features Type System Implicit Givens
- Hands on experience of working in Hadoop Environment HDFS Hive AWS S3 EMR
- Working experience on Workflow Orchestration tools like Airflow Oozie
- Working with API calls in Scala
- Understanding and exposure to file formats such as Apache AVRO Parquet JSON
- Good to have knowledge of Protocol Buffers and Geospatial data analytics
- Writing Test cases using frameworks such as scalatest
- Good Knowledge of Build Tools such as Gradle SBT in depth
- Experience on using GIT resolving conflicts working with branches
- Good to have Python programming skills
- Good to have worked on some workflow systems as Airflow
- Strong programming skills using data structures and algorithms
- Excellent analytical skills
- Good communication skills
Technical Requirements:
- Primary skills Technology Big Data Data Processing Spark Technology Cloud Platform Amazon Webservices DevOps Technology Finacle Core Ops Multi Entity Advance Technology Functional Programming Scala
Additional Responsibilities:
- Ability to work with clients to identify business challenges and contribute to client deliverables by refining analyzing and structuring relevant data
- Awareness of latest technologies and trends
- Logical thinking and problem solving skills along with an ability to collaborate
- Ability to assess the current processes identify improvement areas and suggest the technology solutions
- One or two industry domain knowledge
Preferred Skills:
Technology->Functional Programming->Scala,Technology->Big Data - Data Processing->Spark,Technology->Cloud Platform->Amazon Webservices DevOps->AWS DevOps,Technology->Cloud Security->AWS - Container Security->Amazon Elastic Kubernetes Service (EKS)