Home
Jobs

Spark, Kafka, Airflow Professional

5 - 10 years

2 - 6 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job description Minimum of 5 years of experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture using Hadoop, Hive, HDFS, HBase, Spark, Kafka, Nifi and Airflow. Expertise in one of the programming language- Scala/Java/Python. Experience with Data lake, data warehouse ETL build and design, and Data migration from legacy systems including Hadoop, Exadata, Oracle Teradata, or Netezza etc. Demonstrable knowledge Experience using Google Cloud Big Query is mandatory. Good understanding of batch and streaming Data Ingestion. Expertise on building modern data pipelines and operations, with an ELT philosophy Experience with Agile methodologies and DevOps , CICD principles Strong knowledge of data technologies and data modeling. As a lead architect, work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on-premise. Lead Big data solution and scoping to generate estimates and approaches for proposals and SOWs for customers. Create detailed target state technical architecture and design blueprints incorporating modern data technologies demonstrating modernization value proposition.

Mock Interview

Practice Video Interview with JobPe AI

Start Data Migration Interview Now
Fusion Plus Solutions
Fusion Plus Solutions

Information Technology & Services

Innovation City

50-100 Employees

104 Jobs

    Key People

  • Alice Johnson

    CEO
  • Bob Smith

    CTO

RecommendedJobs for You