Posted:2 weeks ago|
Platform:
Work from Office
Full Time
Job description Minimum of 5 years of experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture using Hadoop, Hive, HDFS, HBase, Spark, Kafka, Nifi and Airflow. Expertise in one of the programming language- Scala/Java/Python. Experience with Data lake, data warehouse ETL build and design, and Data migration from legacy systems including Hadoop, Exadata, Oracle Teradata, or Netezza etc. Demonstrable knowledge Experience using Google Cloud Big Query is mandatory. Good understanding of batch and streaming Data Ingestion. Expertise on building modern data pipelines and operations, with an ELT philosophy Experience with Agile methodologies and DevOps , CICD principles Strong knowledge of data technologies and data modeling. As a lead architect, work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on-premise. Lead Big data solution and scoping to generate estimates and approaches for proposals and SOWs for customers. Create detailed target state technical architecture and design blueprints incorporating modern data technologies demonstrating modernization value proposition.
Fusion Plus Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Information Technology & Services
50-100 Employees
104 Jobs
Key People
2.0 - 6.0 Lacs P.A.
Noida
10.0 - 15.0 Lacs P.A.
15.0 - 20.0 Lacs P.A.
32.5 - 40.0 Lacs P.A.
12.0 - 16.0 Lacs P.A.
13.0 - 17.0 Lacs P.A.
Pune, Chennai, Bengaluru
20.0 - 35.0 Lacs P.A.
17.0 - 25.0 Lacs P.A.
Hyderabad
19.0 - 30.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.