Posted:2 months ago|
Platform:
Work from Office
Full Time
Position Overview: Cloud Architect with expertise in Hadoop and Google Cloud Platform (GCP) Data Stack , along with experience in Big Data Architecture and Migration . The ideal candidate should have strong proficiency in GCP Big Data tools , including Hadoop, Hive, HDFS, Impala, Spark, MapReduce, MS SQL, Kafka, and Redis . Familiarity with Cloudera, HBase, MongoDB, MariaDB, and Event Hub is a plus. Key Responsibilities: Design, implement, and optimize Big Data architecture on GCP, and Hadoop ecosystems . Lead data migration projects from on-premise to cloud platforms (GCP). Develop and maintain ETL pipelines using tools like Spark, Hive, and Kafka . Manage Hadoop clusters, HDFS, and related components . Work with data streaming technologies like Kafka and Event Hub for real-time data processing. Optimize SQL and NoSQL databases (MS SQL, Redis, MongoDB, MariaDB, HBase) for high availability and scalability. Collaborate with data scientists, analysts, and DevOps teams to integrate Big Data solutions. Ensure data security, governance, and compliance in cloud and on-premise environments. Required Skills & Experience: 5-10 years of experience as Cloud Architect Strong expertise in Hadoop (HDFS, Hive, Impala, Spark, MapReduce) Hands-on experience with GCP Big Data Services Proficiency in MS SQL, Kafka, Redis for data processing and analytics Experience with Cloudera, HBase, MongoDB, and MariaDB Knowledge of real-time data streaming and event-driven architectures Understanding Big Data security and performance optimization Ability to design and execute data migration strategies Location : Koregaon Park, Pune, Maharashtra (India) Shift Timings : USA Time Zone (06:30 PM IST to 03:30 AM IST)
Fluid.Live
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Fluid.Live
0.5 - 1.25 Lacs P.A.