Data Architect GCP

10 - 15 years

0 Lacs

Posted:6 days ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Overview:
As a Data Engineering Architect, you will be responsible for designing and implementing robust data solutions that leverage both GCP and Hadoop technologies. You will create scalable data pipelines, optimize data storage, ensure data quality and accessibility, facilitate data migration projects, and provide pre-sales support to help articulate our data solutions to potential clients. Your expertise will be crucial in transforming complex data sets into actionable insights that align with our business objectives. Your knowledge of the business domain will be instrumental in understanding business requirements and translating them into effective data strategies.
Responsibilities:
Design and architect data solutions on Google Cloud Platform (GCP) and Hadoop ecosystems to support data ingestion, processing, and storage.
  • Develop and maintain data pipelines using tools such as Apache Beam, Dataflow, Hadoop, BigQuery and Hive.
  • Lead data migration projects, ensuring seamless transition of data from on-premises to cloud environments.
  • Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.
  • Provide pre-sales support by engaging with potential clients, understanding their data needs, and presenting tailored solutions.
  • Implement data governance and data quality frameworks to ensure the integrity and reliability of data.
  • Optimize data models and storage solutions for performance and cost efficiency.
  • Provide technical leadership and mentorship to junior data engineers and analysts.
  • Stay updated with the latest trends and technologies in data engineering, GCP and Hadoop.
  • Work closely with stakeholders to understand their data needs and provide actionable insights.
Requirements:
Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, or a related field; Master’s degree preferred.
  • Proven experience of around 10 to 15 years as a Data Engineer or Data Architect, with a strong focus on GCP and Hadoop.
  • In-depth knowledge of GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Dataproc, as well as Hadoop components like HDFS, MapReduce, and Hive.
  • Demonstrated experience in designing data pipelines and executing data migration projects.
  • Experience with data modeling, ETL processes, and data warehousing concepts.
  • Familiarity with programming languages such as Python, Java, or Scala.
  • Strong analytical skills and the ability to work with large data sets.
  • Excellent problem-solving skills and the ability to work in a fast-paced environment.
  • Strong communication and collaboration skills.
  • Following skills are desirable:
  • Experience with machine learning and data analytics tools.
  • Knowledge of data privacy regulations and best practices.
  • Certifications in GCP (e.g., Google Cloud Professional Data Engineer) and Hadoop related technologies are a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

guindy, chennai, tamil nadu

thiruvananthapuram, all india

thiruvananthapuram, kerala

Trivandrum, Kerala, India

guindy, chennai, tamil nadu