GCP Data Architect

10 years

0 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

Location:

Experience:

Budget:

Notice Period




Responsibilities:

1. Understand customers’ overall data platform, business and IT priorities and success measures to design data solutions that drive business value.

2. Apply technical knowledge to architect solutions, create data platform and roadmaps on GCP cloud that meet business and IT needs.

3. Estimation and outlining of the solutions needed to implement cloud native architecture and migration from on-prem systems.

4. Executing the hands-on implementation of proposed solutions and providing technical guidance to teams during the solutions’ development and deployment phases.

5. Ensure long term technical viability and optimization of production deployments, and lead/advise on migration and modernization using GCP native services.

6. Work with prospective and existing customers to implement POCs/MVPs and guide through to deployment, operationalization, and troubleshooting.

7. Identify and build technical collateral or technical assets for client consumption.

8. Identify, communicate, and mitigate the assumptions, issues, and risks that occur throughout the project lifecycle.

9. Ability to judge and strike a balance between what is strategically logical and what can be accomplished realistically.

10. Assess and validate non-functional attributes and build solutions that exhibit high levels of performance, security, scalability, maintainability, and reliability.


Qualifications:


1

2. Proven experience as a Data Engineer with a focus on GCP data services.

3. Strong proficiency in GCP services in data engineering and data warehousing including but not limited to BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Storage, etc.

4. Strong proficiency in ETL processes, SQL, and data integration techniques.

5. Strong proficiency in data modelling and data warehousing concepts.

6. Proficiency in architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements.

7. Programming skills in languages such as PySpark, Python, Java, or Scala.

8. Familiarity with building AI/ML models on cloud solutions built in GCP. 9. Familiarity with version control systems, DevOps and Infrastructure-As-Code practices.

10. Strong problem-solving and troubleshooting skills.

11. Excellent communication and teamwork skills.



Preferred Skills:

1. Solution architect and/or data engineer certifications from GCP.

2. Experience with BFSI or Healthcare or Retail domain.

3. Experience with data governance principles, data privacy and security.

4. Experience with big data and distributed technologies like Hadoop, Hive, Kafka, etc.

5. Experience with data visualization tools like Power BI or Tableau.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Mastech Digital logo
Mastech Digital

Information Technology & Staffing Services

Pittsburgh

RecommendedJobs for You

chennai, tamil nadu, india

hyderabad, pune, bengaluru