Posted:2 days ago|
Platform:
On-site
Full Time
1. Understand customers’ overall data platform, business and IT priorities and success measures to design data solutions that drive business value.
2. Apply technical knowledge to architect solutions, create data platform and roadmaps on GCP cloud that meet business and IT needs.
3. Estimation and outlining of the solutions needed to implement cloud native architecture and migration from on-prem systems.
4. Executing the hands-on implementation of proposed solutions and providing technical guidance to teams during the solutions’ development and deployment phases.
5. Ensure long term technical viability and optimization of production deployments, and lead/advise on migration and modernization using GCP native services.
6. Work with prospective and existing customers to implement POCs/MVPs and guide through to deployment, operationalization, and troubleshooting.
7. Identify and build technical collateral or technical assets for client consumption.
8. Identify, communicate, and mitigate the assumptions, issues, and risks that occur throughout the project lifecycle.
9. Ability to judge and strike a balance between what is strategically logical and what can be accomplished realistically.
10. Assess and validate non-functional attributes and build solutions that exhibit high levels of performance, security, scalability, maintainability, and reliability.
Qualifications:
2. Proven experience as a Data Engineer with a focus on GCP data services.
3. Strong proficiency in GCP services in data engineering and data warehousing including but not limited to BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Storage, etc.
4. Strong proficiency in ETL processes, SQL, and data integration techniques.
5. Strong proficiency in data modelling and data warehousing concepts.
6. Proficiency in architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements.
7. Programming skills in languages such as PySpark, Python, Java, or Scala.
8. Familiarity with building AI/ML models on cloud solutions built in GCP. 9. Familiarity with version control systems, DevOps and Infrastructure-As-Code practices.
10. Strong problem-solving and troubleshooting skills.
11. Excellent communication and teamwork skills.
1. Solution architect and/or data engineer certifications from GCP.
2. Experience with BFSI or Healthcare or Retail domain.
3. Experience with data governance principles, data privacy and security.
4. Experience with big data and distributed technologies like Hadoop, Hive, Kafka, etc.
5. Experience with data visualization tools like Power BI or Tableau.
Mastech Digital
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
chennai, tamil nadu, india
Salary: Not disclosed
hyderabad
37.5 - 45.0 Lacs P.A.
karnataka
Salary: Not disclosed
chennai, tamil nadu, india
Experience: Not specified
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed
hyderabad, pune, bengaluru
25.0 - 40.0 Lacs P.A.
pune, maharashtra
Salary: Not disclosed
chennai
32.5 - 37.5 Lacs P.A.
maharashtra
17.0 - 19.0 Lacs P.A.
chennai, all india
Salary: Not disclosed