9 - 14 years
25 - 40 Lacs
Posted:4 weeks ago|
Platform:
Work from Office
Full Time
Hi, We at HCL are looking for GCP Data Architects with exp in ETL technologies. Location: Noida, Bangalore and Chennai Please share the details below. Total years of exp: Exp in GCP: Current CTC: Expected CTC: Notice Period: Current location: Preferred location: Reach us on srikanth.domala@hcltech.com GCP Data Architect: Location: Noida, Bangalore, Chennai Job Type: FTE Experience: 8+ Years Job Summary: The GCP Data Architect is responsible for creating scalable data architectures that enhance data processing capabilities and streamline data workflows within an organization. This role involves collaborating with cross-functional teams to understand business requirements and translating them into effective data models and infrastructure solutions on the GCP. Key Responsibilities: Design and implement data architecture solutions using GCP services such as BigQuery, Cloud Storage, Dataflow, and Dataproc. Collaborate with data engineers, analysts, and business stakeholders to ensure data models align with business needs. Develop data ingestion pipelines and ETL processes to maximize data availability and quality. Identify best practices for data security, governance, and compliance in the cloud environment. Optimize existing data solutions for performance, scalability, and cost-efficiency. Stay updated on the latest GCP features and technologies, evaluating their potential impact on data architecture. Required Skills: Cloud Services Knowledge: Deep understanding of GCP services such as BigQuery, Cloud Storage, Cloud Pub/Sub, Dataflow, Dataproc, and Cloud SQL. Data Modeling and Architecture: Proficiency in designing data models (relational and non-relational) and understanding data warehousing concepts. ETL/ELT Proficiency: Experience with ETL processes and tools, including Dataflow, Apache Beam, or Talend for data ingestion and transformation. Big Data Technologies: Familiarity with big data technologies like Hadoop, Spark, and NoSQL databases (e.g., Bigtable, Firestore). Programming Languages: Proficiency in programming languages such as Python, Java, or Scala, particularly for data manipulation and automation tasks. SQL Expertise: Strong SQL skills for querying and analyzing data, as well as optimizing query performance in GCP. Data Governance and Security: Understanding of data governance frameworks, data privacy regulations, and security best practices in cloud environments. Architecture Design: Ability to design and architect scalable and efficient data solutions that meet performance and cost requirements. Problem-Solving Skills: Strong analytical and troubleshooting skills to optimize data processes and solve complex data issues. Required Qualifications: Strong experience with GCP services and tools. Proficiency in SQL and understanding of data modeling techniques. Previous experience in designing data solutions for analytics and reporting. Knowledge of data warehousing concepts and big data technologies. Excellent problem-solving skills and ability to communicate complex technical concepts. Typically, 7+ years of experience in data architecture, data engineering, or a related field, especially in cloud environments. Demonstrated experience working with GCP data solutions and architecture. Relevant certifications, such as Google Cloud Professional Data Engineer or Google Cloud Professional Cloud Architect, are highly advantageous and often preferred.
HCLTech
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Chennai, Bengaluru, Greater Noida
25.0 - 40.0 Lacs P.A.
Hyderabad, Pune
4.5 - 9.5 Lacs P.A.
8.0 - 8.5 Lacs P.A.
Kolkata, Hyderabad, Pune, Ahmedabad, Chennai, Bengaluru, Delhi / NCR, Mumbai (All Areas)
27.5 - 40.0 Lacs P.A.
Lucknow
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
Hyderabad
8.0 - 12.0 Lacs P.A.
Bengaluru, Mumbai (All Areas)
11.0 - 20.0 Lacs P.A.
2.4 - 3.0 Lacs P.A.
Hyderabad, Pune, Chennai, Bengaluru, Delhi / NCR, Mumbai (All Areas)
6.0 - 9.0 Lacs P.A.