Posted:3 months ago|
Platform:
Hybrid
Full Time
Looking for a candidate for the GCP Lead role with one of our big4 clients & experience in Python, PySpark, and SQL , targeting candidates with 8-10 years of experience : Job Title : GCP Lead Engineer Location : Bangalore/ Gurgaon / Hyderabad Job Type : Full time Experience : 8-10 years Job Description: We are seeking an experienced and highly skilled GCP Lead Engineer to lead and manage our cloud data engineering team. In this role, you will be responsible for designing, implementing, and optimizing large-scale data processing solutions on Google Cloud Platform (GCP) . The ideal candidate should have extensive experience with Python , PySpark , and SQL , as well as in-depth knowledge of GCP services like BigQuery , Dataflow , and Dataproc . You will work closely with data architects, data scientists, and cross-functional teams to deliver innovative and scalable cloud-based data solutions. Key Responsibilities: Lead and Manage Teams : Lead a team of data engineers and data scientists, guiding them through the design, development, and implementation of cloud data engineering solutions on GCP . Cloud Data Architecture : Design and implement end-to-end data pipelines and data models using GCP services like BigQuery , Dataflow , Dataproc , Pub/Sub , and Cloud Storage . Big Data Processing : Develop and optimize large-scale data processing systems using PySpark , Python , and other GCP-native tools to process both structured and unstructured data. SQL Development : Write complex SQL queries to extract and transform data for analysis and reporting, ensuring high performance and scalability. Data Integration : Work on integrating data from diverse sources such as on-premises systems, Google Cloud Storage , APIs , and third-party services. Performance Optimization : Optimize the performance of data pipelines and queries by leveraging GCPs scalability and best practices. Cloud Migration : Lead cloud migration projects, helping teams migrate legacy systems and applications to GCP . Collaboration : Work closely with stakeholders, including business analysts, data scientists, and engineers, to understand business requirements and translate them into scalable data solutions. Mentorship : Provide technical leadership and mentorship to junior team members, promoting the adoption of best practices, tools, and technologies. Data Security and Governance : Implement data security and governance best practices, ensuring compliance with relevant industry regulations (e.g., GDPR). Continuous Improvement : Stay up-to-date with the latest trends and advancements in GCP , big data technologies, and cloud computing to drive innovation within the team. Skills & Qualifications: Experience : 8-10 years of experience in data engineering , with at least 3-4 years of hands-on experience working with Google Cloud Platform (GCP) , PySpark , Python , and SQL . Technical Expertise : Proficiency in Python and PySpark for large-scale data processing and ETL jobs. Strong expertise in SQL for data manipulation, querying, and optimization. Deep knowledge of Google Cloud Platform (GCP) , particularly services like BigQuery , Dataflow , Dataproc , Cloud Pub/Sub , Cloud Storage , and Cloud Composer . Familiarity with GCP networking , IAM , security , and cost optimization practices. Experience with data lakes , data warehouses , and real-time data processing on the cloud. Understanding of data integration from various sources, including relational databases, APIs, and file-based data. Familiarity with CI/CD pipelines and automation for GCP deployments. Leadership & Management : Proven experience in leading teams , managing technical projects, and mentoring junior engineers. Cloud Migration : Experience leading or supporting cloud migration efforts to GCP . Problem Solving : Strong analytical and problem-solving skills with the ability to troubleshoot complex data engineering issues. Collaboration : Excellent communication and teamwork skills, with the ability to work cross-functionally with diverse teams. Education : Bachelors degree in Computer Science , Engineering , Information Technology , or a related field. Masters degree is a plus. Preferred Qualifications : GCP certification (e.g., Google Cloud Professional Data Engineer ) is highly desirable. Experience with containerization technologies like Docker and orchestration platforms like Kubernetes is a plus. Experience working with NoSQL databases (e.g., MongoDB , Cassandra ) is a plus. Knowledge of data visualization tools such as Tableau , Power BI , or Looker is a plus.
First Meridian Business Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections First Meridian Business Services
Business Services / Technology Solutions
250 Employees
183 Jobs
Key People
Pune
15.0 - 30.0 Lacs P.A.
Hyderabad
10.0 - 13.0 Lacs P.A.
Pune
13.0 - 16.0 Lacs P.A.
Pune
15.0 - 30.0 Lacs P.A.
Bengaluru, Hyderabad
7.0 - 12.0 Lacs P.A.
Bengaluru, Hyderabad, Gurgaon
12.0 - 22.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
Experience: Not specified
0.5 - 5.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
2.0 - 3.5 Lacs P.A.