Position Title:
Software Engineer Consultant/Expert – GCP Data Engineer34350
Location:
Chennai
Engagement Type:
Contract
Compensation:
Up to ₹18 LPA
Notice Period:
Immediate joiners preferred
Work Mode:
Onsite
Role Overview
This role is for a proactive
Google Cloud Platform (GCP) Data Engineer
who will contribute to the modernization of a cloud-based enterprise data warehouse. The ideal candidate will focus on integrating diverse data sources to support advanced analytics and AI/ML-driven solutions, as well as designing scalable pipelines and data products for real-time and batch processing.This opportunity is ideal for individuals who bring both architectural thinking and hands-on experience with GCP services, big data processing, and modern DevOps practices.
Key Responsibilities
- Design and implement scalable, cloud-native data pipelines and solutions using GCP technologies
- Develop ETL/ELT processes to ingest and transform data from legacy and modern platforms
- Collaborate with analytics, AI/ML, and product teams to enable data accessibility and usability
- Analyze large datasets and perform impact assessments across various functional areas
- Build data products (data marts, APIs, views) that power analytical and operational platforms
- Integrate batch and real-time data using tools like Pub/Sub, Kafka, Dataflow, and Cloud Composer
- Operationalize deployments using CI/CD pipelines and infrastructure as code
- Ensure performance tuning, optimization, and scalability of data platforms
- Contribute to best practices in cloud data security, governance, and compliance
- Provide mentorship, guidance, and knowledge-sharing within cross-functional teams
Mandatory Skills
- GCP expertise with hands-on use of services including:
- BigQuery, Dataflow, Data Fusion, Dataform, Dataproc
- Cloud Composer (Airflow), Cloud SQL, Compute Engine
- Cloud Functions, Cloud Run, Cloud Build, App Engine
- Strong knowledge of SQL, data modeling, and data architecture
- Minimum 5+ years of experience in SQL and ETL development
- At least 3 years of experience in GCP cloud environments
- Experience with Python, Java, or Apache Beam
- Proficiency in Terraform, Docker, Tekton, and GitHub
- Familiarity with Apache Kafka, Pub/Sub, and microservices architecture
- Understanding of AI/ML integration, data science concepts, and production datasets
Preferred Experience
- Hands-on expertise in container orchestration (e.g., Kubernetes)
- Experience working in regulated environments (e.g., finance, insurance)
- Knowledge of DevOps pipelines, CI/CD, and infrastructure automation
- Background in coaching or mentoring junior data engineers
- Experience with data governance, compliance, and security best practices in the cloud
- Use of project management tools such as JIRA
- Proven ability to work independently in fast-paced or ambiguous environments
- Strong communication and collaboration skills to interact with cross-functional teams
Education Requirements
- Required: Bachelor's degree in Computer Science, Information Systems, Engineering, or related field
- Preferred: Master's degree or relevant industry certifications (e.g., GCP Data Engineer Certification)
Skills: bigquery,cloud sql,ml,apache beam,app engine,gcp,dataflow,microservices architecture,cloud functions,compute engine,project management tools,data science concepts,security best practices,pub/sub,ci/cd,compliance,cloud run,java,cloud build,jira,data,pipelines,dataproc,sql,tekton,python,github,data modeling,cloud composer,terraform,data fusion,cloud,data architecture,apache kafka,ai/ml integration,docker,data governance,infrastructure automation,dataform