Job Title:
Software Engineer Consultant/Expert – GCP Data Engineer
Location:
Chennai (Onsite) 34350
Employment Type:
Contract
Budget:
Up to ₹18 LPA
Assessment:
Google Cloud Platform Engineer (HackerRank or equivalent)
Notice Period:
Immediate Joiners Preferred
Role Summary
We are seeking a highly skilled
GCP Data Engineer
to support the modernization of enterprise data platforms. The ideal candidate will be responsible for designing and implementing scalable, high-performance data pipelines and solutions on
Google Cloud Platform (GCP)
. You will work with large-scale datasets, integrating legacy and modern systems to enable advanced analytics and AI/ML capabilities. The role requires a deep understanding of GCP services, strong data engineering skills, and the ability to collaborate across teams to deliver robust data solutions.
Key Responsibilities
- Design and develop production-grade data engineering solutions using GCP services such as:
- BigQuery, Dataflow, Dataform, Dataproc, Cloud Composer, Cloud SQL, Airflow, Compute Engine, Cloud Functions, Cloud Run, Cloud Build, Pub/Sub, App Engine
- Develop batch and real-time streaming pipelines for data ingestion, transformation, and processing.
- Integrate data from multiple sources including legacy and cloud-based systems.
- Collaborate with stakeholders and product teams to gather data requirements and align technical solutions to business needs.
- Conduct in-depth data analysis and impact assessments for data migrations and transformations.
- Implement CI/CD pipelines using tools like Tekton, Terraform, and GitHub.
- Optimize data workflows for performance, scalability, and cost-effectiveness.
- Lead and mentor junior engineers; contribute to knowledge sharing and documentation.
- Champion data governance, data quality, security, and compliance best practices.
- Utilize monitoring/logging tools to proactively address system issues.
- Deliver high-quality code using Agile methodologies including TDD and pair programming.
Required Skills & Experience
- GCP Data Engineer Certification.
- Minimum 5+ years of experience designing and implementing complex data pipelines.
- 3+ years of hands-on experience with GCP.
- Strong expertise in:
- SQL, Python, Java, or Apache Beam
- Airflow, Dataflow, Dataproc, Dataform, Data Fusion, BigQuery, Cloud SQL, Pub/Sub
- Infrastructure-as-Code tools such as Terraform
- DevOps tools: GitHub, Tekton, Docker
- Solid understanding of microservice architecture, CI/CD integration, and container orchestration.
- Experience with data security, governance, and compliance in cloud environments.
Preferred Qualifications
- Experience with real-time data streaming using Apache Kafka or Pub/Sub.
- Exposure to AI/ML tools or integration with AI/ML pipelines.
- Working knowledge of data science principles applied on large datasets.
- Experience in a regulated domain (e.g., financial services or insurance).
- Experience with project management and agile tools (e.g., JIRA, Confluence).
- Strong analytical and problem-solving mindset.
- Effective communication skills and ability to collaborate with cross-functional teams.
Education
- Required: Bachelor's degree in Computer Science, Engineering, or a related technical field.
- Preferred: Master's degree or certifications in relevant domains.
Skills: github,bigquery,airflow,ml,pub/sub,terraform,python,apache beam,dataflow,gcp,gcp data engineer certification,tekton,java,dataform,docker,data fusion,sql,dataproc,cloud sql,cloud