Job Title:
Specialty Development Senior 34263
Location:
Chennai
Employment Type:
Full-Time (Hybrid)
Job Overview
We are looking for an experienced
GCP Data Engineer
to join a global data engineering team responsible for building a sophisticated data warehouse and analytics platform on
Google Cloud Platform (GCP)
. This role is ideal for professionals with a strong background in
data engineering, cloud migration, and large-scale data transformation
, particularly within cloud-native environments.
Key Responsibilities
- Design, build, and optimize data pipelines on GCP to support large-scale data transformations and analytics.
- Lead the migration and modernization of legacy systems to cloud-based architecture.
- Collaborate with cross-functional global teams to support data-driven applications and enterprise analytics solutions.
- Work with large datasets to enable platform capabilities and business insights using GCP tools.
- Ensure data quality, integrity, and performance across the end-to-end data lifecycle.
- Apply agile development principles to rapidly deliver and iterate on data solutions.
- Promote engineering best practices in CI/CD, DevSecOps, and cloud deployment strategies.
Must-Have Skills
- GCP Services: BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Composer, Cloud Functions, Cloud SQL, Cloud Spanner, Cloud Storage, Bigtable, Pub/Sub, App Engine, Compute Engine, Airflow
- Programming & Data Engineering: 5+ years in data engineering and SQL development; experience in building data warehouses and ETL processes
- Cloud Experience: Minimum 3 years in cloud environments (preferably GCP), implementing production-scale data solutions
- Strong understanding of data processing architectures (batch/real-time) and tools such as Terraform, Cloud Build, and Airflow
- Experience with containerized microservices architecture
- Excellent problem-solving skills and ability to optimize complex data pipelines
- Strong interpersonal and communication skills with the ability to work effectively in a globally distributed team
- Proven ability to work independently in high-ambiguity scenarios and drive solutions proactively
Preferred Skills
- GCP Certification (e.g., Professional Data Engineer)
- Experience in regulated or financial domains
- Migration experience from Teradata to GCP
- Programming experience with Python, Java, Apache Beam
- Familiarity with data governance, security, and compliance in cloud environments
- Experience coaching and mentoring junior data engineers
- Knowledge of software architecture, CI/CD, source control (Git), and secure coding standards
- Exposure to Java full-stack development (Spring Boot, Microservices, React)
- Agile development experience including pair programming, TDD, and DevSecOps
- Proficiency in test automation tools like Selenium, Cucumber, REST Assured
- Familiarity with other cloud platforms like AWS or Azure is a plus
Education
- Bachelor’s Degree in Computer Science, Information Technology, or a related field (mandatory)
Skills: python,gcp certification,microservices architecture,terraform,airflow,data processing architectures,test automation tools,sql development,cloud environments,agile development,ci/cd,gcp services: bigquery, dataflow, dataproc, data fusion, cloud composer, cloud functions, cloud sql, cloud spanner, cloud storage, bigtable, pub/sub, app engine, compute engine, airflow,apache beam,git,communication,problem-solving,data engineering,analytics,data,data governance,etl processes,gcp,cloud build,java