Job description
We are looking for an experienced
GCP Data Engineer
with strong hands-on expertise in Python, BigQuery, Dataflow, and Cloud Composer
. In this role, you will design, build, and maintain scalable data pipelines on Google Cloud Platform (GCP)
, working closely with business and technology teams to enable reliable data migration, transformation, and analytics use cases.
Roles Responsibilities
- Design & Develop Data Pipelines
- Design, develop, and maintain
scalable, secure, and reliable data pipelines
using GCP services (BigQuery, Dataflow, Cloud Composer, GCS, etc.).
- Build end-to-end data ingestion and transformation workflows from various source systems (databases, APIs, flat files, etc.).
- Data Migration & Onboarding
- Work with client business and technical teams to build
complex data migration pipelines
and seamlessly onboard new data products to the GCP environment.
- Streamline and optimize data migration workflows to add measurable value to
client data hubs
and analytics platforms.
- Data Transformation & Curation
- Build robust ETL/ELT processes to
cleanse, transform, and curate data
into analytics-ready datasets and views.
- Ensure data quality, integrity, and consistency across stages with proper validation, logging, and monitoring.
- Design and build
reusable frameworks
for ingesting different data patterns (batch, streaming, CDC, etc.), reducing time-to-market for new data products.
- Create and maintain
technical documentation
for pipelines, frameworks, and processes.
- Collaboration & Agile Delivery
- Work in an
agile environment
, participating in sprint planning, daily stand-ups, and retrospectives.
- Collaborate with data architects, analysts, SMEs, and business stakeholders to translate requirements into scalable data solutions.
- Deployment, CI/CD & Operations
- Use tools like
Spinnaker, Jenkins, and Git
to implement CI/CD for data pipelines and related components.
- Deploy and manage data workloads on
GKE
or other GCP compute services where required.
- Monitor performance, optimize SQL queries and pipeline performance, and troubleshoot issues in production.
Required Experience
-
Total Experience:
46 years in data engineering / ETL / big data roles.
-
Relevant Experience:
Minimum 3+ years working on GCP-based data engineering
projects.
Must-Have Skills (Primary)
-
Python (OOPs concept):
Strong programming skills with clean, modular, and testable code.
-
Dataflow
(or Apache Beam) for building scalable data pipelines.
-
Cloud Composer
(Airflow) for workflow orchestration and scheduling.
-
BigQuery
for data warehousing, analytics, and performance-optimized SQL.
- Strong understanding of
ETL/ELT concepts
, data integration patterns, and data warehousing fundamentals.
Secondary Skills (Good Working Knowledge)
-
Agile methodologies
(Scrum/Kanban) working in sprint-based delivery.
-
Spinnaker
for deployments in GCP environments.
-
Jenkins
for building CI/CD pipelines.
-
Git
version control and collaborative development.
-
GKE (Google Kubernetes Engine)
understanding of containerized workloads and orchestration.
Nice-to-Have Skills (Preferred, Not Mandatory)
-
Data Modelling Basics
star/snowflake schemas, dimensional modelling, understanding of fact and dimension tables.
-
Tableau Basics
exposure to building or supporting dashboards and BI analytics.
-
SQL Performance Tuning
experience optimizing complex queries and improving BigQuery performance.
Educational Qualification
- B.Tech / B.E. / M.Tech / MCA or equivalent degree in
Computer Science, IT, Data Engineering, or related fields
.
Desired Candidate Profile
- Strong analytical and problem-solving skills with attention to detail.
- Ability to work independently as well as part of a collaborative, cross-functional team.
- Good communication skills to interact with both technical and non-technical stakeholders.
- Proactive mindset with a focus on
automation, reusability, and continuous improvement
.
Job Category:
Software Development
Job Type:
Full Time
Job Location:
Hyderabad
Experience:
4 to 6 years
Reimagine your business with Religent Systems, your one-stop shop for a seamless digital transformation and Automation.