Posted:1 day ago|
Platform:
Work from Office
Full Time
Requirement ID: 10373065
Job Title: Engineer(Job Title: Data Engineer GCP Framework Implementation)
Work Location: Pune, Hyderabad, Chennai, Bangalore, Delhi
Skills Required: Digital: Kubernetes, Digital: Google Data Engineering
Experience Range in Required Skills: 5+ Years
Job Description: GCP (E2), Big Query Cloud Composer, GKE
Role Overview:
We are looking for a Data Engineer with hands-on experience in Google Cloud Platform (GCP) services to work on implementing and managing data ingestion using our in- house Origin Data Product (ODP) framework. The role primarily involves configuring pipelines, loading data, debugging issues, and ensuring smooth operation of the data ingestion process. Should be capable of handling parameter changes, data issues, and basic fixes within the scope of ingestion jobs. If you understand the framework well, you are encouraged to suggest improvements and may contribute to enhancements in collaboration with the core development team. Key Responsibilities: • Configure and execute data ingestion pipelines using our reusable GCP-based framework. • Work with services such as GCS, BigQuery, Composer, Data Fusion, DataProc for ETL/ELT operations. • Manage parameters, job configurations, and metadata for ingestion. • Debug and resolve issues related to data, parameters, and job execution. • Escalate framework-related bugs to the core development team when required. • Monitor daily job runs, troubleshoot failures, and ensure SLAs are met. • Collaborate with cross-functional teams for smooth delivery. • Follow version control best practices using Git. • Maintain deployment scripts and infrastructure configurations via Terraform. • (Nice-to-have) Work with TWSd job scheduling/monitoring tools. • Suggest improvements to the framework or processes based on usage experience. • Contribute to small enhancements in the framework, where applicable. Required Skills & Experience: • 5+ years of experience as a Data Engineer or in a similar role. • Strong working knowledge of Google Cloud Storage (GCS), BigQuery, Composer, Data Fusion, Dataproc. • Proficiency in Python for scripting, debugging, and automation tasks. • Experience with Terraform for infrastructure-as-code. • Knowledge of Git for version control. • Understanding of data ingestion concepts, file formats, and ETL workflows. • Ability to debug and resolve runtime data issues independently. • Strong problem-solving and analytical skills. Good to Have: • Exposure to TWSd or other enterprise job schedulers. • Basic understanding of SQL optimization in BigQuery. Soft Skills: • Attention to detail and ownership mindset. • Good communication and collaboration skills. • Ability to work in a fast-paced environment and meet deadlines. Proactive approach towards process improvement and innovation.
Artech
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowhyderabad, pune, chennai
4.0 - 8.5 Lacs P.A.
pune, chennai, bengaluru
13.0 - 15.0 Lacs P.A.
kochi, pune, bengaluru
9.0 - 19.0 Lacs P.A.
pune, chennai, bengaluru
0.5 - 3.0 Lacs P.A.
hyderabad, chennai, bengaluru
4.0 - 9.0 Lacs P.A.
indore, pune, bengaluru
20.0 - 25.0 Lacs P.A.
pune, chennai
0.5 - 1.25 Lacs P.A.
hyderabad, pune, bengaluru
5.0 - 15.0 Lacs P.A.
noida, chennai, bengaluru
30.0 - 32.5 Lacs P.A.
hyderabad, pune, bengaluru
12.0 - 22.0 Lacs P.A.