5 - 10 years
5 - 10 Lacs
Posted:6 hours ago|
Platform:
Work from Office
Full Time
Job Title: Data Engineer GCP Framework Implementation
Work Location: Pune, Hyderabad, Chennai, Bangalore, Delhi
Experience: 5+ Years
Job Description: GCP (E2), Big Query Cloud Composer, GKE
Role Overview:
We are looking for a Data Engineer with hands-on experience in Google Cloud Platform (GCP) services to work on implementing and managing data ingestion using our in- house Origin Data Product (ODP) framework. The role primarily involves configuring pipelines, loading data, debugging issues, and ensuring smooth operation of the data ingestion process. Should be capable of handling parameter changes, data issues, and basic fixes within the scope of ingestion jobs. If you understand the framework well, you are encouraged to suggest improvements and may contribute to enhancements in collaboration with the coredevelopment team.
Key Responsibilities:
• Configure and execute data ingestion pipelines using our reusable GCP-based framework.• Work with services such as GCS, BigQuery, Composer, Data Fusion, DataProc for ETL/ELT operations.• Manage parameters, job configurations, and metadata for ingestion.• Debug and resolve issues related to data, parameters, and job execution.• Escalate framework-related bugs to the core development team when required.• Monitor daily job runs, troubleshoot failures, and ensure SLAs are met.• Collaborate with cross-functional teams for smooth delivery.• Follow version control best practices using Git.• Maintain deployment scripts and infrastructure configurations via Terraform.• (Nice-to-have) Work with TWSd job scheduling/monitoring tools.• Suggest improvements to the framework or processes based on usage experience.• Contribute to small enhancements in the framework, where applicable.
Required Skills & Experience:
• 5+ years of experience as a Data Engineer or in a similar role.• Strong working knowledge of Google Cloud Storage (GCS), BigQuery, Composer, Data Fusion, Dataproc.• Proficiency in Python for scripting, debugging, and automation tasks.• Experience with Terraform for infrastructure-as-code.• Knowledge of Git for version control.• Understanding of data ingestion concepts, file formats, and ETL workflows.• Ability to debug and resolve runtime data issues independently.• Strong problem-solving and analytical skills.Good to Have:• Exposure to TWSd or other enterprise job schedulers.• Basic understanding of SQL optimization in BigQuery.Soft Skills:• Attention to detail and ownership mindset.• Good communication and collaboration skills.• Ability to work in a fast-paced environment and meet deadlines.Proactive approach towards process improvement and innovation
Interested candidate can share updated CV on nilima.dumbre@siroclinpharm.com
Siro Clinpharm
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
hyderabad, chennai, bengaluru
5.0 - 10.0 Lacs P.A.
bengaluru
25.0 - 40.0 Lacs P.A.
bengaluru
4.0 - 9.0 Lacs P.A.
17.0 - 25.0 Lacs P.A.
hubli, mangaluru, mysuru, bengaluru, belgaum
8.0 - 12.0 Lacs P.A.
4.0 - 6.0 Lacs P.A.
Experience: Not specified
5.0 - 9.0 Lacs P.A.
5.0 - 10.0 Lacs P.A.
13.0 - 18.0 Lacs P.A.
hyderabad
8.0 - 12.0 Lacs P.A.