Posted:Just now|
Platform:
Work from Office
Full Time
Job Description:
Business Title
Data Engineer
Years of Experience
Min 3 and max upto 7.
Job Descreption
We are looking for a highly skilled GCP Data Engineer with strong programming fundamentals and cloud-native architecture experience. The candidate should have hands-on expertise in building scalable data pipelines on Google Cloud Platform, integrating external APIs, and applying software engineering principles to data workflows.
Must have skills
1. 3 to 7 years of experience in data engineering, with at least 1+ years on Google Cloud Platform.
2. Strong proficiency in Python, including OOP principles, modular design, and clean coding standards. 3. Hands-on with GCP services: BigQuery, Cloud Functions, Cloud Run, Cloud Build, Dataform, Pub/Sub, Eventarc, Cloud Storage, and Cloud Composer. 4. Solid understanding of SQL, data modeling, and data warehousing concepts. 5. Familiarity with CI/CD pipelines, Git-based workflows etc. 6. Strong problem-solving skills and ability to work independently in a fast-paced environment.
Good to have skills
1. Familiarity with other Cloud technologies
2. Working experience on JIRA and Agile 3. Stakeholder communication 4. Microsoft Office 5. Cross functional team work internally and with external clients 6. Team Lead 7. Requirement gathering
Key responsibiltes
1. Implement modular, object-oriented Python applications for data ingestion and transformation.
2. Strong Object-Oriented Programming skills. Proficient in implementing scalable, maintainable data pipelines using core OOP principles such as encapsulation, inheritance, polymorphism, and abstraction. Experienced in building modular systems with well-defined class hierarchies, applying SOLID principles to ensure clean architecture and separation of concerns. Skilled in leveraging design patterns to solve complex problems and enhance code readability and testability. 3. Build and maintain ETL/ELT pipelines using GCP services: BigQuery, Cloud Functions, Cloud Run, Cloud Composer, and DataForm. 4. Integrate with external APIs to extract and load data into BigQuery. 5. Develop event-driven architectures using Pub/Sub and Eventarc to trigger workflows and manage real-time data streams. 6. Automate deployment and testing workflows using Cloud Build and CI/CD pipelines. 7. Optimize performance and cost of data processing and storage across GCP services. 8. Collaborate with cross-functional teams to translate business requirements into scalable data solutions. 9. Ensure data security, governance, and compliance across all pipelines and storage layers.
Education Qulification
1. Bachelor s or Master Degree or equivalent Degree
Certification If Any
1. GCP Professional Data Engineer Certification.
2. Snowflake Associate / Core
Shift timing
12 PM to 9 PM and / or 2 PM to 11 PM - IST time zone
Location:
DGS India - Bengaluru - Manyata H2 block
Brand:
Merkle
Time Type:
Full time
Contract Type:
Permanent
Merkle B2b
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowpune
3.0 - 9.0 Lacs P.A.
chennai, bengaluru
35.0 - 40.0 Lacs P.A.
hyderabad
6.0 - 11.0 Lacs P.A.
bengaluru
32.5 - 47.5 Lacs P.A.
hyderabad
4.0 - 8.0 Lacs P.A.
chennai, bengaluru
4.0 - 8.0 Lacs P.A.
chennai, bengaluru
20.0 - 25.0 Lacs P.A.
gurugram
0.00786 - 0.00786 Lacs P.A.
chandigarh, pune, bangalore rural
14.0 - 19.0 Lacs P.A.
bengaluru
12.0 - 14.0 Lacs P.A.