Posted:3 months ago|
Platform:
Work from Office
Full Time
We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset you take initiative without waiting for instructions. 2. A commitment to excellence no shortcuts or compromises on quality. 3. Accountability you own your work end-to-end and deliver on time. 4. Attention to detail precision matters; mistakes are not acceptable. Location - Pan india
Encora
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Encora
Book and Periodical Publishing
2-10 Employees
132 Jobs
Key People
35.0 - 35.0 Lacs P.A.
Chennai, Tamil Nadu, India
6.0 - 10.0 Lacs P.A.
Chennai, Tamil Nadu, India
7.0 - 10.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 7.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
3.0 - 7.0 Lacs P.A.
Delhi, Delhi, India
3.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
3.0 - 9.5 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
7.0 - 14.0 Lacs P.A.
Noida, Uttar Pradesh, India
7.0 - 14.0 Lacs P.A.
Patan - Gujarat, Gujrat, India
4.0 - 11.0 Lacs P.A.