Posted:6 days ago| Platform:
Work from Office
Full Time
Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Understanding of security best practices, including IAM, secrets management, and vulnerability scanning. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform, Ansible, or Cloud Deployment Manager Building and managing CI/CD pipelines. Design, implement, and manage cloud infrastructure on Google Cloud Platform (GCP). Write clean, maintainable, and efficient code following best practices. Requirements 8 12 years of experience in data engineering, with at least 3 5 years hands-on experience specifically in Google Cloud Platform (GCP). BigQuery (data modelling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Experience in developing high performing batch pipelines. Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. DevOps: Strong scripting skills (Bash, Python, etc) Proficiency in building and managing CI/CD pipelines. Exposure to monitoring and optimizing cloud resources for performance, cost, and scalability using tools like Stackdriver, Prometheus, or Grafana Exposure to deployment tools like Terraform, Ansible, or Cloud Deployment Manager Monitor system health, identify potential bottlenecks, and ensure high availability and disaster recovery processes are in place. General: Experience with Agile delivery methodologies (eg Scrum, Kanban) Demonstrable track record of dealing we'll with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, participate in Continuous improvement and transformation towards Agile, DevOps, CI/CD and drivers of improved productivity. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 10.0 - 20.0 Lacs P.A.
Mumbai, Hyderabad, Bengaluru
INR 8.0 - 12.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Hyderabad
INR 10.0 - 20.0 Lacs P.A.
INR 15.0 - 30.0 Lacs P.A.
Chennai
INR 17.0 - 32.0 Lacs P.A.
INR 6.0 - 8.0 Lacs P.A.
Hyderabad
INR 17.0 - 30.0 Lacs P.A.
Experience: Not specified
INR 3.0 - 8.0 Lacs P.A.
INR 20.0 - 30.0 Lacs P.A.