Posted:14 hours ago|
Platform:
Work from Office
Full Time
We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem. Key Responsibilities: Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability. Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions. Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency. Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow . Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets. Ensure the scalability, reliability, and security of cloud-based data architectures. Optimize cloud storage, compute, and query performance, driving cost-effective solutions. Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes. Implement best practices for data management, including governance, quality, and monitoring of data pipelines. Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP) . Extensive hands-on experience with GCP Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Strong expertise in SQL for query optimization and performance tuning in large-scale datasets. Solid experience in designing data schemas, data pipelines, and ETL processes. Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems. Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies. Experience with managing and processing streaming data and batch data processing workflows. Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines. Familiarity with data security, governance, and compliance best practices on GCP. Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions. Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders. Preferred Qualifications: Bachelor's/Masters degree in Computer Science, Data Engineering, or a related field. Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager . GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect ).
UPS Supply Chain Solutions (UPS SCS)
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections UPS Supply Chain Solutions (UPS SCS)
Logistics and Supply Chain
40,000 Employees
219 Jobs
Key People
7.0 - 12.0 Lacs P.A.
Bengaluru
7.0 - 11.0 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Pune, Bengaluru, Gurgaon
15.0 - 30.0 Lacs P.A.
Chennai, Hyderabad
6.0 - 10.0 Lacs P.A.
Bengaluru
7.0 - 12.0 Lacs P.A.
Chennai, Bengaluru, Hyderabad
20.0 - 25.0 Lacs P.A.
Chennai
25.0 - 30.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
11.0 - 15.0 Lacs P.A.