Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a GCP DBT Manager, your primary responsibility will be to collaborate with the team in designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This role will involve utilizing tools such as BigQuery, Cloud Composer, and Python, requiring a strong foundation in SQL skills and knowledge of data warehousing concepts. Additionally, you will play a crucial role in ensuring data quality, optimizing performance, and working closely with cross-functional teams. Your key responsibilities will include: Data Pipeline Development: - Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like BigQuery and Cloud Composer. Data Modeling: - Creating and managing data models and transformations with dbt to ensure efficient and accurate data consumption for analytics and reporting. Data Quality: - Developing and maintaining a data quality framework, including automated testing and cross-dataset validation. Performance Optimization: - Writing and optimizing SQL queries to enhance data processing efficiency within BigQuery. Collaboration: - Collaborating with data engineers, analysts, scientists, and business stakeholders to deliver effective data solutions. Incident Resolution: - Providing support for day-to-day incident and ticket resolution related to data pipelines. Documentation: - Creating and maintaining comprehensive documentation for data pipelines, configurations, and procedures. Cloud Platform Expertise: - Leveraging GCP services like BigQuery, Cloud Composer, Cloud Functions, etc. for efficient data operations. Scripting: - Developing and maintaining SQL/Python scripts for data ingestion, transformation, and automation tasks. Preferred Candidate Profile: Requirements: - 7~12 years of experience in data engineering or a related field. - Strong hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery. - Proficiency in using dbt for data transformation, testing, and documentation. - Advanced SQL skills for data modeling, performance optimization, and querying large datasets. - Understanding of data warehousing concepts, dimensional modeling, and star schema design. - Experience with ETL/ELT tools and frameworks, such as Apache Beam, Cloud Dataflow, Data Fusion, or Airflow/Composer. In this role, you will be at the forefront of data pipeline development and maintenance, ensuring data quality, performance optimization, and effective collaboration across teams to deliver impactful data solutions using GCP and dbt.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |