Posted:1 week ago| Platform:
Work from Office
Full Time
Key Responsibilities Design, develop, and maintain data pipelines and ETL processes focused on data transformation using SQL, Python, and DBT . Develop complex SQL queries for data extraction, transformation, and loading (ETL), with focus on aggregation, cleansing, and modeling . Leverage Python to automate transformation tasks and implement custom logic in data workflows. Build and manage DBT models for reusable, maintainable, and scalable data pipelines. Assemble reusable and scalable datasets aligned with business needs. Ensure data accuracy, consistency, and completeness through thorough validation, testing, and documentation. Develop and maintain data storage solutions and implement transformation logic for analysis and reporting. Collaborate with cross-functional stakeholders to understand data requirements and provide actionable insights. Troubleshoot and resolve data-related technical issues and identify opportunities to improve data quality and reliability . Document technical specifications and data transformation processes . Adhere to information security policies , ensure data compliance with PII, GDPR, and other relevant regulations. Required Skills & Experience Hands-on experience with Google Cloud Platform (GCP) and Google BigQuery . Strong expertise in SQL , database design, and query optimization. Proven experience in designing and implementing ETL pipelines and data transformation flows . Technical proficiency in data modeling , data mining , and segmentation techniques . Excellent numerical, analytical , and problem-solving skills. Strong attention to detail and a critical mindset for evaluating information. Effective stakeholder management and ability to handle multiple priorities. Excellent verbal and written communication skills. Ability to self-manage workload and work collaboratively within a team. Desirable Skills Proficiency in SQL and Python programming. Hands-on experience with DBT and version control tools like GitLab . Familiarity with complex financial data models. Experience with data visualization tools such as Tableau or Google Data Studio . Exposure to Salesforce and Salesforce Einstein Analytics . Understanding of Agile/Scrum methodologies . Bachelor's degree in Computer Science , Information Technology , or a related field. Data engineering certification and basic knowledge of AI/ML fundamentals are a plus.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Hyderabad
10.0 - 19.0 Lacs P.A.
Salary: Not disclosed
10.0 - 14.0 Lacs P.A.
10.0 - 14.0 Lacs P.A.
Bengaluru
15.0 - 22.5 Lacs P.A.
Kolkata, Gurugram, Bengaluru
10.0 - 17.0 Lacs P.A.
10.0 - 16.0 Lacs P.A.
25.0 - 40.0 Lacs P.A.
19.0 - 34.0 Lacs P.A.