Home
Jobs
Companies
Resume

189 Dbt Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

12 - 16 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool. 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Soniya soniya05.mississippiconsultants@gmail.com We are a Recruitment firm based in Pune, having various clients globally. Preferred candidate profile

Posted 1 month ago

Apply

6 - 10 years

0 - 2 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities: Senior Data Engineer Business domain knowledge : Saas, SFDC, Netsuite Areas we support: Product, Finance (ARR reporting), GTM, Marketing, Sales Tech Stack: Fivetran, Snowflake, dbt, Tableau, Github What we are looking for: 1. SQL and data modeling at intermediate level - write complex SQL queries, build data model and experience with data transformation. 2. Problem solver: Person who can weed through ambiguity of the ask 3. Bias for Action: Asks questions, reaches out to stakeholders, comes up with solutions 4. Communication: Effectively communicates with stakeholder and team members 5.Documentation: Can create BRD 6. Someone well versed in Finance (ARR reporting) and/or GTM (sales and marketing) would be an added advantage 7. Experience in SAAS, NetSuite and Salesforce will be a plus 8. Independent, self-starter, motivated and experience with working in an onsite/offshore environment Key is excellent communication, ownership, working with stakeholders in driving requirements

Posted 1 month ago

Apply

6 - 11 years

22 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Skill Combination: Snowflake + (Python or DBT) + (AWS or Azure) + SQL + Data warehousing Location: Kolkata Exp & CTC: Band Experience CTC Range (Fixed) 4B 4 to 7 years Up to 21 LPA 4C 7 to 11 years Up to 28 LPA 4D 10 to 16 years Up to 35 LPA Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python/DBT+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 1 month ago

Apply

8 - 18 years

10 - 40 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

Naukri logo

Roles and Responsibilities : Lead the development of data warehousing solutions using Snowflake on Microsoft Azure platform. Collaborate with cross-functional teams to design, develop, test, and deploy large-scale data pipelines. Ensure high-quality delivery of projects by providing technical guidance and mentorship to junior team members. Participate in code reviews and ensure adherence to coding standards. Job Requirements : 8-18 years of experience in building data warehouses using Snowflake on Microsoft Azure platform. Strong expertise in developing complex SQL queries for query performance tuning. Proficiency in building efficient ETL processes using various tools such as Data Build Tool (DBT). Experience working with big-data technologies like Hadoop, Spark, Kafka.

Posted 1 month ago

Apply

5 - 9 years

20 - 25 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 5+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus

Posted 1 month ago

Apply

5 - 7 years

15 - 25 Lacs

Pune, Mumbai (All Areas)

Hybrid

Naukri logo

DUTIES AND RESPONSIBILITIES: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. SUPERVISORY RESPONSIBILITIES: This job has no supervisory responsibilities. QUALIFICATIONS: Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 3-5 years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker • Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)fepa

Posted 1 month ago

Apply

3 - 8 years

6 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Novastrid is hiring an experienced Data Engineer for a leading Tier-1 company in Hyderabad. If you're passionate about building robust, scalable data systems and working with cutting-edge big data technologies, this is your opportunity to work with one of the best in the industry. Role & responsibilities Design and implement scalable, high-performance batch and real-time data pipelines using Apache Spark , Kafka , Java , and SQL Build and maintain ETL/ELT frameworks handling structured, semi-structured, and unstructured data Work on streaming data solutions using Spark Structured Streaming and Kafka Develop and optimize data models , implement data warehousing solutions on AWS / Azure / GCP Automate and orchestrate workflows using Apache Airflow , DBT , or equivalent tools Collaborate with cross-functional teams (Data Science, Product, Engineering) Monitor, troubleshoot, and ensure reliability of data systems Follow best practices in data governance , security , and cloud cost optimization Preferred candidate profile 3 to 8 years of hands-on experience in Data Engineering / Big Data Development Strong expertise in: Apache Spark Kafka Java (production-grade experience) Advanced SQL Python/Scala (optional but a plus) Experience with cloud platforms (AWS / Azure / GCP) Familiarity with Git , CI/CD pipelines , and modern data ops practices Good to Have Experience with NoSQL (MongoDB, Cassandra, DynamoDB) Exposure to Docker , Kubernetes Domain experience in Banking / FinTech / Financial Services Educational Qualifications Bachelor's or Masters degree in Computer Science , Information Systems , Data Engineering , or a related field

Posted 1 month ago

Apply

4 - 6 years

12 - 15 Lacs

Hyderabad

Remote

Naukri logo

Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities: 1. Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. 2. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. 3. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. 4. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. 5. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. 6. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. 7. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. 8. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.

Posted 1 month ago

Apply

- 2 years

3 - 8 Lacs

Lucknow

Hybrid

Naukri logo

Develop and maintain scalable data pipelines. Collaborate with data scientists and analysts to support business needs. Work with cloud platforms like AWS, Azure, or Google Cloud. Effectively working with cross-functional teams. Data Modelling.

Posted 1 month ago

Apply

4 - 9 years

0 - 0 Lacs

Bengaluru

Remote

Naukri logo

Hi , Synergy Technologies is a leader in technology services and consulting. We enable clients across the world to create and execute strategies .We help our clients find the right problems to solve, and to solve these effectively. We bring our expertise and innovation to every project we undertake Position: Business Intelligence Developer Duration : Contract to Full Time Location : Remote Work ( Remote Work ) JD Required qualications include: Business Intelligence Developer Opportunity Our mission is clear: to enhance the safety and well-being of workers across the globe. As a trailblazer in software solutions, we empower businesses and their suppliers with a platform that champions safety, sustainability, and risk management within supply chains. Join our close-knit team of Data Systems and Internal Business Intelligence experts, where you can live out our core values daily and contribute to impactful projects that further the companys vision. About the Role As a Business Intelligence Developer , you will play a critical role in developing impactful business intelligence solutions that empower internal teams with data-driven insights for strategic decision-making. Working closely with business analysts, data engineers, and stakeholders, youll design and build data models, interactive reports, and dashboards to transform complex data into clear, actionable insights. Your efforts will ensure data quality, accuracy, and governance while enhancing accessibility for business users. Key Responsibilities Develop BI Solutions: Design, develop, and implement data models, dashboards, and reports using Power BI to support data-driven initiatives. Data Modeling & Integration: Collaborate with data engineers and analysts to create optimized data models that aggregate data from multiple sources, ensuring scalability and alignment with business needs. Enhance Data Accuracy: Continuously improve data accuracy, standardize key metrics, and refine reporting processes to drive operational efficiency. Ensure Data Governance: Adhere to the companys data governance policies, ensuring that all BI solutions comply with data security standards, especially for sensitive information. Optimize BI Performance: Monitor BI solutions to ensure performance and reliable data access, implementing enhancements as needed. Documentation & User Support: Maintain comprehensive documentation of dashboards, data models, and processes; provide end-user training to maximize tool effectiveness. Adapt and Innovate: Stay informed on BI best practices and emerging technologies to proactively enhance BI capabilities. Qualifications Education: Bachelors Degree in Data Science, Business Analytics, Computer Science, or a rrelated field. Experience: Minimum of 5 years in business intelligence development, including data modeling, reporting, and dashboard creation. Power BI Expertise: Strong experience with Power BI, including advanced DAX calculations, data modeling, and creating visually engaging, actionable dashboards. dbt Labs Cloud IDE: At least 1 year of hands-on experience with dbt Labs Cloud IDE is required. Technical Skills: Proficiency in SQL and modern cloud-based data warehousing concepts, with experience in Snowflake, SQL Server, or Redshift. Cloud and ERP/CRM Proficiency: Familiarity with platforms such as NetSuite, Salesforce, Fivetran, and API integrations; experience with SaaS systems like Zuora Billing, Churn Zero, Marketo, and Qualtrics is a plus. Communication Skills: Ability to translate technical insights into business-friendly language. Preferred Skills Certifications: Power BI, Snowflake, or similar BI tools. Portfolio: Ability to provide redacted samples of Power BI dashboards. SaaS Experience: Background in SaaS organizations is beneficial.

Posted 1 month ago

Apply

10 - 18 years

20 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Job Summary: We are looking for an experienced and highly skilled Senior Python Developer with strong hands-on expertise in Snowflake to join our growing data engineering team. The ideal candidate will have a solid background in building scalable data pipelines, data modeling, and integrating Python-based solutions with Snowflake. Roles and Responsibilities: Design, develop, and maintain scalable and efficient data pipelines using Python and Snowflake. Collaborate with data architects and analysts to understand data requirements and translate them into technical solutions. Write complex SQL queries and stored procedures in Snowflake. Optimize Snowflake performance using best practices for data modeling, partitioning, and caching. Develop and deploy Python-based ETL/ELT processes. Integrate Snowflake with other data sources, APIs, or BI tools. Implement and maintain CI/CD pipelines for data solutions. Ensure data quality, governance, and security standards are maintained. Required Skills and Qualifications: Strong programming skills in Python with a focus on data processing and automation. Hands-on experience with Snowflake including SnowSQL, Snowpipe, data sharing, and performance tuning. Proficiency in SQL and working with large, complex datasets. Experience in designing and implementing ETL/ELT pipelines. Strong understanding of data warehousing concepts and data modeling (star/snowflake schema). Familiarity with cloud platforms such as AWS , Azure , or GCP . Experience with version control (e.g., Git) and CI/CD tools. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with Apache Airflow , DBT , or other workflow orchestration tools. Knowledge of data security and compliance standards . Experience integrating Snowflake with BI tools (Tableau, Power BI, etc.). Certification in Snowflake or relevant cloud platforms is a plus.

Posted 1 month ago

Apply

8 - 12 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role : Snowflake Developer Experience : 8 years - 12 years Expert in python snowflake SQL and Github Experience in Dagster or Airflow is Must Should be able to grasp landscape quickly to test and approve Merge requests from Data Engineers Data Modelling and Architectural level knowledge is needed Should be able to establish connectivity from different source systems like SAP Beeline to existing setup and take ownership of it

Posted 1 month ago

Apply

7 - 10 years

17 - 25 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Data Engineer DBT, Snowflake, Looker Location: remote Experience: 7–10 years About the Role We are looking for an experienced Data Engineer to design and build scalable data pipelines and enable powerful business insights. You’ll work with modern data stack tools like DBT, Snowflake , and Looker to empower data-driven decisions. Key Responsibilities Design & maintain scalable data pipelines (DBT, Snowflake) Perform data transformation , cleansing & enrichment Integrate data from multiple sources into data warehouse/data lake Support reporting, analytics & BI with Looker / similar tools Optimize performance & troubleshoot data workflows Document processes & ensure data quality Skills Required DBT, Snowflake, Looker (or similar tools) Strong SQL , Python (or similar scripting) Data modeling, schema design, database optimization Problem-solving & business requirement translation Excellent communication & cross-functional collaboration Drop your resume at: bhavikas@overturerede.com Contact: 7428694900 We’re hiring! Don’t miss the chance to work with cutting-edge data platforms and make an impact. Reach out now!

Posted 1 month ago

Apply

4 - 8 years

0 - 1 Lacs

Mohali

Work from Office

Naukri logo

Job Title : Snowflake Developer (4+ years' experience) Location : F, 384, Sector 91 Rd, Phase 8B, Industrial Area, Sector 91, Sahibzada Ajit Singh Nagar, Punjab 160055. Job Type : Fulltime (In-house) Job Overview : We are looking for an experienced Snowflake Developer with 4+ years of hands-on experience in Snowflake Data Warehouse and related tools. You will be responsible for building, managing, and optimizing Snowflake data pipelines, assisting in data integration, and contributing to overall data architecture. The ideal candidate should have a strong understanding of data modeling, ETL processes, and experience working with cloud-based data platforms. Responsibilities : Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Optimize query performance and data storage in Snowflake. Work with stakeholders to define data requirements and provide technical solutions. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Monitor and troubleshoot performance issues in Snowflake environments. Automate repetitive data processes and report generation tasks. Ensure data integrity, security, and compliance with data governance policies. Assist in data migration and platform upgrades. Required Skills : 4+ years of experience working with Snowflake Data Warehouse . Proficient in SQL , SnowSQL , and ETL processes . Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines , data lakes, and data integration tools. Experience in query optimization and performance tuning in Snowflake. Understanding of data governance and best practices. Strong knowledge of data security and privacy policies in a cloud environment. Experience in using tools like dbt , Airflow , or similar orchestration tools is a plus. #Salary: No bar for deserving candidates. Location: - Mohali Punjab (Work from office) Shift:- Night Shift Other Benefits: 5 Days working US based work culture and environment Indoor and Outdoor events Paid Leaves Health Insurance Employee engagement activities like month end & festival celebration, team outing, birthday celebrations. Gaming and sports area Please comment/DM to know more. You may also e-mail your resume to me at priyankaaggarwal@sourcemash.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies