Jobs
Interviews

731 Dbt Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

18 - 30 Lacs

kolkata, chennai, coimbatore

Work from Office

Mandate Skills : Snowflake, SQL/DBT/Python/AWS/GCP/Azure/Data Dimensional data model Responsibility: Design, develop, and maintain Snowflake databases and data warehouse solutions. Build and optimize SQL queries for data processing and reporting. Experience with Snowflake tools such as Snowpipe, Time Travel, and Cloning. Collaborate with cross-functional teams to implement data models and pipelines. Ensure data security, quality, and compliance in Snowflake environments. Monitor and troubleshoot Snowflake and SQL systems for performance issues.

Posted 1 week ago

Apply

5.0 - 10.0 years

18 - 30 Lacs

pune, bengaluru, mumbai (all areas)

Work from Office

Mandate Skills : Snowflake, SQL/DBT/Python/AWS/GCP/Azure/Data Dimensional data model Responsibility: Design, develop, and maintain Snowflake databases and data warehouse solutions. Build and optimize SQL queries for data processing and reporting. Experience with Snowflake tools such as Snowpipe, Time Travel, and Cloning. Collaborate with cross-functional teams to implement data models and pipelines. Ensure data security, quality, and compliance in Snowflake environments. Monitor and troubleshoot Snowflake and SQL systems for performance issues.

Posted 1 week ago

Apply

8.0 - 12.0 years

18 - 20 Lacs

hyderabad, pune, bengaluru

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solut...

Posted 1 week ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

bengaluru

Work from Office

Role & responsibilities Minimum 5 years of experience in Data & Analytics with a strong focus on data transformation and modeling. Hands-on expertise in DBT (Core and Cloud) including model development, testing, and deployment. Proficiency in Snowflake including data warehousing concepts, performance tuning, and integration with DBT. Design, develop, and maintain DBT models for data transformation across cloud and on-premise environments. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. Optimize and refactor existing DBT pipelines for performance, maintainability, and scalability. Integrate DBT workflows with...

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

gurugram

Remote

About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning formthe core pillars ofthe companys long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with hu...

Posted 1 week ago

Apply

3.0 - 7.0 years

3 - 10 Lacs

bengaluru, karnataka, india

On-site

Responsibilities: Help move products through the development lifecycle from proofs of concept all the way to final, production-ready products Engineer, build and maintain scalable automated datapipelines. Implement best practices in management of data, including data processing,dataquality and lineage. Managing code repositories, code deployments using GIT Automate, maintain and manage system, to ensure the availability, performance, scalability of product. Support regular and ad-hoc data querying and analysis. Qualifications: 3-5 years of experience in Python and data-focused packages, e.g., Pandas, Numpy; interacting with JSON, XML, CSV, TSV formatted data 3-5 years of experience with snow...

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

You will be responsible for managing and creating design schema, SQL query tuning, and conducting code reviews. With a minimum of 5 years of professional experience in data engineering, you should have a strong understanding of data platform and DWH development. Additionally, you should possess at least 2 years of hands-on experience in SSAS tabular models and have designed Data Ingestion and Orchestration Pipelines using Kafka-Snowflake and Python. Your role will involve designing and developing DBT models, as well as building data pipeline processes in accordance with DBT scripting. You should be familiar with analytical tools such as SQL, ETL/ELT (Azure Data Factory, APIs), Cloud services...

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You will be working at Paras Twin Tower, Gurgaon as a full-time employee for Falcon, a Series-A funded cloud-native, AI-first banking technology & processing platform. Falcon specializes in assisting banks, NBFCs, and PPIs to efficiently launch cutting-edge financial products like credit cards, credit lines on UPI, prepaid cards, fixed deposits, and loans. Since its inception in 2022, Falcon has processed over USD 1 billion in transactions, collaborated with 12 of India's top financial institutions, and generated revenue exceeding USD 15 million. The company is supported by prominent investors from Japan, the USA, and leading Indian ventures and banks. To gain more insights about Falcon, vis...

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, you will be responsible for building, maintaining, and optimizing scalable data pipelines that support enterprise data ingestion, ETL workflows, and cloud data warehousing for handling large-scale datasets. You will primarily work with AWS Data Technologies, Redshift, Snowflake, and DBT to ensure efficient data processing and storage. Your main duties will include designing and implementing data pipelines that enable the seamless flow of data across various systems, transforming raw data into valuable insights for the organization. You will collaborate with cross-functional teams to understand data requirements and develop solutions that meet business needs. In this role,...

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Data Engineer at Vista IT Solutions (VITSPL), you will be a key part of the team responsible for developing the next generation of cloud-native data infrastructure for leading platforms in the U.S. Your primary mission will be to create scalable, production-ready data systems that drive real-time analytics, AI-driven insights, and critical applications. Your role will involve designing, constructing, and managing data pipelines utilizing tools such as Airflow and DBT. You will collaborate with cloud data warehouses like Snowflake, BigQuery, or Redshift and integrate event-driven systems and APIs into data workflows. In addition, you will be responsible for modeling data using the medall...

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are a highly skilled and motivated Data Engineer who will be responsible for designing & developing data transformations and data models to ensure reliable and efficient data processing and analysis. You will work closely with cross-functional teams to support data-driven decision-making processes and contribute to the overall success of insights teams. Your key proficiency and responsibilities include expertise in DBT (Data Build Tool) for data transformation and modeling, proficiency in Snowflake, a strong understanding of data architecture, modeling, and data warehousing best practices, designing, developing, and maintaining robust data pipelines using DBT and Snowflake, implementing ...

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the compan...

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Developer (24 years) in Snowflake Engineering with proficiency in SQL, DBT, Python, and Data Quality, you will have the opportunity to enhance and fine-tune Snowflake data solutions. Your responsibilities will include working on ETL/ELT tasks, DBT pipelines, and data quality procedures that underpin large-scale analytics and AI projects. Your primary tasks will involve constructing DBT models and Snowflake transformations, crafting optimized SQL code for ETL/ELT pipelines, executing data validation and quality assessments, as well as providing support for schema modifications and incremental pipelines. The ideal candidate for this role should possess practical experience with SQL and DB...

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer, you will be responsible for building scalable data pipelines using PySpark. Your role will involve implementing complex business logic using Spark SQL, DataFrame, and RDD APIs. You should have strong programming skills in Python, with a solid understanding of data structures, algorithms, and software engineering principles. Your expertise in designing, developing, and maintaining batch and streaming data pipelines will be crucial. You should be familiar with ETL/ELT processes and best practices for data transformation, data quality, and performance optimization. Knowledge of the modern data engineering ecosystem, including distributed data processing, storage systems, and...

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, we transform ideas into impact by bringing together data, science, technology and human ingenuity to deliver better outcomes for all. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their busines...

Posted 1 week ago

Apply

3.0 - 8.0 years

1 - 2 Lacs

hyderabad

Work from Office

Position: ETL Engineer Location: Gachibowli Telangana Employment Type: Full-time / Contract Experience: 3---10 Years Required Skills Hands-on experience with Fivetran (connectors, transformations, monitoring). Strong working knowledge of Matillion ETL (jobs, orchestration, transformation, performance tuning). Proficiency in SQL (complex queries, optimization, stored procedures). Experience with cloud data warehouses (Snowflake, Redshift, BigQuery, or Azure Synapse). Familiarity with data modelling techniques (star schema, snowflake schema, slowly changing dimensions). Exposure to Qlik Sense or other BI platforms (dashboard integration, data prep). Strong problem-solving skills and attention ...

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

pune, chennai, bengaluru

Hybrid

Role & responsibilities Snowflake JD Must have 8- 15 years of experience in Data warehouse, ETL, BI projects • Must have at least 5+ years of experience in Snowflake, 3+DBT • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certifi...

Posted 1 week ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

gurugram

Work from Office

Experience in Big Data technologies, specifically Spark, Python, Hive, SQL, Presto (or other query engines), big data storage formats (e.g., Parquet), orchestration tools (e.g., Apache Airflow) and version control (e.g. Bitbucket) Proficiency in developing configuration-based ETL pipelines and user-interface driven tools to optimize data processes and calculations (e.g., Dataiku). Experience in analysing business requirements, solution design, including the design of data models, data pipelines, and calculations, as well as presenting solution options and recommendations. Experience working in a cloud-based environment (ideally AWS), with a solid understanding of cloud computing concepts (EC...

Posted 1 week ago

Apply

8.0 - 12.0 years

15 - 20 Lacs

kochi, chennai, bengaluru

Hybrid

Experience : 8-12 Years Timing: Candidate must be ready to work till 11 PM Location : Chennai / Bangalore/ Kochi The customer is looking for a Business Intelligence Analyst who have experience in creating and generating reports. And the skills they are looking for AWS + SQL + DBT + Looker or any other BI Reporting Tool. As a Senior Data Analyst, you will oversee the entire lifecycle of data analysis projects for the customer data platform, including designing, developing, and implementing complex analytical models to extract actionable insights from large datasets. You will be working independently and collaborating with various business leaders to support key business outcomes. The ideal ca...

Posted 1 week ago

Apply

5.0 - 10.0 years

18 - 30 Lacs

indore, pune, coimbatore

Work from Office

Greetings from LTIMindtree!! About the jobAre you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision will surely be a fulfilling experience. Scheduled Face to Face Drive on 13th Sep at Pune,Coimbatore,Indore for Snowflake professionals!! If you are available for face to face at Pune, Coimbatore , Kolkata or Indore kindly apply in below link.https://...

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a Data Engineer with expertise in Databricks, PySpark, SQL, and Azure Data Factory (ADF) to design and optimize scalable data pipelines. You will be responsible for designing, developing, and maintaining ETL/ELT pipelines using Azure Data Factory (ADF) and Databricks. Processing and transforming large datasets using PySpark in Databricks will also be a key aspect of your role. Additionally, you will write and optimize complex SQL queries for data extraction, transformation, and loading. Implementing data lakehouse architectures using Delta Lake in Databricks is part of the responsibilities, along with managing and optimizing Snowflake data warehouses and using DBT for modular and sca...

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Role : Mid-Level Snowflake Developer Experience: 4 - 6 yrs About the Role: We are looking for a skilled Mid-Level Snowflake Developer to join our dynamic data engineering team. The ideal candidate will have hands-on experience with Snowflake, data warehousing concepts, and ETL/ELT processes to design, build, and maintain scalable data pipelines and solutions. Key Responsibilities: Develop, optimize, and maintain Snowflake data warehouse solutions to support business intelligence and analytics. Design and implement data models, schemas, and database objects (tables, views, stages, file formats). Build and manage ETL/ELT pipelines using Snowflake-native tools or third-party tools (e.g., dbt, A...

Posted 1 week ago

Apply

0.0 years

0 Lacs

gurugram, haryana, india

On-site

About the Company Urban Company is a technology platform offering customers a variety of services at home. Customers use our platform to book services such as beauty treatments, haircuts, massage therapy, cleaning, plumbing, carpentry, appliance repair, painting, etc., all delivered in the comfort of their home and at a time of their choosing. We promise our customers a high-quality, standardised, and reliable service experience. To fulfill this promise, we work closely with our hand-picked service partners, enabling them with technology, training, products, tools, financing, insurance, and brand, helping them succeed and deliver on this promise. Urban Company started as UrbanClap in Novembe...

Posted 1 week ago

Apply

0.0 years

0 Lacs

mumbai, maharashtra, india

On-site

Role: Data Engineer Location: Mumbai, 5th Floor, 3, North Avenue, Maker Maxity, Bandra Kurla Complex, Contract: 12+months, likely long term Work Mode: Hybrid Notice Period: Immediate - 15days Strong Experience in PySpark Hands-on expertise in building scalable data pipelines using PySpark. Proficiency in using Spark SQL, DataFrame, and RDD APIs to implement complex business logic. Proficient Programming Skills Solid coding skills in Python (preferred), with strong fundamentals in data structures, algorithms, and software engineering principles. Data Pipeline Development Proven experience designing, developing, and maintaining batch and streaming data pipelines. Understanding of ETL/ELT proce...

Posted 1 week ago

Apply

8.0 - 13.0 years

22 - 37 Lacs

bengaluru

Hybrid

Senior Performance Data Engineering - TOP HEALTHCARE PRODUCT Bangalore - Hybrid The Opportunity Translate software requirements into elegant, scalable design specifications. Develop and optimize SQL queries, stored procedures, and database performance strategies. Engineer performance tuning solutions across MSSQL and Snowflake , including partitions, clustering, cost management, and query optimization. Build logical and algorithmic designs independently while contributing to peer design reviews. Implement enhancements, fix defects, and create integration test scenarios. Act as a subject matter expert for at least one key product/application area. Collaborate closely with product managers, ar...

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies