Jobs
Interviews

398 Dbt Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

curatAId is seeking a Senior Snowflake Consultant on behalf of our client, a fast-growing organization focused on data- driven innovation. This role combines snowflake expertise with DevOps, DBT, Airflow t o support the development and operation of a modern, cloud-based enterprise data platform. The ideal candidate will be responsible for building and managing data infrastructure, developing scalable data pipelines, implementing data quality and governance frameworks and automating workflows for operational efficiency. To apply for this position, it is mandatory to register on our platform at www.curataid.com and give 10 minutes technical quiz on Snowflake skill. Title: Senior Data Engineer Level: Consultant/Deputy Manager/Manager/Senior Manager Relevant Experience: Minimum of 5+ years of hands-on experience on Snowflake with DevOps, DBT, Airflow Must Have Skill: Data Engineering, Snowflake, DBT, Airflow & DevOps Location: Mumbai, Gurgaon, Bengaluru, Chennai, Kolkata, Bhubaneshwar, Coimbatore, Ahmedabad Qualifications 5+ years of relevant snowflake in a data engineering context. (Must Have) 4+ years of relevant experience in DBT, Airflow & DevOps . (Must Have) Strong hands-on experience with data modelling, data warehousing and building high-volume ETL/ELT pipelines. Must have experience with Cloud Data Warehouses like Snowflake, Amazon Redshift, Google Big Query or Azure Synapse Experience with version control systems (GitHub, BitBucket, GitLab). Strong SQL expertise. Implement best practices for data storage management, security, and retrieval efficiency. Experience with pipeline orchestration tools (Fivetran, Stitch, Airflow, etc.). Coding proficiency in at least one modern programming language (Python, Java, Scala, etc.). Show more Show less

Posted 4 days ago

Apply

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Bangalore/Gurugram/Hyderabad YOE - 7+ years We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. Youll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Show more Show less

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Join Zendesk as a Data Engineering Manager and lead a team of data engineers who deliver meticulously curated data assets to fuel business insights. Collaborate with Product Managers, Data Scientists, and Data Analysts to drive successful implementation of data products. We are seeking a leader with advanced skills in data infrastructure, data warehousing, and data architecture, as well as a proven track record of scaling BI teams. Be a part of our mission to embrace data and analytics and create a meaningful impact within our organization. You will foster the growth and development of a team of data engineers, design, build, and launch new data models and pipelines in production, and act as a player-coach to amplify the effects of your team's work. Foster connections with diverse teams to comprehend data requirements, help develop and support your team in technical architecture, project management, and product knowledge. Define processes for operational excellence in project management and system reliability and set direction for the team to anticipate strategic and scaling-related challenges. Foster a healthy and collaborative culture that embodies our values. What You Bring to the Role: - Bachelor's degree in Computer Science/Engineering or related field. - 7+ years of proven experience in Data Engineering and Data Warehousing. - 3+ years as a manager of data engineering teams. - Proficiency with SQL & any programming language (Python/Ruby). - Experience with Snowflake, BigQuery, Airflow, dbt. - Familiarity with BI Tools (Looker, Tableau) is desirable. - Proficiency in modern data stack and architectural strategies. - Excellent written and oral communication skills. - Proven track record of coaching/mentoring individual contributors and fostering a culture valuing diversity. - Experience leading SDLC and SCRUM/Agile delivery teams. - Experience working with globally distributed teams preferred. Tech Stack: - SQL - Python/Ruby - Snowflake - BigQuery - Airflow - dbt Please note that this position requires physical location in and working from Pune, Maharashtra, India. Zendesk software was built to bring calm to the chaotic world of customer service. We advocate for digital-first customer experiences and strive to create a fulfilling and inclusive workplace experience. Our hybrid working model allows for connection, collaboration, and learning in person at our offices globally, as well as remote work flexibility. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans. If you require an accommodation to participate in the hiring process, please email peopleandplaces@zendesk.com with your specific request.,

Posted 4 days ago

Apply

10.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonalds: One of the worlds largest employers with locations in more than 100 countries, McDonalds Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald&aposs global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Senior Manager, Integrated Test Lead Data Product Engineering & Delivery (Sr Manager, Technology Testing) Lead comprehensive testing strategy and execution for complex data engineering pipelines and product delivery initiatives. Drive quality assurance across integrated systems, data workflows, and customer-facing applications while coordinating cross-functional testing efforts. Who we are looking for: Primary Responsibilities: Test Strategy & Leadership: Design and implement end-to-end testing frameworks for data pipelines, ETL / ELT processes, and analytics platforms Ensure test coverage across ETL / ELT, data transformation, lineage and consumption layers Develop integrated testing strategies spanning multiple systems, APIs, and data sources Establish testing standards, methodologies, and best practices across the organization Data Engineering Testing: Create comprehensive test suites for data ingestion, transformation, and output validation Design data quality checks, schema validation, and performance testing for large-scale datasets Implement automated testing for streaming and batch data processing workflows Validate data integrity across multiple environments and systems and against business rules Cross-Functional Coordination: Collaborate with data engineers, software developers, product managers, and DevOps teams Coordinate testing activities across multiple product streams and release cycles Manage testing dependencies and critical path items in complex delivery timelines Quality Assurance & Process Improvement: Establish metrics and KPIs for testing effectiveness and product quality to drive continuous improvement in testing processes and tooling Lead root cause analysis for production issues and testing gaps Technical Leadership: Mentor junior QA engineers and promote testing best practices Evaluate and implement new testing tools and technologies Design scalable testing infrastructure and CI/CD integration Skill: 10+ years in software testing with 3+ years in leadership roles 8+ year experience testing data engineering systems, ETL pipelines, or analytics platforms Proven track record with complex, multi-system integration testing Experience in agile/scrum environments with rapid delivery cycles Strong SQL experience with major databases (Redshift, Bigquery, etc.) Experience with cloud platforms (AWS, GCP) and their data services Knowledge of data pipeline tools (Apache Airflow, Kafka, Confluent, Spark, dbt, etc.) Proficiency in data warehousing, data architecture, reporting and analytics applications Scripting languages (Python, Java, bash) for test automation API testing tools and methodologies CI/CD/CT tools and practices Strong project management and organizational skills Excellent verbal and written communication abilities Experience managing multiple priorities and competing deadlines Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Show more Show less

Posted 4 days ago

Apply

8.0 - 13.0 years

0 - 0 Lacs

pune, hyderabad, mumbai city

On-site

Position Overview We are seeking a highly skilled and experienced Senior Snowflake Developer/Lead to join our dynamic team. This role is ideal for individuals who are passionate about data engineering and analytics, and who thrive in a collaborative environment. As a Senior Snowflake Developer, you will play a pivotal role in designing, developing, and implementing data solutions that drive business insights and decision-making. With an annual salary of 20,00,000 , this full-time position offers an exciting opportunity to work in a fast-paced environment in one of our key locations: Pune, Mumbai City, or Hyderabad . Key Responsibilities Design and develop scalable data pipelines and ETL processes using Snowflake and other relevant technologies. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Optimize and maintain existing data models and workflows to ensure high performance and reliability. Implement best practices for data governance, security, and compliance. Lead and mentor junior developers, providing guidance and support in their professional development. Conduct code reviews and ensure adherence to coding standards and quality assurance processes. Stay updated with the latest industry trends and technologies related to data engineering and analytics. Participate in project planning and estimation activities, ensuring timely delivery of high-quality solutions. Qualifications The ideal candidate will possess the following qualifications: Experience: 8 to 13 years of relevant work experience in data engineering, with a strong focus on Snowflake. Technical Skills: Proficiency in SQL, Python, dbt, and Snowflake. Education: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Leadership Skills: Proven ability to lead projects and mentor team members effectively. Analytical Skills: Strong problem-solving skills with the ability to analyze complex data sets and derive actionable insights. Communication Skills: Excellent verbal and written communication skills, with the ability to convey technical concepts to non-technical stakeholders. This is a fantastic opportunity for a motivated individual looking to advance their career in a leading organization. If you are ready to take on new challenges and make a significant impact, we encourage you to apply!

Posted 4 days ago

Apply

4.0 - 9.0 years

25 Lacs

Chennai

Hybrid

About the Role We are seeking a skilled and driven Data Engineer to join our growing data team. Required Skills Experience with data platforms and tools: Snowflake, DBT, Databricks, Azure (ADF & Fabric), and GCP BigQuery . Strong understanding of database optimization techniques: partitioning, indexes, query performance tuning , etc. Proficient in coding with Python, PySpark , and SQL for data extraction, transformation, and analysis. Strong SQL expertise is a must Experience in automating and testing data workflows , preferably using Azure ADF . Familiarity with analytical techniques including data modeling, data transformation , and algorithmic development . Role & responsibilities End-to-End Pipeline Development - Design, build, and test complete data pipelines for ingestion, integration, and curation. Workflow Automation & Data Quality - Automate data workflows and testing to boost efficiency and ensure data integrity. Pipeline Optimization - Continuously enhance pipeline performance to meet evolving business demands. Scalable Cloud Data Platform - Build and maintain a scalable, high-availability cloud-based data platform. Technical Debt Management - Identify and reduce technical debt for a clean and efficient codebase. Reusable Frameworks - Develop modular components and frameworks to accelerate data engineering tasks. Stakeholder Collaboration - Partner with analysts and cross-functional teams to deliver data-driven solutions. Documentation & Maintenance - Document workflows and systems for clarity, consistency, and long-term support.

Posted 4 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Principal Technologist (Data Architect) at Medtronic, you will be responsible for delivering data architecture solutions that align with business capability needs and enterprise standards. In this role, you will collaborate with Enterprise Solution Architects, Business Solution Architects, Technical Architects, and external service providers to ensure that data and information models and technologies are in line with architecture strategies and Medtronic's standards. Your role will involve working with Business Analysts to review business capability needs, define requirements, conduct data analysis, develop data models, write technical specifications, and collaborate with development teams to ensure the successful delivery of designs. Your technical expertise will be crucial in leveraging tools such as webMethods suite, Informatica, ETL tools, Kafka, and data transformation techniques to design and implement robust integration solutions. You will oversee the implementation of integration solutions, ensuring they meet technical specifications, quality standards, and best practices. Additionally, you will lead continuous improvement initiatives to enhance integration processes, troubleshoot and resolve integration-related issues, mentor junior team members, collaborate with vendors, optimize performance, and contribute to documentation and knowledge management efforts. To be successful in this role, you should have at least 8 years of IT experience with a Bachelor's Degree in Engineering, MCA, or MSc. You should also have experience in relevant architecture disciplines (integrations, data, services, infrastructure), Oracle, SAP, or big data platforms, Informatica, PowerDesigner, Python coding, and Snowflake. Specialized knowledge in Enterprise-class architecture concepts, data integration, data modeling methodologies, cloud-based solutions, and data governance would be advantageous. It would be beneficial to have a high degree of learning agility, experience with large enterprise systems, technical modeling and design skills, awareness of architecture frameworks, and strong leadership, teamwork, analytical, and communication skills. Experience in the Medical Device Industry or other regulated industries, as well as the ability to work independently and collaboratively, would also be valuable. At Medtronic, we offer a competitive salary, flexible benefits package, and a commitment to recognizing and supporting the contributions of our employees. Our mission is to alleviate pain, restore health, and extend life by boldly addressing the most challenging health problems. As part of our global team of passionate individuals, you will have the opportunity to engineer real solutions for real people and contribute to our mission of making healthcare technology accessible to all. Join us at Medtronic and be a part of a team that is dedicated to innovation, collaboration, and making a meaningful impact on global healthcare technology.,

Posted 5 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You will play a crucial role as a Data Engineer, leading the development of data infrastructure at the forefront. Your responsibilities will involve creating and maintaining systems that ensure a seamless flow, availability, and reliability of data. Your key tasks at Coforge will include: - Developing and managing data pipelines to facilitate efficient data extraction, transformation, and loading (ETL) processes. - Designing and enhancing data storage solutions such as data warehouses and data lakes. - Ensuring data quality and integrity by implementing data validation, cleansing, and error handling mechanisms. - Collaborating with data analysts, data architects, and software engineers to comprehend data requirements and provide relevant data sets for business intelligence purposes. - Automating and enhancing data processes and workflows to drive scalability and efficiency. - Staying updated on industry trends and emerging technologies in the field of data engineering. - Documenting data pipelines, processes, and best practices to facilitate knowledge sharing. - Contributing to data governance and compliance initiatives to adhere to regulatory standards. - Working closely with cross-functional teams to promote data-driven decision-making across the organization. Key skills required for this role: - Proficiency in data modeling and database management. - Strong programming capabilities, particularly in Python, SQL, and PL/SQL. - Sound knowledge of Airflow, Snowflake, and DBT. - Hands-on experience with ETL (Extract, Transform, Load) processes. - Familiarity with data warehousing and cloud platforms, especially Azure. Your experience of 5-10 years will be instrumental in successfully fulfilling the responsibilities of this role located in Greater Noida with a shift timing from 2:00 PM IST to 10:30 PM IST.,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

jaipur, rajasthan

On-site

You are a Sr. Data Engineer with a strong background in building ELT pipelines and expertise in modern data engineering practices. You are experienced with Databricks and DBT, proficient in SQL and Python, and have a solid understanding of data warehousing methodologies such as Kimball or Data Vault. You are comfortable working with DevOps tools, particularly within AWS, Databricks, and GitLab. Your role involves collaborating with cross-functional teams to design, develop, and maintain scalable data infrastructure and pipelines using Databricks and DBT. Your responsibilities include designing, building, and maintaining scalable ELT pipelines for processing and transforming large datasets efficiently in Databricks. You will implement Kimball data warehousing methodologies or other multi-dimensional modeling approaches using DBT. Leveraging AWS, Databricks, and GitLab, you will implement CI/CD practices for data engineering workflows. Additionally, you will optimize SQL queries and database performance, monitor and fine-tune data pipelines and queries, and ensure compliance with data security, privacy, and governance standards. Key qualifications for this role include 6+ years of data engineering experience, hands-on experience with Databricks and DBT, proficiency in SQL and Python, experience with Kimball data warehousing or Data Vault methodologies, familiarity with DevOps tools and practices, strong problem-solving skills, and the ability to work in a fast-paced, agile environment. Preferred qualifications include experience with Apache Spark for large-scale data processing, familiarity with CI/CD pipelines for data engineering workflows, understanding of orchestration tools like Apache Airflow, and certifications in AWS, Databricks, or DBT. In return, you will receive benefits such as medical insurance for employees, spouse, and children, accidental life insurance, provident fund, paid vacation time, paid holidays, employee referral bonuses, reimbursement for high-speed internet at home, one-month free stay for employees moving from other cities, tax-free benefits, and other bonuses as determined by management.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be working as a Snowflake developer at our organization, with the job location being in Noida, Chennai, or Pune (3 times per week). Your primary responsibilities will include utilizing your expertise in Python, Snowflake, DBT, SQL, Data Quality, and Data Modelling. It is essential for you to possess proficiency in Python, Snowflake, DBT, SQL, Data Quality, and Data Modelling. Additionally, it would be advantageous if you have experience with Snowflake db, snowpipe, and fivetran. As a successful candidate, you should be an expert in DBT and SQL, capable of developing and maintaining DBT models, comprehending data flow, conducting data quality assessments, and performing data testing using DBT. Your role will involve ensuring efficient data processing and maintaining high data quality standards throughout the development and maintenance processes.,

Posted 5 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Gurugram

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5+ Years Location: Gurgaon Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 5 days ago

Apply

6.0 - 11.0 years

30 - 35 Lacs

Pune

Remote

Role & responsibilities We are seeking a Production Support Lead with expertise in modern data platforms to oversee the reliability, performance, and user access control of our analytics and reporting environment. This individual will lead operational support across tools like Snowflake, dbt, Fivetran, Tableau, and Azure Entra, AWS, Terraform ensuring compliance and high data availability. The ideal candidate will not only resolve technical issues but also guide the team in scaling and automating platform operations.

Posted 5 days ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Role: Azure Data Engineer Location: Pune, Gurgaon, or Bangalore Work Mode - Hybrid Key Role and Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. o Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. o Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must Have: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets Good to Have: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance, metadata management, or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Education: Bachelor's degree in computer science, Software Engineering, MIS, or equivalent combination of education and experience. Key Skills: Azure [Data Factory, Data Bricks], Snowflake, DBT

Posted 5 days ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Role: Azure Data Engineer Location: Pune/Gurugram/ Bangalore/ Hyderabad. Work Mode : Hybrid Key Role and Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. o Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. o Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must Have: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good to Have: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance, metadata management, or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Education : Bachelors degree in computer science, Software Engineering, MIS or equivalent combination of education and experience. Key Skills: Azure [Data Factory, Data Bricks], Snowflake, DBT

Posted 5 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 5 days ago

Apply

10.0 - 14.0 years

35 - 45 Lacs

Hyderabad

Work from Office

About the Team At DAZN, the Analytics Engineering team is at the heart of turning hundreds of data points into meaningful insights that power strategic decisions across the business. From content strategy to product engagement, marketing optimization to revenue intelligence we enable scalable, accurate, and accessible data for every team. The Role We're looking for a Lead Analytics Engineer to take ownership of our analytics data Pipeline and play a pivotal role in designing and scaling our modern data stack. This is a hands-on technical leadership role where you'll shape the data models in dbt/ Snowflake , orchestrate pipelines using Airflow , and enable high-quality, trusted data for reporting. Key Responsibilities Lead the development and governance of DAZNs semantic data models to support consistent, reusable reporting metrics. Architect efficient, scalable data transformations on Snowflake using SQL/DBT and best practices in data warehousing. Manage and enhance pipeline orchestration with Airflow , ensuring timely and reliable data delivery. Collaborate with stakeholders across Product, Finance, Marketing, and Technology to translate requirements into robust data models. Define and drive best practices in version control, testing, CI/CD for analytics workflows. Mentor and support junior engineers, fostering a culture of technical excellence and continuous improvement. Champion data quality, documentation, and observability across the analytics layer. You'll Need to Have 10+ years of experience in data/analytics engineering, with 2+ years leading or mentoring engineers . Deep expertise in SQL and cloud data warehouses (preferably Snowflake ) and Cloud Services(AWS /GCP/AZURE) Proven experience with dbt for data modeling and transformation. Hands-on experience with Airflow (or similar orchestrators like Prefect, Luigi). Strong understanding of dimensional modeling, ELT best practices, and data governance principles. Ability to balance hands-on development with leadership and stakeholder management. Clear communication skills you can explain technical concepts to both technical and non-technical teams. Nice to Have Experience in the media, OTT, or sports tech domain. Familiarity with BI tools like Looker or PowerBI. Exposure to testing frameworks like dbt tests or Great Expectations

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

About KPMG in India KPMG entities in India are professional services firm(s) affiliated with KPMG International Limited. Established in August 1993, KPMG leverages a global network of firms and maintains expertise in local laws, regulations, markets, and competition. With offices in major Indian cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG caters to national and international clients across various sectors. The services offered by KPMG in India are characterized by their rapid, performance-based, industry-focused, and technology-enabled approach, reflecting a deep understanding of global and local industries and the Indian business environment. Snowflake, SQL, DBT Equal employment opportunity information QUALIFICATIONS B.tech,

Posted 6 days ago

Apply

3.0 - 8.0 years

0 Lacs

delhi

On-site

As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

Intellect Care is a collaborative process aimed at guiding, exploring, and resolving personal, social, or psychological problems and difficulties. Counseling is beneficial for individuals seeking tools or techniques to support their mental health, while psychotherapy focuses on identifying and addressing unhelpful patterns of thinking and behaviors that may be limiting the client's quality of life. The goal is to equip clients with more effective ways of thinking, behaving, and relating to others, particularly in addressing moderately severe mental health experiences. Sessions at Intellect Care typically last up to 50 minutes. Joining Intellect Care means becoming part of a collaborative network of Clinical Psychologists and Counsellors dedicated to expanding access to quality mental health care. As part of the team at Intellect, you will find a supportive community that is focused on enhancing lives and helping clients overcome personal challenges. As a member of the team, your responsibilities will include providing 1-on-1 telehealth or onsite mental health support for Intellect's clients. You will be addressing a range of clinical cases, such as depression, anxiety, trauma, schizophrenia, eating disorders, addiction, and more. Additionally, you will work in partnership with our internal clinical team to enhance the platform and client programs. Qualifications required for the role include a minimum Master's Degree in Counseling or Counseling Psychology for Counsellors, with at least 300 hours of counseling experience post-qualification. Clinical Psychologists should hold a minimum Master's Degree in Clinical Psychology with a minimum of 300 hours of Clinical/Therapy experience post-qualification. Candidates should be skilled in Cognitive Behavioral Therapy or other evidence-based approaches and must hold a valid license from recognized bodies such as the Rehabilitation Council of India (RCI), Clinical Psychology Society of India (CPSI), Counsellors Council of India (CCI), or Bharatiya Counseling Psychology Association (BCPA). A minimum of 300 clinical hours post-master's degree is required, along with proficiency in English and the local language. Preferred backgrounds include prior experience with Employee Assistance Programs (EAP), adult counseling, or coaching managerial roles. Flexibility in terms of occasional onsite work is considered a plus. If you are passionate about making a meaningful impact and being part of a mission-driven team, we invite you to apply today and join us in transforming mental health care.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

ahmedabad, gujarat

On-site

YipitData is a leading market research and analytics firm specializing in the disruptive economy, having recently secured a significant investment from The Carlyle Group valued over $1B. Recognized for three consecutive years as one of Inc's Best Workplaces, we are a rapidly expanding technology company with offices across various locations globally, fostering a culture centered on mastery, ownership, and transparency. As a potential candidate, you will have the opportunity to collaborate with strategic engineering leaders and report directly to the Director of Data Engineering. This role involves contributing to the establishment of our Data Engineering team presence in India and working within a global team framework, tackling challenging big data problems. We are currently in search of a highly skilled Senior Data Engineer with 6-8 years of relevant experience to join our dynamic Data Engineering team. The ideal candidate should possess a solid grasp of Spark and SQL, along with experience in data pipeline development. Successful candidates will play a vital role in expanding our data engineering team, focusing on enhancing reliability, efficiency, and performance within our strategic pipelines. The Data Engineering team at YipitData sets the standard for all other analyst teams, maintaining and developing the core pipelines and tools that drive our products. This team plays a crucial role in supporting the rapid growth of our business and presents a unique opportunity for the first hire to potentially lead and shape the team as responsibilities evolve. This hybrid role will be based in India, with training and onboarding requiring overlap with US working hours initially. Subsequently, standard IST working hours are permissible, with occasional meetings with the US team. As a Senior Data Engineer at YipitData, you will work directly under the Senior Manager of Data Engineering, receiving hands-on training on cutting-edge data tools and techniques. Responsibilities include building and maintaining end-to-end data pipelines, establishing best practices for data modeling and pipeline construction, generating documentation and training materials, and proficiently resolving complex data pipeline issues using PySpark and SQL. Collaboration with stakeholders to integrate business logic into central pipelines and mastering tools like Databricks, Spark, and other ETL technologies is also a key aspect of the role. Successful candidates are likely to have a Bachelor's or Master's degree in Computer Science, STEM, or a related field, with at least 6 years of experience in Data Engineering or similar technical roles. An enthusiasm for problem-solving, continuous learning, and a strong understanding of data manipulation and pipeline development are essential. Proficiency in working with large datasets using PySpark, Delta, and Databricks, aligning data transformations with business needs, and a willingness to acquire new skills are crucial for success. Effective communication skills, a proactive approach, and the ability to work collaboratively with stakeholders are highly valued. In addition to a competitive salary, YipitData offers a comprehensive compensation package that includes various benefits, perks, and opportunities for personal and professional growth. Employees are encouraged to focus on their impact, self-improvement, and skill mastery in an environment that promotes ownership, respect, and trust.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

We are seeking a Data Modeler with expertise in mortgage banking data to support a large-scale Data Modernization program. As a Data Modeler, your primary responsibilities will include designing and developing enterprise-grade data models such as 3NF, Dimensional, and Semantic models to cater to both analytics and operational use cases. You will collaborate closely with business and engineering teams to define data products that are aligned with specific business domains. Your role will involve translating complex mortgage banking concepts into scalable and extensible models that meet the requirements of the organization. It is crucial to ensure that the data models are in alignment with modern data architecture principles and are compatible with cloud platforms like Snowflake and DBT. Additionally, you will be expected to contribute to the creation of canonical models and reusable patterns for enterprise-wide use. To be successful in this role, you should possess the following qualifications: - A minimum of 5 years of experience in data modeling with a strong emphasis on mortgage or financial services. - Hands-on experience in developing 3NF, Dimensional, and Semantic models. - Profound understanding of data as a product and domain-driven design principles. - Familiarity with modern data ecosystems and tools such as Snowflake, DBT, and BI tools would be advantageous. - Excellent communication skills are essential to effectively collaborate with both business and technical teams. This position requires the candidate to work onsite in either Hyderabad or Ahmedabad.,

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 22 Lacs

Bengaluru

Work from Office

Dear Candidate, Hope you are doing well. Greeting from NAM Info INC. NAM Info Inc. is a technology-forward talent management organization dedicated to bridging the gap between industry leaders and exceptional human resources. They pride themselves on delivering quality candidates, deep industry coverage, and knowledge-based training for consultants. Their commitment to long-term partnerships, rooted in ethical practices and trust, positions them as a preferred partner for many industries. Learn more about their vision, achievements, and services on their website at www.nam-it.com. We have an open position for Data Engineer role with our company for Bangalore, Pune and Mumbai location. Job Description Position: Sr / Lead Data Engineer Location: Bangalore, Pune and Mumbai Experience: 5 + years Required Skills: Azure, Data warehouse, Python, Spark, PySpark, Snowflake / Databricks, Any RDBMS, Any ETL Tool, SQL, Unix Scripting, GitHub Strong experience in Azure / AWS / GCP Permanent with NAM Info Pvt Ltd Work Location: Bangalore, Pune and Mumbai Working time: 12 PM to 9 PM or 2 PM to 11 PM 5 Days work from office, Monday to Friday L1 interview virtual, L2 face to face at Banashankari office (for Bangalore candidate) Notice period immediate to 15 days If you are fine with the above job details then please share your resume to ananya.das@nam-it.com Regards, Recruitment Team NAM Info INC

Posted 1 week ago

Apply

6.0 - 11.0 years

22 - 27 Lacs

Pune, Bengaluru

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 1 week ago

Apply

5.0 - 8.0 years

13 - 19 Lacs

Mumbai, Pune, Mumbai (All Areas)

Hybrid

Role : DBT Developer Experience: 5 years to 8 years Location: Pune and Mumbai only. Job Description: Required Skills: 5 - 8 years of experience in data engineering or backend development. Minimum 2 years of hands-on experience with DBT (Data Build Tool). Strong SQL skills with experience in complex query writing and performance tuning. Proficiency in Python for scripting and data processing. Experience with cloud data warehouses (e.g., Snowflake, Redshift, BigQuery). Understanding of data warehousing concepts and best practices. Experience working in Agile environments and using version control tools like Git.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are invited to join our team as a Mid-Level Data Engineer Technical Consultant with 4+ years of experience. As a part of our diverse and inclusive organization, you will be based in Bangalore, KA, working full-time in a permanent position during the general shift from Monday to Friday. In this role, you will be expected to possess strong written and oral communication skills, particularly in email correspondence. Your experience in working with Application Development teams will be invaluable, along with your ability to analyze and solve problems effectively. Proficiency in Microsoft tools such as Outlook, Excel, and Word is essential for this position. As a Data Engineer Technical Consultant, you must have at least 4 years of hands-on experience in development. Your expertise should include working with Snowflake and Pyspark, writing SQL queries, utilizing Airflow, and developing in Python. Experience with DBT and integration programs will be advantageous, as well as familiarity with Excel for data analysis and Unix Scripting language. Your responsibilities will encompass a good understanding of data warehousing and practical work experience in this field. You will be accountable for various tasks including understanding requirements, coding, unit testing, integration testing, performance testing, UAT, and Hypercare Support. Collaboration with cross-functional teams across different geographies will be a key aspect of this role. If you are action-oriented, independent, and possess the required technical skills, we encourage you to submit your resume to pallavi@she-jobs.com and explore this exciting opportunity further.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies