Jobs
Interviews

398 Dbt Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a Sr. Data Engineer with over 7 years of experience, specializing in Data Engineering, Python, and SQL. You will be a part of the Data Engineering team in the Enterprise Data Insights organization, responsible for building data solutions, designing ETL/ELT processes, and managing the data platform to support various stakeholders across the organization. Your role is crucial in driving technology and data-led solutions to foster growth and innovation at scale. Your responsibilities as a Senior Data Engineer include collaborating with cross-functional stakeholders to prioritize requests, identify areas for improvement, and provide recommendations. You will lead the analysis, design, and implementation of data solutions, including constructing data models and ETL processes. Furthermore, you will engage in fostering collaboration with corporate engineering, product teams, and other engineering groups, while also leading and mentoring engineering discussions and advocating for best practices. To excel in this role, you should possess a degree in Computer Science or a related technical field and have a proven track record of over 5 years in Data Engineering. Your expertise should include designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment, and developing data products and APIs. Proficiency in SQL/NoSQL databases, particularly Snowflake, Redshift, or MongoDB, along with strong programming skills in Python, is essential. Additionally, experience with columnar OLAP databases, data modeling, and tools like dbt, AirFlow, Fivetran, GitHub, and Tableau reporting will be beneficial. Good communication and interpersonal skills are crucial for effectively collaborating with business stakeholders and translating requirements into actionable insights. An added advantage would be a good understanding of Salesforce & Netsuite systems, experience in SAAS environments, designing and deploying ML models, and familiarity with events and streaming data. Join us in driving data-driven solutions and experiences to shape the future of technology and innovation.,

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Gurugram

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Noida

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Pune

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 1 week ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable . Greetings From Mississippi Consultant LLP!! We are a Recruitment firm based in Pune, having various clients globally. Contact Soniya soniya05.mississippiconsultants@gmail.com

Posted 1 week ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Pune, Bengaluru

Hybrid

Role & responsibilities Job Description - Snowflake Senior Developer Experience: 8+ years, Hybrid Employment Type: Full-time Job Summary We are seeking a skilled Snowflake Developer with 8+ years of experience in designing, developing, and optimizing Snowflake data solutions. The ideal candidate will have strong expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration. This role involves building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Key Responsibilities 1. Snowflake Development & Optimization Design and develop Snowflake databases, schemas, tables, and views following best practices. Write complex SQL queries, stored procedures, and UDFs for data transformation. Optimize query performance using clustering, partitioning, and materialized views. Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks). 2. Data Pipeline Development Build and maintain ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. Integrate Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). Develop CDC (Change Data Capture) and real-time data processing solutions. 3. Data Modeling & Warehousing Design star schema, snowflake schema, and data vault models in Snowflake. Implement data sharing, secure views, and dynamic data masking. Ensure data quality, consistency, and governance across Snowflake environments. 4. Performance Tuning & Troubleshooting Monitor and optimize Snowflake warehouse performance (scaling, caching, resource usage). Troubleshoot data pipeline failures, latency issues, and query bottlenecks. Work with DevOps teams to automate deployments and CI/CD pipelines. 5. Collaboration & Documentation Work closely with data analysts, BI teams, and business stakeholders to deliver data solutions. Document data flows, architecture, and technical specifications. Mentor junior developers on Snowflake best practices. Required Skills & Qualifications 8+ years in database development, data warehousing, or ETL. 4+ years of hands-on Snowflake development experience. Strong SQL or Python skills for data processing. Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). Certifications: SnowPro Core Certification (preferred). Preferred Skills Familiarity with data governance and metadata management. Familiarity with DBT, Airflow, SSIS & IICS Knowledge of CI/CD pipelines (Azure DevOps). If interested, Kindly share update cv on- Himanshu.mehra@thehrsolutions.in

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a skilled and experienced DBT (Data Build Tool) / DataStage Developer to design, develop, and maintain robust data transformation and integration solutions. The ideal candidate will have hands-on expertise with either DBT for data modeling in data warehouses or IBM DataStage for ETL processes, coupled with a strong understanding of data warehousing concepts. This role is crucial for building and optimizing our data pipelines, ensuring data quality, and supporting analytical initiatives. Roles and Responsibilities: Design, develop, and implement data transformation logic using either DBT (Data Build Tool) for data modeling within data warehouses (e.g., Snowflake, BigQuery, Redshift) or IBM DataStage for complex ETL (Extract, Transform, Load) processes. Create, maintain, and optimize data pipelines to ensure efficient and reliable data flow from source systems to target data platforms. Write, test, and deploy SQL-based data transformations using DBT, ensuring data integrity and performance. For DataStage, develop and fine-tune DataStage jobs, sequences, and routines for data extraction, transformation, and loading. Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions. Perform data profiling, data quality checks, and validation to ensure accuracy and consistency of data. Troubleshoot and resolve data pipeline issues , performance bottlenecks, and data discrepancies. Implement and maintain version control for data models and ETL jobs. Contribute to the development of data governance and data quality standards . Create and maintain comprehensive technical documentation for data models, ETL processes, and data flows. Participate in code reviews and provide constructive feedback to peers. Required Skills and Qualifications: Proven experience as a Data Engineer / Developer with a focus on data transformation. Strong expertise in either DBT (Data Build Tool) for data modeling and transformation in modern data warehouses OR extensive experience with IBM DataStage for enterprise-level ETL. Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT methodologies . Highly proficient in SQL (Structured Query Language) for complex data manipulation and querying. Experience with cloud data platforms like Snowflake, Google BigQuery, or Amazon Redshift (if DBT focused). Familiarity with version control systems (e.g., Git). Strong analytical and problem-solving skills with attention to detail. Excellent communication skills (written and verbal) and ability to work collaboratively in a team environment. Knowledge of scripting languages (e.g., Python) is a plus.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Join our Engineering team within the Corporate Operations Group where you will have the opportunity to work in a collaborative and supportive environment. Our team is integral to Macquarie Asset Management, as we develop robust data platforms and deliver actionable insights that empower strategic decision-making and drive the business forward. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. You'll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. In this role, you will lead the design, development, and optimization of scalable data solutions and workflows. You will drive the team's success by delivering high-quality solutions while fostering a culture of innovation and collaboration. Your role will also focus on promoting continuous improvement to support business objectives. What You Offer: - Proven experience in leading and managing technical teams, with a focus on mentoring and collaboration. - Strong hands-on experience with AWS cloud services (e.g., S3, Lambda, EC2, RDS, Redshift, Glue, etc.). - Advanced proficiency in Python for data processing and automation. - In-depth knowledge of data engineering tools including dbt for data transformation and modeling. - Expertise in writing complex SQL queries and optimizing database performance. We love hearing from anyone inspired to build a better future with us, if you're excited about the role or working at Macquarie we encourage you to apply. Technology enables every aspect of Macquarie, for our people, our customers, and our communities. We're a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a technical and hands-on Lead Data Engineer with over 8 years of experience, responsible for driving the modernization of data transformation workflows within the organization. Your primary focus will be on migrating legacy SQL-based ETL logic to DBT-based transformations and designing a scalable, modular DBT architecture. You will also be tasked with auditing and refactoring legacy SQL code for clarity, efficiency, and modularity. In this role, you will lead the improvement of CI/CD pipelines for DBT, including automated testing, deployment, and code quality enforcement. Collaboration with data analysts, platform engineers, and business stakeholders is essential to understand current gaps and define future data pipelines. Additionally, you will own Airflow orchestration redesign where necessary and define coding standards, review processes, and documentation practices. As a Lead Data Engineer, you will coach junior data engineers on DBT and SQL best practices and provide lineage and impact analysis improvements using DBT's built-in tools and metadata. Key qualifications for this role include proven experience in migrating legacy SQL to DBT, a deep understanding of DBT best practices, proficiency in SQL performance tuning and query optimization, and hands-on experience with modern data stacks such as Snowflake or BigQuery. Strong communication and leadership skills are essential for this role, as you will be required to work cross-functionally and collaborate with various teams within the organization. Exposure to Python, data governance and lineage tools, and mentoring experience are considered nice-to-have qualifications for this position. If you are passionate about modernizing data transformation workflows and driving technical excellence within the organization, this role is the perfect fit for you.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

You will be joining the Analytics Engineering team at DAZN, where your primary responsibility will be transforming raw data into valuable insights that drive decision-making across various aspects of our global business. This includes content, product development, marketing strategies, and revenue generation. Your role will involve constructing dependable and scalable data pipelines and models to ensure that data is easily accessible and actionable for all stakeholders. As an Analytics Engineer with a minimum of 2 years of experience, you will play a crucial part in the construction and maintenance of our advanced data platform. Utilizing tools such as dbt, Snowflake, and Airflow, you will be tasked with creating well-organized, well-documented, and reliable datasets. This hands-on position is perfect for individuals aiming to enhance their technical expertise while contributing significantly to our high-impact analytics operations. Your key responsibilities will involve: - Developing and managing scalable data models through the use of dbt and Snowflake - Creating and coordinating data pipelines using Airflow or similar tools - Collaborating with various teams within DAZN to transform business requirements into robust datasets - Ensuring data quality through rigorous testing, validation, and monitoring procedures - Adhering to best practices in code versioning, CI/CD processes, and data documentation - Contributing to the enhancement of our data architecture and team standards We are seeking individuals with: - A minimum of 2 years of experience in analytics/data engineering or related fields - Proficiency in SQL and a solid understanding of cloud data warehouses (preference for Snowflake) - Familiarity with dbt for data modeling and transformation - Knowledge of Airflow or other workflow orchestration tools - Understanding of ELT processes, data modeling techniques, and data governance principles - Strong communication and collaboration skills Nice to have: - Previous experience in media, OTT, or sports technology sectors - Familiarity with BI tools such as Looker, Tableau, or Power BI - Exposure to testing frameworks like dbt tests or Great Expectations,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are a skilled Snowflake + Python + SQL Developer with 4-6 years of experience, ready to join a dynamic team. Your expertise lies in cloud data platforms, Python programming, and SQL database management. While experience with DBT (Data Build Tool) is a plus, it's not mandatory for this role. In this role, your primary responsibilities include designing, implementing, and managing data pipelines using Snowflake. You will also be developing and optimizing complex SQL queries for data extraction, transformation, and reporting, as well as handling large-scale data processing and integration using Python. Data modeling and optimization are crucial aspects of your role. You will develop and maintain Snowflake data models and warehouse architecture, optimizing data pipelines for performance and scalability. Collaboration is key as you work closely with cross-functional teams to understand data needs and provide efficient solutions. ETL development is another essential part of your role. You will develop and maintain ETL/ELT processes to support data analytics and reporting, utilizing Python scripts and Snowflake tools for data transformation and integration. Monitoring performance, troubleshooting issues, and ensuring data integrity are also part of your responsibilities. While leveraging DBT for data transformation within Snowflake is optional, it is considered advantageous. You may also develop and maintain DBT models to enhance the quality of data transformations. Your key skills and qualifications include hands-on experience with Snowflake, Python, and SQL, a strong understanding of SQL databases and data modeling concepts, and experience in building scalable data pipelines and ETL/ELT processes using Python and Snowflake. Having knowledge of data warehousing best practices, familiarity with cloud platforms such as AWS, Azure, or GCP, and an understanding of version control systems like Git are also beneficial for this role.,

Posted 1 week ago

Apply

4.0 - 9.0 years

0 - 0 Lacs

Hyderabad, Chennai

Hybrid

Job Description: Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python. Optimize and tune data solutions for performance and scalability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer or similar role. Strong proficiency in AWS and Snowflake. Expertise in DBT and Python programming. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad, Pune

Work from Office

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data

Posted 1 week ago

Apply

4.0 - 8.0 years

25 - 30 Lacs

Pune

Hybrid

Hi, Greetings!!! Role : Data Engineer Experience : 4+ years Location: Pune (Hybrid 3 days in office per week) Work Model : Hybrid Mail Skills: Data Engineer with Java, ETL, Apache, SQL Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Airflow for orchestration and automation of data workflows. Ensure the reliability, scalability, and efficiency of data pipelines for ingestion, transformation, and storage. Work with cross-functional teams to understand data needs and deliver high-quality solutions. Troubleshoot and resolve data pipeline issues in production environments. Apply data quality and governance best practices, including validation, logging, and monitoring. Collaborate on CI/CD deployment pipelines for data infrastructure. Required Skills & Qualifications: 4+ years of hands-on experience in data engineering roles. Strong experience with DBT for modular, testable, and version-controlled data transformation. Proficient in Java, especially for building custom data connectors or processing frameworks. Deep understanding of Apache Airflow and ability to design and manage complex DAGs. Solid SQL skills and familiarity with data warehouse platforms (e.g., Snowflake, Redshift, Big Query). Familiarity with version control tools (Git), CI/CD pipelines, and Agile methodologies. Exposure to cloud environments like AWS, GCP, or Azure.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Lead Data Engineer, you will be responsible for leading cloud modernization initiatives, developing scalable data pipelines, and enabling real-time data processing for enterprise-level systems. Your expertise in Google Cloud Platform (GCP) and BigQuery will be crucial in driving the transformation of legacy infrastructure into a robust, cloud-native data ecosystem. Your key responsibilities will include analyzing legacy on-premises and hybrid cloud data warehouse environments, leading the migration of large-scale datasets to Google BigQuery, and designing data migration strategies to ensure data quality, integrity, and performance. You will also be responsible for integrating data from various structured and unstructured sources, building real-time streaming pipelines for large-scale ingestion and processing of IoT and telemetry data, and modernizing legacy SSIS packages into cloud-native ETL pipelines. To excel in this role, you should have at least 5 years of experience in Data Engineering with a strong focus on cloud and big data technologies, along with a minimum of 2 years of hands-on experience with GCP, specifically BigQuery. Your experience in migrating on-premise data systems to the cloud, development with Apache Airflow, Python, and Apache Spark, and expertise in streaming data ingestion will be highly valuable. Additionally, your strong SQL development skills and understanding of cloud architecture, data modeling, and data warehouse design will be essential for this role. Preferred qualifications include a GCP Professional Data Engineer certification, experience with modern data stack tools like dbt, Kafka, or Terraform, and exposure to ML pipelines, analytics engineering, or DataOps/DevOps methodologies. Joining us will provide you with the opportunity to work with cutting-edge technologies in a fast-paced, collaborative environment, lead cloud transformation initiatives at scale, and benefit from competitive compensation and remote flexibility with growth opportunities.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect with over 4 years of experience, you will be responsible for designing and optimizing data pipelines to integrate various data sources in order to support business intelligence and advanced analytics. Your role will involve developing data models and flows to enable personalized customer experiences and support omnichannel marketing and customer engagement. You will lead efforts to ensure data governance, data quality, and data security, ensuring compliance with regulations such as GDPR and CCPA. Additionally, you will implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs. Your responsibilities will also include optimizing workflows to streamline data transformation and modeling processes. You will leverage Azure for cloud infrastructure, data storage, and real-time data analytics, ensuring that the architecture supports scalability and performance. Collaboration with cross-functional teams, including data engineers, analysts, and business stakeholders, will be essential to ensure that data architectures meet business needs. Supporting both real-time and batch data integration will be crucial to make data accessible for actionable insights and decision-making. It will also be your responsibility to continuously assess and integrate new data technologies and methodologies to enhance the organization's data capabilities. To qualify for this role, you should have at least 4 years of experience in Data Architecture or Data Engineering, with expertise in Snowflake and Azure. A strong understanding of data modeling, ETL/ELT processes, and modern data architecture frameworks is required. Experience in designing scalable data architectures for personalization and customer analytics across marketing, sales, and customer service domains is essential. You should also have expertise with cloud data platforms (preferably Azure) and Big Data technologies for large-scale data processing. Hands-on experience with Python for data engineering tasks and scripting is also preferred. Primary Skills: - Around Relevant 3+ years of Hands-on experience in DBT, Snowflake, CICD, Python (Nice to have), SQL - Taking ownership of tasks - Eager to learn, good communication skills, enthusiastic to upskill If you are a motivated and experienced Data Architect looking to work on challenging projects in a dynamic environment, we encourage you to apply for this position.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Manager at Autodesk, you will lead the BI and Data Engineering Team to develop and implement business intelligence solutions. Your role is crucial in empowering decision-makers through trusted data assets and scalable self-serve analytics. You will oversee the design, development, and maintenance of data pipelines, databases, and BI tools to support data-driven decision-making across the CTS organization. Reporting to the leader of the CTS Business Effectiveness department, you will collaborate with stakeholders to define data requirements and objectives. Your responsibilities will include leading and managing a team of data engineers and BI developers, fostering a collaborative team culture, managing data warehouse plans, ensuring data quality, and delivering impactful dashboards and data visualizations. You will also collaborate with stakeholders to translate technical designs into business-appropriate representations, analyze business needs, and create data tools for analytics and BI teams. Staying up to date with data engineering best practices and technologies is essential to ensure the company remains ahead of the industry. To qualify for this role, you should have 3 to 5 years of experience managing data teams and a BA/BS in Data Science, Computer Science, Statistics, Mathematics, or a related field. Proficiency in Snowflake, Python, SQL, Airflow, Git, and big data environments like Hive, Spark, and Presto is required. Experience with workflow management, data transformation tools, and version control systems is preferred. Additionally, familiarity with Power BI, AWS environment, Salesforce, and remote team collaboration is advantageous. The ideal candidate is a data ninja and leader who can derive insights from disparate datasets, understand Customer Success, tell compelling stories using data, and engage business leaders effectively. At Autodesk, we are committed to creating a culture where everyone can thrive and realize their potential. Our values and ways of working help our people succeed, leading to better outcomes for our customers. If you are passionate about shaping the future and making a meaningful impact, join us in our mission to turn innovative ideas into reality. Autodesk offers a competitive compensation package based on experience and location. In addition to base salaries, we provide discretionary annual cash bonuses, commissions, stock grants, and a comprehensive benefits package. If you are interested in a sales career at Autodesk or want to learn more about our commitment to diversity and belonging, please visit our website for more information.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a part-time Mental Health Therapist at Spring Health, you will be contributing to our mission of eliminating every barrier to mental health. Spring Health is a comprehensive mental health benefit provider for employers, dedicated to helping employees understand their mental health issues and connecting them with top-tier providers for appropriate treatment. Your role will involve providing counseling to clients with various benefits, treating adults based on online screenings, and potentially expanding your practice to include other populations such as children, adolescents, couples, or families. It is crucial to maintain a calendar of availability, document sessions promptly, and utilize evidence-based treatment modalities tailored to each client's needs. Collaboration is key at Spring Health, and you will work closely with our provider support team, licensed clinical care navigation team, and administrative care support team whenever required. Upholding values of diversity and inclusion, you will provide culturally competent and empathetic care to individuals of all backgrounds. To be successful in this role, you should be a qualified counselor licensed to practice in India with a minimum of 3 years of post-training experience. Comfort with technology, tele-health competence, and experience in evidence-based modalities such as CBT, DBT, EMDR, ACT, and CPT are essential. You should also be capable of providing safety planning and intervention during high-risk situations. Spring Health offers a flexible work environment where you can set your own schedule, work remotely, and benefit from administrative support that allows you to focus on clinical care. Feedback from evidence-based measures will help you enhance your clinical abilities, while the supportive community and collaborative opportunities with other providers and clinical care navigators will enrich your professional experience. If you are applying for a hybrid role, please note that you will need your own office space. At Spring Health, we value member advocacy, urgency in making a difference, accountability, diversity, innovation, and open communication. Join us in shaping the future of mental health care while delivering high-quality clinical services in a supportive and dynamic environment.,

Posted 1 week ago

Apply

6.0 - 8.0 years

4 - 8 Lacs

Hosur, Bengaluru

Work from Office

Work Location: BANGALORE, KA. Skill Required - Digital : Snowflake Experience Range in Required Skills: 6-8 Job Description: 4+ years of Snowflake developer experience, DBT is mandatory with 3+ years of relevant experience. Mandatory - Snowflake, Airflow, DBT, python Excellent problem-solving and analytical skills.

Posted 1 week ago

Apply

10.0 - 15.0 years

20 Lacs

Chennai

Work from Office

Candidate Specification: Any Graduate, Min 10+; years relevant Experience; Job Description: Strong hands-on experience with the following: Snowflake, Redshift, Big Query.; Proficiency in; Data Build Tool - DBT and SQL-based data modeling and transformation. Solid understanding of data warehousing concepts, star/snowflake schemas, and performance optimization. Experience with modern ETL/ELT tools and cloud-based data pipeline frameworks. Familiarity with version control systems (e.g., Git) and CI/CD practices for data workflows. Strong problem-solving skills and attention to detail. Should have excellent Inter Personal skill. Contact Person: Deepikad Email ID : deepikad@gojobs.biz

Posted 1 week ago

Apply

4.0 - 5.0 years

10 - 16 Lacs

Pune

Work from Office

High Priority Requirement We have a high priority requirement candidates must be able to pass the SQL and Python portions of the Glider test prior to submittal. Please see below: We have a new offshore requirement these are high priority. Our client is looking to interview ASAP since he just had a few associates in India give notice after supporting them for years. Rate : They are looking for mid/senior level experience so let us know if their budget is way off or if this is doable with the multiple openings. Hours : 8am - 4pm EST, 4am -- 1pm EST Looking for candidates who can fulfill these shifts; they have openings for both shifts Interview Rounds : 3 rounds Top Skills Strong SQL Strong Python Nice to Have Snowflake Databricks dbt Position Summary The Senior Data Operations Engineer is a crucial team member, providing essential data support within the modern data stack, alongside coaching and hands-on assistance to engineers, analysts, business users, data scientists, and decision-makers across the company. This role demands deep knowledge of SQL, Python and Optimization , as well as familiarity with tools like Snowflake, Databricks, Azure Cloud, DBT, and git version control. Senior Data Operations Engineers play a key role in managing and enhancing data workflows, combining technical skills with a thorough understanding of data management principles. They contribute to coding new features, assist other principal engineers with architectural plans, and conduct code reviews to ensure that new features and fixes efficiently meet stakeholder needs. Ideal candidates should enjoy teamwork and be adept at publicly sharing their ideas. The Senior Data Operations Engineer reports to the Manager of IT Data Operations Engineering. Essential Responsibilities Collaborate with business partners to comprehend external system configurations and establish connectivity, facilitating downstream data engineering development Oversee the entire data pipeline, from data collection to deployment of data models Monitor data pipeline performance and support bug fixing and performance analysis along the data pipeline; resolve any issues or bottlenecks Extensive experience in optimization and cost savings, with a proven track record of effectively managing and enhancing data workflows to reduce overhead and improve operational efficiency in data operations Perform end-to-end unit testing and code reviews to promote data integrity across a variety of products built by the development team Identify and implement process improvements, such as automating manual processes Provide technical support and training to end-users on data access and usage Be comfortable presenting to large groups in public settings with high visibility Be a strong advocate for a culture of process and data quality across development teams Follow an agile development methodology Other duties as assigned Minimum Experience and Qualifications Bachelors degree in Computer Science or Engineering; OR demonstrated capability to perform job responsibilities with a combination of a High School Diploma/GED and at least four (4) years of previous relevant work experience Five (5) years of relevant experience in a data role working with data warehouses and data analytics tools Familiarity with cloud services (AWS, Azure, or Google Cloud) and understanding of data warehousing solutions like Snowflake Proficiency with SQL skills Experience with modern Extract/Load/Transform (ELT) orchestration tools like Azure Data Factory or Airflow Experience with git and git-based workflows Experience in optimization and enhancement of cloud environments Knowledge of data modeling, data warehousing, and data architecture principles Excellent problem-solving skills and the ability to work in a team environment Strong communication skills and the ability to convey complex data issues in clear terms to non-technical stakeholders Must be legally eligible to work in the country in which the position is located Preferred Experience and Qualifications Experience implementing best-practices for performance tuning and data processes, optimizing resource utilization, and implementing cost-effective solutions to enhance data operations Strong knowledge of Python programming Proven track record of successfully contributing to a project that transitioned a large enterprise to a new cloud data warehouse, like Snowflake Prior airline experience

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 11 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Title: Developer Work Location: BANGALORE, KA. Skill Required - Digital : Snowflake Experience Range in Required Skills: 6-8

Posted 1 week ago

Apply

4.0 - 6.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Responsibilities Role name: Developer Role Description: Data Engineer with Python, Snowflake, DBT, AWS/Azure Experience (Years): 4-6 Years Essential Skills: Data Engineer with Python, Snowflake, DBT, AWS/Azure Desirable Skills: Data Engineer with Python, Snowflake, DBT, AWS/Azure Location: BANGALORE

Posted 1 week ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Kanpur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies