Jobs
Interviews

398 Dbt Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

6 - 9 Lacs

Noida, Uttar Pradesh, India

On-site

We are urgently hiring for a Snowflake developer with a reputed Client for Noida, Location Experience - 6 - 10 years Looking for IMMEDIATE JOINER to 3rd week of June Mission: Snowflake developer Python SQL DBT

Posted 3 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Noida, Gurugram

Hybrid

Skills Matrix: - Snowflake Data Build Tool (DBT) SQL Snowflake Developer Openings only for Gurugram/Noida.

Posted 3 weeks ago

Apply

10.0 - 20.0 years

25 - 30 Lacs

Bengaluru

Remote

Role & responsibilities Data Platform : Snowflake, dbt, Fivetran, Oracle OCI Visualization : Tableau Cloud & Identity : Azure, Microsoft Entra (Entra ID / Azure AD) Infrastructure as Code : OpenTofu (Terraform alternative) - Migration from Terraform Scripting & Monitoring : SQL, Python/Bash, monitoring tools

Posted 3 weeks ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Pune

Remote

Role & responsibilities We are seeking a Production Support Lead with expertise in modern data platforms to oversee the reliability, performance, and user access control of our analytics and reporting environment. This individual will lead operational support across tools like Snowflake, dbt, Fivetran, Tableau, and Azure Entra, ensuring compliance and high data availability. The ideal candidate will not only resolve technical issues but also guide the team in scaling and automating platform operations. Key Responsibilities: Snowflake Platform Management: Oversee production operations, including query performance, object dependencies, warehouse sizing, replication setup, and role management. Data Ingestion Support: Monitor and manage Fivetran pipelines for data ingestion from Oracle OCI to Snowflake. Transformation Layer Oversight: Maintain and troubleshoot dbt jobs, ensuring timely and accurate data transformations. Tableau Operations: Validate data sources, dashboard usage, performance, and version control Manage user access and role-level security Ensure SOX compliance for reporting environments Access & Identity Management: Administer access control via Microsoft Entra, including mapping users/groups to appropriate roles across environments. IAC Operations: Lead and maintain OpenTofu (Terraform alternative) deployments for infrastructure provisioning. Monitoring & Alerts: Set up automated monitoring tools and alerts for proactive detection of system anomalies. Incident & Problem Management: Lead root cause analysis for production issues and coordinate with cross-functional teams for permanent fixes. Compliance & Governance: Ensure platform is audit-ready; document SOPs, user access logs, and remediation procedures. Required Skills: Strong knowledge and hands-on experience with Snowflake, dbt, Tableau, Fivetran, Oracle OCI, OpenTofu/Terraform Proven ability in Azure platform administration, especially Entra ID (formerly Azure AD) Experience with access governance, data security, and SOX compliance Proficient in SQL, scripting (e.g., Python or Bash), and monitoring tools Comfortable leading triage calls, managing escalations, and coordinating between dev, infra, and business teams Preferred Qualifications: Experience leading migrations (e.g., Terraform to OpenTofu) ITIL or equivalent certification in incident/change/problem management Experience working in highly regulated industries (finance, healthcare, manufacturing) Role Expectations: Act as a bridge between engineering, infra, and business stakeholders Own SLAs, root cause reviews, and system documentation Mentor junior support staff and onboard new hires Recommend and implement automation for recurring issues or manual workflows

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant- Sr.Data Engineer ( DBT +Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities : . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with e xperience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Pune

Remote

Role & responsibilities Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Location: Bengaluru (Hybrid) We are looking for an experienced Senior Data Engineer to join our Marketing Data Engineering Team . Reporting to the Manager, Data Engineering , this is a hybrid position based in Bengaluru. The Data Engineer will help expand and optimize our data and data pipeline architecture, as well as optimize data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support Software Developers, Data Quality Engineers, Data Analysts, and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Responsibilities Create and maintain optimal data pipeline architecture. Assemble complex data sets that meet functional/non-functional requirements. Identify, design, and implement internal process improvements, including automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, dbt, and AWS big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into employee experience, operational efficiency, and other key business performance metrics. Work with stakeholders to assist with data-related technical issues and support associated data infrastructure needs. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Stay up to date with the latest features and capabilities from public cloud providers (AWS, Azure) and apply them to enhance the team. Collaborate with data scientists and analysts to strive for greater functionality in our data systems. Minimum Qualifications 5+ years of experience as a Data Engineer. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. 5+ years of hands-on experience with Snowflake . 5+ years of experience with dbt , including advanced dbt concepts like macros and Jinja templating. Advanced working SQL experience with relational databases and query authoring. Experience with scripting languages such as Python . Experience with big data tools such as PySpark . Experience with AWS cloud services commonly used for data engineering, including S3, EC2, Glue, Lambda, RDS, or Redshift. Experience working with APIs to pull and push data. Experience optimizing big data pipelines, architectures, and data sets. Experience performing root cause analysis on data and processes to answer business questions and identify improvement opportunities. Preferred Qualifications Experience with AWS CloudFormation templates is a plus. Familiarity with Agile and SCRUM methodologies is a plus. Experience developing dashboards with PowerBI is a plus. Analytical skills for working with unstructured datasets. Proven history of extracting value from large, disconnected datasets. Experience working with agile, globally distributed teams.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Noida

Work from Office

Hi All, We are urgently hiring for a "Snowflake developer"" with a reputed Client for Noida, Location Experience - 6 - 10 years ### Looking for IMMEDIATE JOINER ONLY Mission: Snowflake developer Python SQL DBT Interested candidate kindly share your resume on varsha.si@peoplefy.com

Posted 3 weeks ago

Apply

4.0 - 9.0 years

5 - 10 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Design, develop, and maintain data pipelines using Python and SQL. Write efficient, optimized SQL queries for data extraction, transformation, and reporting. Automate data workflows and integrate APIs or third-party services. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights. Perform data validation, cleansing, and quality checks. Develop dashboards or reports using BI tools (optional, if applicable). Document processes, code, and data models for future reference. Required Skills & Qualifications: Strong proficiency in Python (Pandas, NumPy, etc.). Advanced knowledge of SQL (joins, subqueries, CTEs, window functions). Experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Familiarity with version control systems like Git. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, GCP). Familiarity with data visualization tools (e.g., Power BI, Tableau). Knowledge of ETL tools or frameworks (e.g., Airflow, dbt). Background in data warehousing or big data technologies. Education: Bachelors or Master's degree in Computer Science, Information Systems, Engineering, or a related field.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities . Architect and design solutions to meet functional and non-functional requirements. . Create and review architecture and solution design artifacts. . Evangelize re-use through the implementation of shared assets. . Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. . Proactively guide engineering methodologies, standards, and leading practices. . Guidance of engineering staff and reviews of as-built configurations during the construction phase. . Provide insight and direction on roles and responsibilities required for solution operations. . Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. . Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. . Demonstrate strong analytical and technical problem-solving skills. . Ability to analyze and operate at various levels of abstraction. . Ability to balance what is strategically right with what is practically realistic. . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. . Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. . Supporting and developing our people, including learning & development, certification & career development plans . Providing technical governance and oversight for solution design and implementation . Should have technical foresight to understand new technology and advancement. . Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance . Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. . Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives . Evangelizing the Data Engineering service offerings to both internal and external stakeholders . Development of Whitepapers, blogs, webinars and other though leadership material . Development of Go-to-Market and Service Offering definitions for Data Engineering . Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. . Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. . Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. . Build new Data capabilities, solutions, assets, accelerators, and team competencies. . Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications . Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. . Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. . Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. . Knowledge and experience in IT methodologies and life cycles that will be used. . Familiar with solution implementation/management, service/operations management, etc. . Leadership skills can inspire others and persuade. . Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. . Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience . Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. . Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. . Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance . Must have experience to design platform on Databricks. . Hands-on Experience to design and build Databricks based solution on any cloud platform. . Hands-on experience to design and build solution powered by DBT models and integrate with databricks. . Must be very good designing End-to-End solution on cloud platform. . Must have good knowledge of Data Engineering concept and related services of cloud. . Must have good experience in Python and Spark. . Must have good experience in setting up development best practices. . Intermediate level knowledge is required for Data Modelling. . Good to have knowledge of docker and Kubernetes. . Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC, SSO etc. . Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. . Experience building and supporting mission-critical technology components with DR capabilities. . Experience with multi-tier system and service design and development for large enterprises . Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. . Exposure to infrastructure and application security technologies and approaches . Familiarity with requirements gathering techniques. Preferred qualifications . Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. . Must have excellent coding skills either Python or Scala, preferably Python. . Must have experience in Data Engineering domain with total . Must have designed and implemented at least 2-3 project end-to-end in Databricks. . Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint - Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security . Must have knowledge of new features available in Databricks and its implications along with various possible use-case. . Must have followed various architectural principles to design best suited per problem. . Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. . Must have strong understanding of Data warehousing and various governance and security standards around Databricks. . Must have knowledge of cluster optimization and its integration with various cloud services. . Must have good understanding to create complex data pipeline. . Must be strong in SQL and sprak-sql. . Must have strong performance optimization skills to improve efficiency and reduce cost. . Must have worked on designing both Batch and streaming data pipeline. . Must have extensive knowledge of Spark and Hive data processing framework. . Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. . Must be strong in writing unit test case and integration test. . Must have strong communication skills and have worked with cross platform team. . Must have great attitude towards learning new skills and upskilling the existing skills. . Responsible to set best practices around Databricks CI/CD. . Must understand composable architecture to take fullest advantage of Databricks capabilities. . Good to have Rest API knowledge. . Good to have understanding around cost distribution. . Good to have if worked on migration project to build Unified data platform. . Good to have knowledge of DBT. . Experience around DevSecOps including docker and Kubernetes. . Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools . Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java, Python, etc. . Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx . Experience with visualization tools such as Tableau, Power BI . Experience with machine learning tools such as mlFlow, Databricks AI/ML, Azure ML, AWS sagemaker, etc. . Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. . Experience coordinating the intersection of complex system dependencies and interactions . Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant - Sr.Data Enginee r ( DBT +Snowfl ake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities : . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. e xperience in data engineering, wit h experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Role & responsibilities Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 1 month ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Role & responsibilities Snowflake JD Must have 8- 15 years of experience in Data warehouse, ETL, BI projects • Must have at least 5+ years of experience in Snowflake, 3+DBT • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Pune

Remote

Role & responsibilities Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 1 month ago

Apply

7.0 - 12.0 years

6 - 16 Lacs

Bengaluru

Remote

5+ years’ experience with a strong proficiency with SQL query/development skills Hands-on experience with ETL tools Experience working in the healthcare industry with PHI/PII

Posted 1 month ago

Apply

2.0 - 7.0 years

6 - 8 Lacs

Mumbai, Hyderabad, Bengaluru

Work from Office

Hiring Requirements 1. Work Arrangement- Must be available to do in person sessions for 5 days a week Must be open to doing evening sessions (9 PM - 6 AM) but time slot will be approved by clients Must be open to a contract role (Contract ends after 2 years) 2. Education Graduated with a Masters degree in Clinical Psychology, Counselling Psychology, Counselling 3. Experience Minimum of 3 years clinical experience (Including experience done in Masters Degree) Minimum 500 supervised clinical hours (including hours done in Masters degree) Supervision Requirement: Open to doing supervision once every 2 months 4. License (must have a minimum of 1) Counsellors Council of India Bharatiya Counseling Psychology Association Clinical Psychology Society of India Rehabilitation Council of India 5. Location Hyderabad Mumbai Bangalore Job Description Provide onsite/in-person mental health counselling for the employees of Intellect's local clients in India Report to the client's office either for morning, evening or overnight shifts (dependent on the the clients request) Shifts may be rotated depending on the provided availability Provider will be notified of the changes at least 30 days in advance Facilitate 1 on 1 sessions on various mental health topics (e.g. resilience, stress management, and available well-being resources) Lead group sessions on coping strategies for managing stress and challenging information Moderate peer support sessions to encourage shared experiences and well-being discussions Develop and deliver group sessions and workshops to promote mental health awareness and resilience Share monthly report on utilisation / participation Collaboratively work with Intellect's internal clinical team by suggesting improvements to the program and the platform Conduct once a month, 1-hour checks ins with the internal Intellect team during regular working hours (10 AM - 6 PM Singapore Time) Interested? Please share your updated CV with us at mhn@hireindians.com or call 8700943881

Posted 1 month ago

Apply

9.0 - 11.0 years

0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Snowflake, SQL, Stored Procs, Azure Data Bricks, PySpark, Unity Catalog, Purview, Data Build Tool (DBT), Lakehouse, Delta Tables, Optimization and Troubleshooting skills, Metadata Drven Framework. Good to have Security Knowledge, PowerBI, Scala

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 19 Lacs

Chennai

Work from Office

Senior Data Engineer - Azure Years of Experience : 5 Job location: Chennai Job Description : We are looking for a skilled and experienced Senior Azure Developer to join the team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating,and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : DBT,Python,Databricks,Airflow,Fivetran,Glue,Snowflake Role Description : Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility : l Translate functional specifications and change requests into technical specifications l Translate business requirement document, functional specification, and technical specification to related coding l Develop efficient code with unit testing and code documentation l Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving l Setting up the development environment and configuration of the development tools l Communicate with all the project stakeholders on the project status l Manage, monitor, and ensure the security and privacy of data to satisfy business needs l Contribute to the automation of modules, wherever required l To be proficient in written, verbal and presentation communication (English) l Co-ordinating with the UAT team Role Requirement : l Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) l Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) l Knowledgeable in Shell / PowerShell scripting l Knowledgeable in relational databases, nonrelational databases, data streams, and file stores l Knowledgeable in performance tuning and optimization l Experience in Data Profiling and Data validation l Experience in requirements gathering and documentation processes and performing unit testing l Understanding and Implementing QA and various testing process in the project l Knowledge in any BI tools will be an added advantage l Sound aptitude, outstanding logical reasoning, and analytical skills l Willingness to learn and take initiatives l Ability to adapt to fast-paced Agile environment Additional Requirement : l Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. l Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. l Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. l Utilize Azure Databricks for data transformation and processing. l Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. l Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. l Proficient in programming languages like Python, SQL, and conversant with pertinent l scripting languages.

Posted 1 month ago

Apply

5.0 - 20.0 years

10 - 35 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

Mandatory Skill - Snowflake, Matillion

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Pune, Chennai, Bengaluru

Hybrid

Role & responsibilities Snowflake JD Must have 12- 22 years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake, 3+DBT • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Jaipur

Remote

Lead Databricks to Snowflake migration. Expertise in PySpark, Snowflake, DBT, Airflow, CI/CD, SQL optimization, and orchestration. Ensure scalable, high-performance pipelines with strong DevOps and monitoring practices.

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 20 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Role & responsibilities (Exp is required 6+ Years) Job Description: Enterprise Business Technology is on a mission to support and create enterprise software for our organization. We're a highly collaborative team that interlocks with corporate functions such as Finance and Product teams to deliver value with innovative technology solutions. Each day, thousands of people rely on Enlyte's technology and services to help their customers during challenging life events. We're looking for a remote Senior Data Analytics Engineer for our Corporate Analytics team. Opportunity - Technical lead for our corporate analytics practice using dbt, Dagster, Snowflake and Power BI, SQL and Python Responsibilities Build our data pipelines for our data warehouse in Python working with APIs to source data Build power bi reports and dashboards associated to this process Contribute to our strategy for new data pipelines and data engineering approaches Maintain a medallion based architecture for data analysis with Kimball Participates in daily scrum calls, follows agile SDLC Creates meaningful documentation of their work Follow organizational best practices for dbt and writes maintainable code Qualifications 5+ years of professional experience as a Data Engineer Strong dbt experience (3+ years) and knowledge of modern data stack Strong experience with Snowflake (3+ years) You have experience using Dagster and running complex pipelines (1+ year) Some Python experience, experience with git and Azure Devops Experience with data modeling in Kimball and medallion based structures

Posted 1 month ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Pune, Bengaluru

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 1 month ago

Apply

7.0 - 12.0 years

30 - 40 Lacs

Hyderabad

Work from Office

Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies