61 Snowpark Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

delhi

On-site

As a Senior Snowflake Data Engineer at Bright Vision Technologies, your role will involve designing, building, and maintaining large-scale data warehouses using Snowflake. You will be expected to have expertise in DBT (Data Build Tool) and Python. Key Responsibilities: - Designing, building, and maintaining large-scale data warehouses using Snowflake - Utilizing expertise in DBT and Python for data processing, transformation, and loading tasks - Implementing ETL transformations using snowpark - Collaborating with cross-functional teams for effective communication and problem-solving - Working with Agile development methodologies and version control systems like Git - Utilizing data visualiza...

Posted 1 month ago

AI Match Score
Apply

10.0 - 15.0 years

18 - 20 Lacs

noida, gurugram

Work from Office

Lead design & implementation of scalable data pipelines using Snowflake & Databricks. Drive data architecture, governance. Build ETL/ELT, optimize models, mentor team, ensure security, compliance.strong Snowflake, Databricks, SQL, Python Required Candidate profile Experienced Data Analytics Lead skilled in Snowflake, Databricks, SQL, Python. Proven leader in designing scalable pipelines, data governance, ETL/ELT, and team mentoring.

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

15 - 19 Lacs

gurugram

Work from Office

Strong skills in Java 8+, Web application frameworks such as Spring Boot, and RESTful API development. Familiarity with AWS Toolsets, including but not limited to SQS, Lambda, DynamoDB, RDS, S3, Kinesis, Cloud formation Demonstrated experience in designing, building, and documenting customer facing RESTful APIs Demonstrable ability to read high-level business requirements and drive clarifying questions. Demonstrable ability to engage in self-paced continuous learning to upskill, with the collaboration of engineering leaders. Demonstrable ability to manage your own time and prioritize how you spend your time most effectively. Strong skills with the full lifecycle of development, from analysis...

Posted 2 months ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

remote, india

On-site

Where Data Does More. Join the Snowflake team. We are looking for a Solutions Consultant to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake's solution and communicate that connection and vision to various technical and executive audiences. The person we're looking for shares our passion for reinventing the data platform and thrives in a dynamic environment. That means having the flexibility and willingness to jump in and get it don...

Posted 2 months ago

AI Match Score
Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (N...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

nagpur, maharashtra

On-site

As a Data Engineer at our organization, you will play a crucial role in expanding and optimizing our data and data pipeline architecture. Your responsibilities will include optimizing data flow and collection for cross-functional teams, supporting software developers, database architects, data analysts, and data scientists on data initiatives, and ensuring optimal data delivery architecture throughout ongoing projects. You will be expected to be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The ideal candidate will be experienced in data pipeline building and data wrangling, with a passion for optimizing data systems and building them from ...

Posted 2 months ago

AI Match Score
Apply

4.0 - 7.0 years

15 - 25 Lacs

bengaluru

Work from Office

Job Summary We are looking for a skilled Snowflake Developer with hands-on experience in Python, SQL, and Snowpark to join our data engineering team. You will be responsible for designing and building scalable data pipelines, developing Snowpark-based data applications, and enabling advanced analytics solutions on the Snowflake Data Cloud platform. Key Responsibilities Develop and maintain robust, scalable, and high-performance data pipelines using Snowflake SQL, Python, and Snowpark. Use Snowpark (Python API) to build data engineering and data science workflows within the Snowflake environment. Perform advanced data transformation, modeling, and optimization to support business reporting an...

Posted 2 months ago

AI Match Score
Apply

4.0 - 7.0 years

4 - 7 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking a proactive Senior Snowflake PySpark Developer to lead the design and maintenance of data pipelines in cloud environments. You will be responsible for building robust ETL processes using Snowflake, PySpark, SQL, and AWS Glue . This role requires strong expertise in data architecture, data modeling, and a collaborative mindset to work effectively with large datasets and cross-functional teams. Roles & Responsibilities: Design, build, and maintain data pipelines in cloud environments, with a focus on AWS . Utilize Snowflake, PySpark, SQL, and AWS Glue to perform ETL tasks. Work with large datasets, applying strong problem-solving skills to transform and analyze data. Collaborate...

Posted 3 months ago

AI Match Score
Apply

4.0 - 7.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a proactive Senior Snowflake PySpark Developer to lead the design and maintenance of data pipelines in cloud environments. You will be responsible for building robust ETL processes using Snowflake, PySpark, SQL, and AWS Glue . This role requires strong expertise in data architecture, data modeling, and a collaborative mindset to work effectively with large datasets and cross-functional teams. Roles & Responsibilities: Design, build, and maintain data pipelines in cloud environments, with a focus on AWS . Utilize Snowflake, PySpark, SQL, and AWS Glue to perform ETL tasks. Work with large datasets, applying strong problem-solving skills to transform and analyze data. Collaborate...

Posted 3 months ago

AI Match Score
Apply

4.0 - 7.0 years

4 - 7 Lacs

Delhi, India

On-site

We are seeking a proactive Senior Snowflake PySpark Developer to lead the design and maintenance of data pipelines in cloud environments. You will be responsible for building robust ETL processes using Snowflake, PySpark, SQL, and AWS Glue . This role requires strong expertise in data architecture, data modeling, and a collaborative mindset to work effectively with large datasets and cross-functional teams. Roles & Responsibilities: Design, build, and maintain data pipelines in cloud environments, with a focus on AWS . Utilize Snowflake, PySpark, SQL, and AWS Glue to perform ETL tasks. Work with large datasets, applying strong problem-solving skills to transform and analyze data. Collaborate...

Posted 3 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a skilled Snowflake Developer with over 7 years of experience, you will be responsible for designing, developing, and optimizing Snowflake data solutions. Your expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration will be crucial in building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Your key responsibilities will include: - Designing and developing Snowflake databases, schemas, tables, and views following best practices. - Writing complex SQL queries, stored procedures, and UDFs for data transformation. - Optimizing query performance using clustering, partitioning, and materialized views. -...

Posted 3 months ago

AI Match Score
Apply

3.0 - 8.0 years

0 Lacs

delhi

On-site

As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and t...

Posted 3 months ago

AI Match Score
Apply

6.0 - 11.0 years

22 - 27 Lacs

Pune, Bengaluru

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 3 months ago

AI Match Score
Apply

7.0 - 12.0 years

22 - 27 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Work from Office

Job Description - Snowflake Developer Experience: 7+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are looking for a Snowflake Developer with 7+ years of experience to design, develop, and maintain our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake SQL, data modeling, and ETL/ELT processes to build efficient and scalable data solutions. Key Responsibilities 1. Snowflake Development & Implementation Design and develop Snowflake databases, schemas, tables, and views Write and optimize complex SQL queries, stored procedures, and UDFs Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks) Manage virtual wareh...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer at Ethoca, a Mastercard Company in Pune, India, you will play a crucial role in driving data enablement and exploring big data solutions within our technology landscape. Your responsibilities will include designing, developing, and optimizing batch and real-time data pipelines using tools such as Snowflake, Snowpark, Python, and PySpark. You will also be involved in building data transformation workflows, implementing CI/CD pipelines, and administering the Snowflake platform to ensure performance tuning, access management, and platform scalability. Collaboration with stakeholders to understand data requirements and deliver reliable data solutions will be a key part ...

Posted 3 months ago

AI Match Score
Apply

2.0 - 5.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design and implement scalable data models using Snowflake to support business intelligence and analytics solutions. Implement ETL/ELT solutions that involve complex business transformations. Handle end-to-end Data warehousing solutions Migrate the data from legacy systems to Snowflake systems Write complex SQL queries for extracting, transforming, and loading data, ensuring high performance and accuracy. Optimize the SnowSQL queries for better processing speeds Integrate Snowflake with 3rd party applications Use any ETL/ELT technology Implement data security policies, including user access control and data masking, to maintain compliance with organizational standards. D...

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

22 - 27 Lacs

Pune, Bengaluru

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

22 - 27 Lacs

Chennai, Mumbai (All Areas)

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 4 months ago

AI Match Score
Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

________________________________________ Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, ...

Posted 4 months ago

AI Match Score
Apply

6.0 - 8.0 years

6 - 8 Lacs

Navi Mumbai, Maharashtra, India

On-site

We are looking for a Senior Big Data Engineer with deep experience in building scalable, high-performance data processing pipelines using Snowflake (Snowpark) and the Hadoop ecosystem . You'll design and implement batch and streaming data workflows, transform complex datasets, and optimize infrastructure to power analytics and data science solutions. Key Responsibilities: Design, develop, and maintain end-to-end scalable data pipelines for high-volume batch and real-time use cases. Implement advanced data transformations using Spark, Snowpark, Pig , and Sqoop . Process large-scale datasets from varied sources using tools across the Hadoop ecosystem . Optimize data storage and retrieval in HB...

Posted 4 months ago

AI Match Score
Apply

6.0 - 8.0 years

6 - 8 Lacs

Delhi, India

On-site

We are looking for a Senior Big Data Engineer with deep experience in building scalable, high-performance data processing pipelines using Snowflake (Snowpark) and the Hadoop ecosystem . You'll design and implement batch and streaming data workflows, transform complex datasets, and optimize infrastructure to power analytics and data science solutions. Key Responsibilities: Design, develop, and maintain end-to-end scalable data pipelines for high-volume batch and real-time use cases. Implement advanced data transformations using Spark, Snowpark, Pig , and Sqoop . Process large-scale datasets from varied sources using tools across the Hadoop ecosystem . Optimize data storage and retrieval in HB...

Posted 4 months ago

AI Match Score
Apply

6.0 - 8.0 years

6 - 8 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are looking for a Senior Big Data Engineer with deep experience in building scalable, high-performance data processing pipelines using Snowflake (Snowpark) and the Hadoop ecosystem . You'll design and implement batch and streaming data workflows, transform complex datasets, and optimize infrastructure to power analytics and data science solutions. Key Responsibilities: Design, develop, and maintain end-to-end scalable data pipelines for high-volume batch and real-time use cases. Implement advanced data transformations using Spark, Snowpark, Pig , and Sqoop . Process large-scale datasets from varied sources using tools across the Hadoop ecosystem . Optimize data storage and retrieval in HB...

Posted 4 months ago

AI Match Score
Apply

8.0 - 10.0 years

8 - 10 Lacs

Navi Mumbai, Maharashtra, India

On-site

We are seeking an experienced Big Data Engineer to design and maintain scalable data processing systems and pipelines across large-scale, distributed environments. This role requires deep expertise in tools such as Snowflake (Snowpark), Spark, Hadoop, Sqoop, Pig, and HBase . You will work closely with data scientists and stakeholders to transform raw data into actionable intelligence and power analytics platforms. Key Responsibilities: Design and develop high-performance, scalable data pipelines for batch and streaming processing. Implement data transformations and ETL workflows using Spark, Snowflake (Snowpark), Pig, Sqoop , and related tools. Manage large-scale data ingestion from various ...

Posted 4 months ago

AI Match Score
Apply

8.0 - 10.0 years

8 - 10 Lacs

Delhi, India

On-site

We are seeking an experienced Big Data Engineer to design and maintain scalable data processing systems and pipelines across large-scale, distributed environments. This role requires deep expertise in tools such as Snowflake (Snowpark), Spark, Hadoop, Sqoop, Pig, and HBase . You will work closely with data scientists and stakeholders to transform raw data into actionable intelligence and power analytics platforms. Key Responsibilities: Design and develop high-performance, scalable data pipelines for batch and streaming processing. Implement data transformations and ETL workflows using Spark, Snowflake (Snowpark), Pig, Sqoop , and related tools. Manage large-scale data ingestion from various ...

Posted 4 months ago

AI Match Score
Apply

8.0 - 10.0 years

8 - 10 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are seeking an experienced Big Data Engineer to design and maintain scalable data processing systems and pipelines across large-scale, distributed environments. This role requires deep expertise in tools such as Snowflake (Snowpark), Spark, Hadoop, Sqoop, Pig, and HBase . You will work closely with data scientists and stakeholders to transform raw data into actionable intelligence and power analytics platforms. Key Responsibilities: Design and develop high-performance, scalable data pipelines for batch and streaming processing. Implement data transformations and ETL workflows using Spark, Snowflake (Snowpark), Pig, Sqoop , and related tools. Manage large-scale data ingestion from various ...

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies