Jobs
Interviews

37 Snowpark Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

8 - 10 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are seeking an experienced Big Data Engineer to design and maintain scalable data processing systems and pipelines across large-scale, distributed environments. This role requires deep expertise in tools such as Snowflake (Snowpark), Spark, Hadoop, Sqoop, Pig, and HBase . You will work closely with data scientists and stakeholders to transform raw data into actionable intelligence and power analytics platforms. Key Responsibilities: Design and develop high-performance, scalable data pipelines for batch and streaming processing. Implement data transformations and ETL workflows using Spark, Snowflake (Snowpark), Pig, Sqoop , and related tools. Manage large-scale data ingestion from various ...

Posted 3 months ago

Apply

5.0 - 15.0 years

22 - 24 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Bengaluru, Chennai, Gurgaon JobType: full-time We are looking for an experiencedSnowflake Developerto join our Data Engineering team. The ideal candidate will possess a deep understanding ofData Warehousing,SQL,ETL tools like Informatica, andvisualization platforms such as Power BI. This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing:Leverage over 5 years of hands-on experience in...

Posted 3 months ago

Apply

6.0 - 11.0 years

5 - 15 Lacs

Ahmedabad, Mumbai (All Areas)

Work from Office

6 years of exp. with AWS, Snowflake, Microsoft SQL Server, SSMS, Visual Studio, and Data Warehouse ETL processes. 4 yrs of programming exp. with Python, C#, VB.NET, T-SQL. Minimum of 3 years of exp. building end-to-end pipelines within AWS Stack. Required Candidate profile Strong collaborative team-oriented style Impeccable customer service skills Exp. with healthcare information systems and healthcare practice processes. Exp. with SaaS applications. Good Communication

Posted 3 months ago

Apply

3.0 - 6.0 years

0 - 0 Lacs

Hyderabad

Work from Office

Snowflake Developer Job Location: Hyderabad Description: We are seeking a talented SNOWFLAKE ETL/ELT Engineer to join our growing Data Engineering team. The ideal candidate will have extensive experience designing, building, and maintaining scalable data integration solutions in Snowflake. Responsibilities: Design, develop, and implement data integration solutions using Snowflake's ELT features Load and transform large data volumes from a variety of sources into Snowflake Optimize data integration processes for performance and efficiency Collaborate with other teams, such as Data Analytics and Business Intelligence, to ensure the integration of data into the data warehouse meets their needs ...

Posted 3 months ago

Apply

4.0 - 9.0 years

15 - 27 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Location: Kolkata, Hyderabad, Bangalore Exp 4 to 17 years Band 4B, 4C, 4D Skill set -Snowflake, Horizon , Snowpark, Kafka for ETL

Posted 3 months ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

Consultant Data Engineer Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now. Duties and Responsibility Act as Consultant Data Engineer Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage. Create a robust, extensible architecture to meet the client/business requirements Snowflake objects with integration with AWS services and DBT Involved in different type of data ingestion pipelines as per requirements. Development in DBT (Data Build Tool) for data transformation as per the requiremen...

Posted 3 months ago

Apply

4.0 - 8.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Job Location: Bangalore Experience: 4+ Years Job Type: FTE Note: Looking only for Immediate to 1 week joiners. Must be comfortable for Video discussion. JD KeySkills required : Option :1 Bigdata Hadoop + Hive + HDFS Python OR Scala - Language OR Option :2 Snowflake with Bigdata knowledge & Snowpark is preferred Python / Scala - Language Contact Person - Amrita Please share your updated profile to amrita.anandita@htcinc.com with the below mentioned details: Full Name (As per Aadhar card) - Total Exp. - Rel. Exp. (Bigdata Hadoop) - Rel. Exp. (Python) - Rel. Exp. (Scala) - Rel. Exp. (Hive) - Rel. Exp. (HDFS) - OR Rel. Exp. (Snowflake) - Rel. Exp. (Snowpark) - Highest Education (if has done B.Te...

Posted 3 months ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Big Data Engineer (Remote, Contract 6 Months+) ig Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop eco...

Posted 3 months ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Big Data Engineer (Remote, Contract 6 Months+) Location: Remote | Contract Duration: 6+ Months | Domain: Big Data Stack We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop ecosy...

Posted 3 months ago

Apply

3 - 5 years

0 - 2 Lacs

Bengaluru

Hybrid

Demand 1 :: - Mandatory Skill :: 3.5 -7 Years (Bigdata -Adobe& scala, python, linux) Demands 2:: Mandatory Skill :: 3.5 -7 Years (Bigdata -Snowflake (snowpark ) & scala, python, linux) Specialist Software Engineer - Bigdata Missions We are seeking an experienced Big Data Senior Developer to lead our data engineering efforts. In this role, you will design, develop, and maintain large-scale data processing systems. You will work with cutting-edge technologies to deliver high-quality solutions for data ingestion, storage, processing, and analytics. Your expertise will be critical in driving our data strategy and ensuring the reliability and scalability of our big data infrastructure. Profile 3 ...

Posted 4 months ago

Apply

6.0 - 11.0 years

6 - 14 Lacs

pune

Hybrid

Project Role Description: A Snowflake Developer will be responsible for designing and developing data solutions within the Snowflake cloud data platform using SNOWPARK, Apache Airflow, Data Build Tool (DBT) and Fivetran. Work location: Pune/Remote. Graduate or Post-Graduate in Computer Science/ Information Technology/Engineering. Job Requirements: Must Have Skills: 6 to 11 years IT Experience as Snowflake Developer . Experience in Telcom Domain BSS/OSS. Minimum experience with 4+ years on Snowflake is MUST. Strong experience with Snowflake (data modeling, performance tuning, security). Proficient in dbt (Data Build Tool) for data transformation is MUST (model creation, Jinja templates, macro...

Posted Date not available

Apply

6.0 - 11.0 years

22 - 27 Lacs

pune, bengaluru

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted Date not available

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies