Jobs
Interviews

90 Snowflake Sql Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 3 months ago

Apply

5 - 10 years

15 - 25 Lacs

Bengaluru

Work from Office

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Qualification : Any Graduate or Above Relevant Experience : 5 to 10 Years Required Technical Skill Set (Skill Name) : Snowflake, Azure Managed Services platforms Keywords Must-Have Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake Location : PAN INDIA CTC Range : 25 LPA-40 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. Ipooja.singh@blackwhite.in I www.blackwhite.in

Posted 3 months ago

Apply

5 - 7 years

5 - 15 Lacs

Pune, Chennai, Bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 months ago

Apply

6 - 11 years

15 - 30 Lacs

Bengaluru

Hybrid

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Required skills and qualifications : Snowflake SQL PYTHON Qualification : Any Graduate or Above Relevant Experience : 4 to 12yrs Location : BLR/HYD/PUNE/BBSR CTC Range : 10 to 35 LPA Notice period : Any Mode of Interview : Virtual Mode of Work : In Office Nithyashree Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA Nithyashree@blackwhite.in I www.blackwhite.in

Posted 3 months ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 3 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bhubaneshwar, Bengaluru, Hyderabad

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 3 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bengaluru, Hyderabad, Noida

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 3 months ago

Apply

4.0 - 6.0 years

10 - 20 Lacs

bengaluru

Hybrid

Primary Responsibilities: Interaction with key stakeholders on regular basis and identify the new Analytics business opportunities Provides explanations and interpretations within area of expertise by Analyzing and investigating Drive Marketing Analytics for Healthcare Domain Uses pertinent data and facts to identify and solve a range of problems within area of expertise Managing a team of data/statistical analysts Works exclusively within a specific knowledge area Prioritizes and organizes own work to meet deadlines Provides explanations and information to others on topics within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelors degree in Engineering, Business, Finance, Quantitative or related field 5+ years of experience in SQL and Tableau with analysis experience, programming, and reporting with a solid understanding of data storage structures 4+ years of business analytics work experience 3+ years of experience in working with visualization tools such as tableau 2+ years of experience with project methodology (requirements, design, development, test, and implementation) Working knowledge of relational databases, database structures Demonstrated process improvement, workflow, benchmarking and / or evaluation of business processes Demonstrated excellent communication, time/project management, problem solving, organizational, and analytical skills Demonstrated excellent verbal and written communication skills Preferred Qualification: 4+ years of experience with reporting and analyzing large amounts of data

Posted Date not available

Apply

6.0 - 8.0 years

8 - 13 Lacs

greater noida

Remote

Role & responsibilities Develop, test, and maintain data pipelines using Snowflake and DBT (Data Build Tool) . Build and optimize ELT processes for large-scale data processing. Design and implement scalable data models for analytics and reporting. Collaborate with data engineers, analysts, and stakeholders to understand business needs and deliver data solutions. Optimize SQL queries and Snowflake resources for performance and cost efficiency. Implement and maintain data quality checks within DBT models. Ensure adherence to best practices in data governance, security, and compliance. Troubleshoot and resolve data issues in a timely manner. Required Skills & Qualifications: Proven hands-on experience with Snowflake (warehouse architecture, performance tuning, security). Strong experience with DBT for data modeling and transformation. Proficiency in SQL and ELT/ETL concepts. Knowledge of data warehousing concepts , dimensional modeling, and analytics workflows. Interested candidate please drop your resume at riyanshi@etelligens.in

Posted Date not available

Apply

3.0 - 6.0 years

13 - 18 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Roles and Responsibilities Design, develop, and optimize data pipelines using Snowflake . Implement data models, transformations, and ETL processes. Work with stakeholders to understand requirements and translate them into scalable data solutions. Ensure data quality, performance tuning, and security compliance. Integrate Snowflake with other cloud services and BI tools. Required Skills: 3-6 years of experience in data engineering or development roles. Strong expertise in Snowflake (warehousing, performance tuning, query optimization). Proficiency in SQL and ETL tools (e.g., Informatica, Talend, dbt ). Familiarity with cloud platforms (AWS/Azure/GCP) Good understanding of data modeling and data governance. BE/ BTech compulsory, any field Nice to Have: Experience with Python or Spark for data processing. Knowledge of CI/CD pipelines for data workflows.

Posted Date not available

Apply

7.0 - 10.0 years

15 - 20 Lacs

bengaluru

Work from Office

We are looking for a skilled Technology Lead with extensive experience in Snowflake, PL/SQL, and Azure to lead our cloud-based data solutions project. The ideal candidate will have a strong background in cloud data platform development, particularly with Snowflake, and a deep understanding of Azure cloud services. Experience in the financial industry and Agile environments is highly preferred. Key Responsibilities: Lead design and development of cloud-based data pipelines and data solutions, primarily using Snowflake and Azure. Build end-to-end data workflows, including data ingestion, transformation, and extract generation within Snowflake. Write and optimize complex SQL and PL/SQL queries to support business requirements. Monitor and tune Snowflake performance for efficient query execution and scalability. Troubleshoot data issues, perform root cause analysis, and provide production support. Collaborate with cross-functional teams to deliver solutions in an Agile development environment. Provide technical leadership and mentorship to the development team. Ensure adherence to best practices in cloud data engineering and security standards. Required Skills: Minimum 6 years of IT experience with at least 4 years working on cloud-based solutions. 4+ years hands-on experience with Snowflake development. Strong proficiency in PL/SQL and writing complex SQL queries. Solid experience in Azure cloud services and infrastructure. Proven ability to design and build scalable data pipelines on cloud platforms. Experience optimizing Snowflake performance, including query tuning and scaling strategies. Strong problem-solving skills related to data quality and production support. Familiarity with Agile methodologies. Experience in the Financial Industry is preferred. Preferred Qualifications: Certifications in Snowflake and/or Azure cloud services.

Posted Date not available

Apply

6.0 - 11.0 years

6 - 14 Lacs

pune

Hybrid

Project Role Description: A Snowflake Developer will be responsible for designing and developing data solutions within the Snowflake cloud data platform using SNOWPARK, Apache Airflow, Data Build Tool (DBT) and Fivetran. Work location: Pune/Remote. Graduate or Post-Graduate in Computer Science/ Information Technology/Engineering. Job Requirements: Must Have Skills: 6 to 11 years IT Experience as Snowflake Developer . Experience in Telcom Domain BSS/OSS. Minimum experience with 4+ years on Snowflake is MUST. Strong experience with Snowflake (data modeling, performance tuning, security). Proficient in dbt (Data Build Tool) for data transformation is MUST (model creation, Jinja templates, macros and testing). Advanced skills in SQL is MUST - writing, debugging, and performance tuning queries. Workflow Orchestration proficiency with tool Apache Airflow is MUST (developing, scheduling & monitoring). Experience in Integration tool Fivetran is MUST. Experience working with dataframes using Snowpark is MUST. Experience in Automate data workflows and integrate with Azure Devops CI/CD pipelines is MUST . Strong Python & Java scripting for data transformation and automation. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Managing sets of XML, JSON, and CSV from different sources. Build, monitor, and optimize ETL and ELT processes with data models. Continually review and audit data models for enhancement. Hands on experience in Code updates, new code development, and reverse engineering. Possess ownership right from start to finish for the allocated project work. Experience with client interaction is must for demonstrating multiple data solutions. Preferred Snowflake SnowPro Certified professionals. Regular engagement with teams for status reporting and routine activities. Implementation of data streaming solutions from different sources for data migration & transformation. Soft Skills: Hands-on analytical, problem solving and debugging skills. Ability to work under pressure. The person Should be flexible to work independently or in a team. Excellent communication skills and ability to present results in a concise manner to technical & non-technical stakeholders.

Posted Date not available

Apply

4.0 - 9.0 years

0 - 1 Lacs

gurugram, bengaluru

Hybrid

Were prioritizing Snowflake SnowPro Certified professionals with at least 4 years of experience , including 3+ years on Snowflake Data Cloud . Must-have skills include strong hands-on expertise with the Snowflake Cloud Platform , advanced SQL programming, and experience with ETL/ELT tools like Azure Data Factory, AWS Glue, Informatica, Talend, or Qlik Replicate . Familiarity with workflow orchestration tools such as Apache Airflow, Control-M, or Tidal Automation is essential. Python experience , especially working with dataframes using Pandas, PySpark, or Snowpark , is a strong plus. (Share your resume at poonampal@kpmg.com directly) Mandatory Skills:: Snowflake SnowPro Certified professionals are First priority. (Mandatory) -At least 4 years of experience with 3+ years on Snowflake Data Cloud. Snowflake Cloud Platform strong hands-on experience ETL/ELT Tools experience with one or more tools such as: Azure Data Factory AWS Glue Informatica Talend Qlik Replicate Workflow Orchestration proficiency with tools like: Apache Airflow Control-M Tidal Automation Programming: Advanced SQL Python (including working with dataframes using Pandas, PySpark, or Snowpark) Data Engineering Concepts: Strong knowledge of data pipelines, data wrangling, and optimization Preferred candidate profile:: SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch Looking to connect with talented professionals! If you're interested in exciting opportunities, send your resume directly to poonampal@kpmg.com for a quicker response. Let's make great things happen together!

Posted Date not available

Apply

6.0 - 11.0 years

18 - 33 Lacs

pune, chennai, bengaluru

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted Date not available

Apply

6.0 - 11.0 years

15 - 17 Lacs

hyderabad, chennai, bengaluru

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Greetings From Mississippi Consultant LLP!! We are a Recruitment firm based in Pune, having various clients globally. Presently we are having an opening with one of our client contact Soniya soniya05.mississippiconsultants@gmail.com

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies