Home
Jobs
Companies
Resume

22 Snowflake Modeling Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 25 Lacs

Chennai

Work from Office

Naukri logo

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 week ago

Apply

5.0 - 7.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 17 Lacs

Ahmedabad

Work from Office

Naukri logo

Job Summary: Key Responsibilities: Design, develop, and maintain interactive and user-friendly Power BI dashboards and reports. Translate business requirements into functional and technical specifications. Perform data modeling, DAX calculations, and Power Query transformations. Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs. Optimize Power BI datasets, reports, and dashboards for performance and usability. Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy and relevance. Ensure security and governance best practices in Power BI workspaces and datasets. Provide ongoing support and troubleshooting for existing Power BI solutions. Stay updated with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelors degree in Computer Science, Information Technology, Data Analytics, or a related field. 4+ years of professional experience in data analytics or business intelligence. 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service). Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema). Proficiency in writing complex SQL queries and optimizing them for performance. Experience in working with large and complex datasets. Experience in BigQuery, MySql, Looker Studio is a plus. Ecommerce Industry Experience will be an added advantage. Solid understanding of data warehousing concepts and ETL processes. Experience with version control tools such as Power Apps & Power Automate would be a plus. Preferred Qualifications: Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse). Knowledge of other BI tools (Tableau, Qlik) is a plus. Familiarity with scripting languages (Python, R) for data analysis is a bonus. Experience integrating Power BI into web portals using Power BI Embedded.

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Nagpur, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore & Nagpur

Posted 1 week ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Naukri logo

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 1 week ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Naukri logo

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 1 week ago

Apply

6.0 - 11.0 years

18 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Greetings from Primus Global Technology!!! We are hiring for a Snowflake Administrator role with a leading MNC for locations including Bangalore, Chennai & Hyderabad This is a contract position (6 months to 1 year) with a potential for extension based on performance. The selected candidate will be on Primus Global payroll . Experience Required: 6+ years (4.5+ in Snowflake) Salary: 1,50,000 to 1,80,000 per month Contract Duration: 6 - 12 months (Extendable based on performance) Payroll: Primus Global Technology Note: Only candidates with experience as a Snowflake Administrator are eligible for this position. This opening is not for Snowflake Developers. Key Responsibilities: Database Management: Snowflake account/user management, performance tuning, backups Security: Implement RBAC, encryption, and compliance policies Cost Management: Monitor and optimize Snowflake costs ETL & Integration: Support data pipelines and integration with other systems Performance Tuning: Improve query and system performance Support: Troubleshooting and vendor escalation Collaboration: Work with architects and stakeholders, provide system health reports Apply Now! Send your resume to: npandya@primusglobal.com Looking for the immediate joiner Contact: Nidhi P Pandya Sr. Associate Talent Acquisition Primus Global Technology Pvt. Ltd. All THE BEST JOB SEEKERS

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 2 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

We Currently have Job openings for Snowflake Developer www.royalcyber.com Job Description : Design, develop, and maintain scalable data pipelines and Snowflake data warehouse models. Implement data ingestion processes using Snowflake and ETL/ELT tools (e.g., dbt, Informatica, Talend, etc.). Optimize Snowflake SQL queries and manage performance tuning and data modeling. Develop and maintain stored procedures, UDFs, and other Snowflake scripting components. Work with cross-functional teams to understand business requirements and translate them into technical solutions. Collaborate with BI developers to provide clean, transformed, and well-modeled data for analytics and reporting. Maintain data governance, security, and compliance within the Snowflake environment. Monitor data pipelines for reliability and troubleshoot issues as needed. Support integration of Snowflake with various data sources and analytics tools like Tableau, Power BI, Looker, etc. Required Skills and Qualifications: 6 to 8 years of experience in data engineering, data warehousing, or data analytics roles. Minimum 3+ years of hands-on experience with Snowflake (data modeling, performance tuning, schema design, etc.). Strong proficiency in SQL, with expertise in writing complex queries and stored procedures. Solid experience with ETL/ELT tools and frameworks. Familiarity with cloud platforms (AWS, Azure, or GCP) and data lake architectures. Experience in integrating Snowflake with third-party BI tools and APIs. Strong understanding of data warehousing concepts, data lakes, and dimensional modeling. Working knowledge of version control (Git), CI/CD practices, and Agile methodologies. Excellent problem-solving skills and ability to work in a collaborative environment. Work Location : Remote If interested pls share resume to sruthy.p@royalcyber.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Bhubaneswar, Odisha, India

On-site

Foundit logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Snowflake Data Warehouse. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Snowflake Data Warehouse. Collaborate with cross-functional teams to analyze business requirements and develop scalable and efficient solutions. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of the application by conducting unit testing, integration testing, and performance testing. Professional & Technical Skills: Must To Have Skills: Experience in Snowflake Data Warehouse. Good To Have Skills: Experience in other data warehousing technologies like Redshift, BigQuery, or Azure Synapse Analytics. Strong understanding of database concepts and SQL. Experience in ETL tools like Talend, Informatica, or DataStage. Experience in developing and maintaining technical documentation. Experience in conducting unit testing, integration testing, and performance testing. Additional Information: The candidate should have experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 weeks ago

Apply

4.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

About Us Shravas Technologies, founded in 2016, is an IT services company based out of Bangalore, India. The company specializes in Software QA and related services such as Data Mining, Analytics, and Visualization. Job Title Snowflake Developer (4 to 9 Years Experience) Location Bangalore Type Full-time, Hybrid Job Summary We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate should have hands-on expertise in building and optimizing scalable data pipelines and working with Snowflake data warehouse solutions. This role involves working closely with business analysts, data scientists, and other developers to deliver reliable, secure, and high-performance data solutions. Key Responsibilities - Design, develop, and implement Snowflake-based data solutions. - Create and maintain scalable ETL/ELT pipelines using tools such as SQL, Python, dbt, Airflow, or similar - Develop data models and schema design optimized for performance and usability. - Write and optimize complex SQL queries for data transformation and extraction. - Integrate Snowflake with other systems like MySQL, SQL Server, AWS (S3, Lambda), Azure, or GCP using APIs or connectors. - Manage Snowflake security (roles, users, access control). - Monitor data pipeline performance and troubleshoot issues. - Participate in code reviews, unit testing, and documentation. Required Skills and Qualifications - Bachelor's degree in Computer Science, Information Systems, or a related field. - 4 to 9 years of experience in data engineering or d - Proficiency in SQL and performance tuning. - Experience with data pipeline and ETL tools (e.g., Informatica, Talend, dbt, Apache Airflow). - Familiarity with cloud platforms (AWS, Azure, or GCP). - Understanding of data warehousing concepts and best practices. - Knowledge of version control systems like Git. Preferred Skills - Experience with Python or Scala for data processing. - Familiarity with tools like Stitch, Fivetran, or Matillion. - Exposure to CI/CD pipelines for data projects. - Knowledge of data governance and security compliance. - Understanding of financial and economic data trends is a plus. Soft Skills - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and collaboratively in a team environment. - Detail-oriented with a strong focus on quality and accuracy. Reporting To Lead Data Engineer / Chief Data Officer

Posted 3 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 3 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 3 weeks ago

Apply

3 - 8 years

15 - 25 Lacs

Bhubaneshwar, Bengaluru, Hyderabad

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 1 month ago

Apply

3 - 8 years

15 - 25 Lacs

Bengaluru, Hyderabad, Noida

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 1 month ago

Apply

5 - 10 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

Grasp business data, processes and use cases ETL pipeline impl, tuning and maintenance Implement dimension and fact tables Snowflake schema, ingestion, partitioning and tuning Data engineering and scripting Work with BI teams Data governance Required Candidate profile Experienced in SQL, Python AWS S3, RDS, EC2, VPC Snowflake, Snowsql, Snowpipe Data migration: Onpremise to cloud Snowflake (desirable) DataBricks, PySpark, Apache Bigdata Techstack AWS Redshift, EMR

Posted 2 months ago

Apply

2 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 3 months ago

Apply

5 - 10 years

15 - 30 Lacs

Bengaluru, Hyderabad

Work from Office

Naukri logo

Data Pipelines & ETL Processes: Snowflake Data Modeling: Cloud & On-Prem Solutions: Design and manage cloud-based data infrastructure (AWS, Azure, GCP), Real-Time Data Processing:

Posted 3 months ago

Apply

3 - 8 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 1 month ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies