Home
Jobs

83 Snowflake Db Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Nashik

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Ludhiana

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Lucknow

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Nagpur, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore & Nagpur

Posted 3 weeks ago

Apply

6.0 - 11.0 years

18 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Greetings from Primus Global Technology!!! We are hiring for a Snowflake Administrator role with a leading MNC for locations including Bangalore, Chennai & Hyderabad This is a contract position (6 months to 1 year) with a potential for extension based on performance. The selected candidate will be on Primus Global payroll . Experience Required: 6+ years (4.5+ in Snowflake) Salary: 1,50,000 to 1,80,000 per month Contract Duration: 6 - 12 months (Extendable based on performance) Payroll: Primus Global Technology Note: Only candidates with experience as a Snowflake Administrator are eligible for this position. This opening is not for Snowflake Developers. Key Responsibilities: Database Management: Snowflake account/user management, performance tuning, backups Security: Implement RBAC, encryption, and compliance policies Cost Management: Monitor and optimize Snowflake costs ETL & Integration: Support data pipelines and integration with other systems Performance Tuning: Improve query and system performance Support: Troubleshooting and vendor escalation Collaboration: Work with architects and stakeholders, provide system health reports Apply Now! Send your resume to: npandya@primusglobal.com Looking for the immediate joiner Contact: Nidhi P Pandya Sr. Associate Talent Acquisition Primus Global Technology Pvt. Ltd. All THE BEST JOB SEEKERS

Posted 3 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Job description Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities Manage and maintain the Snowflake platform to ensure optimal performance and reliability. Collaborate with data engineers and analysts to design and implement data pipelines. Develop and optimize SQL queries for efficient data retrieval and manipulation. Create custom scripts and functions using JavaScript and Python to automate platform tasks. Troubleshoot platform issues and provide timely resolutions. Implement security best practices to protect data within the Snowflake platform. Stay updated on the latest Snowflake features and best practices to continuously improve platform performance. Preferred candidate profile Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of Nine years of experience in managing any Database platform. Proficiency in SQL for data querying and manipulation. Strong programming skills in JavaScript and Python. Experience in optimizing and tuning Snowflake for performance. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

POSITION SUMMARY : We are looking for a highly skilled Python Backend Developer with 6-12 years of experience with backend development, the candidate will drive projects independently while ensuring high code quality and efficiency. The role requires expertise in Python frameworks (Django, Flask), cloud platforms (AWS), and database management (Snowflake), with a strong emphasis on software best practices, problem-solving, and stakeholder collaboration. EXPERIENCE AND REQUIRED SKILL SETS : - 6-12 years of backend development experience with Python - Understanding of cloud platforms, particularly AWS. - Proficiency in using Snowflake for database management and optimization. - Experience working with data-intensive applications. - Demonstrated ability to build dynamic and static reports using Python libraries such as Pandas, Matplotlib, or Plotly. - Strong understanding of RESTful APIs and microservices architecture. - Proficiency with Python frameworks like Django, Flask, or Tornado, including basic skills required to develop and maintain applications using these frameworks. - Knowledge of both relational and non-relational databases. - Proficiency with version control systems, especially Git. - Backend Development: Design, develop, and maintain scalable and resilient backend services using Python, ensuring optimal performance and reliability. - Data-Intensive Applications: Develop and manage data-intensive applications, ensuring efficient data processing and handling. - Report Generation: Create dynamic and static reports utilizing common Python libraries (e.g., Pandas, Matplotlib, Plotly) to deliver actionable insights. - Python Frameworks: Utilize frameworks such as Django, Flask, or Tornado to build and maintain robust backend systems, ensuring best practices in application architecture. - Cloud Platforms: Deploy and manage applications on cloud development platforms such as AWS and Beacon, leveraging their full capabilities to support our solutions. - Database Management: Architect, implement, and optimize database solutions using Snowflake to ensure data integrity and performance. - Stakeholder Collaboration: Engage directly with Tech Owners and Business Owners to gather requirements, provide progress updates, and ensure alignment with business objectives. - Ownership & Initiative: Take full ownership of projects, driving them from conception through to completion with minimal supervision. - Software Best Practices: Implement and uphold software development best practices, including version control, automated testing, code reviews, and CI/CD pipelines. - GenAI Tools Utilization: Utilize GenAI tools such as GitHub Copilot to enhance coding efficiency, streamline workflows, and maintain high code quality. - Problem-Solving: Proactively identify, troubleshoot, and resolve technical issues, ensuring timely delivery of solutions.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 4 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 4 weeks ago

Apply

4.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

About Us Shravas Technologies, founded in 2016, is an IT services company based out of Bangalore, India. The company specializes in Software QA and related services such as Data Mining, Analytics, and Visualization. Job Title Snowflake Developer (4 to 9 Years Experience) Location Bangalore Type Full-time, Hybrid Job Summary We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate should have hands-on expertise in building and optimizing scalable data pipelines and working with Snowflake data warehouse solutions. This role involves working closely with business analysts, data scientists, and other developers to deliver reliable, secure, and high-performance data solutions. Key Responsibilities - Design, develop, and implement Snowflake-based data solutions. - Create and maintain scalable ETL/ELT pipelines using tools such as SQL, Python, dbt, Airflow, or similar - Develop data models and schema design optimized for performance and usability. - Write and optimize complex SQL queries for data transformation and extraction. - Integrate Snowflake with other systems like MySQL, SQL Server, AWS (S3, Lambda), Azure, or GCP using APIs or connectors. - Manage Snowflake security (roles, users, access control). - Monitor data pipeline performance and troubleshoot issues. - Participate in code reviews, unit testing, and documentation. Required Skills and Qualifications - Bachelor's degree in Computer Science, Information Systems, or a related field. - 4 to 9 years of experience in data engineering or d - Proficiency in SQL and performance tuning. - Experience with data pipeline and ETL tools (e.g., Informatica, Talend, dbt, Apache Airflow). - Familiarity with cloud platforms (AWS, Azure, or GCP). - Understanding of data warehousing concepts and best practices. - Knowledge of version control systems like Git. Preferred Skills - Experience with Python or Scala for data processing. - Familiarity with tools like Stitch, Fivetran, or Matillion. - Exposure to CI/CD pipelines for data projects. - Knowledge of data governance and security compliance. - Understanding of financial and economic data trends is a plus. Soft Skills - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and collaboratively in a team environment. - Detail-oriented with a strong focus on quality and accuracy. Reporting To Lead Data Engineer / Chief Data Officer

Posted 4 weeks ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Description: We are seeking a highly skilled Snowflake Database Administrator with a proven track record of managing, optimizing, and securing Snowflake environments. The ideal candidate will have over 5+ years of hands-on experience in Snowflake database administration , ensuring seamless performance, security, and scalability of enterprise-level data solutions. This role requires deep expertise in database architecture, performance tuning, access control, workload management, and cost optimization within Snowflake. The selected candidate will play a key role in monitoring database health, implementing security best practices, automating administrative tasks, and optimizing query performance . Key Responsibilities: Administer and manage Snowflake databases, warehouses, and schemas . Optimize query performance, workload management, and storage efficiency . Implement and manage RBAC (Role-Based Access Control), authentication, and data security policies . Monitor and troubleshoot database performance issues and system health . Handle data loading, unloading, and integrations with various data sources. Automate administrative tasks using SQL, Python, or Shell scripting . Manage clustering, caching, and partitioning strategies for efficient data retrieval. Ensure high availability, backups, and disaster recovery planning . Maintain audit logs, access control, and compliance standards . Required Skills & Qualifications: 5+ years of experience in Snowflake database administration . Strong expertise in SQL performance tuning and optimization . Experience with Role-Based Access Control (RBAC) and security best practices . Familiarity with data sharing, replication, and failover strategies . Experience working with cloud platforms (AWS, Azure, or GCP) . Scripting experience with Python, Shell, or SQL for automation. Knowledge of Snowflake resource monitoring and cost optimization . Willingness to work in a 24/7 operational environment with rotational shifts . Preferred Qualifications: Snowflake certifications (SnowPro Core/Advanced). Experience with data governance and compliance frameworks . Familiarity with ETL/ELT processes and data pipeline orchestration will be value add

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Kochi, Chennai, Thiruvananthapuram

Work from Office

Naukri logo

Snowflake Data Warehouse Development : Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflakes features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development : Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark : Leverage PySpark for data transformations within the Snowflake environment . Implement complex data cleansing , enrichment , and validation processes using PySpark to ensure the highest data quality. Collaboration : Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.

Posted 1 month ago

Apply

10.0 - 15.0 years

16 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Experience : 10+ Years Location : Bangalore, Hybrid Notice Period : Immediate Joiners to 30 days Mode of Interview : 2-3 rounds Key Skills : Snowflake or Databricks, Python or Java, Cloud, Data Modeling Primary Skills : - Erwin tool - Data Modeling (Logical, Physical) - 2+ years - Snowflake - 2+ years - Warehousing Concepts - Any cloud Secondary Skills : - dbt cloud - Airflow - AWS Job Description : This is a hands-on technology position for a data technology leader with specialized business knowledge in the middle/front office areas. The candidate is someone with a proven record of technology project execution for data on the cloud, able to get hands-on when it comes to analysis, design, and development, and has creativity and self-motivation to deliver on mission-critical projects. These skills will help you succeed in this role : - Having 10+ Years of experience in an application development team with hands-on architecting, designing, developing, and deployment skillset. - Have demonstrated ability to translate business requirements in a technical design and through to implementation. - Experienced Subject Matter Expert in designing & architecting Big Data platforms services, and systems using Java/Python, SQL, Databricks, Snowflake, and cloud-native tools on Azure and AWS. - Experience in event-driven architectures, message hub, MQ, Kafka. - Experience in Kubernetes, ETL tools, Data as a Service, Star Schema, Dimension modeling, OLTP, ACID, and data structures is desired. - Proven Experience with Cloud and Big Data platforms, building data processing applications utilizing Spark, Airflow, Object storage, etc. - Ability to work in an on-shore/off-shore model working with development teams across continents. - Use coding standards, secured application development, documentation, Release and configuration management, and expertise in CI/CD. - Well-versed in SDLC using Agile Scrum. - Plan and execute the deployment of releases. - Ability to work with Application Development, SQA, and Infrastructure team. - Strong leadership skills, analytical problem-solving skills along with the ability to learn and adapt quickly. - Self-motivated, quick learner, and creative problem solver, organized, and responsible for managing a team of dev engineers. Education & Preferred Qualifications : - Bachelor's degree and 6 or more years of experience in Information Technology. - Strong team ethics and team player. - Cloud certification, Databricks, or Snowflake Certification is a plus. - Experience in evaluating software estimating cost and delivery timelines and managing financials. - Experience leading agile delivery & adhering to SDLC processes is required. - Work closely with the business & IT stakeholders to manage delivery. Additional Requirements : - Ability to lead delivery, manage team members if required, and provide feedback. - Ability to make effective decisions and manage change. - Communicates effectively in a professional manner both written and orally. - Team player with a positive attitude, enthusiasm, initiative, and self-motivation.

Posted 1 month ago

Apply

13.0 - 23.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Role : Snowflake Practice Lead / Architect / Solution Architect Exp : 13+ Years Work Location : Hyderabad Position Overview : We are seeking a highly skilled and experienced - Snowflake Practice Lead- to drive our data strategy, architecture, and implementation using Snowflake. This leadership role requires a deep understanding of Snowflake's cloud data platform, data engineering best practices, and enterprise data management. The ideal candidate will be responsible for defining best practices, leading a team of Snowflake professionals, and driving successful Snowflake implementations for clients. Key Responsibilities : Leadership & Strategy : - Define and drive the Snowflake practice strategy, roadmap, and best practices. - Act as the primary subject matter expert (SME) for Snowflake architecture, implementation, and optimization. - Collaborate with stakeholders to understand business needs and align data strategies accordingly. Technical Expertise & Solutioning : - Design and implement scalable, high-performance data architectures using - Snowflake- . - Develop best practices for data ingestion, transformation, modeling, and security- within Snowflake. - Guide clients on Snowflake migrations, ensuring a seamless transition from legacy systems. - Optimize - query performance, storage utilization, and cost efficiency- in Snowflake environments. Team Leadership & Mentorship : - Lead and mentor a team of Snowflake developers, data engineers, and architects. - Provide technical guidance, conduct code reviews, and establish best practices for Snowflake development. - Train internal teams and clients on Snowflake capabilities, features, and emerging trends. Client & Project Management : - Engage with clients to understand business needs and design tailored Snowflake solutions. - Lead - end-to-end Snowflake implementation projects , ensuring quality and timely delivery. - Work closely with - data scientists, analysts, and business stakeholders- to maximize data utilization. Required Skills & Experience : - 10+ years of experience- in data engineering, data architecture, or cloud data platforms. - 5+ years of hands-on experience with Snowflake in large-scale enterprise environments. - Strong expertise in SQL, performance tuning, and cloud-based data solutions. - Experience with ETL/ELT processes, data pipelines, and data integration tools- (e.g., Talend, Matillion, dbt, Informatica). - Proficiency in cloud platforms such as AWS, Azure, or GCP, particularly their integration with Snowflake. - Knowledge of data security, governance, and compliance best practices . - Strong leadership, communication, and client-facing skills. - Experience in migrating from traditional data warehouses (Oracle, Teradata, SQL Server) to Snowflake. - Familiarity with Python, Spark, or other big data technologies is a plus. Preferred Qualifications : - Snowflake SnowPro Certification- (e.g., SnowPro Core, Advanced Architect, Data Engineer). - Experience in building data lakes, data marts, and real-time analytics solutions- . - Hands-on experience with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC)- in Snowflake environments. Why Join Us? - Opportunity to lead cutting-edge Snowflake implementations- in a dynamic, fast-growing environment. - Work with top-tier clients across industries, solving complex data challenges. - Continuous learning and growth opportunities in cloud data technologies. - Competitive compensation, benefits, and a collaborative work culture.

Posted 1 month ago

Apply

5 - 10 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills

Posted 1 month ago

Apply

4 - 8 years

7 - 14 Lacs

Pune

Hybrid

Naukri logo

Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com

Posted 1 month ago

Apply

6 - 10 years

15 - 27 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Snowflake Administration: Experience: 6-10 years Key Responsibilities Administer and manage the Snowflake data platform, including monitoring, configuration, and upgrades. Ensure the performance, scalability, and reliability of Snowflake databases and queries. Set up and manage user roles, access controls, and security policies to safeguard data integrity. Optimize database design and storage management to improve efficiency and reduce costs. Collaborate with data engineering and analytics teams to integrate data pipelines and support data workloads. Implement best practices for ETL/ELT processes, query optimization, and data warehouse management. Troubleshoot and resolve issues related to Snowflake platform operations. Monitor resource utilization and provide cost analysis for effective usage. Create and maintain documentation for Snowflake configurations, processes, and policies. Skills and Qualifications Proven experience in Snowflake administration and management. Strong understanding of Snowflake compute and storage management. Expertise in data governance, including column-level data security using secure views and dynamic data masking features. Proficiency in performing data definition language (DDL) operations. Ability to apply strategies for Snowflake performance-tuning. Experience in designing and developing secure access controls using Role-based Access Control (RBAC). Excellent troubleshooting and problem-solving skills. Strong collaboration and communication skills. Understanding of cost optimization approaches and implementation on Snowflake.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies