Home
Jobs

1714 Snowflake Jobs - Page 42

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6 - 10 years

12 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

Requirement is as follows: • Design and implement scalable data storage solutions using Snowflake • Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. • Write, optimize, and troubleshoot complex SQL queries within Snowflake • Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs. • Develop and maintain ETL processes using Informatica PowerCenter • Integrate Snowflake with various data sources and third-party applications • Experience in Data Lineage Analysis, Data Profiling, ETL Design and development, Unit Testing, Production batch support and UAT support. • Involve SQL performance tuning, Root causing failures and bifurcating them into different technical issues and resolving them. • In-depth understanding of Data Warehouse, ETL concepts and Data Modelling. • Experience in requirement gathering, analysis, designing, development, and deployment. • Good working knowledge of any ETL tool (preferably Informatica powercenter, DBT) • Should have proficiency in SQL. • Have experience in client facing projects. • Have experience on Snowflake Best Practices. • Should have experience working on Unix shell scripting. • Good to have working experience in python Expected skillset: Should have 6 8 years of IT experience. Minimum 4+ years of experience in designing and implementing a fully operational solution on Snowflake Data Warehouse. Proven experience as a Snowflake and Informatica developer Strong expertise in Snowflake architecture, design, and implementation. Proficient in SQL, ETL tools, and data modelling concepts. Excellent leadership, communication, and problem-solving skills. Certifications in Snowflake and relevant cloud platforms are desirable.

Posted 2 months ago

Apply

9 - 14 years

30 - 40 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Position: Integration Architect (DBT+ Snowflake) Location: Pune/Chennai/Nagpur/Bengaluru Purpose of the Position: As a Senior Data Integration Developer/ Architect (DBT), this role seeks candidates passionate about specialized skills in Snowflake technology and features. You will be instrumental in assisting our clients by developing models that facilitate their advancement in utilizing Snowflake effectively. Key Result Areas and Activities: Expertise and Knowledge Sharing: Develop and share expertise in DBT & Snowflake Data Modelling and Development. Actively mine and disseminate organizational experience and expertise across teams and clients. 2.Support and Collaboration: Support Cloud and Data Engineering COE initiatives. Collaborate with management to understand and align with company objectives. 3.Real-Time Data and Performance: Ensure DBT solutions are correctly built for collecting real-time data. Perform and deliver effectively in large and complex environments. 4 .Pipeline and Architecture Design: Design, build, test, and maintain Snowflake architectures and data pipelines. 5.Compliance and Security: Ensure compliance with data governance and security policies. Must have: Expertise in Snowflake architecture, understanding of models, cloud platforms integration with snowflake ETL Development, SQL Scripting, working knowledge of stored procedures Proficiency in designing and maintaining data warehouses and data marts. Strong skills in ETL processes and tools (e.g., Informatica, Talend, Snaplogic). Strong problem-solving skills and the ability to work effectively in a collaborative team environment. Experience working on Datawarehouse/ETL projects. 4+ years of Snowflake ETL experience and 3+ DBT experience or equivalent Experience with cloud data platforms (e.g., AWS, Azure)

Posted 2 months ago

Apply

2 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: Data Engineer Position Details Position Title: Data Engineer Department: Data Engineering Location: Chennai Employment Type: Full-Time About the Role We are seeking a highly skilled and motivated Data Engineer with expertise in Snowflake to join our dynamic team. In this role, you will design, build, and optimize scalable data pipelines and cloud-based data infrastructure to ensure efficient data flow across systems. You will collaborate closely with data scientists, analysts, and business stakeholders to provide clean, accessible, and high-quality data for analytics and decision-making. The ideal candidate is passionate about cloud data platforms, data modeling, and performance optimization , with hands-on experience in Snowflake and modern data engineering tools . Key Responsibilities 1. Data Pipeline Development & Optimization Design, develop, and maintain scalable ETL/ELT data pipelines using Snowflake, dbt, and Apache Airflow . Optimize Snowflake query performance, warehouse sizing, and cost efficiency . Automate data workflows to ensure seamless integration between structured and unstructured data sources. 2. Data Architecture & Integration Design and implement data models and schemas optimized for analytics and operational workloads. Manage Snowflake multi-cluster warehouses, role-based access controls (RBAC), and security best practices . Integrate data from multiple sources, including APIs, relational databases, NoSQL databases, and third-party services . 3. Infrastructure & Performance Management Monitor and optimize Snowflake storage, query execution plans, and resource utilization . Implement data governance, security policies, and compliance within Snowflake. Troubleshoot and resolve performance bottlenecks in data pipelines and cloud storage solutions . 4. Collaboration & Continuous Improvement Work with cross-functional teams to define data requirements and ensure scalable solutions . Document technical designs, architecture, and processes for data pipelines and Snowflake implementations . Stay updated with the latest advancements in cloud data engineering and Snowflake best practices . Qualifications Education & Experience Bachelors or Master’s degree in Computer Science, Information Technology, Engineering , or a related field. 2+ years of experience in data engineering , with a strong focus on cloud-based data platforms . Proven expertise in Snowflake , including performance tuning, cost management, and data sharing capabilities . Experience working with cloud platforms (AWS, GCP, or Azure) and distributed computing frameworks (Spark, Hadoop, etc.) . Technical Skills Strong SQL skills for query optimization and data modeling in Snowflake. Experience with ETL tools such as Apache Airflow, dbt, Talend, Informatica, or Matillion . Proficiency in Python, Scala, or Java for data processing and automation . Familiarity with Kafka, Kinesis, or other streaming data solutions . Understanding of data warehousing concepts, partitioning, and indexing strategies . Preferred Qualifications SnowPro Certification or an equivalent cloud data engineering certification . Experience with containerization (Docker, Kubernetes) and CI/CD for data workflows . Knowledge of machine learning pipelines and MLOps . Benefits Competitive salary and performance-based bonuses . Health insurance . Flexible working hours and remote work options . Professional development opportunities , including Snowflake training, certifications, and conferences . Collaborative and inclusive work environment . How to Apply Follow these steps to apply for the Data Engineer position: 1. Submit Your Resume/CV Ensure your resume is updated and highlights your relevant skills, experience, and achievements in Data Engineering. 2. Write a Cover Letter (Optional but Recommended) Your cover letter should include: Why you are interested in the role. Your relevant experience and achievements in data engineering . How your skills align with the job requirements . 3. Provide Supporting Documents (Optional but Recommended) Links to GitHub repositories, research papers, or portfolio projects showcasing your work in data engineering. If you don’t have links, you can attach files (e.g., PDFs) of your projects or research papers. 4. Send Your Application Email your resume, cover letter, and supporting documents to: krishnamoorthi.somasundaram@nulogic.io

Posted 2 months ago

Apply

9 - 14 years

18 - 33 Lacs

Delhi NCR, Gurgaon, Noida

Work from Office

Naukri logo

Role & responsibilities Job Summary: We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills. Preferred candidate profile Immediate Joiner

Posted 2 months ago

Apply

4 - 9 years

12 - 22 Lacs

Chennai, Bengaluru, Gurgaon

Hybrid

Naukri logo

Role Overview: As a Senior Data Engineer , you will design, develop, and manage scalable data solutions using Databricks, Snowflake, and Azure Synapse . You will optimize data pipelines, ensure automation, and collaborate with cross-functional teams to enhance data accessibility and analytics. Key Responsibilities: Lead end-to-end data engineering projects, ensuring best practices in scalability and performance. Develop and optimize ETL/ELT pipelines using Azure Data Factory, AWS Glue, and Apache Airflow . Design data architectures, schemas, and models for analytics, with DBT as a plus. Work with Big Data tools (Hive, Spark) and cloud-based platforms. Implement CI/CD pipelines and version control for data workflows (nice to have). Ingest data from APIs, ERP systems, and RDBMS for seamless integration. Ensure data quality, governance, and monitoring for accuracy and compliance. Collaborate with teams and mentor junior engineers. Requirements: Expertise in Python & SQL with Data Warehousing & ELT experience. Hands-on experience with Snowflake or Databricks (certification preferred). Strong cloud-based ETL/ELT and orchestration tool knowledge. Experience with APIs, data ingestion, and integration . Strong problem-solving, troubleshooting, and communication skills. Preferred Qualifications: Certification in Snowflake, Databricks, or Azure Data Engineering . Experience in Real-time Data Processing (Kafka is a plus).

Posted 2 months ago

Apply

12 - 16 years

40 - 45 Lacs

Chennai, Mumbai, Bengaluru

Work from Office

Naukri logo

Job Description: Cloud DataInformation Architect Core skillset with implementing Cloud data pipelines Tools AWS Databricks Snowflake Python Fivetran Requirements Candidate must be experienced working in projects involving AWS Databricks Python AWS Native data Architecture and services like S3 lamda Glue EMR Databricks Spark Experience with handing AWS Cloud platform Responsibilities Identify define foundational business data domain data domain elements Identifyingcollaborating with data product and stewards in business circles to capture data definitions Driving data sourceLineage report Reference data needs identification Recommending data extraction and replication patterns Experience on data migration from big data to AWS Cloud on S3 Snowflake Redshift Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptom Manages problems that require involvement of others to solve Reaches sound decisions quickly Carefully evaluates alternative risks and solutions before taking action Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on AWS databricks especially S3 Snowflake python Experience on Shell scripting Exceptionally strong analytical and problem solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and crossfunctional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fastpaced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers Has working knowledge on migrating relational and dimensional databases on AWS Cloud platform

Posted 2 months ago

Apply

5 - 9 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

We are currently seeking an experienced Snowflake Engineer for our Data Analytics team. This role involves designing, building, and maintaining our Snowflake cloud data warehouse. Candidates should have strong Snowflake, SQL, and cloud data solutions experience. Responsibilities: Design, develop, and maintain efficient and scalable data pipelines in Snowflake, encompassing data ingestion, transformation, and loading (ETL/ELT). Implement and manage Snowflake security, including role-based access control, network policies, and data encryption. Develop and maintain data models optimized for analytical reporting and business intelligence. Collaborate with data analysts, scientists, and stakeholders to understand data requirements and translate them into technical solutions. Monitor and troubleshoot Snowflake performance, identifying and resolving bottlenecks. Automate data engineering processes using scripting languages (e.g., Python, SQL) and orchestration tools (e.g., Airflow, dbt). Designing, developing, and deploying APIs within Snowflake using stored procedures and user-defined functions (UDFs) Lead and mentor a team of data engineers and analysts, providing technical guidance, coaching, and professional development opportunities. Stay current with the latest Snowflake features and best practices. Contribute to the development of data engineering standards and best practices. Document data pipelines, data models, and other technical specifications. Qualifications: Bachelor's degree or higher in computer science, Information Technology, or a related field. A minimum of 5 years of experience in data engineering and management, including over 3 years of working with Snowflake. Strong understanding of data warehousing concepts, including dimensional modeling, star schemas, and snowflake schemas. Proficiency in SQL and experience with data transformation and manipulation. Experience with ETL/ELT tools and processes. Experience with Apache Iceberg. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Preferred Qualifications: Snowflake certifications (e.g., SnowPro Core Certification). Experience with scripting languages (e.g., Python) and automation tools (e.g., Airflow, dbt). Experience with cloud platforms (e.g., AWS, Azure, GCP). Experience with data visualization tools (e.g., Tableau, Power BI). Experience with Agile development methodologies. Experience with Snowflake Cortex, including Cortex Analyst, Arctic TILT, and Snowflake AI & ML Studio.

Posted 2 months ago

Apply

2 - 5 years

5 - 7 Lacs

Noida

Work from Office

Naukri logo

About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision.

Posted 2 months ago

Apply

4 - 9 years

16 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Job Title: Senior Database Administrator - Snowflake Work Location: Bangalore (Bhartiya City) Job Type: Full-Time/Permanent Experience Level: 5+ Years Position Overview: This position is responsible for execution of standard tasks that fall under the database administration domain. Responsibilities: Snowflake Datawarehouse Administration - Configuring warehouses, databases, and other Snowflake objects and ensuring optimal configuration settings for performance and cost-efficiency Time travel, Fail safe, Clustering, Cloning, Metadata management Cloud Platform - Aws Storage (S3), Azure Blob Storage and Containers Expertise in SQL, Snowflake stored procedures. Creation & Management of Snowflake Objects like Database/Schema /Tables/Views/MV's /Pipes, Streams, Task Creation & Management Account level Objects like Warehouse, Resource Monitor, Network Policies, Integration (Storage, Notification), Shares Connectivity issues resolution with respect to ODBC, JDBC, Python Connecters. Create replication and failover groups and restore data from Fail safe or from time travel data Troubleshoot data loading issues, connection issues and Snow pipe issues. User management, Implementation of RBAC best practices. Design and develop secure access to objects in Snowflake with Role-Based Access Control (RBAC). Access Rights Management, experience with privileged roles Accountadmin, sysadmin and Securityadmin Analyzing and Planning the Workload, Warehouses with Multi-clusters, Scaling policies and Application Specific warehouse creation and management for efficient Compute consumption Generate, manage and analyze Cost/credits Consumption reports and identify cost optimization areas PII data management using Dynamic data masking, Secure views implementation, managing Account level Security using network policies. Secure Data Sharing using managed accounts Creation, Reader Account provisioning and managing resource monitor, credit Quota, IP whitelisting In depth knowledge of Micro partition, Caching, Table Clustering, Recommendation for Cluster key, Analysing the SQLs using query profile, identifying gaps, patterns and trends, recommendation to BI team based upon the analysis Working experience in writing Procedures (PLSQL, Python) / UDF and calling them from Unix shell Scripts using SnowSql to automate day to day DBA BAU's Experience working with Snowflake Utilities SnowSql, SnowPipe Monitor and coordinate all data replication system operations, including security procedures, and liaison with infrastructure, security, DevOps, Data Platform and Application team. Ensure that necessary system backups are performed, and storage and rotation of backups is accomplished. Manage and maintain the Snowflake environment, including provisioning resources, configuring databases, and monitoring performance to ensure optimal functionality with Qlik. Implement strategies to optimize the performance of Snowflake databases, including warehouse tuning, indexing, and data partitioning. Monitor Snowflake resource utilization and plan for capacity expansion to accommodate future growth and workload demands. Identify opportunities for cost optimization within the Snowflake environment, such as optimizing resource utilization, implementing auto-scaling policies, and leveraging cost-effective storage options. Nothing in this job description restricts managements right to assign or reassign duties and responsibilities to this job at any time. Required Knowledge, Skills, Abilities, and Experience Undergraduate degree with 5 to 7 years of experience with a relational database platform. Minium 2+ years experience with Snowflake admin. Minium 16 years of formal education is preferred. Advanced knowledge of relational database management system and principles, and database security. Advanced proficiency with SQL and SQL tuning. Solid knowledge of scripting/stored procs. Demonstrate ability to acquire knowledge of new database technologies while on the job. Be self-reliant and self-driven. Have strong analytical and problem-solving skills. Ability to pay meticulous attention to details. Ability to meet goals and deadlines. Ability to quickly diagnose issues and come up with solutions and/or workarounds. Ability to work independently or collaboratively and maintain a positive attitude. Strong verbal and written communication skills. Demonstrate honesty, integrity, and professionalism always. Exercise self-control under trying or tough circumstances, situations, or under any kind of pressure. Availability for 24x7 on-call on a rotation basis. Nice to have: Any replication technology QLIK, Stream, etc. Oracle, Mongo DB experience.

Posted 2 months ago

Apply

6 - 10 years

0 - 2 Lacs

Chennai, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Looking for Data Engineer at Wipro Exp-6 plus relevant Location-PAN India Mode-Hybrid Notice period-Immediate to 30 days only can apply Interested can share the cv on Varalakshmi V-varalakshmiv@itsourceglobal.com

Posted 2 months ago

Apply

7 - 10 years

15 - 25 Lacs

Allahabad

Work from Office

Naukri logo

As a Senior Data Engineer Snowflake you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Engineer as demanded by each project to define, design, and deliver actionable insights. Typically, you might Create a solution utilizing Snowflake's native capabilities and put cloud architectures and data platforms into practice. Work as a team to address complex issues; strategic planning; thought leadership; and direction for the Snowflake practice. Design and implement effective Analytics solutions and models with Snowflake. Examine and identify Datawarehouse structural necessities by evaluating business requirements. Assess Datawarehouse implementation procedures to ensure they comply with internal and external regulations. Create detailed data warehouse design, and architecture reports for management & executive teams and make recommendations for improving new and existing Datawarehouse solutions. Educate staff members through training and individual support, and aid by responding quickly to system problems. Understand and document data flows in and between different systems/applications and guide developers in preparing functional/technical specs to define reporting requirements and ETL process. Job Requirement What do we expect? Skills that wed love! 7- 10 years of experience designing and implementing a full-scale data warehouse solution by building productionized data ingestion and processing pipelines. Adept understanding of Public Cloud architectures and good Snowflake implementation experience in DW/ BI/ Analytics use cases. As a Senior or Lead Developer on Analytics solutions, real-time experience in completing 2 Snowflake projects from start to finish. Enjoys providing technical data solutions for all Snowflake business problems and helping with the data migration from on-premises or another cloud database/ data warehouse to Snowflake. Comprehensive knowledge of the delivery methodology and lead teams in the implementation of the solution according to the design/architecture. Profound knowledge of migration, dev/ops, ETL/ELT, and BI is required, and the ability to work in Snowflake design patterns and migration architectures is a major plus. Comprehensive knowledge of the Snowflake capabilities such as Snow pipe, STREAMS, and Snowflake architecture and be aware of Snowflake roles and user security. Enthuse to work in Implementation in integrating Snowflake with DBT and other modern data stack tools as well as a strong understanding of Python for framework design. Loves to take accountability for the individual and team members deliverables. You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire.

Posted 2 months ago

Apply

16 - 26 years

40 - 60 Lacs

Ghaziabad

Work from Office

Naukri logo

On a typical day, you might Develop and maintain the Data layer (DM or DWH) for the aggregation, validation, and presentation of data for reporting layer. Develop or coach the team members to develop reports, dashboards, and related database views/stored procedures. Evangelize self-service BI and visual discovery. Lead multiple projects and provide solutions Design & promote best BI practices by championing data quality, integrity, and reliability. Coach BI team members as a technology SME. Provide on-the-job training for new or less experienced team members. Partner with the sales and presales team to build BI-related solutions and proposals. Ideate an end-end architecture for a given problem statement. Interact and collaborate with multiple teams (Data Science, Consulting & Engineering) and various stakeholders to meet deadlines, to bring Analytical Solutions to life. What do we expect Skills that wed love! 15+ years of experience in delivering comprehensive BI solutions (Power BI, Tableau & Qlik) and 5+ years of experience in Power BI Real-time experience working in OLAP & OLTP database models (Dimensional models). Extensive knowledge and experience in Power BI capacity management, license recommendations, performance optimizations and end to end BI (Enterprise BI & Self-Service BI) Real-time experience in migration projects. Comprehensive knowledge of SQL & modern DW environments (Synapse, Snowflake, Redshift, BigQuery etc). Effortless in relational database concepts, flat file processing concepts, and software development lifecycle. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). Enthuse to collaborate with various stakeholders across the organization and take complete ownership of deliverables. You are important to us, let's stay connected! Every individual comes with a different set of skills and qualities so even if you don't tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire.

Posted 2 months ago

Apply

10 - 18 years

30 - 35 Lacs

Coimbatore

Work from Office

Naukri logo

We are looking for people who have experience in digital implementations in cloud platforms, leading architecture design and discussions. ETL SME, SQLSnowflak,e and Data Engineering skills Alert monitoring, scheduling, and auditing knowledge Nice to have: Experience with agile,working incompliance regulated environments, exposure to manufacturing IIoT data 8-10 years of relevant experience

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! Please Note: Candidates who have appeared in Nagarro within the last 6 months are not eligible to reapply. However, we would greatly appreciate any referrals you may have. REQUIREMENTS: Total experience 5+years Experience in data engineering and database management. Expert knowledge in PostgreSQL (preferably cloud-hosted on AWS, Azure, or GCP). Experience with Snowflake Data Warehouse and strong SQL programming skills. Deep understanding of stored procedures, performance optimization, and handling large-scale data. Knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning. Strong understanding of index design and performance tuning techniques. Familiarity with SQL security techniques, including data encryption, Transparent Data Encryption (TDE), signed stored procedures, and user permission assignments. Competence in data preparation and ETL tools to build and maintain data pipelines and flows. Experience in data integration by mapping various source platforms into Entity Relationship Models (ERMs). Exposure to source control systems like Git, Azure DevOps. Expertise in Python and Machine Learning (ML) model development. Experience in automated testing and test coverage tools. Hands-on experience in CI/CD automation tools Programming experience in Golang Understanding of Agile methodologies (Scrum, Kanban). Ability to collaborate with stakeholders across Executive, Product, Data, and Design teams. RESPONSIBILITIES: Design and maintain an optimal data pipeline architecture. Assemble large, complex data sets to meet functional and non-functional business requirements. Develop pipelines for data extraction, transformation, and loading (ETL) using SQL and cloud database technologies. Prepare and optimize ML models to improve business insights. Support stakeholders by resolving data-related technical issues and enhancing data infrastructure. Ensure data security across multiple data centers and regions, maintaining compliance with national and international data laws. Collaborate with data and analytics teams to enhance data systems functionality. Conduct exploratory data analysis to support database and dashboard development.

Posted 2 months ago

Apply

2 - 4 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Description: Position Overview : Looking for an experienced data automation professional who will be working as the QA automation analyst in one of the leading squads at our fleet and drive the core automation activities. Be able to create end to end automation scripts covering all the touchpoints in the SIT / UAT env as part of system and integration testing Experience in crafting time wide automation test strategy across application for the squad functionalities Understand the design patterns Writing reusable Object oriented codes managing common frameworks across multiple teams Work with business and dev partners bridging gap between functional and automation delivery Should be able to lead and drive from the technical front representing the squad QA & make sure that the squad automation deliverables are met on sprint to sprint basis Participate on QA decision making processes and share valuable inputs to the team members Provide recommendations on existing QA standards and methodologies Storing and maintain the repository for regression Extensive knowledge on build management CI/CD tools like Gradle Maven Teamcity Exposure to databases for data identification and creation. Skill Set: Primary skills o 2-4 years of QA experience o Functional testing - 3 years minimum o Test automation experience in any scripting language o DB testing experience and strong experience in SQL 1 year minimum o Able to test the data ingestion/curation pipelines (Teradata to Snowflake File to Snowflake). o Able to reconcile data across the data platforms as part of the testing by running SQL scripts (manual or automated). o Familiar with Snowflake ecosystem (hands on with writing SQL in the Snowflake query interface at least). o Familiar with the test artifacts (test strategy test cases defect logs status report). o Familiar with the UNIX file system and file manipulation commands. o Familiar with devops processes from testing perspective. o Familiar with the data warehouse and database concepts. o Able to write non-functional and negative test cases. o Familiar with testing batch jobs in an enterprise data warehouse setup. Additional Details Global Grade : B Remote work possibility : Yes Global Role Family : 60236 (P) Software Engineering Local Role Name : 6504 Developer / Software Engineer Local Skills : 5151 ETL Testing Languages Required: : English

Posted 2 months ago

Apply

5 - 7 years

7 - 9 Lacs

Hyderabad

Remote

Naukri logo

Role: Business Analyst Location: Remote 5+ years of Commercial Analytics experience in Pharma/Healthcare industry (must have). Excellent communication skills. Strong stakeholder and project management skills. Good proficiency in SQL(must have). Working Knowledge of Snowflake, good to have. Knowledge of at least one BI tool, Microstrategy preferred. Should have worked on Commercial and Call Activity data. Exposure to pharma datasets from IMS, IQVIA or other similar vendors

Posted 2 months ago

Apply

4 - 5 years

6 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Mid-level Snowflake engineers with 4-5 years of experience in Data warehousing and 1-2 years in snowflake. Good academic background, good com skills and strong in SQL and Python, good data warehousing and data modeling exp.

Posted 2 months ago

Apply

2 - 4 years

4 - 6 Lacs

Pune

Work from Office

Naukri logo

Skills: SQL, SSIS We are looking for a skilled SQL Developer who will be responsible for designing, developing, and optimizing SQL queries and database models. The ideal candidate should have a strong understanding of data integration, ETL processes, and data warehousing concepts. Experience with data visualization tools and knowledge of Snowflake would be a great addition. Key Responsibilities: Develop and optimize complex SQL queries to ensure efficient data retrieval and processing. Design and implement data models and database schemas to meet business requirements. Perform data integration and ETL (Extract, Transform, Load) processes to ensure data accuracy and consistency across systems. Collaborate with stakeholders, including data analysts and business users, to understand and meet data-related requirements. Troubleshoot and resolve any data-related issues, ensuring high levels of performance and reliability. Key Skills: Strong proficiency in SQL with the ability to write optimized and complex queries. Experience with data modelling and implementing database schemas. Knowledge of ETL processes and tools for data extraction, transformation, and loading. Familiarity with data warehousing concepts and best practices. Data visualization tools knowledge (e.g., Tableau, Power BI) is a plus.

Posted 2 months ago

Apply

10 - 14 years

32 - 40 Lacs

Hyderabad

Hybrid

Naukri logo

We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000 experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! Please Note: Candidates who have appeared in Nagarro within the last 6 months are not eligible to reapply. However, we would greatly appreciate any referrals you may have. REQUIREMENTS: Total experience 10+years Experience in data engineering and database management. Expert knowledge in PostgreSQL (preferably cloud-hosted on AWS, Azure, or GCP). Experience with Snowflake Data Warehouse and strong SQL programming skills. Deep understanding of stored procedures, performance optimization, and handling large-scale data. Knowledge of ingestion techniques, data cleaning, de-duplication, and partitioning. Strong understanding of index design and performance tuning techniques. Familiarity with SQL security techniques, including data encryption, Transparent Data Encryption (TDE), signed stored procedures, and user permission assignments. Competence in data preparation and ETL tools to build and maintain data pipelines and flows. Experience in data integration by mapping various source platforms into Entity Relationship Models (ERMs). Exposure to source control systems like Git, Azure DevOps. Expertise in Python and Machine Learning (ML) model development. Experience in automated testing and test coverage tools. Hands-on experience in CI/CD automation tools Programming experience in Golang Understanding of Agile methodologies (Scrum, Kanban). Ability to collaborate with stakeholders across Executive, Product, Data, and Design teams. RESPONSIBILITIES: Design and maintain an optimal data pipeline architecture. Assemble large, complex data sets to meet functional and non-functional business requirements. Develop pipelines for data extraction, transformation, and loading (ETL) using SQL and cloud database technologies. Prepare and optimize ML models to improve business insights. Support stakeholders by resolving data-related technical issues and enhancing data infrastructure. Ensure data security across multiple data centers and regions, maintaining compliance with national and international data laws. Collaborate with data and analytics teams to enhance data systems functionality. Conduct exploratory data analysis to support database and dashboard development.

Posted 2 months ago

Apply

2 - 7 years

6 - 16 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Exciting Snowflake developer Job Opportunity at Infosys! We are looking for skilled Snowflake Developers to join our dynamic team PAN INDIA. If you have a passion for technology and a minimum of 2 to 9 years of hands-on experience in snowflake application development, this is your chance to make an impact. At Infosys, we value innovation, collaboration, and diversity. We believe that a diverse workforce drives creativity and fosters a richer company culture. Therefore, we strongly encourage applications from all genders and backgrounds. Ready to take your career to the next level? Join us in shaping the future of technology. Visit our careers page for more details on how to apply.

Posted 2 months ago

Apply

6 - 8 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Minimum of 6 years of experience in Talend + Snowflake Capacity to work with onsite resources Ability to work as one team mode (part of team and clients are on remote- multinational environment) Ability to work in multiple simultaneous projects Have clear communication with all stakeholder Good experience in Talend Suit Good experience in Snowflake Stored Procedures Very good experience in SQL (SnowFlake desired) Good knowledge on data modelling and data transformation [ business layer]

Posted 2 months ago

Apply

6 - 11 years

15 - 25 Lacs

Noida

Work from Office

Naukri logo

SUMMARY This is a remote position. About tsworks: tsworks is a leading technology innovator, providing transformative products and services designed for the digital-first world. Our mission is to provide domain expertise, innovative solutions, and thought leadership to drive exceptional user and customer experiences. Demonstrating this commitment , we have a proven track record of championing digital transformation for industries such as Banking, Travel and Hospitality, and Retail (including e-commerce and omnichannel), as well as Distribution and Supply Chain, delivering impactful solutions that drive efficiency and growth. We take pride in fostering a workplace where your skills, ideas, and attitude shape meaningful customer engagements. About the Role As a Presales Consultant specializing in Cloud Solutions at tsworks, you will work closely with our clients as a consulting professional, designing, building, and implementing cloud-based initiatives that enhance their business performance. You will be a key driver in translating business needs into technical cloud solutions, leveraging our expertise across Snowflake, AWS, and Azure. This role requires a strong blend of technical acumen, business understanding, and excellent communication skills. Requirements Key Responsibilities Requirements Gathering & Solution Design: Gather business requirements related to client cloud initiatives, design processes, and define implementation strategies for leading cloud platforms (Snowflake, AWS, Azure). Develop business cases for cloud adoption, migration, and optimization projects. Conduct fit-gap analysis and map platform capabilities to business requirements. Client Collaboration: Collaborate with clients to understand their business goals, target audience, and cloud objectives. Identify opportunities for leveraging cloud platforms to improve efficiency, effectiveness, and achieve desired business outcomes. Cloud Expertise & Solutioning: Demonstrate deep understanding of cloud computing concepts, architectures, and best practices, with a focus on Snowflake, AWS, and Azure. Develop tailored cloud-based solutions, including technical proposals, presentations, diagrams, solution architectures, and cost estimates. Value Proposition & ROI: Clearly articulate the value and ROI of our cloud solutions to both technical and business stakeholders, focusing on business outcomes, cost savings, performance improvements, and security enhancements. Competitive Analysis: Analyze the competitive landscape for cloud solutions and effectively position our offerings against alternatives, highlighting our unique differentiators, platform expertise, and value-added services. Presentations & Communication: Deliver compelling presentations to both technical and business audiences, explaining complex cloud concepts and solutions in a clear and concise manner. Create high-quality proposals, technical documentation, SOWs, and responses to RFPs/RFIs. Collaboration & Teamwork: Work closely with the sales team to qualify leads, develop proposals, and support the cloud sales cycle. Provide technical input to marketing materials, webinars, and content creation focused on cloud solutions. Liaise with the engineering/delivery team to ensure seamless project handoffs and successful cloud implementations. Market Awareness: Monitor trends, analyze data, and stay up to date on the latest cloud services, features, and trends within Snowflake, AWS, and Azure. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Master's degree preferred. Minimum 6+ years of experience in a pre-sales or technical consulting role within the IT services industry, with a strong focus on cloud solutions. At least 3+ years of hands-on functional consulting experience with Snowflake, AWS, or Azure implementations. Deep understanding and hands-on experience with at least two of the major cloud platforms: Snowflake, AWS, or Azure. Experience with all three is highly desirable. Strong understanding of cloud architectures, infrastructure as code, cloud security best practices, and cloud cost optimization strategies. Experience developing and delivering compelling technical and business presentations focused on cloud solutions. Excellent written and verbal communication, presentation, and interpersonal skills. Strong analytical, problem-solving, and cloud solution design skills. Understanding business drivers, financial metrics, and how cloud technology can solve business problems. Experience working with CRM systems (e.g., Salesforce) is a plus. Industry knowledge in [mention specific industries you target, e.g., finance, healthcare, retail]. Certifications in relevant cloud platforms (Snowflake, AWS, Azure) are highly desirable. To Apply: Please submit your resume, cover letter, and completed assignments to mohan.kumar@tsworks.io Assignments (a) Cloud Migration Strategy & Planning: Scenario: A financial services company is looking to migrate their core banking applications to the cloud. They have a complex legacy system and are concerned about minimizing downtime and ensuring data integrity during migration. Assignment: Outline a detailed migration strategy for this financial services company. Your strategy should include a. Phased approach to migration. b. Key considerations for data migration and synchronization. c. Strategies for minimizing downtime during the migration process. d. Testing and validation plan. e. Risk assessment and mitigation strategies. f. Post-migration support and optimization plan . (b) Cloud Cost Optimization & TCO Analysis: Scenario: A startup has migrated their applications to the cloud but are experiencing unexpectedly high cloud costs and security. They need to optimize their cloud spending without compromising performance. Assignment: Outline the strategy to analyze the cloud cost report and identify areas for cost optimization. Subsequently, develop a plan to reduce their cloud spending by 20% while maintaining or improving performance. Justify your recommendations.

Posted 2 months ago

Apply

10 - 16 years

20 - 35 Lacs

Pune, Bengaluru, Mohali

Hybrid

Naukri logo

Designation: Lead Data Engineer Experience: 10 to 16 Years Location: Bengaluru/Mohali/Gurugram/Pune. Hybrid Model. Notice Period: up to 60 days Mandatory skills:Snowflake-3+ Years, Python-5+ Years, Matillion-2+ Years, SQL-5+ Years, Tableau-3+ Years

Posted 2 months ago

Apply

6 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Your Job The supervisor will be a part of a global team creating new solutions as well as improving existing solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global service India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KGSI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the KGSI over the next several years. Working closely with global colleagues would provide significant global exposure to the employees. Our Team Supervisor will be responsible for providing leadership to create valuable solutions for KGS and the customers. The supervisor will measure outcomes and direct the applicable action. The Delivery Lead will be responsible for application solutions delivered from the KGSI and will manage data engineers located in the KGSI. The supervisor will be responsible for the performance and development of talent in the KGSI data engineering team. The data Team will work closely with their global counterparts on enterprise-wide delivery. What You Will Do Developing people in team, helping them self-actualize and grow Lead by example in day-to-day work Gathering meaningful and timely feedback and coaching employees Principle Based Management, RREs, Contribution motivation, Connect to the vision (GP, KGSI, Koch) Understand pain points and career aspirations so can help build plan/strategies to address them, facilitate acquiring training. Ensuring compensation of each employee follows Kochs compensation's philosophy. Facilitate communication with HR based on the needs of the employees. Keeping DnA Leader and Functional Leads updated on any market trends in India and foreseeable risks viz., attrition. Hire, retain the right talent, internal and consultants. Delivery Ensuring timely delivery every time Proactively discuss delays and think of alternate solutions. Ensure appropriate delivery processes are followed and updates sent. Pick up individual tasks on a need basis, this may have to be completed individually or with help from the team. Knowledge sharing Keeping abreast with all the experiments/POCs happening across and sharing the learnings and applicability with GP team (AWS) Identifying best practices across different DnA groups and sharing it with GP (Ex- reusable components related to Data Engineering, cloud security best practices etc.) Proactively provide insights/challenges about opportunities for innovation and synergy among the DnA sub teams. Encouraging Tech Lead in each group to help share learnings from other areas of KGSI and to be connected to the GP environment. Who You Are (Basic Qualifications) Bachelor's/masters degree in computer science/information technology with 12+ years of IT experience including leading integration teams Minimum 6+ of data engineering experience with below skill set Deeper experience with data engineering/ETL projects along with AWS cloud migration. At least 3+ years of experience of leading and managing data engineers. Experience of defining & executing appropriate operating & delivery models Experience in onboarding legacy applications to AWS cloud based on standard checks including vulnerability Good experience of developing people and managing their performance & aspirations Strong customer focus, communication, collaboration and problem-solving skills Capable to own the delivery and get the outcome delivered Excellent verbal and written communication Experience with AWS Lambda, Glue or any ETL, Snowflake, AWS S3, Redshift, Python and SQL/PLSQL Secondary skill: Spark scripting, Microsoft, GitLab, AWS Redshift, BI -Power BI, Tableau or any BI tool. What Will Put You Ahead Good understanding of manufacturing business and services Exposure to Scrum Master and Project Management metrics

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Your Job The Data Engineer will be a part of an international team that designs, develops and delivers new applications for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Services (KGS) is being developed in India to as a Shared Service Operations, as well as act as a hub for innovation across functions. As KGS rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Services (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees Our Team We are seeking a Data Engineer expert to join KGS Analytics capability. We love passionate, forward-thinking individuals who are driven to innovate. You will have the opportunity to engage with Business Analysts, Analytics Consultants, and internal customers to implement ideas, optimize existing dashboards, and create Visualization products using powerful, contemporary tools. This opportunity engages diverse types of business applications and data sets at a rapid pace, and our ideal candidate gets excited when they are faced with a challenge . What You Will Do If a candidate is entrepreneurial in the way they approach ideas, Koch is among the most fulfilling organizations they could join. We are growing an analytics capability and looking for entrepreneurial minded innovators who can help us further develop this service of exceptionally high value to our business. Due to the diversity of companies and work within Koch, we are frequently working in new and interesting global business spaces, with data and analytics applications that are unique relative to opportunities from other employers in the marketplace . W ho You Are (Basic Qualifications) Work with business partners to understand key business drivers and use that knowledge to experiment and transform Business Intelligence & Advanced Analytics solutions to capture the value of potential business opportunities Translate a business process/problem into a conceptual and logical data model and proposed technical implementation plan Assist in developing and implementing consistent processes for data modeling, mining, and production Focus on implementing development processes and tools that allow for the collection of metadata, access to metadata, and completed in a way that allows for widespread code reuse (e.g., utilization of ETL Frameworks, Generic Metadata driven Tools, shared data dimensions, etc.) that will enable impact analysis as well as source to target tracking and reporting. Improve data pipeline reliability, scalability, and security What You Will Need to Bring with You: (experience & education required) 5+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent. At least 4 years of Data Engineering experience (preferably AWS) with strong knowledge in SQL, developing, deploying, and modelling DWH and data pipelines on AWS cloud or similar other cloud environments. 3+ years of experience with business and technical requirements analysis, elicitation, data modeling, verification, and methodology development with a good hold of communicating complex technical ideas to technical and non-technical team members Demonstrated experience with Snowflake and AWS Lambda with python development for provisioning and troubleshooting. Demonstrated experience using git-based source control management platforms (Gitlab, GitHub, DevOps, etc.). What Will Put You Ahead 3+ years experience in the Amazon Web Services stack experience including S3, Athena, Redshift, Glue, or Lambda 3+ years experience with cloud data warehousing solutions including Snowflake with developing in and implementation of dimensional modeling 2+ years experience with data visualization and statistical tools like PowerBI, Python, etc. Experience with Git and CICD pipelines. Development experience with docker and a Kubernetes environment (would be a plus).

Posted 2 months ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies