Home
Jobs

1123 Snowflake Jobs - Page 28

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

20 - 25 Lacs

Noida, Mumbai

Work from Office

Naukri logo

Responsibilities: Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. Develop and enforce data modeling standards and best practices for Snowflake environments. Develop, optimize, and maintain Snowflake data warehouses. Leverage Snowflake features such as clustering, materialized views, and semi-structured data processing to enhance data solutions. Ensure data architecture solutions meet performance, security, and scalability requirements. Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. Provide mentorship and guidance to junior data engineers and architects. Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data Working experience on ETL tools like Fivetran, DBT labs, MuleSoft Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. Excellent problem-solving skills and attention to detail. Effective communication and collaboration abilities. Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

12 - 20 Lacs

Noida, Bengaluru

Hybrid

Naukri logo

Job Overview We are looking for a highly capable and motivated Data Engineer to join our growing data team. The ideal candidate will be responsible for designing and implementing scalable data pipelines, enabling efficient migration of large data workloads into Snowflake , and integrating with various AWS services . This role requires deep knowledge of SQL, cloud data platforms, and a strong understanding of modern data engineering practices. Key Responsibilities Design and implement robust, scalable, and secure data pipelines for ingesting, transforming, and storing data in Snowflake Execute data migration strategies from on-prem or legacy systems (e.g., SQL Server, Oracle, Teradata) to Snowflake Integrate Snowflake with AWS components such as S3 , Glue , Lambda , and Step Functions Automate data ingestion using Snowpipe , Streams , and Tasks Write clean, efficient, and reusable SQL for transformations and data quality validations Monitor and tune Snowflake performance , including warehouse usage and query optimization Implement and enforce data governance , access control , and security best practices Collaborate with data analysts, architects, and business stakeholders to define data requirements Support development of data models (Star, Snowflake schemas) and metadata documentation Required Skills & Experience 3+ years of experience in a Data Engineering role Strong hands-on experience with Snowflake in production environments Proficiency in SQL (complex joins, CTEs, window functions, performance tuning) Solid experience with AWS services : S3, Glue, Lambda, IAM, Step Functions Proven experience in data migration projects Familiarity with ETL/ELT processes and data orchestration tools (e.g., Airflow , DBT , Informatica , Matillion ) Strong understanding of data warehousing , data modeling , and big data concepts Knowledge of version control (Git) and CI/CD pipelines for data workflows Preferred Qualifications Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or related field Snowflake SnowPro Certification AWS Certified Data Analytics or Solutions Architect certification Experience with scripting languages (e.g., Python , Shell ) Exposure to BI/visualization tools (e.g., Tableau, Power BI)

Posted 2 weeks ago

Apply

4.0 - 9.0 years

18 - 27 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Skill set Snowflake, AWS, Cortex AI, Horizon Catalog or Snowflake, AWS, (Cortex AI or Horizon Catalog) or Snowflake, Azure, Cortex AI, Horizon Catalog Or Snowflake, Azure, (Cortex AI or Horizon Catalog) Preferred Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. Experience in data engineering, with at least 3 years of experience working with Snowflake. Proven experience in Snowflake, Cortex AI/ Horizon Catalog focusing on data extraction, chatbot development, and Conversational AI. Strong proficiency in SQL, Python, and data modeling. Experience with data integration tools (e.g., Matillion, Talend, Informatica). Knowledge of cloud platforms such as AWS or Azure, or GCP. Excellent problem-solving skills, with a focus on data quality and performance optimization. Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github/Gitlab, Azure repo

Posted 2 weeks ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Senior Data Management Specialist Level 3 for US based IT Company based in Hyderabad. Candidates with experience in Data Management and Snowflake can app. Job Title : Senior Data Management Specialist Level 3 Location : Hyderabad Experience : 7+ Years CTC : 18 LPA - 20 LPA Working shift : Day shift Description: We are looking for an experienced and highly skilled Data Management Specialist (Level 3) to contribute to enterprise-level data solutions with an emphasis on cloud data platforms and modern data engineering tools . The ideal candidate will possess hands-on expertise with Snowflake , combined with a solid foundation in data integration, modeling , and cloud-based database technologies . This role is a key part of a high-impact data team dedicated to ensuring the quality, availability, and governance of enterprise data assets. As a Level 3 specialist , the individual will be expected to lead and execute complex data management tasks , while collaborating closely with data architects, analysts, and business stakeholders . Key Responsibilities: Design, develop, and maintain scalable data pipelines and integrations using Snowflake and other cloud data technologies Handle structured and unstructured data to support analytics, reporting, and operational workloads Develop and optimize complex SQL queries and data transformation logic Collaborate with data stewards and governance teams to uphold data quality, consistency, and compliance Perform data profiling, cleansing, and validation across multiple source systems Support ETL/ELT development and data migration initiatives using tools like Informatica, Talend , or dbt Design and maintain data models , including star and snowflake schemas Ensure performance tuning, monitoring, and troubleshooting of Snowflake environments Document data processes, data lineage, and metadata within the data governance framework Act as a technical SME , offering guidance and support to junior team members Required Skills & Qualifications: Minimum 5 years of experience in data engineering, data management, or similar roles Strong hands-on experience with Snowflake (development, administration, performance optimization) Proficiency in SQL , data modeling , and cloud-native data architectures Experience working on cloud platforms such as AWS, Azure , or Google Cloud (with Snowflake) Familiarity with ETL tools like Informatica, Talend , or dbt Solid understanding of data governance , metadata management , and data quality best practices Experience with Python or Shell scripting for automation and data operations Strong analytical and problem-solving abilities Excellent communication and documentation skills For further assistance contact/whatsapp : 9354909512 or write to pankhuri@gist.org.in

Posted 2 weeks ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Pune

Remote

Naukri logo

At Codvo, software and people transformations go together We are a global empathy-led technology services company with a core DNA of product innovation and mature software engineering We uphold the values of Respect, Fairness, Growth, Agility, and Inclusiveness in everything we do About The Role :We are looking for a Data & BI Solution Architect to lead data analytics initiatives in the retail domainThe candidate should be skilled in data modeling, ETL, visualization, and big data technologies Responsibilities:Architect end-to-end data and BI solutions for retail analytics Define data governance, security, and compliance frameworks Work with stakeholders to design dashboards and reports for business insights Implement data pipelines and integrate with cloud platforms Skills Required:Proficiency in SQL, Python, and Spark Experience with ETL tools (Informatica, Talend, AWS Glue) Knowledge of Power BI, Tableau, and Looker Hands-on experience with cloud data platforms (Snowflake, Redshift, BigQuery)

Posted 2 weeks ago

Apply

3.0 - 5.0 years

15 - 22 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

Exciting opportunity for a Senior Data Engineer to join a leading analytics-driven environment. You will be working on data warehousing, visualizations, and collaborative requirement gathering to deliver impactful business insights. Location: Gurgaon/Bangalore Shift Timing: 12:00 PM to 9:30 PM Your Future Employer: A high-growth organization known for delivering cutting-edge analytics and data engineering solutions. A people-first environment focused on innovation, collaboration, and continuous learning. Responsibilities: Building and refining data pipelines, transformations, and curated views Cleansing data to enable full analytics and reporting capabilities Collaborating with cross-functional teams to gather and document data requirements Developing dashboards and reports using Tableau or Sigma Supporting sprint-based delivery with strong stakeholder interaction Working with ERP data analytics and financial data sets Requirements: Bachelors degree in Computer Science, Information Systems, or related field 25 years of experience as a Data Engineer (SQL, Oracle) Hands-on experience with Snowflake, DBT, SQL, stored procedures Experience with visualization tools like Tableau or Sigma Proficiency in Agile methodology and tools like JIRA and Confluence Excellent communication, documentation, and client interaction skills Whats in it for you: Competitive compensation with performance-based rewards Opportunity to work on advanced data platforms and visualization tools Exposure to global stakeholders and cutting-edge analytics use cases Supportive, inclusive, and growth-focused work culture

Posted 2 weeks ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), AWS Glue, AWS Lambda Administration Minimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Snowflake Data Warehouse Architect, you will be responsible for leading the implementation of Infrastructure Services projects, leveraging our global delivery capability. Your typical day will involve working with Snowflake Data Warehouse, AWS Glue, AWS Lambda Administration, and Python programming language. Roles & Responsibilities: Lead the design and implementation of Snowflake Data Warehouse solutions for Infrastructure Services projects. Collaborate with cross-functional teams to ensure successful delivery of projects, leveraging AWS Glue and AWS Lambda Administration. Provide technical guidance and mentorship to junior team members. Stay updated with the latest advancements in Snowflake Data Warehouse and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Snowflake Data Warehouse. Good To Have Skills:Proficiency in Python programming language, AWS Glue, and AWS Lambda Administration. Experience in leading the design and implementation of Snowflake Data Warehouse solutions. Strong understanding of data architecture principles and best practices. Experience in data modeling, data integration, and data warehousing. Experience in performance tuning and optimization of Snowflake Data Warehouse solutions. Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications Graduate

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Google BigQuery. Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery. Good To Have Skills:Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake. Strong understanding of SQL and database design principles. Experience with ETL tools and processes. Experience with programming languages such as Python or Java. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualifications 15 years or more of full time education

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Naukri logo

Develops and manages Oracle data solutions, integrating warehouse management, OBIEE, and ODI for business intelligence.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Naukri logo

Focus on designing, developing, and maintaining Snowflake data environments. Responsible for data modeling, ETL pipelines, and query optimization to ensure efficient and secure data processing.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

5+ yrs experience in IICS, ETL methodology, Informatica Cloud & Power Centre. Hands-on with Informatica Intelligent Cloud Services. Strong DB skills (Oracle, SQL). Business analysis, data modeling, complex SQL. Nice to have: Snowflake, AI/ML

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a highly skilled and forward-thinking AI Engineer with deep expertise in Snowflakes Cortex AI , Maya from Mendix , and strong working knowledge of large language models (LLMs) . The ideal candidate has experience developing intelligent AI agents, integrating low-code AI capabilities, and building traditional machine learning solutions. This role offers the opportunity to work at the cutting edge of AI product development and enterprise AI integration. Key Responsibilities: Design, develop, and deploy AI agents using both LLMs and traditional ML approaches Leverage Cortex AI in Snowflake to build and integrate scalable AI/ML models into data pipelines and applications Utilize Maya (Mendix AI) to embed AI features into low-code enterprise applications Evaluate and integrate major LLMs (e.g., OpenAI GPT, Claude, Google Gemini, Mistral, etc.) for various use cases including text summarization, classification, code generation, and chat interfaces Fine-tune and customize foundation models as needed for enterprise tasks Work closely with data engineering and software teams to operationalize AI models in production Monitor model performance and implement improvements and retraining strategies Ensure AI solutions meet security, compliance, and ethical standards Stay current with the latest advancements in AI, ML, and LLM ecosystems Required Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or related field 2+ years of experience in AI/ML development, with hands-on exposure to deploying production-grade models Hands-on experience with Cortex AI in Snowflake , including model training and inference Proficiency in Maya from Mendix or similar AI-assisted low-code platforms Strong experience with Python, SQL, and AI/ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn) Experience working with and integrating APIs of LLMs like OpenAI, Hugging Face, or Cohere Proven ability to design and implement AI agents or copilots using tools like LangChain, Semantic Kernel, or RAG pipelines Familiarity with MLOps principles, CI/CD pipelines, and cloud-based ML (AWS, Azure, or GCP) Preferred Qualifications: Experience in data-centric AI and vector search (e.g., Pinecone, FAISS, Weaviate) Experience in prompt engineering and LLM evaluation Knowledge of security and governance best practices for enterprise AI Contributions to AI open-source projects or publications in the field Certifications in Snowflake, Mendix, or cloud-based AI platforms

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Delhi / NCR

Hybrid

Naukri logo

8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

The Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Snowflake domain.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Hyderabad

Remote

Naukri logo

Data Analyst Experience: 4 - 10 Years Exp Salary : Upto USD 3,000 / month Preferred Notice Period : Within 30 Days Shift : 4:30PM to 1:30AM IST Opportunity Type: Remote Placement Type: Contractual Contract Duration: Full-Time, 03 Months (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : BI tool (Looker, Experience with Product Analytics (Google Analytics, Snowflake, Data Analysis, SQL Good to have skills : AI/LLM, Zendesk Oyster (One of Uplers' Clients) is Looking for: Data Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Data Analyst - Contract Hire Location - Remote Duration - 3 months contractual, Month to Month contract, 4-5 days per week About Us Were a data-driven organisation committed to turning raw information into actionable insights. Our analytics team partners with stakeholders across the business to inform strategy, optimise operations, and unlock new growth opportunities. The Role Analysis & Reporting Perform exploratory and ad-hoc analyses to uncover trends, outliers, and opportunities Design, build, and maintain dashboards and scheduled reports in our BI platform Stakeholder Engagement Gather requirements, present findings, and translate data insights into clear, actionable recommendations Collaborate with product, revenue, and operations teams to prioritise analytics work Upskill and support stakeholders on analytical tools and data literacy Work closely with Data Engineering teams for project support Presentation Deliver clear, compelling presentations of your analyses to both technical and non-technical audiences Experience 2+ years experience in a data-analysis role (or similar), ideally working with Product teams Strong SQL skills for querying and transforming large datasets Hands-on experience with a BI tool (Looker, Power BI, Tableau, Qlik, etc.) Experience with Product Analytics (Google Analytics, Pendo, Amplitude, etc.) Excellent presentation skills: able to prepare and deliver concise, effective reports and slide decks Education & Certifications Degree or diploma in Data Science, Statistics, Computer Science, or related field (preferred) Looker LookML certification (nice to have) Snowflake certifications (nice to have) Nice-to-Have / Advantages Experience supporting Snowflake Cortex or similar AI-driven data transformations Working with APIs to ingest or expose data Hands-on with Python scripting to automate data-prep steps Familiarity with AI/LLMs and embedding-oriented data pipelines Experience working with Zendesk data Why Youll Love Working Here Impact: Your dashboards and analyses will directly influence strategic decisions Collaboration: Work alongside data engineers, data scientists, and cross-functional teams Opportunity to develop advanced analytics and ML/AI skills How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Our mission is to create a more equal world by making it possible for companies everywhere to hire people anywhere. We believe it should be easy for any company to hire any person, no matter where they are located in the world. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 weeks ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Senior Business Intelligence Analyst Level 3 for a US based IT Company based in Hyderabad. Candidates with 7 years of experience as Business Intelligence Analyst can apply. Job Title : Senior Business Intelligence Analyst Level 3 Location : Hyderabad Experience : 7+ Years CTC : upto 20 LPA Working shift : Day shift Job Description: We are looking for a seasoned and analytical Senior Business Intelligence (BI) Analyst to join our data and analytics team. In this Level 3 role, you will play a vital role in turning complex data into meaningful insights that support strategic business decisions. We are seeking a candidate with deep knowledge of WebFocus, solid BI development expertise, and a strong background in the financial services industry. The selected candidate will be responsible for designing, developing, and delivering BI solutions using tools like WebFocus, Tableau, and Alteryx, while ensuring data quality, performance, and usability. Key Responsibilities: Lead the design, development, and enhancement of reports, dashboards, and ad hoc analyses using WebFocus Serve as a subject matter expert on WebFocus and enterprise reporting practices to ensure performance and standardization Partner with business stakeholders to gather reporting requirements and translate them into effective BI solutions Build and maintain interactive dashboards using Tableau and Power BI, and streamline workflows using Alteryx Develop complex SQL queries to extract, validate, and transform data from multiple sources Conduct data analysis to identify trends, insights, and opportunities for business growth Uphold data governance practices, ensure data accuracy, and maintain proper documentation across BI platforms Mentor junior team members and contribute to continuous improvement of BI processes and standards Support regulatory and compliance reporting within the banking and financial domain Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Data Analytics, or related roles Proven hands-on expertise with WebFocus (InfoAssist, App Studio, BI Portal) Strong understanding of financial services data, KPIs, and reporting methodologies Proficiency in Tableau, Alteryx, and SQL Experience with Power BI is a plus Excellent communication skills, with the ability to collaborate effectively with both technical and non-technical teams Strong analytical mindset, attention to detail, and ability to present data-driven narratives Experience working in Agile or Scrum environments Preferred Qualifications: Familiarity with data governance, metadata management, and data cataloging tools Experience with cloud-based data platforms (AWS, Azure, Snowflake) Knowledge of compliance and regulatory reporting in the financial sector For further assistance contact/whatsapp: 9354909521 or write to priyanshi@gist.org.in

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai

Work from Office

Naukri logo

The Digital :Python, Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Python, Digital :Snowflake domain.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The IBM InfoSphere DataStage, Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the IBM InfoSphere DataStage, Digital :Snowflake domain.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

14 - 20 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Role: ETL Testing Notice Period: Immediate Joiners Work Mode: Remote Interested Candidate can share there CV to Devika_P@trigent.com Role & responsibilities Responsibilities Develop and execute comprehensive test plans and test cases for data solutions, including data pipelines, ETL processes, and data warehouses. Perform data validation and verification to ensure data accuracy, completeness, and consistency. Identify, document, and track defects and issues, and work with development teams to resolve them. Collaborate with data engineers, data scientists, and other stakeholders to understand data requirements and ensure that testing covers all necessary scenarios. Automate data testing processes using appropriate tools and frameworks. Conduct performance testing to ensure data solutions can handle expected workloads. Participate in code reviews and provide feedback on data quality and testing practices. Continuously improve testing processes and methodologies to enhance the efficiency and effectiveness of data testing. Requirements and Experience Proven experience in data testing, quality engineering Strong understanding of data engineering practices, including ETL processes, data pipelines, and data warehousing. Knowledge of SSIS, SSAS. Proficiency in SQL and experience with database management systems (e.g., MS SQL Server) Experience with data testing tools and frameworks (e.g., pytest, dbt). Familiarity with cloud data platforms (e.g., Snowflake, Azure Data Factory). Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Mumbai

Work from Office

Naukri logo

The Digital :Snowflake role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Snowflake domain.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Apache Spark, Digital :Kafka, Digital :Snowflake, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Apache Spark, Digital :Kafka, Digital :Snowflake, Digital :PySpark domain.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Remote

Naukri logo

Job Title : Data Engineer II Experience : 6+ Years Location : Remote (India) Job Type : Full-time Job Description : We are looking for a highly skilled Data Engineer II with 6+ years of experience, including at least 4 years in data engineering or software development. The ideal candidate will be well-versed in building scalable data solutions using modern data ecosystems and cloud platforms. Key Responsibilities : Design, build, and optimize scalable ETL pipelines. Work extensively with Big Data technologies like Snowflake and Databricks . Write and optimize complex SQL queries for large datasets. Define and manage SLAs, performance benchmarks, and monitoring systems. Develop data solutions using the AWS Data Ecosystem , including S3 , Lambda , and more. Handle both relational (e.g., PostgreSQL) and NoSQL databases. Work with programming languages like Python , Java , and/or Scala . Use Linux command-line tools for system and data operations. Implement best practices in data lineage , data quality , data observability , and data discoverability . Preferred (Nice-to-Have) : Experience with data mesh architecture or building distributed data products. Prior exposure to data governance frameworks.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

22 - 35 Lacs

New Delhi, Gurugram, Greater Noida

Work from Office

Naukri logo

Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 7–9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark. Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse. Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Preferred Skills: Strong knowledge of data engineering concepts (data pipeline creation, data warehousing, data marts/cubes, data reconciliation and audit, data management). Working knowledge of DevOps processes (CI/CD), Git/Jenkins version control tools, Master Data Management (MDM), and data quality tools. Strong experience in ETL/ELT development, QA, and operations/support processes (RCA of production issues, code/data fix strategy, monitoring and maintenance). Hands-on experience with databases like Azure SQL DB, Snowflake, MySQL, Cosmos DB, Blob Storage, Python/Unix Shell scripting. ADF, Databricks, and Azure certifications are a plus. Technologies We Use: Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, scripting (PowerShell, Bash), Git, Terraform, Power BI Responsibilities: Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms. Lead the technical execution of non-domain-specific initiatives (e.g., reusable dimensions, TLOG standardization, enablement pipelines). Architect data models and reusable layers consumed by multiple downstream pods. Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks. Mentor and coach team members. Partner with product and platform leaders to ensure engineering consistency and delivery excellence. Act as an L3 escalation point for operational data issues impacting foundational pipelines. Own engineering best practices, sprint planning, and quality across the Enablement pod. Contribute to platform discussions and architectural decisions across regions.

Posted 3 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies