Home
Jobs

1082 Snowflake Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 325298 We are currently seeking a AWS Redshift administrator Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties: "¢ Administer and maintain scalable cloud environments and applications for data organization. "¢ Understanding business objectives of the company and creating cloud-based solutions to facilitate those objectives. "¢ Implement Infrastructure as Code and deploy code using Terraform, Gitlab "¢ Install and maintain software, services, and application by identifying system requirements. "¢ Hands-on AWS Services and DB and Server troubleshooting experience. "¢ Extensive database experience with RDS, AWS Redshift, MySQL "¢ Maintains environment by identifying system requirements, installing upgrades and monitoring system performance. "¢ Knowledge of day-to-day database operations, deployments, and development "¢ Experienced in Snowflake "¢ Knowledge of SQL and Performance tuning "¢ Knowledge of Linux Shell Scripting or Python "¢ Migrate system from one AWS cloud to another AWS account "¢ Hands-on DB and Server troubleshooting experience "¢ Maintains system performance by performing system monitoring and analysis and performance tuning. "¢ Troubleshooting system hardware, software, and operating and system management systems. "¢ Secures web system by developing system access, monitoring, control, and evaluation. "¢ Testing disaster recovery policies and procedures; completing back-ups; and maintaining documentation. "¢ Upgrades system and services and developing, testing, evaluating, and installing enhancements and new software. "¢ Communicating with internal teams, like EIMO, Operations, and Cloud Architect "¢ Communicate with stakeholders and build applications to meet project needs. Minimum Skills Required: "¢ Bachelor"™s degree in computer science or engineering "¢ Minimum of 7 years of experience in System, platform, and AWS cloud administration "¢ Minimum of 5 to 7 years of Database administration and AWS experience using latest AWS technologies "“ AWS EC2, Redshift, VPC, S3, AWS RDS "¢ Experience with Java, Python, Redshift, MySQL, or equivalent database tools "¢ Experience with Agile software development using JIRA "¢ Experience in multiple OS platforms with strong emphasis on Linux and Windows systems "¢ Experience with OS-level scripting environment such as KSH shell., PowerShell "¢ Experience with version management tools and CICD pipeline "¢ In-depth knowledge of the TCP / IP protocol suite, security architecture, securing and hardening Operating Systems, Networks, Databases and Applications. "¢ Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) , query performance tuning. "¢ Experience supporting and optimizing data pipelines and data sets. "¢ Knowledge of the Incident Response life cycle "¢ AWS solution architect certifications. "¢ Strong written and verbal communication skills.

Posted 1 week ago

Apply

18.0 - 23.0 years

15 - 19 Lacs

Pune

Work from Office

Naukri logo

Req ID: 317103 We are currently seeking a Digital Consultant - Innovation Group to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesWe are seeking a highly skilled and experienced Digital Consultant to join our Innovation Group. The ideal candidate will have a strong background in Big Data, Cloud, and AI/ML projects, with a focus on the health insurance or retail domains or manufacturing domains. This role involves engaging with clients for architecture and design, building accelerators for cloud migration, and developing innovative solutions using GenAI technologies. Key Responsibilities: "¢ Engage with clients to understand their requirements and provide architectural and design solutions. "¢ Develop and implement accelerators to facilitate faster cloud migration. "¢ Create innovative use cases or solutions to solve day to day data engineering problems using AI and GenAI tools. "¢ Develop reference architectures for various use cases using modern cloud data platforms. "¢ Understanding of Legacy toolsets, be it ETL, reporting etc is needed. "¢ Create migration suites for cataloging, migrating, and verifying data from legacy systems to modern platforms like Databricks and Snowflake. Minimum Skills RequiredQualifications: "¢ EducationB.E. in Electronics & Telecommunication or related field. "¢ Experience18+ years in IT, with significant experience in Big Data, Cloud, and AI/ML projects. "¢ Technical Skills: Proficiency in Databricks, Snowflake, AWS, GenAI (RAG and GANs), Python, C/C++/C

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 323775 We are currently seeking a Data & AI Technical Solution Architects to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 321498 We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. "¢ Work closely with Data modeller to ensure data models support the solution design "¢ Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. "¢ Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. "¢ Develop documentation and artefacts to support projects Minimum Skills Required"¢ ADF "¢ Fivetran (orchestration & integration) "¢ SQL "¢ Snowflake DWH

Posted 1 week ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in bangalore, Karntaka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelinesmandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring)Training & Certification "¢ Apache Kafka Administration Snowflake Fundamentals/Advanced Training "¢ Experience 8 years of experience in a technical role working with AWSAt least 2 years in a leadership or management role

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Req ID: 323754 We are currently seeking a Data & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works

Posted 1 week ago

Apply

2.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 310924 We are currently seeking a Technical Lead - Data Foundations Squad to join our team in Bangalore, Karntaka (IN-KA), India (IN). Technical Lead We are looking for a highly experienced technical lead to run (jointly with a solution architect) a medium sized squad delivering data as a product integrations and flows. This squad will be the first squad to utilise new standards and tools as they are delivered by the squads working on the core platform, both delivering value, and working with upstream teams to ensure that the baseline you are being asked to follow is high quality and fit for purpose. As a technical lead at NTT DATA UK, you will be expected to mentor and lead quality across the engineering activities with your team, working closely with the program wide engineering lead to ensure consistent standards across the platform. As this team will be having to deliver fully end to end data products, it is highly likely that there will be unfamiliar technologies needing to be handled at times, and a successful candidate will need to be comfortable with not just quickly upskilling themselves as requirements evolve, but to bring there team along with them. Required skills A deep understanding of data products and or DAAP concepts A willingness and ability to quickly upskill yourself and others on new technologies Experience at the level of technical lead, or lead engineer for 2+ years Significant expertise with Python, including design paradigms Experience with Kafka (or significant experience with another log based streaming technology) Experience with Snowflake (or significant experience with another cloud scale warehousing solution) Experience with Terraform (or significant experience with another IaC tool) Experience with unit testing within a data landscape Experience of an agile workflow Strong communication skills Extra useful skills Experience with working in a multi-vendor requirement A working understanding of cloud security An understanding of architectural patterns for real time data A history of authoring formal documentation and low level designs for a wide range of target audiences

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Req ID: 323774 We are currently seeking a Data & AI Technical Solution Architects to join our team in Hyderabad, Telangana (IN-TG), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 1 week ago

Apply

5.0 - 10.0 years

13 - 23 Lacs

Hyderabad

Work from Office

Naukri logo

Skill- Snowflake Location- Hyderabad Experience- 5 to 10years JD IT development experience with min 3+ years hands-on experience in Snowflake Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow Should have good understanding on ETL or ELT processes and ETL tools

Posted 1 week ago

Apply

4.0 - 6.0 years

15 - 25 Lacs

Noida

Work from Office

Naukri logo

We are looking for a highly experienced Senior Data Engineer with deep expertise in Snowflake to lead efforts in optimizing the performance of our data warehouse to enable faster, more reliable reporting. You will be responsible for improving query efficiency, data pipeline performance, and overall reporting speed by tuning Snowflake environments, optimizing data models, and collaborating with Application development teams. Roles and Responsibilities Analyze and optimize Snowflake data warehouse performance to support high-volume, complex reporting workloads. Identify bottlenecks in SQL queries, ETL/ELT pipelines, and data models impacting report generation times. Implement performance tuning strategies including clustering keys, materialized views, result caching, micro-partitioning, and query optimization. Collaborate with BI teams and business analysts to understand reporting requirements and translate them into performant data solutions. Design and maintain efficient data models (star schema, snowflake schema) tailored for fast analytical querying. Develop and enhance ETL/ELT processes ensuring minimal latency and high throughput using Snowflake’s native features. Monitor system performance and proactively recommend architectural improvements and capacity planning. Establish best practices for data ingestion, transformation, and storage aimed at improving report delivery times. Experience with Unistore will be an added advantage

Posted 1 week ago

Apply

10.0 - 16.0 years

25 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Required Skills & Qualifications: 10-12 years of experience in data architecture, data warehousing, and cloud technologies. Strong expertise in Snowflake architecture, data modeling, and optimization. Solid hands-on experience with cloud platforms: AWS , Azure , and GCP . In-depth knowledge of SQL , Python , PySpark , and related data engineering tools. Expertise in data modeling (both dimensional and normalized models). Strong experience with data integration, ETL processes, and pipeline development. Certification in Snowflake , AWS , Azure , or related cloud technologies. Experience working with large-scale data processing frameworks and platforms. Experience in data visualization tools and BI platforms (e.g., Tableau, Power BI). Experience in Agile methodologies and project management. Strong problem-solving skills with the ability to address complex technical challenges. Excellent communication skills and ability to work collaboratively with cross-functional teams.

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2Team ManagementProductivity, efficiency, absenteeism 3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Skill set Snowflake, AWS, Cortex AI, Horizon Catalog or Snowflake, AWS, (Cortex AI or Horizon Catalog) or Snowflake, Azure, Cortex AI, Horizon Catalog Or Snowflake, Azure, (Cortex AI or Horizon Catalog) Preferred Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. Experience in data engineering, with at least 3 years of experience working with Snowflake. Proven experience in Snowflake, Cortex AI/ Horizon Catalog focusing on data extraction, chatbot development, and Conversational AI. Strong proficiency in SQL, Python, and data modeling. Experience with data integration tools (e.g., Matillion, Talend, Informatica). Knowledge of cloud platforms such as AWS or Azure, or GCP. Excellent problem-solving skills, with a focus on data quality and performance optimization. Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT's testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT's lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github/Gitlab, Azure repo

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities : Senior Site Reliability Engineer - Data PlatformRole Summary The primary responsibility of the Senior Site Reliability Engineer (SRE) is to ensure reliability and performance of data systems while working on development, automation, and testing of data pipelines from extract to consumption layer population for the GPN Lakehouse. This role performs tasks connected with data analytics, testing, and system architecture to provide reliable data pipelines that enable business solutions. SRE engineers will be expected to perform at a minimum the following tasks: ETL process management, data modeling, data warehouse/lake architecture, ETL tool implementation, data pipeline development, system monitoring, incident response, data lineage tracking, and ETL unit testing. NOTICE PERIOD- Immediate Joiners only

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 13 Lacs

Pune

Hybrid

Naukri logo

When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation. For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation. From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys. Innovate With Ansys, Power Your Career. Summary / Role Purpose The Business Operations Specialist II works with the Sales, Sales Ops, Legal, Accounting, Export Compliance, and other departments to process customer orders and generate license keys. This role is responsible for verifying and reviewing the accuracy of orders, also completing and maintaining associated records and preparing related reports. Little direction required; the Business Operations Specialist II is able to handle some complex tasks and accomplish straightforward work without assistance. Key Duties and Responsibilities Processes software license orders and stock orders via multiple CRM systems and verifies license agreements in accordance with ANSYS, Inc. policies and procedures Generates timely, accurate license keys and software license entitlement information, and delivers them to sales channels and customers Assists customers attempting to enroll for the ANSYS, Inc. Customer Portal Utilizes CRM checks to strive for succinct data integrity Acts as liaison to ANSYS, Inc. sales channel by providing quality customer service and support and resolving customer issues Provides assistance to sales personnel for proper order submission and documentation Interfaces with legal, accounting, and sales departments to facilitate procedural and policy adherence Proactively seeks ways to improve workflow, including identification of better ways to provide value-added customer service Participates in department projects such as developing rollout plans for product delivery Minimum Education/Certification Requirements and Experience Associates Degree or minimum 4 years of experience in a billing, order processing, or customer service environment Excellent customer services skills and orientation Demonstrated organizational and analytical skills Experience working in database environment including report generation responsibilities Demonstrated ability and experience in a detail-oriented position Ability and willingness to perform in fast paced, rapidly changing environment Excellent communication and interpersonal skills Demonstrated ability to multi-task in a deadline driven environment Microsoft Office experience required Preferred Qualifications and Skills Prior CRM experience preferred Bachelors Degree in Accounting or Business is preferred Previous experience with servicing global customers is highly preferred Experience working with Salesforce, Snowflake, and PowerBI Experience improving processes At Ansys, we know that changing the world takes vision, skill, and each other. We fuel new ideas, build relationships, and help each other realize our greatest potential in the knowledge that every day is an opportunity to observe, teach, inspire, and be inspired. Together as One Ansys, we are powering innovation that drives human advancement. Our Commitments: Amaze with innovative products and solutions Make our customers incredibly successful Act with integrity Ensure employees thrive and shareholders prosper Our Values: Adaptability: Be open, welcome what's next Courage: Be courageous, move forward passionately Generosity: Be generous, share, listen, serve Authenticity: Be you, make us stronger Our Actions: We commit to audacious goals We work seamlessly as a team We demonstrate mastery We deliver outstanding results OUR ONE ANSYS CULTURE HAS INCLUSION AT ITS CORE We believe diverse thinking leads to better outcomes. We are committed to creating and nurturing a workplace that fuels this by welcoming people, no matter their background, identity, or experience, to a workplace where they are valued and where diversity, inclusion, equity, and belonging thrive. At Ansys, you will find yourself among the sharpest minds and most visionary leaders across the globe. Collectively we strive to change the world with innovative technology and transformational solutions. With a prestigious reputation in working with well-known, world-class companies, standards at Ansys are high met by those willing to rise to the occasion and meet those challenges head on. Our team is passionate about pushing the limits of world-class simulation technology, empowering our customers to turn their design concepts into successful, innovative products faster and at a lower cost. At Ansys, it's about the learning, the discovery, and the collaboration. It's about what is next as much as the mission accomplished. And it's about the melding of disciplined intellect with strategic direction and results that have, can, and do impact real people in real ways. All this is forged within a working environment built on respect, autonomy, and ethics. CREATING A PLACE WE'RE PROUD TO BE Ansys is an S&P 500 company and a member of the NASDAQ-100. We are proud to have been recognized for the following more recent awards, although our list goes on: Americas Most Loved Workplaces, Gold Stevie Award Winner, Americas Most Responsible Companies, Fast Company World Changing Ideas, Great Place to Work Certified (China, Greece, France, India, Japan, Korea, Spain, Sweden, Taiwan, U.K.). For more information, please visit us at www.ansys.com Ansys is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics. Ansys does not accept unsolicited referrals for vacancies, and any unsolicited referral will become the property of Ansys. Upon hire, no fee will be owed to the agency, person, or entity.

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

ETL testers with Automation Testing experience in DBT and snowflake. DBT, experience in Talend and snowflake.

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Pune

Hybrid

Naukri logo

About This Role : We are looking for a talented and experienced Data Engineer with Tech Lead with hands-on expertise in any ETL Tool with full knowledge about CI/CD practices with leading a team technically more than 5 and client facing and create Data Engineering, Data Quality frameworks. As a tech lead must ensure to build ETL jobs, Data Quality Jobs, Big Data Jobs performed performance optimization by understanding the requirements, create re-usable assets and able to perform production deployment and preferably worked in DWH appliances Snowflake / redshift / Synapse Responsibilities Work with a team of engineers in designing, developing, and maintaining scalable and efficient data solutions using Any Data Integration (any ETL tool like Talend / Informatica) and any Big Data technologies. Design, develop, and maintain end-to-end data pipelines using Any ETL Data Integration (any ETL tool like Talend / Informatica) to ingest, process, and transform large volumes of data from heterogeneous sources. Have good experience in designing cloud pipelines using Azure Data Factory or AWS Glues/Lambda. Implemented Data Integration end to end with any ETL technologies. Implement database solutions for storing, processing, and querying large volumes of structured and unstructured and semi-structured data Implement Job Migrations of ETL Jobs from Older versions to New versions. Implement and write advanced SQL scripts in SQL Database at medium to expert level. Work with technical team with client and provide guidance during technical challenges. Integrate and optimize data flows between various databases, data warehouses, and Big Data platforms. Collaborate with cross-functional teams to gather data requirements and translate them into scalable and efficient data solutions. Optimize ETL, Data Load performance, scalability, and cost-effectiveness through optimization techniques. Interact with Client on a daily basis and provide technical progress and respond to technical questions. Implement best practices for data integration. Implement complex ETL data pipelines or similar frameworks to process and analyze massive datasets. Ensure data quality, reliability, and security across all stages of the data pipeline. Troubleshoot and debug data-related issues in production systems and provide timely resolution. Stay current with emerging technologies and industry trends in data engineering technologies, CI/CD, and incorporate them into our data architecture and processes. Optimize data processing workflows and infrastructure for performance, scalability, and cost-effectiveness. Provide technical guidance and foster a culture of continuous learning and improvement. Implement and automate CI/CD pipelines for data engineering workflows, including testing, deployment, and monitoring. Perform migration to production deployment from lower environments, test & validate Must Have Skills Must be certified in any ETL tools, Database, Cloud.(Snowflake certified is more preferred) Must have implemented at least 3 end-to-end projects in Data Engineering. Must have worked on performance management optimization and tuning for data loads, data processes, data transformation in big data Must be flexible to write code using JAVA/Scala/Python etc. as required Must have implemented CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline. Must have managed a team technically of min 5 members and guided the team technically. Must have the Technical Ownership capability of Data Engineering delivery. Strong communication capabilities with client facing. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5 years of experience in software engineering or a related role, with a strong focus on Any ETL Tool, database, integration. Proficiency in Any ETL tools like Talend , Informatica etc for Data Integration for building and orchestrating data pipelines. Hands-on experience with relational databases such as MySQL, PostgreSQL, or Oracle, and NoSQL databases such as MongoDB, Cassandra, or Redis. Solid understanding of database design principles, data modeling, and SQL query optimization. Experience with data warehousing, Data Lake , Delta Lake concepts and technologies, data modeling, and relational databases.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai, Hyderabad, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA The Software Applications Development Engineer is a seasoned subject matter expert, responsible for developing new applications and improving upon existing applications based on the needs of the internal organization and or external clients. What you'll be doing Yrs. Of Exp: 5 Yrs. Data Engineer- Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. Work closely with Data modeller to ensure data models support the solution design Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. Develop documentation and artefacts to support projects.

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs

Posted 1 week ago

Apply

3.0 - 7.0 years

12 - 17 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. What you'll be doing We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. Experience and Leadership: Proven experience in data architecture, with a recent role as a Lead Data Solutions Architect, or a similar senior position in the field. Proven experience in leading architectural design and strategy for complex data solutions and then overseeing their delivery. Experience in consulting roles, delivering custom data architecture solutions across various industries. Architectural Expertise: Strong expertise in designing and overseeing delivery of data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination towards a product-led approach in data solution architecture. Extensive familiarity with cloud data architecture on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python or R is beneficial. Exposure to ETL/ ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with machine learning and AI concepts, particularly in how they integrate into data architectures. Design and Lifecycle Management: Proven background in designing modern, scalable, and robust data architectures. Comprehensive grasp of the data architecture lifecycle, from concept to deployment and consumption. Data Management and Governance: Strong knowledge of data management principles and best practices, including data governance frameworks. Experience with data security and compliance regulations (GDPR, CCPA, HIPAA, etc.) Leadership and Communication: Exceptional leadership skills to manage and guide a team of architects and technical experts. Excellent communication and interpersonal skills, with a proven ability to influence architectural decisions with clients and guide best practices Project and Stakeholder Management: Experience with agile methodologies (e.g. SAFe, Scrum, Kanban) in the context of architectural projects. Ability to manage project budgets, timelines, and resources, maintaining focus on architectural deliverables. Location: Delhi or Bangalore Workplace type : Hybrid Working

Posted 1 week ago

Apply

9.0 - 14.0 years

10 - 20 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

JD: Snowflake Implementer : Designing, implementing, and managing Snowflake data warehouse solutions, ensuring data integrity, and optimizing performance for clients or internal teams. Strong SQL skills: Expertise in writing, optimizing, and troubleshooting SQL queries. Experience with data warehousing: Understanding of data warehousing concepts, principles, and best practices. Knowledge of ETL /ELT technologies: Experience with tools and techniques for data extraction, transformation, and loading. Experience with data modeling: Ability to design and implement data models that meet business requirements. Familiarity with cloud platforms: Experience with cloud platforms like AWS, Azure, or GCP (depending on the specific Snowflake environment). Problem-solving and analytical skills: Ability to identify, diagnose, and resolve technical issues. Communication and collaboration skills: Ability to work effectively with cross-functional teams. Experience with Snowflake (preferred): Prior experience with Snowflake is highly desirable. Certifications (preferred): Snowflake certifications (e.g., Snowflake Data Engineer, Snowflake Database Administrator) can be a plus. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

6.0 - 10.0 years

7 - 14 Lacs

Bengaluru

Hybrid

Naukri logo

Roles and Responsibilities Architect and incorporate an effective Data framework enabling end to end Data Solution. Understand business needs, use cases and drivers for insights and translate them into detailed technical specifications. Create epics, features and user stories with clear acceptance criteria for execution and delivery by the data engineering team. Create scalable and robust data solution designs that incorporate governance, security and compliance aspects. Develop and maintain logical and physical data models and work closely with data engineers, data analysts and data testers for successful implementation of them. Analyze, assess and design data integration strategies across various sources and platforms. Create project plans and timelines while monitoring and mitigating risks and controlling progress of the project. Conduct daily scrum with the team with a clear focus on meeting sprint goals and timely resolution of impediments. Act as a liaison between technical teams and business stakeholders and ensure. Guide and mentor the team for best practices on Data solutions and delivery frameworks. Actively work, facilitate and support the stakeholders/ clients to complete User Acceptance Testing ensure there is strong adoption of the data products after the launch. Defining and measuring KPIs/KRA for feature(s) and ensuring the Data roadmap is verified through measurable outcomes Prerequisites 5 to 8 years of professional, hands on experience building end to end Data Solution on Cloud based Data Platforms including 2+ years working in a Data Architect role. Proven hands on experience in building pipelines for Data Lakes, Data Lake Houses, Data Warehouses and Data Visualization solutions Sound understanding of modern Data technologies like Databricks, Snowflake, Data Mesh and Data Fabric. Experience in managing Data Life Cycle in a fast-paced, Agile / Scrum environment. Excellent spoken and written communication, receptive listening skills, and ability to convey complex ideas in a clear, concise fashion to technical and non-technical audiences Ability to collaborate and work effectively with cross functional teams, project stakeholders and end users for quality deliverables withing stipulated timelines Ability to manage, coach and mentor a team of Data Engineers, Data Testers and Data Analysts. Strong process driver with expertise in Agile/Scrum framework on tools like Azure DevOps, Jira or Confluence Exposure to Machine Learning, Gen AI and modern AI based solutions. Experience Technical Lead Data Analytics with 6+ years of overall experience out of which 2+ years is on Data architecture. Education Engineering degree from a Tier 1 institute preferred. Compensation The compensation structure will be as per industry standards

Posted 1 week ago

Apply

6.0 - 10.0 years

3 - 8 Lacs

Noida

Work from Office

Naukri logo

Position: Snowflake - Senior Technical Lead Experience: 8-11 years Location: Noida/ Bangalore Education: B.E./ B.Tech./ MCA Primary Skills: Snowflake, Snowpipe, SQL, Data Modelling, DV 2.0, Data Quality, AWS, Snowflake Security Good to have Skills: Snowpark, Data Build Tool, Finance Domain Preferred Skills Experience with Snowflake-specific features: Snowpipe, Streams & Tasks, Secure Data Sharing. Experience in data warehousing, with at least 2 years focused on Snowflake. Hands-on expertise in SQL, Snowflake scripting (JavaScript UDFs), and Snowflake administration. Proven experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Matillion) and orchestration frameworks. Deep knowledge of data modeling techniques (star schema, data vault) and performance tuning. Familiarity with data security, compliance requirements, and governance best practices. Experience in Python, Scala, or Java for Snowpark development. Strong understanding of cloud platforms (AWS, Azure, or GCP) and related services (S3, ADLS, IAM) Key Responsibilities Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost. Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe). Automate schema migrations, deployments, and pipeline orchestration (e.g., with dbt, Airflow, or Matillion). Monitor query performance and resource utilization; tune warehouses, caching, and clustering. Implement workload isolation (multi-cluster warehouses, resource monitors) for concurrent workloads. Define and enforce role-based access control (RBAC), masking policies, and object tagging. Ensure data encryption, compliance (e.g., GDPR, HIPAA), and audit logging are correctly configured. Establish best practices for dimensional modeling, data vault architecture, and data quality. Create and maintain data dictionaries, lineage documentation, and governance standards. Partner with business analysts and data scientists to understand requirements and deliver analytics-ready datasets. Stay current with Snowflake feature releases (e.g., Snowpark, Native Apps) and propose adoption strategies. Contribute to the long-term data platform roadmap and cloud cost-optimization initiatives.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Developer, Snowflake, Snowpro Education: BE/B.Tech/MCA/M.Tech/MSc./MS Interview Mode-F2F

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Chandigarh

Hybrid

Naukri logo

Design, build, and maintain scalable and reliable data pipelines on Databricks, Snowflake, or equivalent cloud platforms. Ingest and process structured, semi-structured, and unstructured data from a variety of sources including APIs, RDBMS, and file systems. Perform data wrangling, cleansing, transformation, and enrichment using PySpark, Pandas, NumPy, or similar libraries. Optimize and manage large-scale data workflows for performance, scalability, and cost-efficiency. Write and optimize complex SQL queries for transformation, extraction, and reporting. Design and implement efficient data models and database schemas with appropriate partitioning and indexing strategies for Data Warehouse or Data Mart. Leverage cloud services (e.g., AWS S3, Glue, Kinesis, Lambda) for storage, processing, and orchestration. Use orchestration tools like Airflow, Temporal, or AWS Step Functions to manage end-to-end workflows. Build containerized solutions using Docker and manage deployment pipelines via CI/CD tools such as Azure DevOps, GitHub Actions, or Jenkins. Collaborate closely with data scientists, analysts, and business stakeholders to understand requirements and deliver data solutions.

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies