Home
Jobs

1123 Snowflake Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Management Level: Ind&Func AI Decision Science Manager Location: Gurgaon, Bangalore Must-Have Skills: Market Mix Modeling (MMM) Techniques, Optimization Algorithms for budget allocation and promotional channel optimization, Statistical and Probabilistic Methods:SVM, Decision Trees, Programming Languages & Tools:Python, NumPy, Pandas, Sklearn, AI/ML Models Development and Data Pipeline Management, Data Management within Snowflake (data layers, migration), Cloud Platforms experience (Azure, AWS, GCP). Good-to-Have Skills: Experience with Nonlinear Optimization Techniques, Experience in Data Migration (cloud to Snowflake), Proficiency in SQL and cloud-based technologies, Understanding of Econometrics/Statistical Modeling (Regression, Time Series, Multivariate Analysis). Job Summary We are seeking a skilled Ind & Func AI Decision Science Manager to join the Accenture Strategy & Consulting team in the Global Network Data & AI practice. This role will focus on Market Mix Modeling (MMM), where you will be responsible for developing AI/ML models, optimizing promotional channels, managing data pipelines, and working on scaling marketing mix models across cloud platforms. This role offers an exciting opportunity to collaborate with leading financial clients and leverage cutting-edge technology to drive business impact and innovation. Roles & Responsibilities Engagement Execution Lead MMM engagements that involve optimizing promotional strategies, budget allocation, and marketing analytics solutions. Apply advanced statistical techniques and machine learning models to improve marketing effectiveness. Collaborate with clients to develop tailored market mix models, delivering data-driven insights to optimize their marketing budgets and strategies. Develop Proof of Concepts (PoC) for clients, including scoping, staffing, and execution phases. Practice Enablement Mentor and guide analysts, consultants, and managers to build their expertise in Market Mix Modeling and analytics. Contribute to the growth of the Analytics practice through knowledge sharing, staffing initiatives, and the development of new methodologies. Promote thought leadership in Marketing Analytics by publishing research and presenting at industry events. Opportunity Development Identify business development opportunities in marketing analytics and develop compelling business cases for potential clients. Work closely with deal teams to provide subject matter expertise in MMM, ensuring the development of high-quality client proposals and responses to RFPs. Client Relationship Development Build and maintain strong, trusted relationships with internal and external clients. Serve as a consultant to clients, offering strategic insights to optimize marketing spend and performance. Professional & Technical Skills 5+ years of experience in Market Mix Modeling (MMM) and associated optimization techniques. Strong knowledge of nonlinear optimization, AI/ML models, and advanced statistical techniques for marketing. Proficiency in programming languages such as Python, NumPy, Pandas, Sklearn, Seaborne, Pycaret, and Matplotlib. Experience with cloud platforms such as AWS, Azure, or GCP and data migration to Snowflake. Familiarity with econometrics/statistical modeling techniques (Regression, Hypothesis Testing, Time Series, Multivariate Analysis). Hands-on experience in managing data pipelines and deploying scalable machine learning architectures. Additional Information Masters degree in Statistics, Econometrics, Economics, or related fields from reputed universities. Ph.D. or M.Tech is a plus. Excellent communication and interpersonal skills to effectively collaborate with global teams and clients. Willingness to travel up to 40% of the time. Work on impactful projects to help clients optimize their marketing strategies through advanced data-driven insights. About Our Company | Accenture (do not remove the hyperlink) Qualification Experience: 5+ years of advanced experience in Market Mix Modeling (MMM) and related optimization techniques for promotional channels and budget allocation 2+ years for Analysts & 4+ years for Consultants of experience in consulting/analytics with reputed organizations Educational Qualification: Masters degree in Statistics, Econometrics, Economics, or related fields from reputed institutions Ph.D. or M.Tech in relevant fields is an advantage

Posted 3 weeks ago

Apply

9.0 - 14.0 years

20 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Position : Power BI Lead Engineer Experience : 9 +years Hybrid mode Shift Timings : 2PM -11PM Location : Bangalore, Chennai & Hyderabad and Pune. Responsibilities Lead the offshore Power BI development team. Oversee the development and testing of Power BI reports and dashboards. Ensure adherence to project timelines and quality standards. Collaborate with the onshore architect and project manager. Provide technical guidance and support to the offshore team. Work with the data engineers to validate the data within snowflake and teradata. Skills to Have Strong leadership and communication skills. Extensive experience with Power BI development. Proficiency in DAX, Power Query, and data modeling. Experience with Agile development methodologies. Ability to manage and mentor a team. Testing and validation of the user stories developed by engineers Remove any bottlenecks and technical impediments for the project Technologies (Must Have): Power BI (Desktop, Service) DAX, Power Query (M), Copilot. PowerBI Reports Builder SQL Python MicroStrategy, WebFocus : good to have Snowflake : basic knowledge is must Technologies (Good to Have): Azure cloud platform or Any cloud is fine good to have Exposure to PowerBI Rest APIs PowerBI External Tools Custom Visuals

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

The Digital :Snowflake, Database Administration (DBMS) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Snowflake, Database Administration (DBMS) domain.

Posted 3 weeks ago

Apply

12.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Project Role : Responsible AI Tech Lead Project Role Description : Ensure the ethical and responsible use of artificial intelligence (AI) technologies. Design and deploy Responsible AI solutions; align AI projects with ethical principles and regulatory requirements. Provide leadership, fosters cross-functional collaboration, and advocates for ethical AI adoption. Must have skills : Data Modeling Techniques and Methodologies, Amazon Web Services (AWS), Snowflake Data Warehouse, Core Banking Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Responsible AI Tech Lead, you will ensure the ethical and responsible use of artificial intelligence technologies. Your typical day will involve designing and deploying Responsible AI solutions, aligning AI projects with ethical principles and regulatory requirements, and providing leadership to foster cross-functional collaboration. You will advocate for the adoption of ethical AI practices, ensuring that all AI initiatives are conducted with integrity and accountability, while also engaging with various stakeholders to promote a culture of responsible AI usage across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training sessions to enhance team understanding of ethical AI practices.- Monitor and evaluate the impact of AI solutions to ensure compliance with ethical standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies, Core Banking, Amazon Web Services (AWS), Snowflake Data Warehouse.- Strong understanding of data governance frameworks and best practices.- Experience with data integration and ETL processes.- Familiarity with machine learning algorithms and their ethical implications.- Ability to communicate complex technical concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 12 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on any ETL took, 3+ years on Snowflake and 1-3 years on Fivetranb Played a key role in Fivetran related discussions with teams and clients to understand business problems and solutioning requirementsc As a Fivetran SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Fivetran Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using Fivetran c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese Fivetran end to end migration experience f Fivetran and any one cloud certification is good to have Professional Attributes:a Project management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written and verbal , presentation and s Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 3 weeks ago

Apply

12.0 - 15.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to analyze existing systems, propose improvements, and contribute to the strategic direction of the data platform, ensuring it meets both current and future requirements. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 12 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Spring BootMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:-Implement snowflake cloud data warehouse and cloud related architecture. -Migrating from various sources to Snowflake.-Work on Snowflake capabilities such as Snow pipe, Stages, Snow SQL, Streams, and tasks.-Implement snowflake advanced concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, zero copy clone.-In-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse-Deploy the snowflake features such as data sharing, event, and lake house patterns.-Implement Incremental extraction loads - batched and streaming. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Good To Have Skills: Experience with Spring Boot- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management systems Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Hands on experience with Snowflake and Python a must. Hands on experience with Apache Spark a must. Hands on experience with DBT preferred. Experience with performance tuning SQL queries, Spark job, and stored procedures. An understanding of E-R data models (conceptual, logical, and physical).

Posted 3 weeks ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : PostgreSQL, Data Engineering, Snowflake Data WarehouseMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also participate in testing and debugging processes, ensuring that the applications meet quality standards and user expectations. Additionally, you will be involved in continuous improvement efforts, adapting applications to evolving business needs and technological advancements. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with PostgreSQL, Data Engineering, Snowflake Data Warehouse.- Strong understanding of application development methodologies.- Experience with database design and optimization techniques.- Familiarity with software development life cycle and agile practices. Additional Information:- The candidate should have minimum 12 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of AI/ML models.- Conduct research on emerging AI technologies.- Optimize AI algorithms for performance and scalability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of AI and ML concepts.- Experience with cloud AI services.- Knowledge of deep learning and neural networks.- Familiarity with chatbots and image processing. Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 11.0 years

25 - 30 Lacs

Pune, Gurugram

Hybrid

Naukri logo

Job Description: 7+ Years of experience as a Data Engineer Strong technical expertise in SQL Advanced SQL querying skills (joins, subqueries, CTEs, aggregation) Strong knowledge of joins and common table expressions (CTEs) Strong experience with Python Experience in Snowflake, ETL, SQL, CI/CD Strong expertise in ETL process and with various data model concepts Knowledge of star schema and snowflake schema Good to know about AWS services such as S3, Athena, Glue, EMR/Spark with a major emphasis on S3 and Glue Experience with Big Data Tools and technologies Key Skills: • Good Understanding of data structures and data analysis using SQL or Python • Knowledge of Insurance Domain is an addition. • Designing and implementing ETL pipelines (extract, transform, load) • Experience with ETL tools like Informatica, Talend, Apache NiFi, or Fivetran (excluding Azure Data Factory) • Knowledge of analyzing data using SQL Advanced SQL querying skills (joins, subqueries, CTEs, aggregation) • Conducting End to End verification and validation for the entire application

Posted 3 weeks ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in ensuring that applications meet both functional and technical requirements, ultimately contributing to the overall success of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with SQL and data querying techniques.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 8.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Location: Bangalore Experience: 4+ Years Job Type: FTE Note: Looking only for Immediate to 1 week joiners. Must be comfortable for Video discussion. JD KeySkills required : Option :1 Bigdata Hadoop + Hive + HDFS Python OR Scala - Language OR Option :2 Snowflake with Bigdata knowledge & Snowpark is preferred Python / Scala - Language Contact Person - Amrita Please share your updated profile to amrita.anandita@htcinc.com with the below mentioned details: Full Name (As per Aadhar card) - Total Exp. - Rel. Exp. (Bigdata Hadoop) - Rel. Exp. (Python) - Rel. Exp. (Scala) - Rel. Exp. (Hive) - Rel. Exp. (HDFS) - OR Rel. Exp. (Snowflake) - Rel. Exp. (Snowpark) - Highest Education (if has done B.Tech/ B.E, then specify) - Notice Period - If serving Notice or not working, then mention your last working day as per your relieving letter - CCTC - ECTC - Current Location - Preferred Location -

Posted 3 weeks ago

Apply

10.0 - 16.0 years

60 - 75 Lacs

Pune

Hybrid

Naukri logo

Position Summary: As a Software Architect, you will be responsible for providing technical leadership and architectural guidance to development teams, ensuring the design and implementation of scalable, robust, and maintainable software solutions. You will collaborate with stakeholders, including business leaders, project managers, and developers, to understand requirements, define architectural goals, and make informed decisions on technology selection, system design, and implementation strategies. Additionally, you will mentor and coach team members, promote best practices, and foster a culture of innovation and excellence within the organization. This role is based in Redaptive Pune, India office. Responsibilities and Duties: Time Spent Performing Duty: System Design and Architecture : 40% Identify and propose technical solutions for complex problem-statements. Provides an application-level perspective during design and implementation, which incorporates for cost constraints, testability, complexity, scalability, performance, migrations, etc. Provide technical leadership and guidance to development teams, mentoring engineers and fostering a culture of excellence and innovation. Review code and architectural designs to ensure adherence to coding standards, best practices, and architectural principles. Create and maintain architectural documentation, including architectural diagrams, design documents, and technical specifications, to ensure clarity and facilitate collaboration. Software Design and Development: 50% Gather and analyze requirements from stakeholders, understanding business needs, and translating them into technical specifications. Work alongside teams at all stages of design & development. Augmenting and supporting teams as needed. Collaborate with product managers, stakeholders, and cross-functional teams to define project scope, requirements, and timelines, and ensure successful project execution. Knowledge Sharing and Continuous Improvement: 10% Conduct presentations, workshops, and training sessions to educate stakeholders and development teams on architectural concepts, best practices, and technologies. Stay updated with emerging technologies, industry trends, and best practices in software architecture and development. Identify opportunities for process improvement, automation, and optimization in software development processes and methodologies. Share knowledge and expertise with team members through mentorship, training sessions, and community involvement. Required Abilities and Skills: Strong analytical and troubleshooting skills. Excellent verbal and written communication skills. Ability to effectively communicate with stakeholders, including business leaders and project managers to understand requirements and constraints. Works effectively with cross-functional teams, including developers, QA, product managers, and operations. Capability to understand the bigger picture and design systems that align with business goals, scalability requirements, and future growth. Ability to make tough decisions and take ownership of architectural choices, considering both short-term and long-term implications Mastery of one or more programming languages commonly used in software development, such as Java, Python, or JavaScript. Expertise in SQL and NoSQL database, including database design and optimization. Ability to quickly learn new technologies and adapt to changing requirements. Knowledge of techniques for designing scalable and high-performance web services, including load balancing, caching, and horizontal scaling. Knowledge of software design principles (e.g. object-oriented principles, data structures, and algorithms.) Processes a security mindset, drives adoption of best practices to design systems that are secure and resilient to security threats. Continuously learning and staying up to date with emerging technologies and best practices. Domain knowledge in energy efficiency, solar/storage, or electric utilities is a plus. Education and Experience: 10+ years of software development experience. Proven track record of delivering high-quality software solutions within deadlines. Demonstrated technical leadership experience. Experience with data heavy systems like Databricks and Data Ops. Experience with Cloud (AWS) application development. Experience with Java & Spring framework strongly preferred. Experience with distributed architectures, SOA, microservices and containerization technologies (e.g., Docker, Kubernetes) Experience designing and developing web-based applications and backend services. Travel: This role may require 1-2 annual international work visits to the US. The Perks! Equity plan participation Medical and Personal Accident Insurance Support on Hybrid working and Relocation Flexible Time Off Continuous Learning Annual bonus, subject to company and individual performance The company is an Equal Opportunity Employer, drug free workplace, and complies with Labor Laws as applicable. All duties and responsibilities are essential functions and requirements and are subject to possible modification to reasonably accommodate individuals with disabilities. The requirements listed in this document are the minimum levels of knowledge, skills, or abilities.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT EVERNORTH: Evernorthexists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care,we solve the problems others don’t, won’t or can’t. Position Overview Cigna,a leading Health Services company, is looking for data engineers/developers in our Data & Analytics organization. The Full Stack Engineer is responsible for the delivery of a business need end-to end starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is Ownership & Accountability. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. He/She should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Behaviours of a Full Stack Engineer Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers – not institutionalized developer Responsibilities Minimize "meetings" to get requirements and have direct business interactions Write referenceable & modular code Design and architect the solution independently Be fluent in particular areas and have proficiency in many areas Have a passion to learn Take ownership and accountability Understands when to automate and when not to Have a desire to simplify Be entrepreneurial / business minded Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact Take risks and champion new ideas Qualifications Experience Required 3 - 5 years being part of Agile teams 3 - 5 years of scripting 2+ years of AWS Hand on (S3, Lamda) 2+ years of experience with Pyspark or Python 2+ Experience with cloud technologies such as AWS. 2+ years of hand on with SQL Experience Desired: Experience with GITHUB Teradata, AWS (Glue, Lamda), Databricks, Snowflake, Angular, Rest API, Terraform, Jenkins (Cloudbees, Jenkinsfile/Groovy, password valt) Education and Training Required: Knowledge and/or experience with Health care information domains is a plus Computer science – Good to have Primary Skills: JavaScript, Python, PySpark, TDV, R, Ruby, Perl Lambdas, S3, EC2 Databricks, Snowflakes, Jenkins, Kafka, API Language, Angular, Selenium, AI & Machine Learning Additional Skills: Excellent troubleshooting skills Strong communication skills Fluent in BDD and TDD development methodologies Work in an agile CI/CD environment (Jenkins experience a plus) Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH)Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Lead Analyst – Business Intelligence Position Overview We are looking for a Software Engineer to design, build, and QA business intelligence (BI) solutions using a variety of BI tools. Candidates must have solid experiences designing, developing, testing, and implementing analytics reports using various technologies across traditional data center and cloud environments. This role will work directly with business partners and other IT team members to understand requirements and deliver effective solutions within Agile methodology and participate in all phases of the development and system support life cycle. Responsibilities Be comfortable working within a scrum team Actively participate in scrum ceremonies (e.g. daily scrum, story refinement, sprint planning, Program Increment (PI) planning, retrospectives) Work with business and technical partners to understand the current state system design and produce solutions for future capabilities Design, develop, test, and triage data pipelines; includes understanding source data and the transformation processes towards target states Share your perspective and prior experiences for the betterment of the team Qualifications Bachelors degree in software engineering, Computer Science or related field, or equivalent work experience Demonstrate an understanding of agile development practices Demonstrate an understanding of cloud technologies; AWS preferred Experience building interactive dashboards and reports will a Business Intelligences tool Quality Assurance experience Required Technical Skills: 5 - 8 Years of Hands-on experience using SAP Business Objects 4.x version. Experience in Universe Design & Development using IDT. Experience with SAP Business Objects report development using Universes, Web Intelligence and Rich Client. Run unit testing and validation of Business Objects reports and universes. Experience in scheduling and distribution of the Reports. Design reports against large volumes of data and experience in fine tuning reports and queries. Demonstrated experience automating QA scripts within a CI/CD process. Proficiency in SQL and performance tuning on databases like Teradata, Oracle, Postgres & Snowflake. Preferred Technical Skills: Knowledge of Cloud database Databricks, Snowflake. Experience in other BI toolsTableau, Power BI, Cognos Knowledge of development languages such as Python and Scala About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

3 - 8 Lacs

Pune

Work from Office

Naukri logo

We are organizing a direct walk-in drive at Pune location. Please find below details and skills for which we have a walk-in at TCS - Pune on 7th June 2025 Experience: 4 - 12 years Skill Name: - (1) Dot Net full stack developer + PowerShell (2) Azure Data Engineer (3) Dotnet Full stack + Angular/React (4) Snowflake Developer

Posted 3 weeks ago

Apply

10.0 - 20.0 years

20 - 27 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Hello, We are hiring for "Power BI Lead Engineer. Job Title Power BI Lead Engineer Work Mode Hybrid ( in office 1 to 2 days per week as needed ) Shift: 2 11 PM IST ( can work from office till 6pm and then they can head back to their home and login back to finish their work) Exp : 10+Years Loc: Bangalore, Chennai, Hyderabad, Pune Notice Period: Immediate to 20 Days(Serving) Required experience: Responsibilities Lead the offshore Power BI development team. Oversee the development and testing of Power BI reports and dashboards. Ensure adherence to project timelines and quality standards. Collaborate with the onshore architect and project manager. Provide technical guidance and support to the offshore team. Work with the data engineers to validate the data within snowflake and teradata. Skills to Have Strong leadership and communication skills. Extensive experience with Power BI development. Proficiency in DAX, Power Query, and data modeling. Experience with Agile development methodologies. Ability to manage and mentor a team. Testing and validation of the user stories developed by engineers Remove any bottlenecks and technical impediments for the project Technologies (Must Have): Power BI (Desktop, Service) Leadership Experience is must to have (Should have worked as Lead in past years) DAX, Power Query (M), Copilot. PowerBI Reports Builder SQL Python MicroStrategy, WebFocus - good to have Snowflake basic knowledge is must Technologies (Good to Have): Azure cloud platform or Any cloud is fine – good to have Exposure to PowerBI Rest API’s PowerBI External Tools Custom Visuals NOTE: Let your co-workers or Circle know about this opportunity if you lack these skills set.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

7 - 17 Lacs

Pune

Work from Office

Naukri logo

vConstruct, a Pune-based Construction Technology company is seeking a Data Engineer for its Data Science and Analytics team, a close-knit group of analysts and engineers supporting all data aspects of the business. You will be responsible for designing, developing, and maintaining our data infrastructure, ensuring data integrity, and supporting various data-driven projects. You will work closely with cross-functional teams to integrate, process, and manage data from various sources, enabling business insights and enhancing operational efficiency. Responsibilities Design, develop, and maintain robust, scalable data pipelines and ETL/ELT processes to efficiently ingest, transform, and store data from diverse sources. Collaborate with cross-functional teams to design, implement, and sustain data-driven solutions that optimize data flow and system integration. Develop and maintain pipelines to move data in real-time (streaming), on-demand, and batch modeswhether inbound to a central data warehouse, outbound to other systems, or point-to-pointfocusing on security, reusability, and data quality. Implement pipelines with comprehensive error-handling mechanisms that are visible to both technical and functional teams. Ensure optimized pipeline performance with timely data delivery, including appropriate alerts and notifications. Adhere to data engineering best practices for code management and automated deployments, incorporating validation and test automation across all data engineering efforts. Perform debugging, application issue resolution, root cause analysis, and assist in proactive/preventive maintenance. Collaborate with the extended data team to define and enforce standards, guidelines, and data models that ensure data quality and promote best practices. Write and execute complete testing plans, protocols, and documentation for assigned portions of the data system or components; identify defects and create solutions for issues with code and integration into data system architecture. Work closely with data analysts, business users, and developers to ensure the accuracy, reliability, and performance of data solutions. Monitor data performance, troubleshooting issues, and optimize existing solutions. Create and maintain technical documentation related to data architecture, integration flows, and processes. Organize and lead discussions with business and operational data stakeholders to understand requirements and deliver solutions. Partner with analysts, developers, and business users to build data solutions that are scalable, maintainable, and aligned with business objectives. Qualifications 3 to 6 years of experience as a Data Engineer, with a focus on building scalable data solutions. Over 3 years of experience in scripting languages such as Python for data processing, automation, and ETL development. 3+ years of hands-on experience working with Snowflake. 3+ years of experience with data integration tools such as Azure Data Factory, Fivetran, or Matillion. Strong experience in writing complex, highly optimized SQL queries on large datasets (3+ years). Deep expertise in SQL, with a focus on database performance tuning and optimization. Experience working with data platforms like Snowflake, Azure Synapse, or Microsoft Fabric. Proven experience integrating APIs and handling diverse data sources. Ability to understand, consume, and utilize APIs, JSON, and web services for building data pipelines. Experience designing and implementing data pipelines using cloud platforms such as Azure or AWS. Familiarity with orchestration tools like Apache Airflow or equivalent. Experience with CI/CD practices and automation in data engineering workflows. Knowledge of dbt or similar tools for data transformation is a plus. Familiarity with Power BI or other data visualization tools is a plus. Strong problem-solving skills with the ability to troubleshoot complex data issues. Excellent communication skills and a collaborative mindset to work effectively in team environments. Education Bachelors or Masters degree in Computer Science/Information technology or related field. Equivalent academic and work experience can be considered. About vConstruct : vConstruct specializes in providing high quality Building Information Modeling and Construction Technology services geared towards construction projects. vConstruct is a wholly owned subsidiary of DPR Construction. For more information, please visit www.vconstruct.com About DPR Construction: DPR Construction is a national commercial general contractor and construction manager specializing in technically challenging and sustainable projects for the advanced technology, biopharmaceutical, corporate office, and higher education and healthcare markets. With the purpose of building great things, great teams, great buildings, great relationshipsDPR is a truly great company. For more information, please visit www.dpr.com

Posted 3 weeks ago

Apply

10.0 - 12.0 years

1 - 1 Lacs

Hyderabad

Hybrid

Naukri logo

Role: Lead Data Engineer Experience: 10+ years Contract: 6+ months Job Summary: We are seeking an experienced and results-oriented Lead Data Engineer to drive the design, development, and optimization of enterprise data solutions. This onsite role requires deep expertise in FiveTran, Snowflake, SQL, Python, and data modeling, as well as a demonstrated ability to lead teams and mentor both Data Engineers and BI Engineers. The role will play a critical part in shaping the data architecture, improving analytics readiness, and enabling self-service business intelligence through scalable star schema designs. Key Responsibilities: Lead end-to-end data engineering efforts, including architecture, ingestion, transformation, and delivery. Architect and implement FiveTran-based ingestion pipelines and Snowflake data models. Create optimized Star Schemas to support analytics, self-service BI, and KPI reporting. Analyze and interpret existing report documentation and KPIs to guide modeling and transformation strategies. Design and implement efficient, scalable data workflows using SQL and Python. Review and extend existing reusable data engineering templates and frameworks. Provide technical leadership and mentorship to Data Engineers and BI Engineers, ensuring best practices in coding, modeling, performance tuning, and documentation. Collaborate with business stakeholders to gather requirements and translate them into scalable data solutions. Work closely with BI teams to enable robust reporting and dashboarding capabilities. Required Skills: 7+ years of hands-on data engineering experience, with 2+ years in a technical leadership or lead role. Deep expertise in FiveTran, Snowflake, and SQL development. Proficiency in Python for data transformation and orchestration. Strong understanding of data warehousing principles, including star schema design and dimensional modeling. Experience in analysing business KPIs and reports to influence data model design. Demonstrated ability to mentor both Data Engineers and BI Engineers and provide architectural guidance. Excellent problem-solving, communication, and stakeholder management skills Share CV to: Careers@rwavesoftech.com

Posted 3 weeks ago

Apply

6.0 - 10.0 years

2 - 2 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

We Currently have Job openings for Snowflake Developer www.royalcyber.com Job Description : Design, develop, and maintain scalable data pipelines and Snowflake data warehouse models. Implement data ingestion processes using Snowflake and ETL/ELT tools (e.g., dbt, Informatica, Talend, etc.). Optimize Snowflake SQL queries and manage performance tuning and data modeling. Develop and maintain stored procedures, UDFs, and other Snowflake scripting components. Work with cross-functional teams to understand business requirements and translate them into technical solutions. Collaborate with BI developers to provide clean, transformed, and well-modeled data for analytics and reporting. Maintain data governance, security, and compliance within the Snowflake environment. Monitor data pipelines for reliability and troubleshoot issues as needed. Support integration of Snowflake with various data sources and analytics tools like Tableau, Power BI, Looker, etc. Required Skills and Qualifications: 6 to 8 years of experience in data engineering, data warehousing, or data analytics roles. Minimum 3+ years of hands-on experience with Snowflake (data modeling, performance tuning, schema design, etc.). Strong proficiency in SQL, with expertise in writing complex queries and stored procedures. Solid experience with ETL/ELT tools and frameworks. Familiarity with cloud platforms (AWS, Azure, or GCP) and data lake architectures. Experience in integrating Snowflake with third-party BI tools and APIs. Strong understanding of data warehousing concepts, data lakes, and dimensional modeling. Working knowledge of version control (Git), CI/CD practices, and Agile methodologies. Excellent problem-solving skills and ability to work in a collaborative environment. Work Location : Remote If interested pls share resume to sruthy.p@royalcyber.com

Posted 3 weeks ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Hyderabad, Pune, Delhi / NCR

Hybrid

Naukri logo

Create documentation and user stories. Work with engineering teams to review upcoming and backlog Jira tickets. Provide guidance on design decisions in areas including Credit and tech including Snowflake and Streamlit Develop reporting in powerBI Required Candidate profile 5+ years of experience as a Business analyst especially in Alternative assets, Credit, CLO, Real Estate etc. Experience creating complex dashboards in powerBI Exposure to Snowflake and Streamlit

Posted 3 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies