Home
Jobs

1714 Snowflake Jobs - Page 49

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 9 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Teradata BI Minimum 7.5 year(s) of experience is required Educational Qualification : Graduation Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Snowflake Data Warehouse. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient solutions. Roles & Responsibilities: Design, develop, and maintain scalable and efficient applications using Snowflake Data Warehouse. Collaborate with cross-functional teams to analyze business requirements and develop solutions that meet business process and application requirements. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of the application by conducting unit testing, integration testing, and performance testing. Professional & Technical Skills: Must To Have Skills:Strong experience in Snowflake Data Warehouse. Good To Have Skills:Experience in Teradata BI. Experience in designing, building, and configuring applications to meet business process and application requirements. Experience in developing and maintaining technical documentation. Experience in conducting unit testing, integration testing, and performance testing. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office.

Posted 2 months ago

Apply

7 - 11 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool DBT Minimum 7.5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Your typical day will involve working with Snowflake Data Warehouse and collaborating with Integration Architects and Data Architects. Roles & Responsibilities: Assist with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Collaborate with Integration Architects and Data Architects to ensure the successful implementation of the data platform blueprint. Develop and maintain data pipelines using Snowflake Data Warehouse and other relevant data building tools. Ensure data quality and integrity by implementing appropriate data validation and testing procedures. Optimize data storage and retrieval processes to ensure efficient and effective data management. Professional & Technical Skills: Must To Have Skills:Strong experience with Snowflake Data Warehouse. Good To Have Skills:Experience with Data Building Tool DBT. Experience with data modeling and database design principles. Strong understanding of ETL processes and data integration techniques. Experience with data validation and testing procedures. Proficiency in SQL and other relevant programming languages. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduate

Posted 2 months ago

Apply

7 - 12 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool DBT Minimum 7.5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Your typical day will involve working with Snowflake Data Warehouse and collaborating with Integration Architects and Data Architects. Roles & Responsibilities: Assist with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Collaborate with Integration Architects and Data Architects to ensure the successful implementation of the data platform blueprint. Develop and maintain data pipelines using Snowflake Data Warehouse and other relevant data building tools. Ensure data quality and integrity by implementing appropriate data validation and testing procedures. Optimize data storage and retrieval processes to ensure efficient and effective data management. Professional & Technical Skills: Must To Have Skills:Strong experience with Snowflake Data Warehouse. Good To Have Skills:Experience with Data Building Tool DBT. Experience with data modeling and database design principles. Strong understanding of ETL processes and data integration techniques. Experience with data validation and testing procedures. Proficiency in SQL and other relevant programming languages. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduate

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool DBT Minimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models using Snowflake Data Warehouse and Data Building Tool DBT. Roles & Responsibilities: Assist with the blueprint and design of the data platform components using Snowflake Data Warehouse. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data pipelines using Data Building Tool DBT. Design and implement data security and access controls. Optimize and tune the performance of the data platform components. Professional & Technical Skills: Must To Have Skills:Experience with Snowflake Data Warehouse. Must To Have Skills:Strong understanding of data modeling and database design principles. Good To Have Skills:Experience with Data Building Tool DBT. Experience with data security and access controls. Experience with performance optimization and tuning of data platform components. Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduate

Posted 2 months ago

Apply

7 - 11 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool DBT Minimum 7.5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Your typical day will involve working with Snowflake Data Warehouse and collaborating with Integration Architects and Data Architects. Roles & Responsibilities: Assist with the blueprint and design of the data platform components, ensuring cohesive integration between systems and data models. Collaborate with Integration Architects and Data Architects to ensure the successful implementation of the data platform blueprint. Develop and maintain data pipelines using Snowflake Data Warehouse and other relevant data building tools. Ensure data quality and integrity by implementing appropriate data validation and testing procedures. Optimize data storage and retrieval processes to ensure efficient and effective data management. Professional & Technical Skills: Must To Have Skills:Strong experience with Snowflake Data Warehouse. Good To Have Skills:Experience with Data Building Tool DBT. Experience with data modeling and database design principles. Strong understanding of ETL processes and data integration techniques. Experience with data validation and testing procedures. Proficiency in SQL and other relevant programming languages. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, data engineering, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduate

Posted 2 months ago

Apply

5 - 9 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool DBT Minimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models using Snowflake Data Warehouse and Data Building Tool DBT. Roles & Responsibilities: Assist with the blueprint and design of the data platform components using Snowflake Data Warehouse. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data pipelines using Data Building Tool DBT. Design and implement data security and access controls. Optimize and tune the performance of the data platform components. Professional & Technical Skills: Must To Have Skills:Experience in Snowflake Data Warehouse. Good To Have Skills:Experience in Data Building Tool DBT. Strong understanding of data modeling and database design principles. Experience in designing and implementing data security and access controls. Experience in optimizing and tuning the performance of data platform components. Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduate

Posted 2 months ago

Apply

4 - 8 years

10 - 20 Lacs

Ahmedabad

Work from Office

Naukri logo

We are looking for an experienced Senior Data Engineer to join our team. The candidate will be a key member of our data engineering team, responsible for developing and optimizing scalable data pipelines, ensuring data integrity, and providing a robust infrastructure that serves as the foundation for all data-driven activities. We are expecting from Senior Data Engineer: Design, develop, and maintain data pipelines to extract, transform, and load data from various sources into a centralized data store. Create and maintain scalable data infrastructure, including databases, data warehouses, and other storage and processing systems. Ensure data pipelines and infrastructure are optimized for efficiency and can handle large volumes of data without compromising performance. Enforce data validation, verification processes, and security measures like encryption and access controls to ensure the accuracy, completeness, and security of data. Work closely with data scientists, analysts, and business stakeholders to understand data needs and develop solutions to support their analytical and operational needs. Provide technical guidance and mentorship to junior data engineers, fostering a collaborative learning environment. Create comprehensive documentation for data infrastructure and processes, ensuring they are understandable and maintainable by other team members. Continuously explore emerging technologies and best practices in data engineering to keep the team and the organization at the cutting edge. Technical Skills: Experience with ETL tools and data pipeline frameworks (e.g., DBT, Apache Spark, Apache Kafka). Proficiency in SQL for data extraction and manipulation. Familiarity with cloud-based data platforms (e.g. Snowflake , AWS, Azure, Google BigQuery). Experience with data-replication tools and platforms (e.g. Stitch, Fivetran) Programming experience in Python or Scala. Strong problem-solving and analytical skills with a proven ability to optimize data architecture. Experience in developing solutions that support high-volume data processing and real-time analytics. Excellent communication and collaboration skills, capable of working with cross-functional teams in a dynamic environment. Ability to translate complex data engineering concepts into actionable insights for stakeholders. Education: B.S. or M.S. in Computer Science, Data Engineering, or a related field, or equivalent experience.

Posted 2 months ago

Apply

6 - 11 years

8 - 14 Lacs

Pune

Work from Office

Naukri logo

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 2 months ago

Apply

3 - 5 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using Snowflake Data Warehouse and Oracle Procedural Language Extensions to SQL (PLSQL). Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing using Snowflake Data Warehouse and Oracle Procedural Language Extensions to SQL (PLSQL). Create data pipelines and ensure data quality by implementing ETL (extract, transform and load) processes to migrate and deploy data across systems. Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain technical documentation for data solutions, including data models, data dictionaries, and ETL processes. Professional & Technical Skills: Must To Have Skills:Experience with Snowflake Data Warehouse. Good To Have Skills:Experience with Oracle Procedural Language Extensions to SQL (PLSQL). Strong understanding of ETL (extract, transform and load) processes. Experience with data modeling and data dictionary development. Experience with data quality and data governance best practices. Additional Information: The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Chennai office. Qualification 15 years full time education

Posted 2 months ago

Apply

6 - 11 years

10 - 20 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

Sr Solutions Engineer/Data Engineer Aligned Automation Based in the Pune, Maharashtra, India office. About the job A Better Together philosophy towards building a better world Aligned Automation is a strategic service provider that partners with Fortune 500 leaders to digitize enterprise operations and enable business strategies. We believe we can create positive, lasting change in the way our clients work while advancing the global impact of their business solutions for a more optimistic and better world. We are passionate about building and sustaining an inclusive and equitable workplace where all people can develop and thrive. Enriched by our 4C’s” – Care, Courage, Curiosity, and Collaboration – our culture supports solutions that empower the possible. Mid-Level Position based out of Pune (6 -18 years) Senior Solutions Engineer/Data Engineer Job Description : We are looking for a Data Engineer good in SQL, ETL, Complex SQL, Cloud (Azure). In This Role, You Will Engage in all aspects of a project, to design the python based solution end-to-end. Successfully translate functional designs into technical solutions using best practices. Create functional and technical specification documents by understanding customer requirements Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Participate in quality reviews and implement quality norms. Ensure the project complies with Software Quality Processes and is adhering to the defined timelines. Accountable for ensuring the business and technical architecture of the delivered solution matches customer technical and functional requirements, and commits to Customer Success (realization of business benefit) Preferred Qualifications Around 5-8 years of experience working on python to implement Automation, API creation, ETLs & Visualization Experience in ETL. Expertise in any database writing complex SQL Queries, Stored Procedures, Views, triggers. Experience in Cloud technologies like Azure, AWS, GCP Experience in DevOps, CI / CD pipelines Qualified candidates must also have a superior aptitude for analytical concepts, oral and written communication skills, teamwork abilities, integrity. Good To Have: Experience in Big technologies like Hadoop, Hive Experience in Machine learning, NLP Thanks & Regards Durba Kumari Sr. HR Associate durba.kumari@alignedautomation.com AlignedAutomation.com

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Noida

Work from Office

Naukri logo

Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to support data analytics and machine learning initiatives. Utilize workflow management platforms to orchestrate and automate data engineering pipelines. Implement data models and schemas to support data storage, retrieval, and analysis. Work closely with data scientists and analysts to understand data requirements and provide data engineering solutions. Collaborate with software engineers to integrate data pipelines with existing systems and applications. Implement data quality monitoring and validation processes to ensure data integrity and accuracy. Optimize data infrastructure and performance to meet scalability and performance requirements. Setup instrumentation and monitoring to track production health parameters and proactively alert in case of any performance degradation. Qualifications: Over 5 years of experience in data engineering or a related role. Strong experience with the Google Cloud platform and its respective data services (BigQuery, Airflow). Strong experience working with relational databases such as SQL server, as well as NoSQL databases such as MongoDB. Proficiency in Python for scripting and data manipulation. Solid understanding of data modeling concepts and database technologies (e.g., SQL, NoSQL, relational databases). Experience working with monitoring tools such as Datadog is a plus. Experience with data warehouse solutions such as Snowflake, Redshift, or BigQuery. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a team environment.

Posted 2 months ago

Apply

6 - 10 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions " underpinned by the world's largest delivery network " Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 492,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Global Network: Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, every business is a digital business. Digital is changing the way organizations engage with their employees, business partners, customers, and communities how they manufacture and deliver products and services, and how they run their organizations. This is our unique differentiator. We seek people who recognize and understand the impact that digital, and technology have on every industry and every sector, and share our passion to shape unique strategies that allow our clients to succeed in this environment. To bring this global perspective to our clients, Accenture Strategys services include those provided by our Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Approximately 10,000 consultants are part of this rapidly expanding network, providing specialized and strategic industry and functional consulting expertise from key locations around the world. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit . Practice Overview: Skill/Operating Group Technology Consulting Level Consultant Location Gurgaon/Mumbai/Bangalore/Kolkata/Pune Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Deep understanding of Data and Analytics platforms, data integration w/ cloud Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Driving enterprise business, application, and integration architecture Helping solve key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a "to-be" enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Experience: We are looking for experienced professionals with Data strategy, data architecture, data on cloud, data modernization, data operating model and data security experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space:AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Andhra Pradesh

Work from Office

Naukri logo

JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.

Posted 2 months ago

Apply

12 - 16 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Architecture Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Lead, you will be responsible for leading the effort to design, build and configure applications, acting as the primary point of contact. Your typical day will involve working with Microsoft Azure Architecture and collaborating with cross-functional teams to ensure successful project delivery. Roles & Responsibilities: Lead the design, development, and deployment of applications using Microsoft Azure Architecture. Collaborate with cross-functional teams to ensure successful project delivery, including project planning, resource allocation, and risk management. Act as the primary point of contact for all application-related issues, providing technical guidance and support to team members. Ensure adherence to best practices and standards for application development, including code reviews, testing, and documentation. Stay updated with the latest advancements in Microsoft Azure Architecture and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Coordination with local team (30%) Steering Design and architecture Data Modelling, SQL, Data Factory, snowflake Experience with Agile development methodologies. Strong leadership and communication skills, with the ability to effectively collaborate with cross-functional teams. Additional Information: The candidate should have a minimum of 12 years of experience in Microsoft Azure Architecture. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office. Qualification 15 years of full time education

Posted 2 months ago

Apply

5 - 9 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Graduation Summary :As an Application Lead for Data Engineering, you will be responsible for leading the design, build, and configuration of applications using Snowflake Data Warehouse. Your typical day will involve collaborating with cross-functional teams, ensuring data quality and integrity, and delivering impactful data-driven solutions. Roles & Responsibilities: Lead the design, build, and configuration of applications using Snowflake Data Warehouse, acting as the primary point of contact. Collaborate with cross-functional teams to ensure data quality and integrity, utilizing strong communication and interpersonal skills. Develop and maintain data pipelines, ensuring efficient and effective data processing and storage. Implement and maintain data security and privacy measures, adhering to industry standards and best practices. Stay updated with the latest advancements in data engineering, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Snowflake Data Warehouse. Good To Have Skills:Experience with other cloud-based data warehousing solutions such as AWS Redshift or Google BigQuery. Experience in designing and implementing data pipelines using ETL/ELT tools such as Apache Airflow or Talend. Strong understanding of data modeling and database design principles. Experience in implementing data security and privacy measures, adhering to industry standards and best practices. Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field. This position is based at our Kolkata office. Qualification Graduation

Posted 2 months ago

Apply

5 - 8 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Python. Your typical day will involve working with Snowflake Data Warehouse and utilizing your expertise in Python to deliver impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Python. Collaborate with cross-functional teams to identify and prioritize business requirements and translate them into technical solutions. Develop and maintain technical documentation related to application design, configuration, and testing. Perform unit testing and support system and integration testing to ensure quality deliverables. Professional & Technical Skills: Must To Have Skills:Proficiency in Python. Good To Have Skills:Experience with Snowflake Data Warehouse. Strong understanding of software engineering principles and best practices. Experience with application development frameworks such as Flask or Django. Experience with database technologies such as SQL and NoSQL. Solid grasp of software testing methodologies and tools. Additional Information: The candidate should have a minimum of 5 years of experience in Python. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Hyderabad office. Qualification Minimum 15 years of fulltime education

Posted 2 months ago

Apply

4 - 6 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

3+ years of work experience in Python programming for AI/ML, deep learning, and Generative AI model development Proficiency in TensorFlow/PyTorch, Hugging Face Transformers and Langchain libraries Hands-on experience with NLP, LLM prompt design and fine-tuning, embeddings, vector databases and agentic frameworks Strong understanding of ML algorithms, probability and optimization techniques 6+ years of experience in deploying models with Docker, Kubernetes, and cloud services (AWS Bedrock, SageMaker, GCP Vertex AI) through APIs, and using MLOps and CI/CD pipelines Familiarity with retrieval-augmented generation (RAG), cache-augmented generation (CAG), retrieval-integrated generation (RIG), low-rank adaptation (LoRA) fine-tuning Ability to write scalable, production-ready ML code and optimized model inference Experience with developing ML pipelines for text classification, summarization and chat agents Prior experience with SQL and noSQL databases, and Snowflake/Databricks

Posted 2 months ago

Apply

3 - 5 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Data Engineering Senior Analyst demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as a key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery. The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence the delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills. Delivery Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions. Domain Expertise Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on. Problem Solving Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable ones Job Description & Responsibilities: The candidate will be responsible to deliver business needs end to end from requirements to development into production. Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns. The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset. The applicant will ensure adherence to enterprise architecture direction and architectural standards. The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Experience Required: 3 to 5 years of experience in software engineering, building data engineering pipelines, middleware and API development and automation More than 3 years of experience in Databricks within an AWS environment Data Engineering experience Experience Desired: Expertise in Agile software development principles and patterns Expertise in building streaming, batch and event-driven architectures and data pipelines Primary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc. Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, Glue Good understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curation Expertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundry Experience in multi-cloud software-as-a-service products such as Databricks, Snowflake Experience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformation Experience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNS Experience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFront Experience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languages Experience in building CI/CD pipelines using Jenkins, Github Actions Strong expertise with source code management and its best practices Proficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD) Knowledge on Behavioral Driven Development (BDD) approach Additional Skills: Ability to perform detailed analysis of business problems and technical environments Strong oral and written communication skills Ability to think strategically, implement iteratively and estimate financial impact of design/architecture alternatives Continuous focus on an on-going learning and development

Posted 2 months ago

Apply

7 - 9 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and meeting the needs of the organization. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, build, and configure applications to meet business process and application requirements. Collaborate with cross-functional teams to gather and analyze user requirements. Develop and implement software solutions using Snowflake Data Warehouse. Perform code reviews and ensure adherence to coding standards. Troubleshoot and debug applications to resolve issues. Ensure the smooth functioning of applications and address any performance or scalability concerns. Professional & Technical Skills: Must To Have Skills:Proficiency in Snowflake Data Warehouse. Strong understanding of database concepts and SQL. Experience in designing and developing data models. Knowledge of ETL processes and data integration techniques. Experience with cloud-based data warehousing solutions. Good To Have Skills:Experience with AWS or Azure cloud platforms. Familiarity with data governance and security best practices. Experience with data migration and data transformation projects. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Intelligent Cloud Services Good to have skills : Informatica PowerCenter, Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Must be full time graduate Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Informatica Intelligent Cloud Services. Your typical day will involve working with Informatica Intelligent Cloud Services, Informatica PowerCenter, and Snowflake Data Warehouse to develop and deploy applications that meet business needs. Roles & Responsibilities: Design, build, and configure applications using Informatica Intelligent Cloud Services to meet business process and application requirements. Collaborate with cross-functional teams to understand business needs and translate them into technical requirements. Develop and deploy applications using Informatica PowerCenter and Snowflake Data Warehouse. Ensure the performance, security, and scalability of applications by conducting thorough testing and debugging. Stay updated with the latest advancements in Informatica Intelligent Cloud Services and related technologies to integrate innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience with Informatica Intelligent Cloud Services. Good To Have Skills:Experience with Informatica PowerCenter and Snowflake Data Warehouse. Strong understanding of data engineering concepts and principles. Experience with data integration, data warehousing, and ETL processes. Experience with SQL and database management systems. Solid grasp of software development life cycle (SDLC) methodologies and agile development practices. Additional Information: The candidate should have a minimum of 5 years of experience in Informatica Intelligent Cloud Services. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Must be full time graduate

Posted 2 months ago

Apply

4 - 7 years

5 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer Please find the JD below for the ETL Snowflake developer role. 4 - 5 years of Strong experience on advanced SQL any Database. (preferably snowflake , Oracle or Teradata,). Extensive experience in data integration area and hands on experience in any of the ETL tools like Datastage , Informatica , Snaplogic etc. Able to transform technical requirements into data collection query. Should be capable of working with business and other IT teams and convert the requirements into queries. Good understanding of ETL architecture and design. Good knowledge in UNIX commands, Databases, SQL, PL/SQL Good to have experience on AWS Glue • Good to have knowledge on Qlik replication tool.

Posted 2 months ago

Apply

3 - 5 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Cigna, a leading Health Services company, is looking for an exceptional engineer in our Data & Analytics Engineering organization. The Full Stack Engineer is responsible for the delivery of a business need starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is ownership, eagerness to learn & an open mindset. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. Person should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Job Description & Responsibilities: Design and architect the solution independently Take ownership and accountability Write referenceable & modular code Be fluent in particular areas and have proficiency in many areas Have a passion to learn Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact Take risks and champion new ideas Experience Required: 3+ years of experience required in listed skills in Data Engineering role 3+ years of Python scripting experience 3+ years of Data Management & SQL expertise Teradata & Snowflake experience strongly preferred 3+ years being part of Agile teams Scrum Experience Desired: Experience with version management tools Git preferred Experience with BDD and TDD development methodologies Experience working in an agile CI/CD environments; Jenkins experience preferred Knowledge and/or experience with Health care information domains preferred Education and Training Required: Bachelors degree (or equivalent) required Primary Skills: Expertise with big data technologies - Hadoop, HiveQL, Spark (Scala/Python) Expertise on Cloud technologies AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR) Additional Skills: Experience working on Analytical Models and their deployment / production enable? ment via data & analytics pipelines

Posted 2 months ago

Apply

6 - 8 years

10 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: S.E Lead Analyst demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery. The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence the delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills. Job Description & Responsibilities: The candidate will be responsible to deliver business needs end to end from requirements to development into production. Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns. The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset. The applicant will ensure adherence to enterprise architecture direction and architectural standards. The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Experience Required: More than 6-8 years of experience in software engineering, building data engineering pipelines, middleware and API development and automation More than 3 years of experience in Databricks within an AWS environment Experience Desired: Expertise in Agile software development principles and patterns Expertise in building streaming, batch and event-driven architectures and data pipelines Primary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc. Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, Glue Good understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curation Expertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundry Experience in multi-cloud software-as-a-service products such as Databricks, Snowflake Experience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformation Experience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNS Experience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFront Experience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languages Experience in building CI/CD pipelines using Jenkins, Github Actions Strong expertise with source code management and its best practices Proficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD) Knowledge on Behavioral Driven Development (BDD) approach Additional Skills: Ability to perform detailed analysis of business problems and technical environments Strong oral and written communication skills

Posted 2 months ago

Apply

6 - 11 years

15 - 25 Lacs

Pune, Hyderabad, Kolkata

Hybrid

Naukri logo

Job description - Snowflake Data engineer Total Positions - 6 Job Location - Pune, Hyd, Kolkata (hybrid) Notice period -Immediate to 1 month Snowflake Data Engineer Overall Experience : 6+ years of experience in Snowflake and Python. Knowledge of Power BI is added advantage Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Knowledge on EBS Modules like Finance, HCM, Procurement will be an added advantage. Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems. Role & responsibilities

Posted 2 months ago

Apply

5 - 9 years

18 - 20 Lacs

Navi Mumbai

Work from Office

Naukri logo

Position Overview: We are looking for a skilled and visionary Data Engineering Lead to join our growing team. In this role, you will be responsible for leading a team of data engineers in designing, developing, and maintaining robust data pipelines and infrastructure. You will work closely with cross-functional teams to support data-driven decision-making and ensure the availability, quality, and integrity of our data assets. Role & responsibilities: Build, develop, and maintain efficient and high-performance data pipelines across both cloud and on-premises environments. Ensure the accuracy, adequacy, and legitimacy of data. Prepare ETL pipelines to extract data from various sources and store it in a centralized location. • Analyse, interpret, and present results through effective visualization and reports. Identify critical metrics. Implement and instill best practices for effective data management. Monitor the use of data systems and ensure the correctness, completeness, and availability of data services. Optimize data infrastructure and processes for cost efficiency on AWS cloud and on-premises environments. Utilize Apache Airflow, NiFi, or equivalent tools to build and manage data workflows and integrations. Implement best practices for data governance, security, and compliance. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Stay current with industry trends and emerging technologies in data engineering and cloud computing. Lead and mentor a team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement. Define team objectives and performance metrics, conducting regular performance evaluations and providing constructive feedback. Facilitate knowledge sharing and professional development within the team. Preferred candidate profile: Up to 6-9 years of proven experience as a Data Engineering Lead or in a similar role. Extensive hands-on experience with ETL and ELT processes. Strong expertise in data integrity and quality assurance. Proficiency in optimizing AWS cloud services and on-premises infrastructure for cost and performance. Hands-on experience with Apache Airflow and NiFi. Strong programming skills in languages such as Python, Java, or Scala. Experience with SQL and NoSQL databases. Experience in building and maintaining a single source of truth. Familiarity with data warehousing solutions like Amazon Redshift, Snowflake, or BigQuery. Strong problem-solving skills and the ability to work under pressure. Hands-on experience with data visualization tools such as Tableau, Power BI, or Looker. Experience in financial services is a must. Skills Required: Team leading and team management. Strong analytical and problem-solving abilities. Excellent communication and interpersonal skills. Ability to work collaboratively in a fast-paced environment and manage multiple priorities. Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.

Posted 2 months ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies