Home
Jobs

1082 Snowflake Jobs - Page 40

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of Full-time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Data Analysis & InterpretationGood to Have Skills :Snowflake Data WarehouseJob Requirements :Key Responsibilities :1 Enable himself with the Accenture Standards and Policies of working in a Project and client environment2 Work with Project Manager and Project Lead to get his Client user accounts created3 Lead the overall Snowflake Transformation journey for the customer4 Design Develop the new solution in Snowflake Datawarehouse 5 Prepare and test strategy and an implementation plan for the solution6 Play role of a End to End Data Engineer Technical Experience :1 2 Years of Hands-on Experience in SNOWFLAKE Datawarehouse Design and Development Projects specifically2 4 Years of Hands-on Experience in SQL Programming Language PLSQL3 1 Years of Experience in JavaScripting or any programming languages Python, ReactJS, Angular4 Good Understanding and Concepts of Cloud Datawarehouse and Datawarehousing concepts and Dimensional Modelling concepts5 1 Year Experience in ETL Technologies - Informatica or DataStage or Talend or SAP BODS or Abinitio, etc Professional Attributes :1 Should be fluent in English communication2 Should have handled direct Client Interactions in the past3 Should be clear in Written Communications4 Should be having strong interpersonal skills5 Should be conscious of European Professional Etiquettes Educational Qualification:Minimum 15 years of Full-time educationAdditional Info :Exposure to AWS and Amazon S3 and other Amazon Cloud Hosting Products related to Analytics or DBs Qualifications Minimum 15 years of Full-time education

Posted 1 month ago

Apply

7 - 12 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : Cloud Infrastructure Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Data ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise. Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. Responsibilities Define architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelor's degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years' experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills.

Posted 1 month ago

Apply

3 - 5 years

0 - 2 Lacs

Bengaluru

Hybrid

Naukri logo

Demand 1 :: - Mandatory Skill :: 3.5 -7 Years (Bigdata -Adobe& scala, python, linux) Demands 2:: Mandatory Skill :: 3.5 -7 Years (Bigdata -Snowflake (snowpark ) & scala, python, linux) Specialist Software Engineer - Bigdata Missions We are seeking an experienced Big Data Senior Developer to lead our data engineering efforts. In this role, you will design, develop, and maintain large-scale data processing systems. You will work with cutting-edge technologies to deliver high-quality solutions for data ingestion, storage, processing, and analytics. Your expertise will be critical in driving our data strategy and ensuring the reliability and scalability of our big data infrastructure. Profile 3 to 8 years of experience on application development with Spark/Scala •Good hands-on experience of working on the Hadoop Eco-system ( HDFS, Hive, Spark ) •Good understanding of the Hadoop File Formats •Good Expertise on Hive / HDFS, PySpark, Spark, JupiterNotebook, ELT Talend, Control-M, Unix/Script, Python, CI/CD, Git / Jira, Hadoop, TOM, Oozie, Snowflake •Expertise in the implementation of the Data Quality Controls •Ability to interpret the Spark UI and identify the bottlenecks in the Spark process and provide the optimal solution. Tools •Ability to learn and work with various tools such as IntelliJ, GIT, Control M, Sonar Qube and also on board the new frameworks into the project. •Should be able to independently handle the projects. Agile •Good to have exposure to CI/CD processes •Exposure to Agile methodology and processes Others •Ability to understand complex business rules and translate into technical specifications/design. •Write highly efficient and optimized code which is easily scalable. •Adherence to coding, quality and security standards. •Effective verbal and written communication to work closely with all the stakeholders •Should be able to convince the stakeholders on the proposed solutions

Posted 1 month ago

Apply

2 - 6 years

15 - 30 Lacs

Pune

Hybrid

Naukri logo

We are on a mission to rid the world of bad customer service by mobilizing the way help is delivered. Todays consumers want an always-available customer service experience that leaves them feeling valued and respected. Helpshift helps B2B brands deliver this modern customer service experience through a mobile-first approach. We have changed how conversations take place, moving the conversation away from a slow, outdated email and desktop experience to an in-app chat experience that allows users to interact with brands in their own time. Through our market-leading AI-powered chatbots and automation, we help brands deliver instant and rapid resolutions. Because agents play a key role in delivering help, our platform gives agents superpowers with automation and AI that simply works. Companies such as Scopely, Supercell, Brex, EA, Square along with hundreds of other leading brands use the Helpshift platform to mobilize customer service delivery. Over 900 million active monthly consumers are enabled on 2B+ devices worldwide with Helpshift. Some numbers that illustrate our scale: 85k/rps 30ms response time 300 GB data transfer/hour 1000 VMs deployed at peak Role & responsibilities Building maintainable data pipelines both for data ingestion and operational analytics for data collected from 2 billion devices and 900M Monthly active users Building customer-facing analytics products that deliver actionable insights and data, easily detect anomalies Collaborating with data stakeholders to see what their data needs are and being a part of the analysis process Write design specifications, test, deployment, and scaling plans for the data pipelines Mentor people in the team & organization Preferred candidate profile 3+ years of experience in building and running data pipelines that scale for TBs of data Proficiency in high-level object-oriented programming language (Python or Java) is must Experience in Cloud data platforms like Snowflake and AWS, EMR/Athena is a must Experience in building modern data lakehouse architectures using Snowflake and columnar formats like Apache Iceberg/Hudi, Parquet, etc Proficiency in Data modeling, SQL query profiling, and data warehousing skills is a must Experience in distributed data processing engines like Apache Spark, Apache Flink, Datalfow/Apache Beam, etc Knowledge of workflow orchestrators like Airflow, Dasgter, etc is a plus Data visualization skills are a plus (PowerBI, Metabase, Tableau, Hex, Sigma, etc) Excellent verbal and written communication skills Bachelors Degree in Computer Science (or equivalent) Perks and benefits Hybrid setup Worker's insurance Paid Time Offs Other employee benefits to be discussed by our Talent Acquisition team in India. Helpshift embraces diversity. We are proud to be an equal opportunity workplace and do not discriminate on the basis of sex, race, color, age, sexual orientation, gender identity, religion, national origin, citizenship, marital status, veteran status, or disability status. Privacy Notice By providing your information in this application, you understand that we will collect and process your information in accordance with our Applicant Privacy Notice. For more information, please see our Applicant Privacy Notice at https://www.keywordsstudios.com/en/applicant-privacy-notice.

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Snowflake Schema Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your day will involve creating innovative solutions to address business needs and ensuring applications are tailored to meet specific requirements. Roles & Responsibilities:Implement snowflake cloud data warehouse and cloud related architecture. Migrating from various sources to Snowflake.Work on Snowflake capabilities such as Snow pipe, Stages, Snow SQL, Streams, and tasks.Implement snowflake advanced concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, zero copy clone.In-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouseDeploy the snowflake features such as data sharing, event, and lake house patterns.Implement Incremental extraction loads - batched and streaming.Must Have- Snowflake certification Professional & Technical Skills: Must To Have Skills:Proficiency in Snowflake Data Warehouse Good To Have Skills:Experience with Snowflake Schema Strong understanding of data warehousing concepts Experience in ETL processes and data modeling Knowledge of SQL and database management Ability to troubleshoot and debug applications Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

6 - 8 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

DESCRIPTION The Scrum Master Data acts as a servant leader and coach to cross-functional agile teams, enabling the effective adoption and execution of Agile practices to deliver business value. This role supports the team in applying Agile principles and frameworks such as Scrum, Kanban, and SAFe while driving agile maturity, continuous improvement, and delivery excellenceparticularly in data analytics and visualization domains. Key Responsibilities: Act as a coach and mentor to the Agile team, fostering a collaborative, respectful, and transparent environment. Guide the team in implementing Agile practices (Scrum, Kanban, SAFe) and help reduce the gap between Agile theory and practice. Facilitate Agile ceremonies including sprint planning, daily stand-ups, retrospectives, and reviews. Provide training and coaching on Agile methodologies and best practices, promoting a lean-agile mindset across teams. Remove impediments and protect the team from external distractions to ensure steady progress. Support Product Owners in backlog refinement, prioritization, and definition of ready/done. Establish and track agile metrics (burn-down/up charts, velocity, throughput) to monitor team performance and value delivery. Lead efforts to unify and enhance Jira usage, including dashboards and metrics for workflow transparency. Contribute subject matter expertise on team structures for optimal agile workflows. Ensure alignment with business goals and foster relationships through Business Relationship Management practices. Promote continuous delivery principles and enable DevOps culture for on-demand releases. RESPONSIBILITIES Skills and Competencies: Lean Agile and SAFe Mindset: Champions agile manifesto principles and SAFe practices. Agile Systems Thinking: Applies holistic systems thinking to support scalable and sustainable delivery. Planning and Managing Ceremonies: Leads cadence-based Agile ceremonies to maintain team rhythm and productivity. Release Planning: Coordinates and manages release activities using roadmap and resource alignment. Business Insight: Applies knowledge of market and business context to guide decisions and delivery. DevOps and Continuous Delivery Pipeline: Supports implementation of automation and release-on-demand capabilities. Building High Performance Teams: Applies lean-agile principles to foster empowered, self-organizing teams. Manages Conflict & Values Differences: Facilitates constructive conflict resolution and promotes inclusion. Communicates Effectively: Delivers clear, adaptive communication across stakeholders and teams. Global Perspective: Considers international and cross-functional implications in decision-making. Experience: Relevant work experience of 6-8 years is required. Prior experience in Data,implementation of Data warehouses,analytical layer,ETL etc. Proven experience working in a software development organization and Agile environment as a Scrum Master. Demonstrated expertise in Lean methodologies and Agile practices with measurable impact. Experience managing or driving data analytics and visualization projects. Strong preference for familiarity with Snowflake and Microsoft Data Stack. Prior experience working with Agile management tools such as Jira is required, including creating dashboards and performance metrics. QUALIFICATIONS Qualifications: Bachelors degree in Computer Science, Information Technology, Business, or related field (or equivalent industry experience). Agile certifications such as Certified Scrum Master (CSM) , Professional Scrum Master (PSM), or SAFe Scrum Master (SSM) are strongly preferred. This role may require licensing or certification to ensure compliance with applicable export control regulations. Data platform certification - Snowflake,Informatica would be preferred.

Posted 1 month ago

Apply

5 - 7 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients' most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analytics"and the data that fuels it all"to power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for todays connected landscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to client's technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise senior leadership. Manage budgeting and forecasting activities and build financial proposals Qualifications Your experience counts! MBA from a tier 1 institute 5-7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred: Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential

Posted 1 month ago

Apply

5 - 8 years

6 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

ROLE SUMMARY Reporting to the Head of Architecture & Engineering APAC, your role as a BI Analyst is responsible for leveraging business intelligence platforms to generate insights, reports, analytics and dashboards. By collaborating with stakeholders within Client ecosystem, you will work to translate business needs into data-driven solutions that support informed decision-making. You will leverage your business acumen and act as a liaison between the technology team and various business units and stakeholders to understand their requirements, key objectives and strategic goals and translate into technical BI solutions; Develop and maintain data-driven reports and visualizations utilising your expertise with PowerBI/Tableau/Sigma. PRIMARY ROLE Collaborating with departments to understand reporting needs and recommend best practices. Analyze large data sets, develop interactive dashboards, and present insights to stakeholders. Carry out data ingestion, star schema data modelling, and visualization to create impactful reports. Develop comprehensive BI dashboards following best practices. Improve and optimize existing Power BI solutions to enhance performance and usability. Leverage SQL skills for advanced data querying and analysis. Conduct data profiling, cleansing, and validation activities to ensure data quality. KEY WORKING RELATIONSHIPS External: Development & Integration partners Technology and Cyber Security partners Cloud and application vendors Client Portfolio Companies Internal: CTO APAC Heads of Architecture & Engineering, Workplace and Service Delivery DevOps engineers Internal business stakeholders WHAT YOU BRING TO THE ROLE Required: 3-5 years experience in a Data engineering and Report development roles. Demonstrable experience in Data Extraction, Manipulation and Visualization. Strong Data Mapping skills, including Source to Target. Strong Power BI and report development skills and experience. Experience with Snowflake and/or Data Bricks. Proven experience partnering with business stakeholders, conceptualizing business objectives and processes, documenting and translating business requirements into technical specifications. Skilled in producing Data flows and models. Exceptional communication and stakeholder engagement skills. Attention to detail and ability to analyses data from diverse application sources and write efficient, effective SQL to handle complex scenarios, providing required outputs under time pressures. Strong ETL development experience, drawing data from disparate systems. Preferred: A degree in Computer science, Mathematics or Statistics. Experience and competency with Sigma or Tableau would also be beneficial. Experience in using cloud technologies such as Azure. Experience with developing Data Governance and Classification framework. Exposure to Machine Learning, AI and Big data.

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

5-10 years of relevant exp in database development environment developing the OLTP system Development and maintenance of data models for any application and system Experience in understanding business requirements and translate them into conceptual Required Candidate profile Exp. in AWS cloud environment is a plus Exp. in basic database administration activities Exp. in DWH/BI environment with dimensional modeling skills Knowledge in Snowflake is a big plus

Posted 1 month ago

Apply

8 - 13 years

6 - 11 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD is looking for a Senior Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Senior Data Engineer will be responsible for strategic planning and hands-on engineering of Big Data and cloud environments that support our clients advanced analytics, data science, and other data platform initiatives. This consultant will design, build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. They will be expected to be hands-on technically, but also present to leadership, and lead projects. The Senior Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Senior Data Engineer, you will design and implement data pipelines to enable analytics and machine learning on rich datasets. Roles and Responsibilities A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Engineers APIs for returning data from these structures to the Enterprise Applications Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications 8+ years of professional technical experience 4+ years of hands-on Data Architecture and Data Modelling. 4+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 4+ years of programming languages such as Python 2+ years of experience working in cloud environments (AWS and/or Azure) Strong client-facing communication and facilitation skills Key Skills Python, Cloud, Linux, Windows, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Snowflake, SQL/RDBMS, OLAP, Data Engineering

Posted 1 month ago

Apply

6 - 10 years

1 - 2 Lacs

Noida, Pune, Bengaluru

Hybrid

Naukri logo

Role & responsibilities : Business Intelligence/Data Warehousing experience 3-4 Years hands-on experience with Snowflake database Strong SQL, PL/SQL, and Snowflake functionality experience Strong exposure to other RDBMS likes Oracle, SQL server, etc. Exposure to cloud storage services like AWS S3 2-3 years Informatica PowerCenter and/or IDMC experience Decent exposure to dimensional data modeling techniques and implementation Decent exposure to data warehousing approaches Basic exposure to reporting/dashboards using PowerBI or similar tools Basic exposure to Linux environments and shell scripting Exposure to Microsoft Fabric is a nice to have

Posted 1 month ago

Apply

10 - 12 years

30 - 35 Lacs

Noida, Hyderabad, Gurugram

Work from Office

Naukri logo

Role & responsibilities Total experience 10+years. Hands-on experience in Data Engineering. Expert-level knowledge of PostgreSQL (cloud-hosted on AWS, Azure, or GCP) and Snowflake. Strong proficiency in SQL and database programming. Extensive experience in Python and working knowledge of machine learning models. Advanced understanding of database internals, stored procedures, performance tuning, and index optimization. Familiar with SQL security features, encryption methods, and user role management. Hands-on experience with source control tools like Git and DevOps platforms. Design, develop, and maintain scalable data pipelines and ETL processes. Exposure to CI/CD automation processes. Programming experience in Golang. Working knowledge of Agile frameworks such as Scrum or Kanban. Strong problem-solving skills and a passion for continuous improvement. Strong communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the clients needs in the context of the project. Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it. Determining and implementing design methodologies and tool sets. Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs. Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it. Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement. Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs. Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Preferred candidate profile

Posted 1 month ago

Apply

8 - 18 years

10 - 40 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

Naukri logo

Roles and Responsibilities : Lead the development of data warehousing solutions using Snowflake on Microsoft Azure platform. Collaborate with cross-functional teams to design, develop, test, and deploy large-scale data pipelines. Ensure high-quality delivery of projects by providing technical guidance and mentorship to junior team members. Participate in code reviews and ensure adherence to coding standards. Job Requirements : 8-18 years of experience in building data warehouses using Snowflake on Microsoft Azure platform. Strong expertise in developing complex SQL queries for query performance tuning. Proficiency in building efficient ETL processes using various tools such as Data Build Tool (DBT). Experience working with big-data technologies like Hadoop, Spark, Kafka.

Posted 1 month ago

Apply

8 - 13 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

Position Summary We are looking for a highly skilled and experienced Data Engineering Manager to lead our data engineering team. The ideal candidate will possess a strong technical background, strong project management abilities, and excellent client handling/stakeholder management skills. This role requires a strategic thinker who can drive the design, development and implementation of data solutions that meet our clients needs while ensuring the highest standards of quality and efficiency. Job Responsibilities Technology Leadership Lead guide the team independently or with little support to design, implement deliver complex cloud-based data engineering / data warehousing project assignments Solution Architecture & Review Expertise in conceptualizing solution architecture and low-level design in a range of data engineering (Matillion, Informatica, Talend, Python, dbt, Airflow, Apache Spark, Databricks, Redshift) and cloud hosting (AWS, Azure) technologies Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Should have expertise and 8+ years of working experience in at least twoETL toolsamong Matillion, dbt, pyspark, Informatica, and Talend Should have expertise and working experience in at least twodatabases among Databricks, Redshift, Snowflake, SQL Server, Oracle Should have strong Data Warehousing, Data Integration and Data Modeling fundamentals like Star Schema, Snowflake Schema, Dimension Tables and Fact Tables. Strong experience on SQL building blocks. Creating complex SQL queries and Procedures. Experience in AWS or Azure cloud and its service offerings Aware oftechniques such asData Modelling, Performance tuning and regression testing Willingness to learn and take ownership of tasks. Excellent written/verbal communication and problem-solving skills and Understanding and working experience on Pharma commercial data sets like IQVIA, Veeva, Symphony, Liquid Hub, Cegedim etc. would be an advantage Hands-on in scrum methodology (Sprint planning, execution and retrospection) Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Agile PySpark Data Modelling Matillion Designing technical architecture AWS Data Pipeline

Posted 1 month ago

Apply

5 - 9 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Position Summary Looking for a Salesforce Data Cloud Engineer to design, implement, and manage data integrations and solutions using Salesforce Data Cloud (formerly Salesforce CDP). This role is essential for building a unified, 360-degree view of the customer by integrating and harmonizing data across platforms. Job Responsibilities Consolidate the Customer data to create a Unified Customer profile Design and implement data ingestion pipelines into Salesforce Data Cloud from internal and third-party systems . Work with stakeholders to define Customer 360 data model requirements, identity resolution rules, and calculated insights. Configure and manage the Data Cloud environment, including data streams, data bundles, and harmonization. Implement identity resolution, micro segmentation, and activation strategies. Collaborate with Salesforce Marketing Cloud, to enable real-time personalization and journey orchestration. Ensure data governance, and platform security. Monitor data quality, ingestion jobs, and overall platform performance. Education BE/B.Tech in Computer or IT Master of Computer Application Work Experience Overall experience of minimum 10 years in Data Management and Data Engineering role, with a minimum experience of 3 years as Salesforce Data Cloud Data Engineer Hands-on experience with Salesforce Data Cloud (CDP), including data ingestion, harmonization, and segmentation. Proficient in working with large datasets, data modeling, and ETL/ELT processes. Understanding of Salesforce core clouds (Sales, Service, Marketing) and how they integrate with Data Cloud. Experience with Salesforce tools such as Marketing Cloud. Strong knowledge of SQL, JSON, Apache Iceberg and data transformation logic. Familiarity with identity resolution and customer 360 data unification concepts. Salesforce certifications (e.g., Salesforce Data Cloud Accredited Professional, Salesforce Administrator, Platform App Builder). Experience with CDP platforms other than Salesforce (e.g., Segment, Adobe Experience Platform (Good to have)). Experience with cloud data storage and processing tools (Azure, Snowflake, etc.). Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Lifescience Knowledge Azure SQL SQL Databricks

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in Data Bricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 1 month ago

Apply

5 - 8 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Credit Risk Modeling Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AICredit risk modelling refers to the use of financial models to estimate losses a firm might suffer in the event of a borrower s default. What are we looking for? Primary Skills: Banking, Financial Services Credit Risk Model Development Market Risk Modelling Financial Regulations Quantitative AnalysisSecondary Skills: SQL SAS Python R Tableau VBA Qlik Sense Snowflake MatillionSoft Skills: Adaptable and flexible Commitment to quality Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software solutions to meet business requirements. Collaborate with cross-functional teams to design and deliver high-quality applications. Troubleshoot and debug applications to ensure optimal performance. Stay updated with industry trends and technologies to enhance application development processes. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse. Strong understanding of database concepts and SQL. Experience in ETL processes and data modeling. Knowledge of cloud platforms like AWS or Azure. Hands-on experience in developing scalable and efficient applications. Additional Information: The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse. This position is based at our Gurugram office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7 - 11 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Credit Risk Modeling Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years What would you do? Data & AICredit risk modelling refers to the use of financial models to estimate losses a firm might suffer in the event of a borrower s default. What are we looking for? Primary Skills: Banking, Financial Services Credit Risk Model Development Market Risk Modelling Financial Regulations Quantitative AnalysisSecondary Skills: SQL SAS Python R Tableau VBA Qlik Sense Snowflake MatillionSoft Skills: Adaptable and flexible Commitment to quality Ability to work well in a team Result Oriented Problem Solving Skills Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification Any Graduation

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure timely project delivery Provide guidance and support to team members Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse Strong understanding of data warehousing concepts Experience in ETL processes Knowledge of cloud data platforms Hands-on experience in SQL development Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Kolkata office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

7 - 10 years

15 - 20 Lacs

Mumbai

Work from Office

Naukri logo

Position Overview: The Databricks Data Engineering Lead role is ideal a highly skilled Databricks Data Engineer who will architect and lead the implementation of scalable, high-performance data pipelines and platforms using the Databricks Lakehouse ecosystem. The role involves managing a team of data engineers, establishing best practices, and collaborating with cross-functional stakeholders to unlock advanced analytics, AI/ML, and real-time decision-making capabilities. Key Responsibilities: Lead t he design and development of modern data pipelines, data lakes, and lakehouse architectures using Databricks and Apache Spark. Manage and mentor a team of data engineers, providing technical leadership and fostering a culture of excellence. Architect scalable ETL/ELT workflows to process structured and unstructured data from various sources (cloud, on-prem, streaming). Build and maintain Delta Lake tables and optimize performance for analytics, machine learning, and BI use cases. Collaborate with data scientists, analysts, and business teams to deliver high-quality, trusted, and timely data products. Ensure best practices in data quality, governance, lineage, and security, including the use of Unity Catalog and access controls. Integrate Databricks with cloud platforms (AWS, Azure, or GCP) and data tools (Snowflake, Kafka, Tableau, Power BI, etc.). Implement CI/CD pipelines for data workflows using tools such as GitHub, Azure DevOps, or Jenkins. Stay current with Databricks innovations and provide recommendations on platform strategy and architecture improvements Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with Databricks and Apache Spark . Proven leadership experience in managing and mentoring data engineering teams. Skills : Proficiency in PySpark, SQL, and experience with Delta Lake, Databricks Workflows, and MLflow. Strong understanding of data modeling, distributed computing, and performance tuning. Familiarity with one or more major cloud platforms (Azure, AWS, GCP) and cloud-native services. Experience implementing data governance and security in large-scale environments. Experience with real-time data processing using Structured Streaming or Kafka. Knowledge of data privacy, security frameworks, and compliance standards (e.g., PCIDSS, GDPR). Exposure to machine learning pipelines, notebooks, and ML Ops practices. Certifications : Databricks Certified Data Engineer or equivalent certification.

Posted 1 month ago

Apply

6 - 9 years

8 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Experience : 6+ Years Location : Bangalore Employment Type : Full-time (Hucon Innovations) Notice Period : Immediate to 15 days Job Description We are seeking a Lead Developer with strong expertise in Oracle Master Data Management (MDM) and Customer Data Hub (CDH) . The ideal candidate should have a solid background in SQL/PL-SQL development , data quality processes, and experience working with Oracle E-Business Suite R12. Key Responsibilities Act as a Lead Developer for Oracle MDM/CDH implementation and support projects. Develop and maintain complex SQL and PL/SQL scripts on Oracle databases. Design, implement, and manage Data Quality Management (DQM) solutions. Work with Oracle Standard APIs and Trading Community Architecture (TCA) . Collaborate with cross-functional teams to analyze and optimize customer data. Perform query tuning and performance optimization . Engage in end-to-end solutioning for Master Data and Integration requirements. Must-Have Skills 6+ years of experience in Oracle MDM, CDH Strong SQL & PL/SQL programming skills Experience in Oracle Applications R12 suite Deep understanding of TCA, DQM, MDM, EDM concepts Experience working with Oracle Standard APIs Good-to-Have / Added Advantage Exposure to Salesforce , Snowflake , or Oracle Data Integrator (ODI) Domain knowledge in ERP systems

Posted 1 month ago

Apply

6 - 11 years

10 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Lead Developer Oracle MDM Location: Bangalore Employment Type: Full-Time Experience Required: 6+ Years Budget: Up to 16 LPA Notice Period: Immediate to 15 Days (Mandatory) PF Compliance: Mandatory Communication Skills: Excellent verbal and written communication skills required Job Description: We are seeking a highly skilled and experienced Lead Developer with strong expertise in Oracle Master Data Management (MDM) , particularly in Customer Data Hub (CDH) . The ideal candidate will be proficient in SQL and PL/SQL development, with a deep understanding of Oracle Applications and data quality frameworks. Key Responsibilities: Lead MDM initiatives and manage technical delivery in Oracle Customer Data Hub (CDH) Develop and optimize complex SQL and PL/SQL scripts Ensure data quality through DQM (Data Quality Management) tools and processes Work with Oracle Applications R12 suite and Trading Community Architecture (TCA) Utilize Oracle Standard APIs to integrate and manage customer data Conduct performance tuning and debugging of SQL queries Collaborate with cross-functional teams to ensure efficient data flow and quality Must-Have Skills: Strong hands-on experience in SQL and PL/SQL Expertise in Oracle MDM, specifically Customer Data Hub (CDH) Deep understanding of Data Quality Management (DQM) Experience with Oracle Applications R12 suite Familiarity with Trading Community Architecture (TCA) Experience with Oracle Standard APIs Strong query writing and performance tuning skills Good-to-Have Skills: Exposure to Salesforce Experience with Snowflake Knowledge of Oracle Data Integrator (ODI) Domain Knowledge Required: ERP Systems

Posted 1 month ago

Apply

3 - 6 years

9 - 13 Lacs

Mohali, Gurugram, Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: Experience in HR Data and databases is a must. 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams. Experience in HR Data and databases. Experience in Azure Data Factory

Posted 1 month ago

Apply

5 - 9 years

20 - 25 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 5+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus

Posted 1 month ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies