Home
Jobs

1714 Snowflake Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

5 - 15 Lacs

Kolkata

Work from Office

Naukri logo

Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Dear Candidate, We are pleased to invite you to participate in the EY GDS face to face hiring Event for the position of AWS Data Engineer. Role: AWS Data Engineer Experience Required: 5-8 Years Location - Hyderabad Mode of interview - Face to Face JD - Technical Skills: • Must have Strong experience in AWS Data Services like Glue , Lambda, Even bridge, Kinesis, S3/ EMR , Redshift , RDS, Step functions, Airflow & Pyspark • Strong exposure to IAM, Cloud Trail , Cluster optimization , Python & SQL • Should have expertise in Data design, STTM, understanding of Data models , Data component design, Automated testing, Code Coverage, UAT support , Deployment and go live • Experience with version control systems like SVN, Git. Create and manage AWS Glue crawlers and jobs to automate data cataloging and ingestion processes across various structured and unstructured data sources. • Strong experience with AWS Glue building ETL pipelines, managing crawlers, and working with Glue data catalogue. • Proficiency in AWS Redshift designing and managing Redshift clusters, writing complex SQL queries, and optimizing query performance. • Enable data consumption from reporting and analytics business applications using AWS services (ex: QuickSight, Sagemaker, JDBC / ODBC connectivity, etc.) Kindly confirm your availability by applying to this Job

Posted 1 week ago

Apply

1.0 - 6.0 years

2 - 7 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Position Name Data Engineer Total Exp: 3-5 Years Notice Period: Immedidate joiner Work Location: Mumbai, Kandivali Work Type: Work from Office Job Description Must have: Must have Data Engineer having 3 to 5 years of experience Must have Should be an individual contributor to deliver the feature/story within given time and expected quality Must have Should be good in Agile process Must have Should be strong in Programming and SQL queries Must have Should be capable to learn new tools and technologies to scale on Data engineering Must have Should have good communication and client interaction. Technical Skills: Must have Data Engineering using Java/Python, Spark/Py-Spark, Big Data(Hadoop, Hive, Yarn, Oozie, etc.,), Cloud warehouse(Snowflake), Cloud services(AWS EMR, S3, Lambda, RDS/Aurora) Must have Unit testing Framework Junit/Mokito/PowerMock Must have Strong experience on SQL queries(MySQL/SQL server/Oracle/Hadoop/snowflake) Must have Source safe - GITHub Must have Project management tool - VSTS Must have Build management tool - Maven / Gradle Must have CI/CD Azure devops Added advantage: Good to have Shell script, Linux commands

Posted 1 week ago

Apply

4.0 - 7.0 years

0 - 0 Lacs

Noida, Pune, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Developer (data modeling and ingestion involved): Design, develop, and manage scalable database solutions using MongoDB Write robust, effective, and scalable queries and operations for MongoDB-based applications Integrate third-party services, tools, and APIs with MongoDB for data management and processing Collaborate with developers, data engineers, and stakeholders to ensure seamless integration of MongoDB with applications and systems Run unit, integration, and performance tests to ensure the stability and functionality of MongoDB implementations Conduct code and database reviews, ensuring adherence to security, scalability, and best practices in MongoDB development Preferred: Snowflakes experience Preferred candidate profile 4-7 Years 3 days work from Office Immediate Joiners (Within 2 Week) Strong in Data Modelling & Data Ingestion

Posted 1 week ago

Apply

2.0 - 7.0 years

2 - 7 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements- Btech/BE, Mtech/ME, Bsc/Msc, BCA/MCA Location- PAN INDIA

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Location- PAN INDIA

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 8 Lacs

Hyderabad, Pune, Chennai

Hybrid

Naukri logo

Looking for Snowflake Developer. Design and implement data solutions using Snowflake cloud data platform. Strong skills in SQL, data warehousing, ETL processes, and cloud (AWS/Azure/GCP) needed. Optimize data pipelines and ensure scalability in Agile teams. Join Infosys to drive impactful projects, grow your career, and excel in a diverse environment focused on innovation. Location: Hyderabad

Posted 1 week ago

Apply

6.0 - 9.0 years

15 - 25 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Experience : 6-9 yrs Work location : Pune/Bangalore/Chennai Work Mode : Hybrid Primary skills : Snowflake + DBT + Python Kindly, share the following details : Updated CV Relevant Skills Total Experience Current Company Current CTC Expected CTC Notice Period Current Location Preferred Location

Posted 1 week ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

Greater Noida

Work from Office

Naukri logo

Design, develop, and maintain high-performance SQL and PL/SQL procedures, packages, and functions in Snowflake or other cloud database technologies. Apply advanced performance tuning techniques to optimize database objects, queries, indexing strategies, and resource usage. Develop code based on reading and understanding business and functional requirements following the Agile process Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members Provide technical support to project team members and responding to inquiries regarding errors or questions about programs Interact with architects, technical leads, team members and project managers as required to address technical and schedule issues. Suggest and implement process improvements for estimating, development and testing processes. Support the development of automated and repeatable processes for ETL/ELT, data integration, and data transformation using industry best practices. Support cloud migration and modernization initiatives, including re-platforming or refactoring legacy database objects for cloud-native platforms. BS Degree in Computer Science, Information Technology, Electrical/Electronic Engineering or another related field or equivalent A minimum of 7 years prior work experience working with an application and database development organization with deep expertise in Oracle PL/SQL or SQL Server T-SQL; must demonstrate experience delivering systems and projects from inception through implementation Proven experience writing and optimizing complex stored procedures, functions, and packages in relational databases such as Oracle, MySQL, SQL Server Strong knowledge of performance tuning, including query optimization, indexing, statistics, execution plans, and partitioning Understanding of data integration pipelines, ETL tools, and batch processing techniques. Possesses solid software development and programming skills, with an understanding of design patterns, and software development best practices Experience with Snowflake, Python scripting, and data transformation frameworks like dbt is a plus Work experience in developing Web Applications with Java, Java Script, HTML, JSPs. Experience with MVC frameworks Spring and Angular

Posted 1 week ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain high-performance SQL and PL/SQL procedures, packages, and functions in Snowflake or other cloud database technologies. Apply advanced performance tuning techniques to optimize database objects, queries, indexing strategies, and resource usage. ; Develop code based on reading and understanding business and functional requirements following the Agile process Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements ; Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members ; Provide technical support to project team members and responding to inquiries regarding errors or questions about programs Interact with architects, technical leads, team members and project managers as required to address technical and schedule issues. ; Suggest and implement process improvements for estimating, development and testing processes. Support the development of automated and repeatable processes for ETL/ELT, data integration, and data transformation using industry best practices. ; Support cloud migration and modernization initiatives, including re-platforming or refactoring legacy database objects for cloud-native platforms. ; BS Degree in Computer Science, Information Technology, Electrical/Electronic Engineering or another related field or equivalent ; A minimum of 7 years prior work experience working with an application and database development organization with deep expertise in Oracle PL/SQL or SQL Server T-SQL; must demonstrate experience delivering systems and projects from inception through implementation ; Proven experience writing and optimizing complex stored procedures, functions, and packages in relational databases such as Oracle, MySQL, SQL Server ; Strong knowledge of performance tuning, including query optimization, indexing, statistics, execution plans, and partitioning ; Understanding of data integration pipelines, ETL tools, and batch processing techniques. ; Possesses solid software development and programming skills, with an understanding of design patterns, and software development best practices ; Experience with Snowflake, Python scripting, and data transformation frameworks like dbt is a plus ; Work experience in developing Web Applications with Java, Java Script, HTML, JSPs. Experience with MVC frameworks Spring and Angular

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 45 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are seeking a motivated and detail-oriented QA & Test Automation Support Engineer (Test Automation Engineering) to join our growing data quality assurance team. In this role, you will work closely with and under the guidance of a Senior QA & Test Automation Engineer to support the testing and validation of our enterprise data platforms, with a focus on data quality, semantic layer validation, and API testing. You will contribute to ensuring data integrity across complex data pipelines, assist in validating business logic behind data transformations, and participate in both manual and automated testing processes. This is a hands-on, growth-oriented position ideal for someone looking to deepen their skills in data quality engineering, API testing, and test automation. Roles & Responsibilities E xecute automated test suites across various layers including data pipelines, APIs, and semantic layers. Analyze test automation results, identify failures or inconsistencies, and assist in root cause analysis. Generate and share test execution reports with stakeholders, summarizing pass/fail rates and key issues. Collaborate with the Senior QA Engineer to triage automation failures and escalate critical issues. Assist in test data preparation and test environment setup. Perform manual validation as needed to support automation gaps or verify edge cases. Log and track defects in JIRA (or similar tools), and follow up on resolutions with relevant teams. Help maintain test documentation, including test case updates, runbooks, and regression packs. Contribute to test automation scripting, framework maintenance, and CI/CD integration. Validate data transformations and flows across platforms like AWS, Databricks, and Snowflake. Participate in testing of semantic layers and Graph QL APIs, including schema validation. Maintain test documentation and work closely with QA, data engineers, and analysts to ensure data quality. Must-Have Skills: Hands-on experience executing and analyzing automated test suites 1+ strong experience in Test Automation specialization 3 to 5 years overall experience in QA & Test Automation is expected. Strong understanding of test result analysis and defect tracking (JIRA or similar) Basic knowledge of test automation scripting (Python, Java, or similar) Proficient in SQL for data validation Familiarity with ETL/ELT pipelines and data testing concepts Experience with API testing (REST & GraphQL) and schema validation Exposure to cloud data platforms like AWS, Databricks, or Snowflake Understanding of CI/CD tools (e.g., Jenkins, GitLab CI) Good communication and collaboration skills Strong attention to detail and a problem-solving mindset Good-to-Have Skills: Experience with data governance tools such as Apache Atlas, Collibra, or Alation Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Education and Professional Certifications Bachelor’s/Masters degree in computer science and engineering preferred. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 1 week ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Data Platform Engineer About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Roles & Responsibilities: Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development. Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc. Assist in building and maintaining relationships with internal and external business stakeholders Develop basic understanding of core business problems and identify opportunities to use advanced analytics Assist in reviewing 3rd party providers for new feature/function/technical fit with department's data management needs. Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform. Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Keen on embracing new responsibilities, facing challenges, and mastering new technologies What we expect of you Basic Qualifications and Experience: Master’s degree in computer science or engineering field and 1 to 3 years of relevant experience OR Bachelor’s degree in computer science or engineering field and 3 to 5 years of relevant experience OR Diploma and Minimum of 8+ years of relevant work experience Must-Have Skills: Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning Experience with common data processing librariesPandas, PySpark, SQLAlchemy. Experience in UI frameworks (Angular.js or React.js) Experience with data lake, data fabric and data mesh concepts Experience with data modeling, performance tuning, and experience on relational databases Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Program skills in one or more computer languages – SQL, Python, Java Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops Exposure to Jira or Jira Align. Good-to-Have Skills: Knowledge on R language will be considered an advantage Experience in Cloud technologies AWS preferred. Cloud Certifications -AWS, Databricks, Microsoft Familiarity with the use of AI for development productivity, such as Github Copilot, Databricks Assistant, Amazon Q Developer or equivalent. Knowledge of Agile and DevOps practices. Skills in disaster recovery planning. Familiarity with load testing tools (JMeter, Gatling). Basic understanding of AI/ML for monitoring. Knowledge of distributed systems and microservices. Data visualization skills (Tableau, Power BI). Strong communication and leadership skills. Understanding of compliance and auditing requirements. Soft Skills: Excellent analytical and solve skills Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

12.0 - 16.0 years

40 - 45 Lacs

Gurugram

Work from Office

Naukri logo

Overview Enterprise Data Operations Assoc Manager Job Overview: As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities: Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications: 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the center of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams.

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. Responsibilities Be a founding member of the data engineering team. Help to attract talent to the team by networking with your peers, by representing PepsiCo HBS at conferences and other events, and by discussing our values and best practices when interviewing candidates. Own data pipeline development end-to-end, spanning data modeling, testing, scalability, operability and ongoing metrics. Ensure that we build high quality software by reviewing peer code check-ins. Define best practices for product development, engineering, and coding as part of a world class engineering team. Collaborate in architecture discussions and architectural decision making that is part of continually improving and expanding these platforms. Lead feature development in collaboration with other engineers; validate requirements / stories, assess current system capabilities, and decompose feature requirements into engineering tasks. Focus on delivering high quality data pipelines and tools through careful analysis of system capabilities and feature requests, peer reviews, test automation, and collaboration with other engineers. Develop software in short iterations to quickly add business value. Introduce new tools / practices to improve data and code quality; this includes researching / sourcing 3rd party tools and libraries, as well as developing tools in-house to improve workflow and quality for all data engineers. Support data pipelines developed by your teamthrough good exception handling, monitoring, and when needed by debugging production issues. Qualifications 6-9 years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience in SQL optimization and performance tuning Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with data profiling and data quality tools like Apache Griffin, Deequ, or Great Expectations. Current skills in following technologies: Python Orchestration platforms: Airflow, Luigi, Databricks, or similar Relational databases: Postgres, MySQL, or equivalents MPP data systems: Snowflake, Redshift, Synapse, or similar Cloud platforms: AWS, Azure, or similar Version control (e.g., GitHub) and familiarity with deployment, CI/CD tools. Fluent with Agile processes and tools such as Jira or Pivotal Tracker Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus.

Posted 1 week ago

Apply

4.0 - 7.0 years

15 - 27 Lacs

Chennai

Work from Office

Naukri logo

We're Nagarro , We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 3+ years. Excellent knowledge and experience in Big data engineer. Strong experience with AWS services , especially S3, Glue, Athena, and EMR. Hands-on programming experience in Python, Spark, Pyspark and SQL . Proficiency in working with data warehouses such as Amazon Redshift or Snowflake . Experience handling structured and semi-structured data. Strong understanding of ETL/ELT processes and data transformation techniques. Proven experience in cross-functional collaboration with technical and business teams. Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. Experience with creating technical documentation and solution designs. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: Develop, maintain, and optimize complex SQL queries and DBT models for business analytics and reporting. Analyze large datasets stored in Snowflake to extract actionable insights and support data-driven decision-making. Design and implement robust data pipelines using Python , ensuring data quality, integrity, and availability. Collaborate with cross-functional teams to gather business requirements and translate them into technical solutions. Leverage tools like Fivetran and Airflow to orchestrate and automate data workflows. Contribute to version control and CI/CD processes using Git and Jenkins . Support data infrastructure hosted on AWS , ensuring scalability and security. Document data models, processes, and best practices using tools such as SQL DBM . Required Skills and Qualifications: Primary Skills: Proficiency in Snowflake , Python , and SQL for data analysis and transformation. Experience with DBT for building scalable and modular analytics workflows. Secondary Skills: Familiarity with Fivetran for data ingestion and Airflow for workflow orchestration. Knowledge of Git and Jenkins for version control and automation. Experience with AWS cloud services for data storage and compute. Understanding of SQL DBM or similar tools for data modeling and documentation. Bachelors or Master’s degree in Computer Science, Data Science, Information Systems, or a related field.

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Must have minimum 6 years of relevant experience in IT Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Proficiency in DBT (Data Build Tool) for data transformation and modelling Experience with ETL/ELT processes and integrating data from multiple sources. Experience in designing Tableau dashboards, data visualizations, and reports Familiarity with data warehousing concepts and best practices Strong problem-solving skills and ability to work in cross-functional teams.

Posted 1 week ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Experience - 5+Years Job TitlePlatform Administrator / Data Platform Administrator Job Summary: We are seeking a highly skilled Platform Administrator with expertise in cloud solutions, data platform management, and data pipeline orchestration. The ideal candidate will possess a strong background in AWS architecture, experience with tools like Airflow, Informatica IDMC, and Snowflake, and a proven ability to design, implement, and maintain robust data platforms that support scalable, secure, and cost-effective operations. Key Responsibilities: Administer and manage cloud-based data platforms, ensuring high availability, scalability, and performance optimization. Architect and implement solutions on AWS to support data integration, transformation, and analytics needs. Leverage AWS services (such as EC2, S3, Lambda, RDS, and Redshift) to design and deploy secure and efficient cloud infrastructures. Manage and optimize data pipelines using Apache Airflow, ensuring reliable scheduling and orchestration of ETL processes. Oversee data integration and management processes using Informatica IDMC for data governance, quality, and privacy in the cloud. Administer and optimize Snowflake data warehousing solutions for querying, storage, and data analysis. Collaborate with cross-functional teams to ensure seamless data flow, system integration, and business intelligence capabilities. Ensure compliance with industry standards and best practices for cloud security, data privacy, and cost management. Required Qualifications: AWS Certified Solutions Architect - Professional or equivalent. Strong experience in platform administration and cloud architecture. Hands-on experience with Airflow, Informatica IDMC, and Snowflake. Proficiency in designing, deploying, and managing cloud-based solutions in AWS. Familiarity with data integration, ETL processes, and data governance best practices. Knowledge of scripting languages (Python, Shell, etc.) for automation and workflow management. Excellent troubleshooting, problem-solving, and performance optimization skills. Strong communication skills and the ability to collaborate with teams across technical and non-technical disciplines. Preferred Qualifications: Experience in multi-cloud environments. Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data visualization tools and business intelligence platforms.

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 4+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must be strong in SQL Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking an experienced Data Engineer with expertise in Snowflake and PLSQL to design, develop, and optimize scalable data solutions. The ideal candidate will be responsible for building robust data pipelines, managing integrations, and ensuring efficient data processing within the Snowflake environment. This role requires a strong background in SQL, data modeling, and ETL processes, along with the ability to troubleshoot performance issues and collaborate with cross-functional teams. Responsibilities: Design, develop, and maintain data pipelines in Snowflake to support business analytics and reporting. Write optimized PLSQL queries, stored procedures, and scripts for efficient data processing and transformation. Integrate and manage data from various structured and unstructured sources into the Snowflake data platform. Optimize Snowflake performance by tuning queries, managing workloads, and implementing best practices. Collaborate with data architects, analysts, and business teams to develop scalable and high-performing data solutions. Ensure data security, integrity, and governance while handling large-scale datasets. Automate and streamline ETL/ELT workflows for improved efficiency and data consistency. Monitor, troubleshoot, and resolve data quality issues, performance bottlenecks, and system failures. Stay updated on Snowflake advancements, best practices, and industry trends to enhance data engineering capabilities. Required Skills: Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Hands-on expertise in PLSQL, including writing and optimizing complex queries, stored procedures, and functions. Proven ability to work with large datasets, data warehousing concepts, and cloud-based data management. Proficiency in SQL, data modeling, and database performance tuning. Experience with ETL/ELT processes and integrating data from multiple sources. Familiarity with cloud platforms such as AWS, Azure, or GCP is an added advantage. Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced) are a plus. Strong analytical skills, problem-solving abilities, and attention to detail. Excellent communication skills and ability to work effectively in a collaborative environment.

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job description Job TitleETL Tester Job Responsibilities: Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Use Tableau to create visualizations and dashboards that help in data analysis and reporting. Work with Snowflake to test and validate data stored in the cloud data warehouse. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results Exposure to Salesforce and proficiency in developing SQL queries The ideal candidate will have a strong background in ETL processes, data validation, and experience with Tableau and Snowflake. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline.

Posted 1 week ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 6+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 1 week ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

JD - QA resource with exp in Salesforce Marketing cloud and some programming experience basically testing needs and some email templates build potentially. Testing will be for tables and data in Snowflakes and SFMC testing.

Posted 1 week ago

Apply

9.0 - 14.0 years

22 - 35 Lacs

Gurugram, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies