Home
Jobs

1082 Snowflake Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: We are seeking a seasoned Solution Architect to drive the architecture, development and implementation of data solutions to Amgen functional groups. The ideal candidate able to work in large scale Data Analytic initiatives, engage and work along with Business, Program Management, Data Engineering and Analytic Engineering teams. Be champions of enterprise data analytic strategy, data architecture blueprints and architectural guidelines. As a Solution Architect, you will play a crucial role in designing, building, and optimizing data solutions to Amgen functional groups such as R&D, Operations and GCO. Roles & Responsibilities: Implement and manage large scale data analytic solutions to Amgen functional groups that align with the Amgen Data strategy Collaborate with Business, Program Management, Data Engineering and Analytic Engineering teams to deliver data solutions Responsible for design, develop, optimize, delivery and support of Data solutions on AWS and Databricks architecture Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Provide expert guidance and mentorship to the team members, fostering a culture of innovation and best practices. Be passionate and hands-on to quickly experiment with new data related technologies Define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data platform. Collaborate and align with EARB, Cloud Infrastructure, Security and other technology leaders on Enterprise Data Architecture changes Work with different project and application groups to drive growth of the Enterprise Data Platform using effective written/verbal communication skills, and lead demos at different roadmap sessions Overall management of the Enterprise Data Platform on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met Ensure scalability, reliability, and performance of data platforms by implementing best practices for architecture, cloud resource optimization, and system tuning. Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. Maintain knowledge of market trends and developments in data integration, data management and analytics software/tools Work as part of team in a SAFe Agile/Scrum model Basic Qualifications and Experience: Masters degree with 6 - 8 years of experience in Computer Science, IT or related field OR Bachelors degree with 9 - 12 years of experience in Computer Science, IT or related field OR Functional Skills: Must-Have Skills: 7+ years of hands-on experience in Data integrations, Data Management and BI technology stack. Strong experience with one or more Data Management tools such as AWS data lake, Snowflake or Azure Data Fabric Expert-level proficiency with Databricks and experience in optimizing data pipelines and workflows in Databricks environments. Strong experience with Python, PySpark, and SQL for building scalable data workflows and pipelines. Experience with Apache Spark, Delta Lake, and other relevant technologies for large-scale data processing. Familiarity with BI tools including Tableau and PowerBI Demonstrated ability to enhance cost-efficiency, scalability, and performance for data solutions Strong analytical and problem-solving skills to address complex data solutions Good-to-Have Skills: Preferred to have experience in life science or tech or consultative solution architecture roles Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

6.0 - 10.0 years

18 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Analyzes and solve problems using technical experience, judgment and precedents Provides informal guidance to new team members Explains complex information to others in straightforward situations 1. Data Engineering and Modelling: Design & Develop Scalable Data Pipelines: Leverage AWS technologies to design, develop, and manage end-to-end data pipelines with services like ETL, Kafka, DMS, Glue, Lambda, and Step Functions . Orchestrate Workflows: Use Apache Airflow to build, deploy, and manage automated workflows, ensuring smooth and efficient data processing and orchestration. Snowflake Data Warehouse: Design, implement, and maintain Snowflake data warehouses, ensuring optimal performance, scalability, and seamless data availability. Infrastructure Automation: Utilize Terraform and CloudFormation to automate cloud infrastructure provisioning, ensuring efficiency, scalability, and adherence to security best practices. Logical & Physical Data Models: Design and implement high-performance logical and physical data models using Star and Snowflake schemas that meet both technical and business requirements. Data Modeling Tools: Utilize Erwin or similar modeling tools to create, maintain, and optimize data models, ensuring they align with evolving business needs. Continuous Optimization: Actively monitor and improve data models to ensure they deliver the best performance, scalability, and security. 2. Collaboration, Communication, and Continuous Improvement: Cross-Functional Collaboration: Work closely with data scientists, analysts, and business stakeholders to gather requirements and deliver tailored data solutions that meet business objectives. Data Security Expertise: Provide guidance on data security best practices and ensure team members follow secure coding and data handling procedures. Innovation & Learning: Stay abreast of emerging trends in data engineering, cloud computing, and data security to recommend and implement innovative solutions. Optimization & Automation: Proactively identify opportunities to optimize system performance, enhance data security, and automate manual workflows. Key Skills & Expertise: Snowflake Data Warehousing: Hands-on experience with Snowflake, including performance tuning, role-based access controls, dynamic Masking, data sharing, encryption, and row/column-level security. Data Modeling: Expertise in physical and logical data modeling, specifically with Star and Snowflake schemas using tools like Erwin or similar . AWS Services Proficiency: In-depth knowledge of AWS services like ETL, DMS, Glue, Step Functions, Airflow, Lambda, CloudFormation, S3, IAM, EKS and Terraform . Programming & Scripting: Strong working knowledge of Python, R, Scala, PySpark and SQL (including stored procedures). DevOps & CI/CD: Solid understanding of CI/CD pipelines, DevOps principles, and infrastructure-as-code practices using tools like Terraform, JFrog, Jenkins and CloudFormation . Analytical & Troubleshooting Skills: Proven ability to solve complex data engineering issues and optimize data workflows. Excellent Communication: Strong interpersonal and communication skills, with the ability to work across teams and with stakeholders to drive data-centric projects. Qualifications & Experience: Bachelors degree in computer science, Engineering, or a related field. 7-8 years of experience designing and implementing large-scale Data Lake/Warehouse integrations with diverse data storage solutions. Certifications: AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect (preferred). Snowflake Advanced Architect and/or Snowflake Core Certification ( Required ).

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

SQL, SNOWFLAKE, TABLEAU SQL, SNOWFLAKE,DBT, Datawarehousing SQL, SNOWFLAKE, Python, DBT, Datawarehousing SQL, SNOWFLAKE, Datawarehousing, any ETL tool(preffered is Matillion) SQL, SNOWFLAKE, TABLEAU

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Hybrid

Naukri logo

We are looking for a Snowflake Developer with deep expertise in Snowflake and DBT or SQL to help us build and scale our modern data platform. Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT/SQL . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 5+ years of experience as a Data Engineer, with at least 4 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Key Responsibilities • Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis. • Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses). • Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. • Monitor, troubleshoot, and enhance data workflows for performance and cost optimization. • Ensure data quality and consistency by implementing validation and governance practices. • Work on data security best practices in compliance with organizational policies and regulations. • Automate repetitive data engineering tasks using Python scripts and frameworks. • Leverage CI/CD pipelines for deployment of data workflows on AWS. Required Skills and Qualifications • Professional Experience: 5+ years of experience in data engineering or a related field. • Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3. • AWS Expertise: Hands-on experience with core AWS services for data engineering, such as AWS Glue for ETL/ELT, S3 for storage. • Redshift or Athena for data warehousing and querying. • Lambda for serverless compute. • Kinesis or SNS/SQS for data streaming. • IAM Roles for security. • Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases. • Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus. • DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline. • Version Control: Proficient with Git-based workflows. • Problem Solving: Excellent analytical and debugging skills. Optional Skills • Knowledge of data modeling and data warehouse design principles. • Experience with data visualization tools (e.g., Tableau, Power BI). • Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 17 Lacs

Pune

Remote

Naukri logo

Were Hiring! | Senior Data Engineer (Remote) Location: Remote | Shift: US - CST Time | Department: Data Engineering Are you a data powerhouse who thrives on solving complex data challenges? Do you love working with Python, AWS, and cutting-edge data tools? If yes, Atidiv wants YOU! Were looking for a Senior Data Engineer to build and scale data pipelines, transform how we manage data lakes and warehouses, and power real-time data experiences across our products. What Youll Do: Architect and develop robust, scalable data pipelines using Python & PySpark Drive real-time & batch data ingestion from diverse data sources Build and manage data lakes and data warehouses using AWS (S3, Glue, Redshift, EMR, Lambda, Kinesis) Write high-performance SQL queries and optimize ETL/ELT jobs Collaborate with data scientists, analysts, and engineers to ensure high data quality and availability Implement monitoring, logging & alerting for workflows Ensure top-tier data security, compliance & governance What We’re Looking For: 5+ years of hands-on experience in Data Engineering Strong skills in Python, DBT, SQL , and working with Snowflake Proven experience with Airflow, Kafka/Kinesis , and AWS ecosystem Deep understanding of CI/CD practices Passion for clean code, automation , and scalable systems Why Join Atidiv? 100% Remote | Flexible Work Culture Opportunity to work with cutting-edge technologies Collaborative, supportive team that values innovation and ownership Work on high-impact, global projects Ready to transform data into impact? Send your resume to: nitish.pati@atidiv.com

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Noida

Work from Office

Naukri logo

Hi All, We are urgently hiring for a "Snowflake developer"" with a reputed Client for Noida, Location Experience - 6 - 10 years ### Looking for IMMEDIATE JOINER to 3rd week of June### Mission: Snowflake developer Python SQL DBT Interested candidate kindly share your resume on gayatri.pat@peoplefy.com

Posted 1 week ago

Apply

4.0 - 6.0 years

14 - 24 Lacs

Hyderabad

Hybrid

Naukri logo

Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL . Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 46 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment.

Posted 1 week ago

Apply

12.0 - 20.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

We are looking for a skilled Data Engineer to join our growing data team. The ideal candidate will be responsible for designing, building, and maintaining data pipelines and infrastructure to support data analytics and business intelligence needs. A strong foundation in cloud data platforms, data transformation tools, and programming is essential. Key Responsibilities: Design and implement scalable data pipelines using Azure Data Lake and dbt . Ingest and transform data from various sources including databases, APIs, flat files, JSON, and XML . Work with Snowflake to optimize storage, processing, and query performance. Collaborate with analysts and business users to understand data requirements. Integrate workflows using Apache Airflow for orchestration. Support and contribute to the development of reporting and visualization tools using Power BI (good to have). Write clean, efficient, and maintainable code using Python . Ensure data quality, integrity, and governance across pipelines and platforms. Must-Have Skills: Strong experience with Azure Data Lake . Proficient in dbt for data transformation and modeling. Excellent programming skills in Python . Experience working with data from diverse sources: databases, APIs, flat files, JSON, and XML . Should-Have Skills: Good working knowledge of Snowflake , beyond basic querying. Familiarity with Apache Airflow for job orchestration and scheduling. Good to Have: Exposure to Power BI for creating dashboards and visual reports

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Snowflake.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : ETL Test lead Required skills and qualifications : ETL testing Data Warehousing SQL Snowflake Python/Pyspark (knowledge) Qualification : Any Graduate or Above Relevant Experience : 7 to 12 Location : Hyderabad/Bangalore/Chennai CTC Range : 20 to 25 LPA (fixed) Notice period : Immediate joiners /Currently serving Mode of Interview : Virtual Mode of Work : Hybrid Gayatri G Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA gayatri@blackwhite.in I www.blackwhite.in +91 8067432472

Posted 1 week ago

Apply

9.0 - 14.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Role & responsibilities Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Work with data and analytics experts to strive for greater functionality in our data systems. Assemble large, complex data sets that meet functional / non-functional business requirements. – Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. – Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. – Unit Test databases and perform bug fixes. – Develop best practices for database design and development activities. – Take on technical leadership responsibilities of database projects across various scrum teams. Manage exploratory data analysis to support dashboard development (desirable) Required Skills: – Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). – Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. – Understanding of data modelling techniques and working knowledge with OLAP systems – Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. – In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. – Ability to fine tune report generating queries. – Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. – Understanding of index design and performance-tuning techniques – Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions – Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). – Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions – Exposure to Source control like GIT, Azure DevOps – Understanding of Agile methodologies (Scrum, Itanban) – experience with NoSQL database to migrate data into other type of databases with real time replication (desirable). – Experience with CI/CD automation tools (desirable) – Programming language experience in Golang, Python, any programming language, Visualization tools (Power BI/Tableau) (desirable).

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

JD: Snowflake Implementer : Designing, implementing, and managing Snowflake data warehouse solutions, ensuring data integrity, and optimizing performance for clients or internal teams. Strong SQL skills: Expertise in writing, optimizing, and troubleshooting SQL queries. Experience with data warehousing: Understanding of data warehousing concepts, principles, and best practices. Knowledge of ETL/ELT technologies: Experience with tools and techniques for data extraction, transformation, and loading. Experience with data modeling: Ability to design and implement data models that meet business requirements. Familiarity with cloud platforms: Experience with cloud platforms like AWS, Azure, or GCP (depending on the specific Snowflake environment). Problem-solving and analytical skills: Ability to identify, diagnose, and resolve technical issues. Communication and collaboration skills: Ability to work effectively with cross-functional teams. Experience with Snowflake (preferred): Prior experience with Snowflake is highly desirable. Certifications (preferred): Snowflake certifications (e.g., Snowflake Data Engineer, Snowflake Database Administrator) can be a plus. Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

10.0 - 12.0 years

13 - 18 Lacs

Noida, Indore, Bengaluru

Work from Office

Naukri logo

Primary Tool & Expertise: Power BI and Semantic modelling Key Project Focus: Leading the migration of legacy reporting systems (Cognos or MicroStrategy) to Power BI solutions. Core Responsibility: Build / Develop optimized semantic models , metrics and complex reports and dashboards Work closely with the business analysts / BI teams to help business teams drive improvement in key business metrics and customer experience Responsible for timely, quality, and successful deliveries Sharing knowledge and experience within the team / other groups in the org Lead teams of BI engineers translate designed solutions to them, review their work, and provide guidance. Manage client communications and deliverables Roles and Responsibilities Core BI Skills: Power BI (Semanti Modelling, DAX, Power Query, Power BI Service) Data Warehousing Data Modeling Data Visualization SQL Datawarehouse, Database & Querying: Strong skills in databases (Oracle / MySQL / DB2 / Postgres) and expertise in writing SQL queries. Experience with cloud-based data intelligence platforms like (Databricks / Snowflake) Strong understanding of data warehousing and data modelling concepts and principles. Strong skills and experience in creating semantic models in the Power BI or similar tools . Additional BI & Data Skills (Good to Have): Certifications in Power BI and any Data platform Experience in other tools like MicroStrategy and Cognos Proven experience in migrating existing BI solutions to Power BI or other modern BI platforms. Experience with the broader Power Platform (Power Automate, Power Apps) to create integrated solutions Knowledge and experience in Power BI Admin features like, versioning, usage reports, capacity planning, creation of deployment pipelines etc Sound knowledge of various forms of data analysis and presentation methodologies. Experience in formal project management methodologies Exposure to multiple BI tools is desirable. Experience with Generative BI implementation. Working knowledge of scripting languages like Perl, Shell, Python is desirable. Exposure to one of the cloud providers – AWS / Azure / GCP. Soft Skills & Business Acumen: Exposure to multiple business domains (e.g., Insurance, Reinsurance, Retail, BFSI, healthcare, telecom) is desirable. Exposure to complete SDLC. Out-of-the-box thinker and not just limited to the work done in the projects. Capable of working as an individual contributor and within a team. Good communication, problem-solving, and interpersonal skills. Self-starter and resourceful, skilled in identifying and mitigating risks

Posted 1 week ago

Apply

9.0 - 12.0 years

25 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Experience 12yrs only Notice period :- immediate / 15days Location :- Hyderabad Client :- Tech Star Group Please highlight the mandatory sill in resume . Client Feedback :- In short, the client is primarily looking for a candidate with strong expertise in data-related skills, including: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake. ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Imp :-The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

Kolkata, Chennai, Bengaluru

Work from Office

Naukri logo

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 week ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure 1 ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team ManagementProductivity, efficiency, absenteeism 3 Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake.

Posted 1 week ago

Apply

6.0 - 8.0 years

12 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

Exp: 6+ years Shift: 2 PM to 11 PM JD For SDET-1 : Experience 6+ yrs Handon experience in ETL & BI Test Automation and scripting . Handon experience in Data Integration Testing by using different sources using Snowflake and SQL server. Develop and maintain UI automation test scripts using frameworks like WebdriverIO, Playwright and JavaScript. Experience in SQL is mandatory and hands-on experience validating data accuracy using SQL and source system comparisons. GenAI Experience on Test Case Design and script automation is required. Assessing the performance and scalability of BI reports and dashboards. Collaborating with stakeholders across the teams. Experience with Agile Practices, Test Management and Defect Management

Posted 1 week ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Title : AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) Req ID: 325686 We are currently seeking a AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) to join our team in Bangalore, Karntaka (IN-KA), India (IN). Minimum Experience on Key Skills - 5 to 10 years Skills: AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) We looking for operational engineer who is ready to work on weekends for oncall as primary criteria. Skills we look for AWS cloud (SQS, SNS, , DynomoDB, EKS), SQL (postgress, cassendra), snowflake, ControlM/Autosys/Airflow, ServiceNow, Datadog, Splunk, Grafana, python/shell scripting.

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Title : Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects Req ID: 323749 We are currently seeking a Data & AI Technical Solution ArchitectsData & AI Technical Solution Architects to join our team in Pune, Mahrshtra (IN-MH), India (IN). Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Req ID: 324664 We are currently seeking a Data Architect to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Key Responsibilities: Develop and articulate long-term strategic goals for data architecture vision and establish data standards for enterprise systems. Utilize various cloud technologies, including Azure, AWS, GCP, and data platforms like Databricks and Snowflake. Conceptualize and create an end-to-end vision outlining the seamless flow of data through successive stages. Institute processes for governing the identification, collection, and utilization of corporate metadata, ensuring accuracy and validity. Implement methods and procedures for tracking data quality, completeness, redundancy, compliance, and continuous improvement. Evaluate and determine governance, stewardship, and frameworks for effective data management across the enterprise. Develop comprehensive strategies and plans for data capacity planning, data security, life cycle data management, scalability, backup, disaster recovery, business continuity, and archiving. Identify potential areas for policy and procedure enhancements, initiating changes where required for optimal data management. Formulate and maintain data models and establish policies and procedures for functional design. Offer technical recommendations to senior managers and technical staff in the development and implementation of databases and documentation. Stay informed about upgrades and emerging database technologies through continuous research. Collaborate with project managers and business leaders on all projects involving enterprise data. Document the data architecture and environment to ensure a current and accurate understanding of the overall data landscape. Design and implement data solutions tailored to meet customer needs and specific use cases. Provide thought leadership by recommending the most suitable technologies and solutions for various use cases, spanning from the application layer to infrastructure. Basic Qualifications: 8+ years of hands-on experience with various database technologies 6+ years of experience with Cloud-based systems and Enterprise Data Architecture, driving end-to end technology solutions. Experience with Azure, Databricks, Snowflake Knowledgeable on concepts of GenAI Ability to travel at least 25%." Preferred Skills: Possess certifications in AWS, Azure, and GCP to complement extensive hands-on experience. Demonstrated expertise with certifications in Snowflake. Valuable "Big 4" Management Consulting experience or exposure to multiple industries. Undergraduate or graduate degree preferred.

Posted 1 week ago

Apply

12.0 - 17.0 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

Req ID: 303369 We are currently seeking a Enterprise Resource Planning Advisor to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Has more than 12 years of relevant experience with Oracle ERP Cloud Data migration and minimum 4 end to end ERP cloud implementation. Analyze Data and Mapping Work with functional teams to understand data requirements and develop mappings to enable smooth migration using FBDI (File-Based Data Import) and ADFdi. Develop and Manage FBDI Templates Design, customize, and manage FBDI templates to facilitate data import into Oracle SaaS applications, ensuring data structure compatibility and completeness. Configure ADFdi for Data Uploads Use ADFdi (Application Development Framework Desktop Integration) for interactive data uploads, enabling users to manipulate and validate data directly within Excel. Data Validation and Quality Checks Implement data validation rules and perform pre- and post-load checks to maintain data integrity and quality, reducing errors during migration. Execute and Troubleshoot Data Loads Run data loads, monitor progress, troubleshoot errors, and optimize performance during the data migration process. Collaborate with Stakeholders Coordinate with cross-functional teams, including project managers and business analysts, to align on timelines, resolve data issues, and provide migration status updates.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 306668 We are currently seeking a Cloud Solution Delivery Sr Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 2+ years of experience inleading a team of data engineers - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification

Posted 1 week ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 321505 We are currently seeking a Test Analyst to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties"¢ Understand business requirements , develop test cases. "¢ Work with tech team and client to validate and finalise test cases.. "¢ Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects "¢ Run in testing phase "“ SIT and UAT "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required"¢ Test Cases development "¢ Jira knowledge for record test cases, expected results, outcomes, assign defects) "¢ Test Reporting & Documentation "¢ Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional)

Posted 1 week ago

Apply

12.0 - 15.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 323777 We are currently seeking a Data Architect Sr. Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesThe Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client"™s technology infrastructure. "¢ Key Responsibilities: "¢ Ability and experience to have conversations with the CEO, Business owners and CTO/CDO "¢ Break down intricate business challenges, devise effective solutions, and focus on client needs. "¢ Craft high level innovative solution approach for complex business problems "¢ Utilize best practices and creativity to address challenges "¢ Leverage market research, formulate perspectives, and communicate insights to clients "¢ Establish strong client relationships "¢ Interact at appropriate levels to ensure client satisfaction "¢ Knowledge and Attributes: "¢ Ability to focus on detail with an understanding of how it impacts the business strategically. "¢ Excellent client service orientation. "¢ Ability to work in high-pressure situations. "¢ Ability to establish and manage processes and practices through collaboration and the understanding of business. "¢ Ability to create new and repeat business for the organization. "¢ Ability to contribute information on relevant vertical markets "¢ Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills RequiredAcademic Qualifications and Certifications: "¢ BE/BTech or equivalent in Information Technology and/or Business Management or a related field. "¢ Scaled Agile certification desirable. "¢ Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience12-15 years "¢ Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. "¢ Very good understanding of Data, AI, Gen AI and Agentic AI "¢ Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. "¢ Must be able to work on Data & AI RFP responses as Solution Architect "¢ 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect "¢ Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. "¢ Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools "¢ Experience with large scale consulting and program execution engagements in AI and data "¢ Seasoned multi-technology infrastructure design experience. "¢ Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. "¢ Additional Additional Additional Career Level Description: Knowledge and application: "¢ Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. "¢ Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: "¢ Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. "¢ Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: "¢ Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. "¢ Works"

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies