Home
Jobs

1123 Snowflake Jobs - Page 33

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

18 - 25 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Snowflake +Python+Sql - c2c/full time Snowflake experience: At least 6 years of experience in Snowflake Python programming Knowledge

Posted 3 weeks ago

Apply

4.0 - 7.0 years

10 - 18 Lacs

Chennai

Hybrid

Naukri logo

Skills and attributes for success Delivery of Testing needs for BI & DWH Projects. Ability to effectively communicate with team members across geographies effectively Perform unstructured data / big data testing both in on-premise and cloud platform. Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for Testing BI & DWH Projects for various aspects like ETL testing & Reports testing (Front end and Backend Testing), Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 9 years of experience in Testing Data warehousing / Business Intelligence solutions, minimum 2 years of experience in Testing BI & DWH technologies and Analytics applications. Experience in Bigdata testing with Hadoop/Spark framework and exposure to predictive analytics testing. Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Experience in cloud AWS/Azure infrastructure testing is desirable. Knowledge on python data processing is desirable. Testing experience in more than one of these areas- Data Quality, ETL, OLAP, Reports Good working experience with SQL server or Oracle database and proficiency with SQL scripting. Experience in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Experience in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Experience/ Knowledge in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, youll also have Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing BI & DWH applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role

Posted 3 weeks ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Hyderabad, Pune

Hybrid

Naukri logo

Job Title : Data Engineer Work Location : India, Pune / Hyderabad (Hybrid) Responsibilities include: Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Develop data pipelines to extract and transform data in near real time using cloud native technologies. Implement data validation and quality checks to ensure accuracy and consistency. Monitor system performance, troubleshoot issues, and implement optimizations to enhance reliability and ePiciency. Collaborate with business users, analysts, and other stakeholders to understand data requirements and deliver tailored solutions. Document technical designs, workflows, and best practices to facilitate knowledge sharing and maintain system documentation. Provide technical guidance and support to team members and stakeholders as needed. Desirable Competencies: 8+ years of work experience Proficiency in writing complex SQL queries on MPP systems (Snowflake/Redshift) Experience in Databricks and Delta tables. Data Engineering experience with Spark/Scala/Python Experience in Microsoft Azure stack (Azure Storage Accounts, Data Factory and Databricks). Experience in Azure DevOps and CI/CD pipelines. Working knowledge of Python Feel comfortable participating in 2-week sprint development cycles. About Us Founded in 1956, Williams-Sonoma Inc. is the premier specialty retailer of high-quality products for the kitchen and home in the United States. Today, Williams-Sonoma, Inc. is one of the United States' largest e-commerce retailers with some of the best known and most beloved brands in home furnishings. Our family of brands are Williams-Sonoma, Pottery Barn, Pottery Barn Kids, Pottery Barn Teens, West Elm, Williams-Sonoma Home, Rejuvenation, GreenRow and Mark and Graham. We currently operate retail stores globally. Our products are also available to customers through our catalogues and online worldwide. Williams-Sonoma has established a technology center in Pune, India to enhance its global operations. The India Technology Center serves as a critical hub for innovation and focuses on developing cutting-edge solutions in areas such as e-commerce, supply chain optimization, and customer experience management. By integrating advanced technologies like artificial intelligence, data analytics, and machine learning, the India Technology Center plays a crucial role in accelerating Williams-Sonoma's growth and maintaining its competitive edge in the global market.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 weeks ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications and Experience: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

locationsIndia, Bangalore time typeFull time posted onPosted 4 Days Ago job requisition idJR0270485 Job Details: About The Role : We seek an experienced Business Systems Analyst to join our Supply Chain IT team. The primary focus of this position is to enable, transform, and deliver Supply Planning data solutions for Intel's key business groups. In this position, you will collaborate with stakeholders from various business domains, including the business operations team, Master data, Supply Planning, Finance, and other IT teams. The ideal candidate should possess a combination of business process knowledge, data and analytics skills, and acumen to enable process transformation by leveraging technology. Responsibilities include but are not limited to: - Collaborate with stakeholders to establish, prioritize, implement, maintain, improve, and discontinue process capabilities. - Develop detailed functional specifications and work closely with business stakeholders and the Blue Yonder team. - Design new data pipelines and maintain existing ones between SAP, the data warehouse, the Planning Data Hub, and the Blue Yonder landscape. - Identify business requirements and system specifications that meet user data needs, map them to system capabilities, and recommend technical solutions. - Partner with SAP Master Data, Order to Cash (O2C), Procure to Pay (P2P), and Supply Planning teams to understand data needs and capture them as requirements for implementing pipelines in Snowflake. - Participate in all phases of product testing, from unit testing to user acceptance testing on the IT front. - Ensure alignment of transformation efforts with relevant enterprise-level initiatives. - Maintain and build stakeholder relationships while effectively communicating across teams. - Estimate effort and schedules for major projects, driving the team to meet timelines while ensuring quality. Qualifications: Minimum Qualifications: Bachelor's and/or master's degree and 9+ years of experience in: Supply Planning - SOP and SOE processes. Inventory Management or Production Planning Business Processes. Designing and implementing data solutions for enterprise planning software solutions such as Blue Yonder ESP, IBP, or equivalent. A background in semiconductor manufacturing and high-level SQL knowledge. Preferred Qualifications: Designing data solutions on Snowflake, Azure Databricks, or similar environments. Knowledge in Order to Cash and Procure or Pay E2E processes. Job Type: Experienced Hire Shift: Shift 1 (India) Primary Location: India, Bangalore Additional Locations: Business group: Intel's Information Technology Group (IT) designs, deploys and supports the information technology architecture and hardware/software applications for Intel. This includes the LAN, WAN, telephony, data centers, client PCs, backup and restore, and enterprise applications. IT is also responsible for e-Commerce development, data hosting and delivery of Web content and services. Posting Statement: All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Position of Trust N/A Work Model for this Role This role will be eligible for our hybrid work model which allows employees to split their time between working on-site at their assigned Intel site and off-site. *

Posted 3 weeks ago

Apply

4.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

About Us Shravas Technologies, founded in 2016, is an IT services company based out of Bangalore, India. The company specializes in Software QA and related services such as Data Mining, Analytics, and Visualization. Job Title Snowflake Developer (4 to 9 Years Experience) Location Bangalore Type Full-time, Hybrid Job Summary We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate should have hands-on expertise in building and optimizing scalable data pipelines and working with Snowflake data warehouse solutions. This role involves working closely with business analysts, data scientists, and other developers to deliver reliable, secure, and high-performance data solutions. Key Responsibilities - Design, develop, and implement Snowflake-based data solutions. - Create and maintain scalable ETL/ELT pipelines using tools such as SQL, Python, dbt, Airflow, or similar - Develop data models and schema design optimized for performance and usability. - Write and optimize complex SQL queries for data transformation and extraction. - Integrate Snowflake with other systems like MySQL, SQL Server, AWS (S3, Lambda), Azure, or GCP using APIs or connectors. - Manage Snowflake security (roles, users, access control). - Monitor data pipeline performance and troubleshoot issues. - Participate in code reviews, unit testing, and documentation. Required Skills and Qualifications - Bachelor's degree in Computer Science, Information Systems, or a related field. - 4 to 9 years of experience in data engineering or d - Proficiency in SQL and performance tuning. - Experience with data pipeline and ETL tools (e.g., Informatica, Talend, dbt, Apache Airflow). - Familiarity with cloud platforms (AWS, Azure, or GCP). - Understanding of data warehousing concepts and best practices. - Knowledge of version control systems like Git. Preferred Skills - Experience with Python or Scala for data processing. - Familiarity with tools like Stitch, Fivetran, or Matillion. - Exposure to CI/CD pipelines for data projects. - Knowledge of data governance and security compliance. - Understanding of financial and economic data trends is a plus. Soft Skills - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and collaboratively in a team environment. - Detail-oriented with a strong focus on quality and accuracy. Reporting To Lead Data Engineer / Chief Data Officer

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

: We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Big Data Engineer (Remote, Contract 6 Months+) Location: Remote | Contract Duration: 6+ Months | Domain: Big Data Stack We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop ecosystem tools. Build end-to-end data workflows for batch and streaming pipelines. Optimize data storage and retrieval processes in HBase, Hive, and other NoSQL databases. Collaborate with data scientists and business stakeholders to design robust data infrastructure. Ensure data integrity, consistency, and security in line with organizational policies. Troubleshoot and tune performance for distributed systems and applications. #MustHaveSkills in Data Engineering / Big Data Tools: Snowflake (Snowpark), Spark, MapReduce, Hadoop, Sqoop, Pig, HBase Data Ingestion & ETL, Data Pipeline Design, Distributed Computing Strong understanding of Big Data architectures & performance tuning Hands-on experience with large-scale data storage and query optimization #NiceToHave Apache Airflow / Oozie experience Knowledge of cloud platforms (AWS, Azure, or GCP) Proficiency in Python or Scala CI/CD and DevOps exposure #ContractDetails Role: Senior Big Data Engineer Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Duration: 6+ Months (Contract) Apply via Email: navaneeta@suzva.com Contact: 9032956160 #HowToApply Send your updated resume with the subject: "Application for Remote Big Data Engineer Contract Role" Include in your email: Updated Resume Current CTC Expected CTC Current Location Notice Period / Availabilit

Posted 3 weeks ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Navi Mumbai

Work from Office

Naukri logo

Data Strategy and PlanningDevelop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data ModelingDesign and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and ManagementOversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data IntegrationDefine and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data ArchitectureAWS , Snowflake ETL & Data EngineeringAWS Glue, Apache Spark, Step Functions Big Data & AnalyticsAthena,Presto, Hadoop Database & StorageSQL, Snow sql Security & ComplianceIAM, KMS, Data Masking Preferred technical and professional experience Cloud Data WarehousingSnowflake (Data Modeling, Query Optimization) Data TransformationDBT (Data Build Tool) for ELT pipeline management Metadata & Data GovernanceAlation (Data Catalog, Lineage, Governance

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 3 weeks ago

Apply

10.0 - 16.0 years

25 - 27 Lacs

Chennai

Work from Office

Naukri logo

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)

Posted 3 weeks ago

Apply

4.0 - 6.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

3+ years of work experience in Python programming for AI/ML, deep learning, and Generative AI model development Proficiency in TensorFlow/PyTorch, Hugging Face Transformers and Langchain libraries Hands-on experience with NLP, LLM prompt design and fine-tuning, embeddings, vector databases and agentic frameworks Strong understanding of ML algorithms, probability and optimization techniques 6+ years of experience in deploying models with Docker, Kubernetes, and cloud services (AWS Bedrock, SageMaker, GCP Vertex AI) through APIs, and using MLOps and CI/CD pipelines Familiarity with retrieval-augmented generation (RAG), cache-augmented generation (CAG), retrieval-integrated generation (RIG), low-rank adaptation (LoRA) fine-tuning Ability to write scalable, production-ready ML code and optimized model inference Experience with developing ML pipelines for text classification, summarization and chat agents Prior experience with SQL and noSQL databases, and Snowflake/Databricks

Posted 3 weeks ago

Apply

4.0 - 9.0 years

11 - 20 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

Naukri logo

Skills & Experience: Strong understanding of data governance principles and enterprise data management. Knowledge of data compliance standards (e.g., GDPR, CCPA, HIPAA). Leadership, cross-functional collaboration, and change management skills. Familiarity with data architecture and analytics tools. Experience with cloud platforms (AWS, Azure, GCP). Expertise in data modeling, warehousing, and lakehouse architectures. Knowledge of ETL/ELT pipelines and data integration tools. Familiarity with big data technologies (e.g., Spark, Kafka, Hadoop). Detail-oriented with knowledge of data profiling, cleansing, and validation. Experience with data catalog and metadata management tools (e.g., Collibra, Alation). Ability to document data definitions, lineage, and business glossaries. Familiarity with data lakes, warehouses (e.g., Snowflake, Redshift), and real-time processing. Understanding of data quality and performance optimization. Experience managing data or IT transformation projects.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

16 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

5+ years experience as a Software Engineer or Data Engineer Experience developing end-to-end technical solutions and sustaining solutions in production, ensuring performance, security, scalability, and robust data integration Experience in cloud data platforms Snowflake or AWS Designed and implemented CI/CD pipelines for data engineering projects Demonstrated experience working within Agile methodologies Ability to write, debug, and optimize SQL queries User-facing written and verbal communication skills and experience Strong business acumen, experience engaging with finance stakeholders Automated data transformation and data curation using tools such as dbt or Informatica Previous programming expertise in Java, Python, or Scala Excellent communication and presentation skills, with the ability to explain complex data to non-technical audiences. Collaborate with stakeholders across the business to understand their needs and challenges, translating them into clear, concise, and technically feasible data analysis requests. Strong analytical skills and the ability to combine data from different sources Bachelor's degree in Computer Science, MIS, or a related field Experience in SAP BW & SAP HANA is a plus

Posted 3 weeks ago

Apply

5.0 - 9.0 years

12 - 20 Lacs

Chennai

Remote

Naukri logo

USXI is Looking for Big Data Developers that will work on the collecting, storing, processing, and analyzing of huge sets of data. The Data Developers must also have exceptional analytical skills, showing fluency in the use of tools such as MySQL and strong Python, Shell, Java, PHP, and T-SQL programming skills. The candidate must also be technologically adept, demonstrating strong computer skills. Additionally, you must be capable of developing databases using SSIS packages, T-SQL, MSSQL, and MySQL scripts. The candidate will also have an ability to design, build, and maintain the businesss ETL pipeline and data warehouse. The candidate will also demonstrate expertise in data modeling and query performance tuning on SQL Server, MySQL, Redshift, Postgres or similar platforms. Key responsibilities will include: Develop and maintain data pipelines Design and implement ETL processes Hands on experience on Data Modeling Design conceptual, logical and physical data models with type 1 and type2 dimension s. Platform Expertise: Leverage Microsoft Fabric, Snowflake, and Databricks to optimize data storage, transformation, and retrieval processes. Knowledge to move the ETL code base from On-premise to Cloud Architecture Understanding data lineage and governance for different data sources Maintaining clean and consistent access to all our data sources Hands on experience to deploy the code using CI/CD pipelines Assemble large and complex data sets strategically to meet business requirements Enable business users to bring data-driven insights into their business decisions through reports and dashboards Required Qualifications: Hands on experience in big data technologies including Scala or Spark (Azure Databricks preferable), Hadoop, Hive, HDFS. Python, Java & SQL Knowledge of Microsofts Azure Cloud Experience and commitment to development and testing best practices. DevOps experience with continuous integration/delivery best-practices, technologies and tools. Experienced deploying Azure SQL Database, Azure Data Factory and well-acquainted with other Azure services including Azure Data Lake and Azure ML Experience implementing REST API calls and authentication Experienced working with agile project management methodologies Computer Science Degree/Diploma Microsoft Certified: DP203 - Azure Data Engineer Associate

Posted 3 weeks ago

Apply

9.0 - 13.0 years

25 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Senior Data Engineer You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake , SnowPro Core certification is a must have. You have used DBT at least in one project to deploy models in production. You have configured and deployed Airflow and integrated various operator in airflow (especially DBT & Snowflake). You can design build, release pipelines and understand of Azure DevOps Ecosystem. You have excellent understanding of Python (especially PySpark) and able to write metadata driven programs. Familiar with Data Vault (Raw , Business) also concepts like Point In Time , Semantic Layer. You are resilient in ambiguous situations and can clearly articulate the problem in a business friendly way. You believe in documenting processes and managing the artifacts and evolving that over time.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

6 - 11 Lacs

Pune

Hybrid

Naukri logo

What’s the role all about? As a BI Developer, you’ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You’ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Design and build Reports for given requirements. Create design documents, test cases for the reports Develop SQL to address the adhoc report requirements, conduct analyses Create visualizations and reports as per the requirements Execute unit testing, functional & performance testing and document the results Conduct peer reviews and ensure quality is met at all stages Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute 2-4 years of BI report development experience Expertise in SQL & any cloud-based databases. Would be able to work with any DB to write SQL for any business needs. Experience in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas Development experience building solutions that leverage SQL and NoSQL databases. Experience/Knowledge of Snowflake an advantage. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc Experience working in Continuous Integration and Delivery practices using industry standard tools such as Jenkins Experience working in an Agile methodology development environment and using work item management tools like JIRA What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager Role Type: Individual Contributor

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Are you a seasoned data engineer with a passion for hands-on technical work? Do you thrive in an environment that values innovation, collaboration, and cutting-edge technologies? We are looking for a seasoned Integration Engineer to join our team, someone who is passionate about building and maintaining scalable data pipelines and integrations. The ideal candidate will have a strong foundation in Python programming, experience with Snowflake for data warehousing, proficiency in AWS and Kubernetes (EKS) for cloud services management, and expertise in CI/CD practices, Apache Airflow, DBT, and API development. This role is critical to enhancing our data integration capabilities and supporting our data-driven initiatives. Role and Responsibilities: As the Technical Data Integration Engineer, you will play a pivotal role in shaping the future of our data integration engineering initiatives. You will be part of talented data integration engineers while remaining actively involved in the technical aspects of the projects. Your responsibilities will include: Hands-On Contribution: Continue to be hands-on with data integration engineering tasks, including data pipeline development, EL processes, and data integration. Be the go-to expert for complex technical challenges. Integrations Architecture: Design and implement scalable and efficient data integration architectures that meet business requirements. Ensure data integrity, quality, scalability, and security throughout the pipeline. Tool Proficiency: Leverage your expertise in Snowflake, SQL, Apache Airflow, AWS, API, and Python to architect, develop, and optimize data solutions. Stay current with emerging technologies and industry best practices. Data Quality: Monitor data quality and integrity, implementing data governance policies as needed. Cross-Functional Collaboration: Collaborate with data science, data warehousing, analytics, and other cross-functional teams to understand data requirements and deliver actionable insights. Performance Optimization :Identify and address performance bottlenecks within the data infrastructure. Optimize data pipelines for speed, reliability, and efficiency. Qualifications Minimum Bachelor's degree in Computer Science, Engineering, or related field. Advanced degree is a plus. 5 years of hands-on experience in data engineering. Familiarity with cloud platforms, such as AWS or Azure. Expertise in Apache Airflow, Snowflake, SQL, Python, Shell scripting, API gateways, web services setup. Strong experience in full-stack development, AWS, Linux administration, data lake construction, data quality assurance, and integration metrics. Excellent analytical, problem-solving, and decision-making abilities. Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders. A collaborative mindset, with a focus on team success. If you are a results-oriented Data Integration Engineer with a strong background in Apache Airflow, Snowflake, SQL, Python and API, we encourage you to apply. Join us in building data solutions that drive business success and innovation

Posted 3 weeks ago

Apply

7.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary: We are looking for an experienced and highly skilled Senior Python Developer with strong hands-on expertise in Snowflake to join our growing data engineering team. The ideal candidate will have a solid background in building scalable data pipelines, data modeling, and integrating Python-based solutions with Snowflake. Roles and Responsibilities: Design, develop, and maintain scalable and efficient data pipelines using Python and Snowflake. Collaborate with data architects and analysts to understand data requirements and translate them into technical solutions. Write complex SQL queries and stored procedures in Snowflake. Optimize Snowflake performance using best practices for data modeling, partitioning, and caching. Develop and deploy Python-based ETL/ELT processes. Integrate Snowflake with other data sources, APIs, or BI tools. Implement and maintain CI/CD pipelines for data solutions. Ensure data quality, governance, and security standards are maintained. Required Skills and Qualifications: Strong programming skills in Python with a focus on data processing and automation. Hands-on experience with Snowflake – including SnowSQL, Snowpipe, data sharing, and performance tuning. Proficiency in SQL and working with large, complex datasets. Experience in designing and implementing ETL/ELT pipelines. Strong understanding of data warehousing concepts and data modeling (star/snowflake schema). Familiarity with cloud platforms such as AWS , Azure , or GCP . Experience with version control (e.g., Git) and CI/CD tools. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with Apache Airflow , DBT , or other workflow orchestration tools. Knowledge of data security and compliance standards . Experience integrating Snowflake with BI tools (Tableau, Power BI, etc.). Certification in Snowflake or relevant cloud platforms is a plus.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

5 - 14 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

We are looking for an experienced Data Engineer to design, develop, and maintain our data pipelines, primarily focused on ingesting data into our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake and practical experience with AWS services, particularly using S3 as a landing zone and an entry point to the Snowflake environment. You will be responsible for building efficient, reliable, and scalable data pipelines that are critical for our data-driven decision-making processes. Role & responsibilities 1. Design, develop, implement, and maintain scalable and robust data pipelines to ingest data from various sources into the Snowflake data platform. 2. Utilize AWS S3 as a primary landing zone for data, ensuring efficient data transfer and integration with Snowflake. 3. Develop and manage ETL/ELT processes, focusing on data transformation, cleansing, and loading within the Snowflake and AWS ecosystem. 4.Write complex SQL queries and stored procedures in Snowflake for data manipulation, transformation, and performance optimization. 5. Monitor, troubleshoot, and optimize data pipelines for performance, reliability, and scalability. 6. Collaborate with data architects, data analysts, data scientists, and business stakeholders to understand data requirements and deliver effective solutions. 7. Ensure data quality, integrity, and governance across all data pipelines and within the Snowflake platform. 8. Implement data security best practices in AWS and Snowflake. 9. Develop and maintain comprehensive documentation for data pipelines, processes, and architectures. 10. Stay up-to-date with emerging technologies and best practices in data engineering, particularly related to Snowflake and AWS. 11. Participate in Agile/Scrum development processes, including sprint planning, daily stand-ups, and retrospectives. Preferred candidate profile 1. Strong, hands-on proficiency with Snowflake: In-depth knowledge of Snowflake architecture, features (e.g., Snowpipe, Tasks, Streams, Time Travel, Zero-Copy Cloning). Experience in designing and implementing Snowflake data models (schemas, tables, views). Expertise in writing and optimizing complex SQL queries in Snowflake. Experience with data loading and unloading techniques in Snowflake. 2. Solid experience with AWS Cloud services: Proficiency in using AWS S3 for data storage, staging, and as a landing zone for Snowflake. Experience with other relevant AWS services (e.g., IAM for security, Lambda for serverless processing, Glue for ETL - if applicable). 3. Strong experience in designing and building ETL/ELT data pipelines. Proficiency in at least one programming language commonly used in data engineering (e.g., Python, Scala, Java). Python is highly preferred.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Chennai

Remote

Naukri logo

modern data warehouse (Snowflake, Big Query, Redshift) and graph databases. designing and building efficient data pipelines for the ingestion and transformation of data into a data warehouse Proficiency in Python, dbt, git, SQL, AWS and Snowflake.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

25 - 35 Lacs

Chennai

Remote

Naukri logo

Snowflake database administration designing data architecture, database networking, maintaining data security, configuring user access Python, git, SQL, AWS Snowflake data pipelines, data structures, data platforms, data integration, data governance

Posted 3 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies