Jobs
Interviews

66 Snowflake Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior Data Analyst with a minimum of 7-8 years of experience in data analysis roles, specifically with significant exposure to Snowflake. Your primary responsibilities will include querying and analyzing data stored in Snowflake databases to derive meaningful insights for supporting business decision-making. You will also be responsible for developing and maintaining data models and schema designs within Snowflake to facilitate efficient data analysis. In addition, you will create and maintain data visualizations and dashboards using tools like Tableau or Power BI, leveraging Snowflake as the underlying data source. Collaboration with business stakeholders to understand data requirements and translate them into analytical solutions is a key aspect of this role. You will also perform data validation, quality assurance, and data cleansing activities within Snowflake databases. Furthermore, you will support the implementation and enhancement of ETL processes and data pipelines to ensure data accuracy and completeness. A Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Information Systems, or a related field is required. Certifications in data analytics, data visualization, or cloud platforms are desirable but not mandatory. Your primary skills should encompass a strong proficiency in querying and analyzing data using Snowflake SQL and DBT. You must have a solid understanding of data modeling and schema design within Snowflake environments. Experience in data visualization and reporting tools such as Power BI, Tableau, or Looker is essential for analyzing and presenting insights derived from Snowflake. Familiarity with ETL processes and data pipeline development is also crucial, along with a proven track record of using Snowflake for complex data analysis and reporting tasks. Strong problem-solving and analytical skills, including the ability to derive actionable insights from data, are key requirements. Experience with programming languages like Python or R for data manipulation and analysis is a plus. Secondary skills that would be beneficial for this role include knowledge of cloud platforms and services such as AWS, Azure, or GCP. Excellent communication and presentation skills, strong attention to detail, and a proactive approach to problem-solving are also important. The ability to work collaboratively in a team environment is essential for success in this position. This role is for a Senior Data Analyst specializing in Snowflake, based in either Trivandrum or Bangalore. The working hours are 8 hours per day from 12:00 PM to 9:00 PM, with a few hours of overlap during the EST time zone for mandatory meetings. The close date for applications is 18-04-2025.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

maharashtra

On-site

As a skilled Snowflake Developer with over 7 years of experience, you will be responsible for designing, developing, and optimizing Snowflake data solutions. Your expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration will be crucial in building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake. Your key responsibilities will include: - Designing and developing Snowflake databases, schemas, tables, and views following best practices. - Writing complex SQL queries, stored procedures, and UDFs for data transformation. - Optimizing query performance using clustering, partitioning, and materialized views. - Implementing Snowflake features such as Time Travel, Zero-Copy Cloning, Streams & Tasks. - Building and maintaining ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark. - Integrating Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe). - Developing CDC (Change Data Capture) and real-time data processing solutions. - Designing star schema, snowflake schema, and data vault models in Snowflake. - Implementing data sharing, secure views, and dynamic data masking. - Ensuring data quality, consistency, and governance across Snowflake environments. - Monitoring and optimizing Snowflake warehouse performance (scaling, caching, resource usage). - Troubleshooting data pipeline failures, latency issues, and query bottlenecks. - Collaborating with data analysts, BI teams, and business stakeholders to deliver data solutions. - Documenting data flows, architecture, and technical specifications. - Mentoring junior developers on Snowflake best practices. Required Skills & Qualifications: - 7+ years in database development, data warehousing, or ETL. - 4+ years of hands-on Snowflake development experience. - Strong SQL or Python skills for data processing. - Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark). - Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT). - Certifications: SnowPro Core Certification (preferred). Preferred Skills: - Familiarity with data governance and metadata management. - Familiarity with DBT, Airflow, SSIS & IICS. - Knowledge of CI/CD pipelines (Azure DevOps).,

Posted 2 days ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Remote

Job Title:- Sr. Python Data Engineer Location:- Chennai & Bangalore (REMOTE) Job Type:- Permanent Employee Experience :- 8 to 12 Years Shift: 2 11 PM Responsibilities Design and develop data pipelines and ETL processes. Collaborate with data scientists and analysts to understand data needs. Maintain and optimize data warehousing solutions. Ensure data quality and integrity throughout the data lifecycle. Develop and implement data validation and cleansing routines. Work with large datasets from various sources. Automate repetitive data tasks and processes. Monitor data systems and troubleshoot issues as they arise. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role (Minimum 6+ years’ experience as Data Engineer). Strong proficiency in Python and PySpark. Excellent problem-solving abilities. Strong communication skills to collaborate with team members and stakeholders. Individual Contributor Technical Skills Required Expert Python, PySpark and SQL/Snowflake Advanced Data warehousing, Data pipeline design – Advanced Level Data Quality, Data validation, Data cleansing – Advanced Level Intermediate/Basic Microsoft Fabric, ADF, Databricks, Master Data management/Data Governance Data Mesh, Data Lake/Lakehouse Architecture

Posted 4 days ago

Apply

5.0 - 8.0 years

10 - 16 Lacs

Pune, Chennai, Bengaluru

Hybrid

Snowflake Developer: Mandate Skills: Snowflake, SQL Responsibility: Design, develop, and maintain Snowflake databases and data warehouse solutions. Build and optimize SQL queries for data processing and reporting. Experience with Snowflake tools such as Snowpipe, Time Travel, and Cloning. Collaborate with cross-functional teams to implement data models and pipelines. Ensure data security, quality, and compliance in Snowflake environments. Monitor and troubleshoot Snowflake and SQL systems for performance issues. Exp..-5-8YRS Location: Mumbai/Bangalore/Pune/Chennai/Hyderabad/Indore/Kolkata/Noida/Coimbatore/Bhuvaneswar Notice Period : 0-30Days Interested Candidates share your cv at Muktai.S@alphacom.in

Posted 4 days ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 week ago

Apply

7.0 - 10.0 years

4 - 8 Lacs

Pune, Maharashtra, India

On-site

Data profiling to identify primary keys and issues with the data. ETL to bring data onto the Cambia Data Platform, de-duplicate data, create or update dimensional data structures, and produce use case-specific output. Unit testing, functional testing, and performance testing and tuning. Interacting with the Product team to understand and refine requirements. Interacting with QA to address reported findings. Working individually and as a team to achieve our goals. Taking initiative to take on additional work if the present work stream slows down Other similar or related activities. Top 3-5 REQUIREMENTS (you don t want to see candidates without these) Expert level knowledge of Git CLI and managing git-based repositories Previous CI/CD experience in working with either Gitlab Runners, Github Actions, Circle CI, or Jenkins and configuring them into GitLab repositories Intermediate to expert knowledge of Snowflake related technologies Intermediate experience in developing and managing python code and python based web services Top 3-5 Desirements

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Nagpur, Pune

Work from Office

JOB DESCRIPTION Off-Shore Contract Data Engineering Role that MUST work out of an approved Clean Room facility. The role is part of an Agile Team in support of Financial Crimes data platforms and strategies, including but not limited to their use of SAS Grid and SnowFlake. JOB SUMMARY Handle the design and construction of scalable data management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition. Required to know and understand the ins and outs of the industry such as data mining practices, algorithms, and how data can be used. Primary Responsibilities: Design, construct, install, test and maintain data management systems. Build high-performance algorithms, predictive models, and prototypes. Ensure that all systems meet the business/company requirements as well as industry practices. Integrate up-and-coming data management and software engineering technologies into existing data structures. Develop set processes for data mining, data modeling, and data production. Create custom software components and analytics applications. Research new uses for existing data. Employ an array of technological languages and tools to connect systems together. Collaborate with members of your team (eg, data architects, the IT team, data scientists) on the projects goals. Install/update disaster recovery procedures. Recommend different ways to constantly improve data reliability and quality. Maintaining up-to-date knowledge, support, and training documentation QUALIFICATIONS Technical Degree or related work experience Proficiency and Technical Skills Relating to: SQL, MySQL, DBT, SnowFlake, and SAS Exposure and experience with: ETL (DataStage), Scripting (Python, Java Script, Etc), Version Controls (Git), Highly Regulated Environments (Banking, Health Care, Etc).

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Mumbai, Mumbai (All Areas)

Work from Office

We are seeking a highly skilled Data Engineer with a strong background in Snowflake and Azure Data Factory (ADF) , and solid experience in Python and SQL . The ideal candidate will play a critical role in designing and building robust, scalable data pipelines, enabling modern cloud-based data platforms including data warehouses and data lakes . Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Snowflake , ADF , and Python to support data warehouse and datalake architectures. Build and automate data ingestion pipelines from various structured and semi-structured sources (APIs, flat files, cloud storage, databases) into Snowflake-based data lakes and data warehouses . Perform full-cycle data migration from on-premise and cloud databases (e.g., Oracle, SQL Server, Redshift, MySQL) to Snowflake . Optimize Snowflake workloads: schema design, clustering, partitioning, materialized views, and query performance tuning . Develop and orchestrate data workflows using Azure Data Factory pipelines, triggers, and dataflows. Implement data quality checks , validation processes, and monitoring mechanisms for production pipelines. Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps to support diverse data needs. Ensure data integrity, security, and governance throughout the data lifecycle. Maintain comprehensive documentation on pipeline design, schema changes, and architectural decisions. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. 2+ years of hands-on experience with Snowflake , including Snowflake SQL,SnowSQL, Snowpipe, Streams, Tasks, and performance optimization. 1+ year of experience with Azure Data Factory (ADF) – pipeline design,linked services, datasets, triggers, and integration runtime. Strong Python skills for scripting, automation, and data manipulation. Advanced SQL skills – ability to write efficient, complex queries, procedures, and analytical expressions. Experience designing and implementing data lakes and data warehouses on cloud platforms. Familiarity with Azure cloud services , including Azure Data Lake Storage (ADLS), Blob Storage, Azure SQL, and Azure DevOps. Experience with orchestration tools such as Airflow, DBT, or Prefect is a plus. Understanding of data modeling , data warehousing principles , and ETL/ELT best practices . Experience in building scalable data architectures for analytics and business intelligence use cases. Preferred Qualifications (Nice to Have) Experience with CI/CD pipelines for data engineering (e.g., Azure DevOps, GitHub Actions). Familiarity with Delta Lake, Parquet, or other big data formats. Knowledge of data security and governance tools like Purview or Informatica.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Pune, Chennai, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Gurugram, Bengaluru

Hybrid

Job Responsibilities: Exp : 4-9 Year Snowflake SnowPro Certified professionals are First priority. (Mandatory) -At least 4 years of experience with 4+ years on Snowflake Data Cloud. Must-Have Skills: Snowflake Cloud Platform strong hands-on experience ETL/ELT Tools – experience with one or more tools such as: Azure Data Factory AWS Glue Informatica Talend Qlik Replicate Workflow Orchestration – proficiency with tools like: Apache Airflow Control-M Tidal Automation Programming: Advanced SQL Python (including working with dataframes using Pandas, PySpark, or Snowpark) Data Engineering Concepts: Strong knowledge of data pipelines, data wrangling, and optimization Good-to-Have Skills: SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch. Please Share your updated your resume at poonampal@kpmg.com

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 3+ years of experience in designing, developing, and administering Snowflake data warehouse solutions with a strong focus on scalability and performance. Your primary responsibilities will include writing and optimizing complex Snowflake SQL queries and scripts to ensure efficient data extraction, transformation, and loading (ETL/ELT). Additionally, you will be expected to develop and implement robust ETL/ELT pipelines using Snowflake and associated tools. Applying design patterns and best practices in data pipeline and system design will be crucial in this role. You will work extensively with cloud platforms, preferably Azure, to integrate Snowflake solutions. Tuning Snowflake warehouses for optimal query performance, including sizing, clustering, and partitioning strategies will also be part of your responsibilities. Collaboration with the DataOps Live platform to orchestrate, automate, and monitor data workflows and pipelines is essential. You will need to review and interpret design documents, including UML diagrams, to ensure alignment with technical solutions. Implementing data security measures such as masking policies, role-based access control, and compliance standards within Snowflake and Azure environments will be required. You should have experience utilizing version control systems like Git and participating in DevOps practices for continuous integration and deployment. Active engagement in Agile methodologies and effective collaboration with cross-functional teams will be expected. Clear and professional communication with clients and team members is necessary to ensure project alignment and success. About Virtusa: Virtusa values teamwork, quality of life, and professional and personal development. When you join Virtusa, you become part of a global team of 27,000 people who care about your growth. Virtusa aims to provide you with exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with the company. At Virtusa, great minds and great potential come together. The company values collaboration and a team environment, seeking to provide dynamic opportunities for great minds to nurture new ideas and foster excellence.,

Posted 2 weeks ago

Apply

2.0 - 5.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design and implement scalable data models using Snowflake to support business intelligence and analytics solutions. Implement ETL/ELT solutions that involve complex business transformations. Handle end-to-end Data warehousing solutions Migrate the data from legacy systems to Snowflake systems Write complex SQL queries for extracting, transforming, and loading data, ensuring high performance and accuracy. Optimize the SnowSQL queries for better processing speeds Integrate Snowflake with 3rd party applications Use any ETL/ELT technology Implement data security policies, including user access control and data masking, to maintain compliance with organizational standards. Document solutions and data flows. Skills & Qualifications: Experience: 2+ years of experience in data engineering, with a focus on Snowflake. Proficient in SQL and Snowflake-specific SQL functions . Experience with ETL/ELT tools and cloud data integrations. Technical Skills: Strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpark, Snowpipe, Streamlit Experience in using Dynamic tables is good to have Familiarity with cloud platforms (AWS, Azure, or GCP) and other cloud-based data technologies. Experience with data modeling concepts like star schema, snowflake schema, and data partitioning. Experience with Snowflakes Time Travel, Streams, and Tasks features Experience in data pipeline orchestration. Knowledge of Python or Java for scripting and automation. Knowledge of Snowflake pipelines is good to have Knowledge of data governance practices, including security, compliance, and data lineage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data Ensure data quality in a big data environment Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach Desirable to have Talend / Snowflake Certification Excellent SQL coding skills, excellent communication, and documentation skills Familiar with Agile delivery process. Must be analytical, creative, and self-motivate Work effectively within a global team environment Excellent communication skills.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

12 - 14 Lacs

Noida, Hyderabad, Bengaluru

Work from Office

Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com

Posted 2 weeks ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

Mumbai

Work from Office

Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. The position is based in Mumbai, India office. Responsibilities Build and maintain ETL pipelines for Snowflake. Manage Snowflake objects and data models. Integrate data from various sources. Optimize performance and query efficiency. Automate and schedule data workflows. Ensure data quality and reliability. Collaborate with cross-functional teams. Document processes and data flows. Qualifications Self-motivated, collaborative individual with passion for excellence B.E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Databases Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Experience managing Snowflake databases, schemas, tables, and other objects Proficient in Snowflake SQL, including CTEs, window functions, and stored procedures Familiar with Snowflake performance tuning and cost optimization tools Skilled in building ETL/ELT pipelines using dbt, Airflow, or Python Able to work with various data sources including RDBMS, APIs, and cloud storage Understanding of incremental loads, error handling, and scheduling best practices Strong SQL skills and intermediate Python proficiency for data processing Familiar with Git for version control and collaboration Basic knowledge of Azure, or GCP cloud platforms Capable of integrating Snowflake with APIs and cloud-native services What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

We are seeking a Lead Snowflake Engineer .The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

7 - 17 Lacs

Gurugram

Work from Office

We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem

Posted 3 weeks ago

Apply

3.0 - 8.0 years

25 - 30 Lacs

Pune, Gurugram

Hybrid

Job Title: Snowflake Developer Company: Xebia Location: Gurgaon / Pune (Hybrid/On-site as applicable) Job Type: Full-Time / Contract Department: Data & AI Center of Excellence (COE) Notice Period: Immediate to 2 Weeks Max Only Join Xebia as a Snowflake Developer! Are you passionate about solving complex data problems using modern cloud data platforms ? Were hiring a Skilled and Passionate Snowflake Developer to join our growing Data & AI COE team at Xebia! In this role, you'll design and implement robust Snowflake-based data solutions, engage in client-facing projects, and contribute to reusable frameworks, accelerators, and technical innovations. Key Responsibilities Project Delivery (50%) Develop scalable data pipelines using Snowflake Write and optimize advanced SQL , UDFs , views , and stored procedures Work with stakeholders to deliver performant, production-ready data solutions COE Contributions (30%) Build accelerators, reusable components, and solution templates Research and integrate Snowflake with AWS , Azure , Databricks , etc. Participate in internal training and documentation efforts Support & Collaboration (20%) Troubleshoot Snowflake dev & prod environments Collaborate across DevOps, QA, product, and architecture teams Contribute to pre-sales discussions, PoCs, and code reviews What You Bring 3-6 years in Data Engineering , with 2+ years in Snowflake Strong expertise in SQL , Snowflake scripting , and performance tuning Experience with ETL/ELT pipelines , Snowpipe , Streams & Tasks , Snowpark , Time Travel , etc. Knowledge of AWS (S3, Glue, Lambda) or Azure ecosystem Familiarity with Airflow, dbt, Jenkins , Python , JSON/XML handling Understanding of Data Governance , Security Policies , and Access Controls Great team collaboration & communication skills Nice to Have Snowflake Certifications (Developer/Architect) BI tools like Power BI or Tableau Experience with Git , DataOps , or Data Cataloging Why Join Xebia? At Xebia, innovation meets execution. Join a passionate team driving data-led transformations across industries. Explore the latest in Cloud, AI, Data Engineering, and Automation , while being part of a culture that celebrates continuous learning, trust, and growth. How to Apply Only candidates who can join immediately or within 15 days will be considered. To apply, please share the following details along with your updated resume: Updated CV Current CTC Expected CTC Notice Period Current Location Preferred Location Total Experience Relevant Experience in Snowflake Experience with AWS/Azure (please specify) Reason for job change Send all the details to: Vijay.S@xebia.com Subject Line: Snowflake Developer [Your Name]

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Work Location :Bangalore, chennai, Hyderabad, Pune, Bhubaneshwar, Kochi Experience :5-10yrs Job Description: Hands on experience in Snowflake Experience in Snowpipe, Snowsql Strong datawarehouse experience Please share your updated profile to suganya@spstaffing.in, if you are actively looking for change.

Posted 4 weeks ago

Apply

6.0 - 10.0 years

13 - 23 Lacs

Hyderabad, Bengaluru

Work from Office

Senior Snowflake Developer Bangalore/ Hyderabad 2nd Shift - (2 - 11PM) NOTE: LOOKING FOR SOMEONE WHO CAN START Required Experience: A minimum of 10 years of hands-on experience in the IT industry At least 5 years of experience in client invoicing and automation processes Strong communication skills, both verbal and written Proficient in using Jira for task tracking and project coordination Demonstrated project management experience. Technical Expertise: Minimum of 7 years of hands-on experience in the following areas: Snowflake cloud data platform SQL development (including Snowflake SQL and SQL Server) Experience with Data Modelling and Stored Procedures DBT (Data Build Tool) for data transformation Apache Airflow for workflow orchestration Google Cloud Platform (GCP) services Strong understanding of Business Intelligence (BI) tools, especially Power BI HVR and Fivetran for data replication Apache Kafka for real-time data streaming Octopus Deploy and TeamCity for CI/CD and deployment automation

Posted 1 month ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Snowflake Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad, Bengaluru and Chennai ( Hybrid - local candidates) Notice Period: Immediate to 15 Days Salary: As per your experience Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion

Posted 1 month ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com

Posted 1 month ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies