Jobs
Interviews

32 Snowflake Development Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

The Snowflake Developer will play a crucial role in designing, developing, and implementing data solutions using Snowflake's cloud-based data platform. You will be responsible for writing efficient procedures with Spark or SQL to facilitate data processing, transformation, and analysis. Your Python/Pyspark and SQL skills should be strong, and having some experience in data pipelines or other data engineering aspects is essential. It is important to have knowledge of the AWS platform and be interested in upskilling and eager to learn. Having the right attitude towards learning is key for this role. You should have good expertise in SDLC/Agile and experience in SQL, complex queries, and optimization. Experience in the Spark ecosystem, familiarity with MongoDB data loads, Snowflake, and AWS platform (EMR, Glue, S3) is desired. Hands-on experience in writing advanced SQL queries and familiarity with a variety of databases are important skills to possess. You should have experience in handling end-to-end data testing for complex big data projects, including extensive experience in writing and executing test cases, performing data validations, system testing, and performance checks. Key Skills: - Snowflake development - Python - Pyspark - AWS,

Posted 6 days ago

Apply

7.0 - 12.0 years

22 - 27 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Work from Office

Job Description - Snowflake Developer Experience: 7+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are looking for a Snowflake Developer with 7+ years of experience to design, develop, and maintain our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake SQL, data modeling, and ETL/ELT processes to build efficient and scalable data solutions. Key Responsibilities 1. Snowflake Development & Implementation Design and develop Snowflake databases, schemas, tables, and views Write and optimize complex SQL queries, stored procedures, and UDFs Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks) Manage virtual warehouses, resource monitors, and cost optimization 2. Data Pipeline & Integration Build and maintain ETL/ELT pipelines using Snowflake and tools like Snowpark, Python, or Spark Integrate Snowflake with cloud storage (S3, Blob Storage) and data sources (APIs) Develop data ingestion processes (batch and real-time) using Snowpipe 3. Performance Tuning & Optimization Optimize query performance through clustering, partitioning, and indexing Monitor and troubleshoot data pipelines and warehouse performance Implement caching strategies and materialized views for faster analytics 4. Data Modeling & Governance Design star schema, snowflake schema, and normalized data models Implement data security (RBAC, dynamic data masking, row-level security) Ensure data quality, documentation, and metadata management 5. Collaboration & Support Work with analysts, BI teams, and business users to deliver data solutions Document technical specifications and data flows Provide support and troubleshooting for Snowflake-related issues Required Skills & Qualifications 7+ years in database development, data warehousing, or ETL 3+ years of hands-on Snowflake development experience Strong SQL and scripting (Python, Bash) skills Experience with Snowflake utilities (SnowSQL, Snowsight) Knowledge of cloud platforms (AWS, Azure) and data integration tools SnowPro Core Certification (preferred but not required) Experience with Coalesce DBT , Airflow, or other data orchestration tools Familiarity with CI/CD pipelines and DevOps practices Knowledge of data visualization tools (Power BI, Tableau)

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Chennai

Work from Office

Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting, and strong communication skills. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT. Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures. Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments. Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions. Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub. Independently design and execute innovative ETL and reporting solutions that align with business and operational goals. Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps. Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance). Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies. Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows. Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations. Troubleshoot post-deployment production issues and deliver timely resolutions. Experience Requirements: 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture. Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization. Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control. Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality. Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime. Experience aligning data engineering practices with data governance and compliance standards. Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations. Strong ability to communicate technical details clearly across teams and stakeholders. Education: Any Post Graduation, Any Graduation.

Posted 1 week ago

Apply

7.0 - 9.0 years

7 - 9 Lacs

Hyderabad, Telangana, India

On-site

Create, test, and implement enterprise-level applications using Snowflake Design and implement features for identity and access management Build authorization frameworks for enhanced access control Optimize client queries and ensure major security competencies like encryption Address performance bottlenecks and ensure system scalability Manage transactions with distributed data processing algorithms Take full ownership of deliverables from start to completion Migrate solutions from on-premises to cloud platforms Apply modern delivery approaches in line with current data architecture Document and track project progress based on user requirements Perform data integration with third-party tools across all SDLC phases Architect, design, code, and test data pipelines Manage and document data models, architecture, and maintenance workflows Review and audit data models for ongoing enhancement Perform performance tuning and support user acceptance testing Provide application support and maintain data confidentiality Execute risk assessment, mitigation, and management plans Coordinate regularly with teams for updates and status reporting Handle database migration activities across environments

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 11 Lacs

Hyderabad, Bengaluru

Hybrid

Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake.Design and implement data pipelines with proper 2NF/3NF normalization standards.Expert-level SQL and experience with data transformation Required Candidate profile Expert-level SQL exp with data transformation.data architecture normalization techniques 2NF/3NF Exp cloud-based data platforms and pipeline design.exp AWS data services.Carrier CAB process.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

19 - 22 Lacs

Bengaluru

Hybrid

Role & responsibilities We are looking for Sr. Snowflake developer for Bangalore - Hybrid (2 days WFO) someone with 7+ YOE in Snowflake, Stored procedures, Python, & cloud.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 3 weeks ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Hyderabad, Chennai, Delhi / NCR

Hybrid

Hiring for Snowflake Developer with experience range 2+ years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Mumbai, Chennai, Bengaluru

Hybrid

Important Points: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. SUPERVISORY RESPONSIBILITIES: This job has no supervisory responsibilities. QUALIFICATIONS: Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 3-5 years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)

Posted 1 month ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Hyderabad, Bengaluru

Hybrid

Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAMto monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). What You'll Bring to Our Team Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.

Posted 1 month ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Snowflake Developer exp: 5+ years Location: Pan India Work Mode: WFO

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Chandigarh, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 12 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 15 Lacs

Kolkata, Pune, Bengaluru

Work from Office

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

Kolkata, Chennai, Bengaluru

Work from Office

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Pune, Chennai, Bengaluru

Hybrid

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Pune, Bengaluru

Work from Office

Mandatory Skills: Snowflake Administrator Administer and manage Snowflake environments: Oversee user access, security, and performance tuning. Develop and optimize SQL queries: Create and refine complex SQL queries for data extraction, transformation, and loading (ETL) processes. Implement and maintain data pipelines: Use Python and integrate them with Snowflake. Monitor and troubleshoot: Ensure the smooth operation of Snowflake environments, identifying and resolving issues promptly. Collaborate with data engineers: Work closely with data engineers to provide optimized solutions and best practices. Review roles hierarchy: Provide recommendations for best practices in role hierarchy and security. Experience: Minimum of 3 years as a Snowflake Administrator, with a total of 5+ years in database administration or data engineering. Technical Skills: Proficiency in SQL, Python, and experience with performance tuning and optimization. Cloud Services: Experience with cloud platforms such as Azure. Data Warehousing: Strong understanding of data warehousing concepts and ETL processes. Problem-Solving: Excellent analytical and problem-solving skills.

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Location- PAN INDIA

Posted 1 month ago

Apply

8.0 - 13.0 years

30 - 45 Lacs

Bengaluru

Hybrid

Role & responsibilities 1. Senior Snowflake Developer-Experience- 8+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office Notice Period- immediate to 15 days CTC: 37 Lakhs JD: Summary ThoughtFocus is looking for a senior snowflake developer for our NYC and London based financial services client operating in Public/Private Loans, CLOs, and Long/Short Credit. You will play a pivotal role in accomplishing the successful delivery of our strategic initiatives. You will be responsible for developing solutions using technologies like Snowflake, Coalesce & Fivetran. Location: Bengaluru, India Requirements: IT experience of 8+ years with a minimum of 3+ years of experience as a Snowflake Developer. Design, develop, and optimize Snowflake objects such as databases, schemas, tables, views, and store procedures. Expertise in Snowflake utilities such as Snow SQL, Snow Pipe, Stages, Tables, Zero Copy Clone, Streams and Tasks, Time travel, data sharing, data governance, and row access policy. Experience in migrating data from Azure Cloud to Snowflake, ensuring data integrity, performance optimization, and minimal disruption to business operations. Experience in Snow pipe for continuous loading and unloading data into Snowflake tables. Experience in using the COPY, PUT, LIST, GET, and REMOVE commands. Experience in Azure integration and data loading (Batch and Bulk loading). Experience in creating the System Roles & Custom Roles and Role Hierarchy in Snowflake. Expertise in masking policy and network policy in Snowflake. Responsible for designing and maintaining ETL tools (Coalesce & Fivetran) that include extracting the data from the MS-SQL Server database and transforming the data as per Business requirements. Extensive experience in writing complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes, and Exception Handling using MS-SQL Server (TSQL). Effective communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. What's on offer Competitive and above market salary. Hybrid work schedule. Opportunity to get exposure and technology experience in global financial markets. Education Bachelor's degree in Computer Science / IT / Finance / Economics or equivalent. 2. Please find the below Lead Snowflake JD Location: Bangalore( UK Shift) 3 days work from Office CTC:45 Lakhs 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. • Strong communication skills and interpersonal skills. • Ability to influence and collaborate effectively with cross-functional teams. • Exceptional problem-solving and decision-making abilities. • Experience in working in an agile development environment. • Experience working in a fast-paced, dynamic environment. • Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, functions, and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL, Snowflake 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools Should be able to define and enforce Best Practices Good communication skills to be able to discuss and deliver requirements effectively with the client Preferred candidate profile

Posted 2 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 2 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies