Jobs
Interviews

197 Snowpipe Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

11 - 14 Lacs

bengaluru

Work from Office

We use cookies to offer you the best possible website experience Your cookie preferences will be stored in your browsers local storage This includes cookies necessary for the website's operation Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies, Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading,, Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now Apply Now Start apply with LinkedIn Please wait,, Software Engineer Snowflake (Bangalore Pool) Job Date: Jul 21, 2025 Job Requisition Id: 61649 Location: Bangalore, KA, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation, At YASH, were a cluster of the brightest stars working with cutting-edge technologies Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future, We are looking forward to hire Snowflake Professionals in the following areas : JD for Senior Snowflake developer as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 4-5 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures, Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting, Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT, Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake, Writing and optimizing SQL queries for efficient data retrieval and analysis, Deliver robust solutions through Query optimization ensuring Data Quality, Should have experience in writing Functions and Stored Procedures, Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications, Good to have experience in Design/ Development in any ETL tool, Good interpersonal skills, experience in handling communication and interactions between different teams, At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale, Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now Apply Now Start apply with LinkedIn Please wait,, Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright 2020 YASH Technologies All Rights Reserved,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

kolkata, ahmedabad, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

kolkata, pune, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Snowflake SQL Developer, you will be responsible for writing SQL queries against Snowflake and developing scripts in Unix, Python, and other languages to facilitate the Extract, Load, and Transform (ELT) process for data. Your role will involve hands-on experience with various Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Your primary objective will be to design and implement scalable and performant data pipelines that ingest, process, and transform data from diverse sources into Snowflake. You should have proven experience in configuring and managing Fivetran connectors for data integration, and familiarity with DBT knowledge is considered a plus. To excel in this role, you must possess excellent SQL coding skills, along with strong communication and documentation abilities. Your complex problem-solving capabilities, coupled with an ever-improving approach, will be crucial in delivering high-quality solutions. Analytical thinking, creativity, and self-motivation are key attributes that will drive your success in this position. Collaboration is essential in our global team environment, and your ability to work effectively with colleagues worldwide will be valued. While a Snowflake Certification is preferred, familiarity with Agile delivery processes and outstanding communication skills are also highly desirable traits for this role.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Snowflake Lead Engineer, you will be responsible for managing projects in Data Warehousing with a focus on Snowflake. With at least 8 years of industry experience, including a minimum of 4 years working directly with Snowflake, you will play a crucial role in designing and implementing data solutions. Your primary skills should include a strong foundation in Data Modelling Fundamentals, Data Warehousing, ETL processes, and Modern Data Platform concepts. Proficiency in tools such as PLSQL, Python, Snowpipe, SnowSQL, and SQL (Basic + Advanced) is essential. You should also have experience with Time Travel and Fail Safe mechanisms. In addition to your core skills, familiarity with Matillion and DBT (Data Build Tool) would be beneficial in this role. Your specialization as a Senior Data Engineer in Snowflake Engineering requires an in-depth understanding of relational and NoSQL data stores, dimensional modeling techniques, and data ingestion strategies. You will be responsible for developing and maintaining data pipelines, writing stored procedures, and implementing DWH and ETL processes within the Snowflake environment. A successful candidate will have a Bachelor's degree in computer science, Engineering, or related fields, or equivalent practical experience. You must be able to collaborate effectively with cross-functional teams, lead project requirements for data integration processes, and keep abreast of best practices in Snowflake data modeling. Your expertise will extend to Snowflake-specific features such as resource monitors, RBAC controls, virtual warehouses, and query performance tuning. You will be expected to drive continuous improvements in modeling principles and processes to ensure alignment with business needs. Overall, as a Snowflake Lead Engineer, you will be at the forefront of implementing cloud-based Enterprise data warehouse solutions, leveraging Snowflake's capabilities to optimize data movement strategies and drive business outcomes effectively.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As an Expert Engineer in the GPS Technology department located in Gurugram, India, reporting to the Project Manager, you will be an integral part of the Technology function providing IT services to Fidelity International business globally. Your role involves developing and supporting business applications that are crucial for revenue, operational efficiency, compliance, finance, legal, customer service, and marketing functions. Your primary responsibility will be to understand system requirements, analyze, design, develop, and test application systems following defined standards. Your expertise in software designing, programming, engineering, and problem-solving skills will be crucial in delivering value to the business efficiently and with high quality. Your essential skills will include working on Data Ingestion, Transformation and Distribution using AWS or Snowflake, experience with SnowSQL, Snowpipe, ETL/ELT tools, and hands-on knowledge of AWS services like EC2, Lambda, ECS/EKS, DynamoDB, VPCs. You will be familiar with building data pipelines leveraging Snowflake's capabilities and integrating technologies that work with Snowflake. Moreover, you will design Data Ingestion and Orchestration Pipelines using AWS and Control M, establish strategies for data extraction, ingestion, transformation, automation, and consumption, and ensure data quality and code coverage. Your ability to experiment with new technologies, passion for technology, problem-solving, and effective collaboration skills will be essential for success in this role. To qualify for this position, you should hold a B.E./B.Tech. or M.C.A. in Computer Science from a reputed University with a total of 7 to 10 years of relevant experience. Personal characteristics such as good interpersonal and communication skills, being a strong team player, strategic thinking, self-motivation, and problem-solving abilities will be highly valued. Join us in our mission to build better financial futures for our clients and be a part of a team that values your well-being, supports your development, and offers a flexible work environment. Visit careers.fidelityinternational.com to explore more about our work culture and how you can contribute to our team.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

pune

Work from Office

About Position: We are looking for experts who has hands on experienced in Snowflake development with good experience in PLSQL. Role: Snowflake Developer Location: All Persistent Locations Experience: 5+ years Job Type: Full Time Employment What You'll Do: Design, develop, and optimize data solutions using Snowflake as the primary platform. Utilize your expertise in Snowflake features, such as Snowpipe, Time Travel, and Secure Data Sharing, to implement scalable and secure solutions. Collaborate with stakeholders to understand business needs and translate them into robust technical designs using Snowflake. Development experience in SQL , PL/SQL . Ensure data security, governance, and compliance in Snowflake environments. Troubleshoot and enhance performance issues, focusing on Snowflake architecture and operations. Expertise You'll Bring: Advanced proficiency in Snowflake platform, including architecture, data modeling, and query performance optimization. Additional Skill: Strong hands-on experience with advanced SQL, supporting ETL processes, data analysis, and query optimization. Familiarity with cloud platforms like AWS, Azure, or Google Cloud, and integration tools. Knowledge of programming languages like Python or Java is advantageous. Excellent problem-solving and communication skills. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Values-Driven, People-Centric & Inclusive Work Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We support hybrid work and flexible hours to fit diverse lifestyles. Our office is accessibility-friendly, with ergonomic setups and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment Lets unleash your full potential at Persistent - persistent.com/careers Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, Gen AI, FivetranMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect, you will design and implement scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs including GenAI features. Roles & Responsibilities:About The Role Project Role :Snowflake Architect Project Role Description :Lead design and implementation of scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Contributes to CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Lead Snowflake solutions across data ingestion, storage, transformation, and access.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Familiar with Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks.- Collaborate with AI/ML teams to enable GenAI use cases leveraging Snowflakes native capabilities (e.g., Cortex-AI, Streamlit apps, LLM models, Agentic-AI framework).-- Understanding of LLM functions, Cortex Analyst(text to SQL), Streamlit app(for bots), Document-AI, Cortex Search should be there.-- Have implementation experience on Gen-AI features within Snowflake that can be of creating POCs/POVs, MVPs, etc. Professional & Technical Skills: Must have skills :Snowflake Data Warehouse with GenAI Good to have skills :dbt, Fivetran- Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification- Experience or Knowledge of building or integrating Generative AI applications and experience with Snowflake Cortex.- Strong communication skills. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is .Overall Experience:8-10 yearsSnowflake Experience:Minimum 3 years Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, FivetranMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect you will design and implement scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs. Roles & Responsibilities: Project Role :Snowflake Architect Project Role Description :Lead design and implementation of scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Contributes to CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Lead Snowflake solutions across data ingestion, storage, transformation, and access.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Familiar with Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks.-- Good to have skills on DBT and Fivetran. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification(must) Good to have skills :dbt, Fivetran, GenAI Features in Snowflake- Deep Expertise in Data Engineering.- Strong communication skills. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is .Overall Experience:8-10 yearsSnowflake Experience:Minimum 3 years Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

13 - 18 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Snowflake Data Warehouse, Data Engineering Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect, you will be responsible for designing robust, secure, and high-performing Snowflake environments. You will assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Your typical day will involve collaborating with cross-functional teams, analyzing requirements, designing application architecture, and providing technical guidance to ensure the successful delivery of projects. As a core contributor to the CoE, you will also collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs. Roles & Responsibilities:- Project Role :Snowflake Senior Solution Architect- Project Role Description :Architects end-to-end Snowflake solutions, including modelling, optimization, and security. Leads CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.Snowflake Experience:Minimum 4 yearsCertifications:Any SnowPro Advanced Certified(Must).- Expected to be a SME in Snowflake with deep knowledge and experience. Collaborate with vendors to align solutions, drive CoE initiatives.- Design and implement enterprise-grade Snowflake solutions across data ingestion, storage, transformation, and access.- Create and maintain reference architectures, accelerators, design patterns, and solution blueprints for repeatability.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Possesses in-depth knowledge of Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Any SnowPro Advanced Certification(must). Good to have skills :dbt, Fivetran, GenAI features in Snowflake- Deep Expertise in Data Engineering.- Strong communication and solution architecture skills with the ability to bridge technical and business discussions.- Should have Influencing and Advisory skills.- Responsible for team decisions. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 2 weeks ago

Apply

11.0 - 20.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Work from Office

Data Architecture experience in Data Warehouse, Snowflake+ DBT Snowflake + snowflake advanced certification Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Essential job tasks Data Architecture experience in Data Warehouse, Snowflake. Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Pls share your resume at parul@mounttalent.com

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

pune, bengaluru, mumbai (all areas)

Hybrid

CitiusTech is hiring for snowflake + ETL Developers and we are conducting an virtual drive this weekend (30th Aug-25). Below is the required details. Required Skill: Snowflake + ETL and SQL. Total Experience: 5 to 12 years Relevant Experience: minimum 3 years. Work Location: Chennai, Bengaluru, Mumbai, Pune. Work Mode: Hybrid Interview Date: 30th Aug-25. Interview Mode: Virtual. Interested candidates kindly share your updated resume to gopinath.r@citiustech.com. Regards, Gopinath R.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

bengaluru

Hybrid

Role & responsibilities Experience with Snowflake utilities, Snow SQL, Snow Pipe and developing stored Procedures Experience in AWS and Python/Shell Scripting languages At working with tools to automate CI/CD pipelines (e.g., Experience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import, and Data Export using multiple ETL tools such as Informatica, DataStage, Teradata, Talend Good in Snowflake advanced concepts like setting up Resource Monitors, Role Based Access Controls, Data Sharing, Cross platform database Replication, Virtual Warehouse Sizing, Query Performance Tuning, Snow Pipe, Tasks, Streams, Zero- copy cloning etc. Performance tuning of the databases to ensure optimal reporting user experience. Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts and data load.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, FivetranMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Lead, you will build scalable, secure, and high-performance Snowflake pipelines across multiple projects and teams. Your typical day will involve collaborating with Team Leads to ensure alignment with the design of the architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will support delivery teams, and contribute to the execution of POVs and POCs. Roles & Responsibilities: Project Role :Snowflake Lead Project Role Description :Build scalable, secure, and high-performance Snowflake pipelines in projects. Contributes to CoE initiatives related to creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Build complex data pipelines, implement ETL/ELT processes, and work with semi-structured and structured data to support analytics and reporting.- Collaborate closely with architects, data engineers, and business teams to translate requirements into scalable solutions, ensure data quality, and maintain security best practices.- Experience with Snowflake features- performance tuning, writing advanced SQL queries, and leveraging Snowflake features like Snowpipe, Streams, Tasks, and Stored Procedures.- Stay abreast of evolving Snowflake and GenAI capabilities. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification (must)- Proven Expertise in Data Engineering.- Strong communication skills.-- Good to have skills :dbt, Fivetran Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

bengaluru

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Snowflake Data Warehouse, Data Engineering Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect, you will be responsible for designing robust, secure, and high-performing Snowflake environments. You will assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Your typical day will involve collaborating with cross-functional teams, analyzing requirements, designing application architecture, and providing technical guidance to ensure the successful delivery of projects. As a core contributor to the CoE, you will also collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs. Roles & Responsibilities:- Project Role :Snowflake Senior Solution Architect- Project Role Description :Architects end-to-end Snowflake solutions, including modelling, optimization, and security. Leads CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.Snowflake Experience:Minimum 4 yearsCertifications:Any SnowPro Advanced Certified(Must).- Expected to be a SME in Snowflake with deep knowledge and experience. Collaborate with vendors to align solutions, drive CoE initiatives.- Design and implement enterprise-grade Snowflake solutions across data ingestion, storage, transformation, and access.- Create and maintain reference architectures, accelerators, design patterns, and solution blueprints for repeatability.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Possesses in-depth knowledge of Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Any SnowPro Advanced Certification(must). Good to have skills :dbt, Fivetran, GenAI features in Snowflake- Deep Expertise in Data Engineering.- Strong communication and solution architecture skills with the ability to bridge technical and business discussions.- Should have Influencing and Advisory skills.- Responsible for team decisions. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, FivetranMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Lead, you will build scalable, secure, and high-performance Snowflake pipelines across multiple projects and teams. Your typical day will involve collaborating with Team Leads to ensure alignment with the design of the architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will support delivery teams, and contribute to the execution of POVs and POCs. Roles & Responsibilities: Project Role :Snowflake Lead Project Role Description :Build scalable, secure, and high-performance Snowflake pipelines in projects. Contributes to CoE initiatives related to creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Build complex data pipelines, implement ETL/ELT processes, and work with semi-structured and structured data to support analytics and reporting.- Collaborate closely with architects, data engineers, and business teams to translate requirements into scalable solutions, ensure data quality, and maintain security best practices.- Experience with Snowflake features- performance tuning, writing advanced SQL queries, and leveraging Snowflake features like Snowpipe, Streams, Tasks, and Stored Procedures.- Stay abreast of evolving Snowflake and GenAI capabilities. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification (must)- Proven Expertise in Data Engineering.- Strong communication skills.-- Good to have skills :dbt, Fivetran Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Data Engineering Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Sr. Data Engineer, you will build scalable, secure, and high-performance Snowflake pipelines across multiple projects and teams. Your typical day will involve collaborating with Team Leads to ensure alignment with the design of the architecture. From CoE perspective, you will support delivery teams, and contribute to the execution of POVs and POCs. Roles & Responsibilities: Project Role :Snowflake Sr. Data Engineer Project Role Description :Build scalable, secure, and high-performance Snowflake pipelines in projects. Contributes to CoE initiatives including POCs, accelerators etc.- Build data pipelines, implement ETL/ELT processes, and work with semi-structured and structured data to support analytics and reporting.- Collaborate closely with project leads to translate requirements into scalable solutions, ensure data quality, and maintain security best practices.- Experience with Snowflake features - performance tuning, writing advanced SQL queries, and leveraging Snowflake features like Snowpipe, Streams, Tasks, and Stored Procedures.- Stay abreast of evolving Snowflake and GenAI capabilities. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification (must)- Experience in Data Engineering.- Strong communication skills.- Good to have skills :dbt, Fivetran, Python, GenAI Features in Snowflake Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, FivetranMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect you will design and implement scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs. Roles & Responsibilities: Project Role :Snowflake Architect Project Role Description :Lead design and implementation of scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Contributes to CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Lead Snowflake solutions across data ingestion, storage, transformation, and access.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Familiar with Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks.-- Good to have skills on DBT and Fivetran. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification(must) Good to have skills :dbt, Fivetran, GenAI Features in Snowflake- Deep Expertise in Data Engineering.- Strong communication skills. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is .Overall Experience:8-10 yearsSnowflake Experience:Minimum 3 years Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, Gen AI, FivetranMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationLevel 8:Snowflake+GenAIAbout The Role Project Role :Snowflake Architect Project Role Description :Lead design and implementation of scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Contributes to CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.Must have skills :Snowflake Data Warehouse with GenAI Good to have skills :dbt, FivetranOverall Experience:8-10 yearsSnowflake Experience:Minimum 3 yearsCertifications:SnowPro Core Certified(must) Educational Qualification :15 years full time education Summary :As a Snowflake Architect, you will design and implement scalable, secure, and high-performance Snowflake architectures across multiple projects and teams. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs including GenAI features. Roles & Responsibilities:- Lead Snowflake solutions across data ingestion, storage, transformation, and access.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Familiar with Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks.- Collaborate with AI/ML teams to enable GenAI use cases leveraging Snowflakes native capabilities (e.g., Cortex-AI, Streamlit apps, LLM models, Agentic-AI framework).-- Understanding of LLM functions, Cortex Analyst(text to SQL), Streamlit app(for bots), Document-AI, Cortex Search should be there.-- Have implementation experience on Gen-AI features within Snowflake that can be of creating POCs/POVs, MVPs, etc. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification- Experience or Knowledge of building or integrating Generative AI applications and experience with Snowflake Cortex.- Strong communication skills. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

bengaluru

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Snowflake Data Warehouse, Data Engineering, GEN AI Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Architect, you will be responsible for designing robust, secure, and high-performing Snowflake environments while enabling and optimizing real-time analytics pipelines. You will assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Your typical day will involve collaborating with cross-functional teams, analyzing requirements, designing application architecture, and providing technical guidance to ensure the successful delivery of projects. As a core contributor to the CoE, you will collaborate closely on presales, support delivery teams, and lead the design and execution of POVs and POCs including GenAI features. Roles & Responsibilities: Project Role :Snowflake Senior Solution Architect Project Role Description :Architects end-to-end Snowflake solutions, including modeling, optimization, and security. Leads CoE initiatives by creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects- Expected to be a SME in Snowflake with deep knowledge and experience. Collaborate with vendors to align solutions, drive CoE initiatives.- Design and implement enterprise-grade Snowflake solutions across data ingestion, storage, transformation, and access.- Create and maintain reference architectures, accelerators, design patterns, and solution blueprints for repeatability.- Track and evangelize Snowflake's latest features, and assess their potential for client impact and internal innovation.- Lead capability growth initiatives by mentoring team members, driving POCs, and contributing to knowledge repositories.- Provide pre-sales support, including proposal creation, estimations, solutioning, and client presentations.- Define data modeling standards, physical schema designs, and best practices for structured and semi-structured data.- Establish data governance, security models, RBAC, and masking policies in line with compliance standards (e.g., SOC2, HIPAA, GDPR).- Possesses in-depth knowledge of Snowflakes advanced features, including Snowpark, Iceberg tables, dynamic tables, snowpipe streaming, SPCS, authentication & authorization methods, alerts, performance tuning, row level security, masking, tagging, etc.- Stay abreast of evolving Snowflake and GenAI capabilities; recommend and pilot new tools and frameworks.- Collaborate with AI/ML teams to enable GenAI use cases leveraging Snowflakes native capabilities (e.g., Cortex-AI, Streamlit apps, LLM models, Agentic-AI framework).- Design & solution the Gen-AI based workloads which are cost efficient and performant.- Have a complete understanding of Cortex analyst, Document AI, Cortex Search, LLM functions and their key use cases. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Any SnowPro Advanced Certification(must).Must have skills :Snowflake Data Warehouse with GenAI Good to have skills :dbt, Fivetran- Experience or Knowledge of building or integrating Generative AI applications and experience with Snowflake Cortex.- Strong communication and solution architecture skills with the ability to bridge technical and business discussions.- Should have Influencing and Advisory skills.- Responsible for team decisions. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is . Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Data Building Tool, SAP SD, Fivetran, Gen AIMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Snowflake Lead, you will build scalable, secure, and high-performance Snowflake pipelines across multiple projects and teams. Your typical day will involve collaborating with Team Leads to ensure alignment with the design of the architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. From CoE perspective, you will support delivery teams, and contribute to the execution of POVs and POCs including GenAI. Roles & Responsibilities: Project Role :Snowflake Lead Project Role Description :Build scalable, secure, and high-performance Snowflake pipelines in projects. Contributes to CoE initiatives related to creating guidelines, frameworks, and reference architectures to ensure consistency and scalability across projects.- Build complex data pipelines, implement ETL/ELT processes, and work with semi-structured and structured data to support analytics and reporting.- Collaborate closely with architects, data engineers, and business teams to translate requirements into scalable solutions, ensure data quality, and maintain security best practices.- Experience with Snowflake features- performance tuning, writing advanced SQL queries, and leveraging Snowflake features like Snowpipe, Streams, Tasks, and Stored Procedures.- Stay abreast of evolving Snowflake and GenAI capabilities.- Contribute to GenAI initiatives led by the CoE. Professional & Technical Skills: -Must have skills :Snowflake Data Warehouse with knowledge of GenAI Features in Snowflake- Good to have skills :dbt, Fivetran, Python- Must To Have Skills: Proficiency in Snowflake Data Warehouse, SnowPro Core Certification (must)- Proven Expertise in Data Engineering including GenAI, including Cortex LLM functions, basic understanding on Cortex Analyst, Cortex Search, and Cortex Agents.- Strong communication skills. Additional Information:- The candidate should have a minimum of 15 years of experience in Snowflake Data Warehouse.- This position is .Overall Experience:5-7 yearsSnowflake Experience:Minimum 2 years Qualification 15 years full time education

Posted 3 weeks ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 3 weeks ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 3 weeks ago

Apply

6.0 - 11.0 years

0 Lacs

pune, chennai, bengaluru

Hybrid

Hello Folks, We are hiring for "Snowflake Developer / Data Engineer" for one of the Service Based Company. Job Description: Job type:- Permanent / Full Time 1) Skills - Snowflake Development + Data Modelling OR Snowflake + AWS Exp - G6 - 8-12 yrs Loc - Pune/Chennai/Bangalore/Mumbai/Noida/Ahmedabad (Hybrid) NP - Imm to Sept Serving only IP - L1 & L2 - Virtual 10+ years of experience delivering data and data analytics solutions Participate in the design discussion with enterprise architects and recommend design improvements Develop and maintain conceptual, logical, and physical data models with their corresponding metadata. Work closely with all the squad and product owners to document & implement data strategies and best practices. Well versed in Data modelling, DB design & development Working experience in complex XML data integration. Experience in writing Complex queries, Reporting needs Very Strong expertise in Data integration, Data Analysis , Technical Analysis & Mappings Developing proof-of-concept projects to validate proposed solutions. Well versed in Snowflake, Strong experience in ETL tools like Ab initio. Good experience in Reports, Dashboards and tools like Tableau 2) Skills - Snowflake, Python OR Pyspark, ETL, SQL, Snowpro certified (preferred), SnowSQL Exp - 6-14 yrs Loc - Pune/Chennai/Bangalore/Mumbai/Noida/(Ahmedabad/Coim - 2nd Priority) - (Hybrid) NP - Imm to Sept Serving only IP - L1 & L2 - Virtual Must have atleast 5 + years of experience in Data warehouse, ETL, BI projects Must have strong hold in Python/PySpark Must have experience in Snowflake Data Base. Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Must have experience in JIRA. Basic experience in AWS cloud - S3, Lambda Functions Experience in using Github, Jenkins Good to have experience in Kafka Good communication and Analytical skills Additional it would be good to have basic knowledge on Netezza DB, informatica & TALEND Any references would be greatly appreciated !! Cheers & warm regards, Kajal Gupta Lead - Talent Acquisition Specialist kajal.gupta@prodcon.com

Posted 3 weeks ago

Apply

10.0 - 15.0 years

15 - 25 Lacs

nagpur

Work from Office

Description: Healthcare Requirements: We are looking for a highly skilled and 12 to 15 years of experienced Snowflake and ETL Expert. The ideal candidate will be responsible for designing, developing, and managing our data warehouse on the Snowflake platform. He be a key player in ensuring our data infrastructure is efficient, scalable, and secure. Technical Skills: Proficiency in SQL, with a deep understanding of advanced functions and query optimization. Strong knowledge of Snowflake-specific features like Snowpipe, Time Travel, Zero-Copy Cloning, and Streams. Experience with ETL/ELT tools (e.g., Fivetran, Informatica) and orchestration tools (e.g., Airflow). Proficiency in a scripting language like Python. Proficiency with cloud platforms (AWS, Azure, or GCP). Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Detail-oriented with a commitment to producing high-quality work Preferred Qualifications Certifications: Snowflake Certified Advanced Administrator or Snowflake Certified Data Engineer. Tools: Experience with data build tool (dbt) for data transformation. Big Data: Knowledge of big data technologies (e.g., Hadoop, Spark). Strong communication and interpersonal skills, with the ability to interact directly and effectively with business stakeholders at all levels. Ability to work independently and collaboratively in a fast-paced environment. This role requires an on-site presence in Australia, so a visa is necessary. Education: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience: Proven experience 12+ years working with data warehousing solutions, with at least 3-5 years of hands-on experience on the Snowflake Data Cloud. Job Responsibilities: Key Responsibilities Design and Development: Lead the design and implementation of data warehousing solutions on Snowflake. This includes creating and optimizing schemas, tables, and views. ETL/ELT Processes: Develop and manage complex ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines to ingest data from various sources into Snowflake. Performance Tuning: Monitor, troubleshoot, and optimize query and data load performance. This includes fine-tuning virtual warehouses, optimizing SQL queries, and managing clustering keys. Security and Governance: Implement and maintain data security policies, roles, and access controls within Snowflake. Ensure compliance with data governance standards. Data Modeling: Collaborate with business stakeholders and data analysts to understand data requirements and translate them into effective data models. Automation: Automate data pipelines and administrative tasks using scripting languages like Python or tools like dbt (data build tool). Documentation: Create and maintain comprehensive documentation for all Snowflake-related processes, data models, and configurations Strong communication and interpersonal skills, with the ability to interact directly and effectively with business stakeholders at all levels. Ability to work independently and collaboratively in a fast-paced environment. This role requires an on-site presence in Australia, so a visa is necessary. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies