Jobs
Interviews

197 Snowpipe Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 9 years

8 - 18 Lacs

Bhubaneswar, Pune, Bengaluru

Work from Office

Required Skillset: Snowflake architecture,Snowpipe, data cloning, and time travel,SQL,Data Modeling. Must Have: Deep understanding of cloud data warehousing and expertise in designing, developing, and implementing data solutions using Snowflake. Design, develop, and optimize data models and ETL workflows for Snowflake Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Implement data integration solutions using Snowflake, including data ingestion from various sources. Write efficient SQL queries to analyze large datasets and improve performance. Monitor and troubleshoot Snowflake performance issues, offering solutions and enhancements. Ensure data quality, consistency, and governance by implementing best practices. Develop and maintain documentation related to data architecture, processes, and data governance. Stay current with Snowflake features, functionalities, and industry trends to suggest improvements and innovations. Snowflake certification (SnowPro Core or Advanced) is a plus. Knowledge of data visualization tools (e.g., Tableau, Power BI) is advantageous Experience in Agile methodologies and working in an Agile team environment Responsibility of / Expectations from the Role: Customer Centric Work closely with client teams to understand project requirements and translate into technical design Experience working in scrum or with scrum teams Internal Collaboration: Work with project teams and guide the end to end project lifecycle, resolve technical queries Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Soft Skills: Good communication skills Ability to interact with various internal groups and CoEs

Posted 4 months ago

Apply

8 - 13 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

3 - 8 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bhubaneshwar, Bengaluru, Hyderabad

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 4 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bengaluru, Hyderabad, Noida

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 4 months ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

mumbai, bengaluru

Work from Office

Job Title : Snowflake Developer with Oracle Golden Gate/ Data Engineer About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals Responsibilities: Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology. Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion. Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions. Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption. CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes. Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance. Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines. Mandatory Skills: Should have 4 years of experience as Data Engineer Strong expertise in Snowflake architecture, data modeling, and query optimization . Proficiency in SQL for writing and optimizing complex queries. Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication. Knowledge of Snowpipe for automated data ingestion. Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake. Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows. Working knowledge of Snowflake Data Vault methodology . Good to Have Skills: Exposure to Databricks for data processing and analytics. Knowledge of Python or Scala for data engineering tasks. Familiarity with Terraform or CloudFormation for infrastructure as code (IaC). Experience in data governance and compliance best practices . Understanding of ML and AI integration with data pipelines . Self-Test Questions: Do I have hands-on experience in designing and optimizing Snowflake data models? Can I confidently set up and manage real-time data replication using Oracle GoldenGate? Have I worked with Snowpipe to automate data ingestion processes? Am I proficient in SQL and capable of writing optimized queries in Snowflake? Do I have experience integrating Snowflake with AWS cloud services? Have I implemented CI/CD pipelines for Snowflake development? Can I troubleshoot performance issues in Snowflake and optimize queries effectively? Have I documented data engineering processes and best practices for team collaboration?

Posted Date not available

Apply

3.0 - 6.0 years

13 - 18 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Roles and Responsibilities Design, develop, and optimize data pipelines using Snowflake . Implement data models, transformations, and ETL processes. Work with stakeholders to understand requirements and translate them into scalable data solutions. Ensure data quality, performance tuning, and security compliance. Integrate Snowflake with other cloud services and BI tools. Required Skills: 3-6 years of experience in data engineering or development roles. Strong expertise in Snowflake (warehousing, performance tuning, query optimization). Proficiency in SQL and ETL tools (e.g., Informatica, Talend, dbt ). Familiarity with cloud platforms (AWS/Azure/GCP) Good understanding of data modeling and data governance. BE/ BTech compulsory, any field Nice to Have: Experience with Python or Spark for data processing. Knowledge of CI/CD pipelines for data workflows.

Posted Date not available

Apply

6.0 - 11.0 years

20 - 27 Lacs

hyderabad, pune, bengaluru

Hybrid

Job Title: Snowflake Developer Experience: 4+ years Short Description: Assist in migrating existing Snowflake code (converted by SnowConvert) into dbt (data build tool). • Test and validate the migrated code in Snowflake and dbt for functionality and performance. • Support the implementation of re-engineered dbt pipelines. • Migrate data transformation code for downstream reports to dbt. • Migrate OLAP cubes from SQL Server to Snowflake/dbt. • Support migration of downstream system integrations to Snowflake/dbt. • Perform testing and validation for all migrated objects and integrations. *Description for Internal Candidates The contractors will be required to: Assist in migrating existing Snowflake code (converted by SnowConvert) into dbt (data build tool). Test and validate the migrated code in Snowflake and dbt for functionality and performance. Support the implementation of re-engineered dbt pipelines. Migrate data transformation code for downstream reports to dbt. Migrate OLAP cubes from SQL Server to Snowflake/dbt. Support migration of downstream system integrations to Snowflake/dbt. Perform testing and validation for all migrated objects and integrations. Desired Skills & Experience Strong SQL & Python proficiency Experience with API integrations Experience with Azure Data Factory Comprehensive understanding of ELT (Extract, Load, Transform) processes and medallion architecture on big data platforms such as Snowflake, Google BigQuery, Amazon Redshift etc Familiarity with dbt and best practices for data modeling and transformation Familiarity with CICD/devops for data engineering development Experience with dashboarding tools such as Power BI, Looker, Qlik, Quicksight etc Knowledge of data warehouse migrations and legacy system transitions Advantage Zensar We are a technology consulting and services company with 11, 800+ associates in 33 global locations. More than 130 leading enterprises depend on our expertise to be more disruptive, agile and competitive. We focus on conceptualizing, designing, engineering, marketing, and managing digital products and experiences for high-growth companies looking to disrupt through innovation and velocity. Zensar Technologies is an Equal Employment Opportunity (EEO) and Affirmative Action Employer, encouraging diversity in the workplace. Please be assured that we will consider all qualified applicants fairly, regardless of race, creed, color, ancestry, religion, sex, national origin, citizen status, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veterans status. Zensar is a place where you are free to express yourself in an environment that values individuality, nurtures development and is mindful of wellbeing. We put our people and customers at the center of everything that we do. Our core values include: Putting people first Client-centricity Collaboration

Posted Date not available

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted Date not available

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted Date not available

Apply

6.0 - 10.0 years

4 - 7 Lacs

hyderabad, pune, bengaluru

Work from Office

Contract duration : 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities : - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have : - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted Date not available

Apply

4.0 - 8.0 years

10 - 16 Lacs

bengaluru

Hybrid

Job Requirements Responsibilities Manage and optimize Snowflake databases, ensuring data integrity and efficient data retrieval. Extract, transform, and load data from various sources into data warehouses and data lakes for analysis and reporting purposes. Ensure data quality and integrity by implementing data validation and testing procedures. Ensure the security of the backend systems, implementing encryption, authentication, and authorization measures. Optimize server-side performance for fast response times and efficient resource utilization. Conduct code reviews and mentor junior developers to maintain high-quality coding standards. Implement and maintain version control using Git for efficient code management. Troubleshoot and resolve issues related to backend functionality and performance. Expertise on designing, developing data integration and data transformation using ETL tools such as DBT with Snowflake Competencies Proven experience as a Backend Developer with a focus on Snowflake. Snowflake : Should be expert in Snowflake. Strong knowledge in Database design, optimization, and administration Good to have: Knowledge of data warehousing concepts, such as dimensional modeling, star and snowflake schemas. Familiarity with cloud platforms and services, such as AWS, Azure and their data-related offerings, such as S3 etc. Proficient in version control systems, especially Git. Collaborating with business users, gathering requirements, data analysis, data mapping and documentation. Understanding of data modeling concepts and familiarity with Snowflake's data modeling tools and techniques. Also, one of the ETL/ELT tools. Experience with Agile solution development Candidate Profile: Bachelors degree in computer science, information technology or a related field 3-5 Years of overall experience with minimum 4 years of experience in backend development

Posted Date not available

Apply

6.0 - 11.0 years

6 - 14 Lacs

pune

Hybrid

Project Role Description: A Snowflake Developer will be responsible for designing and developing data solutions within the Snowflake cloud data platform using SNOWPARK, Apache Airflow, Data Build Tool (DBT) and Fivetran. Work location: Pune/Remote. Graduate or Post-Graduate in Computer Science/ Information Technology/Engineering. Job Requirements: Must Have Skills: 6 to 11 years IT Experience as Snowflake Developer . Experience in Telcom Domain BSS/OSS. Minimum experience with 4+ years on Snowflake is MUST. Strong experience with Snowflake (data modeling, performance tuning, security). Proficient in dbt (Data Build Tool) for data transformation is MUST (model creation, Jinja templates, macros and testing). Advanced skills in SQL is MUST - writing, debugging, and performance tuning queries. Workflow Orchestration proficiency with tool Apache Airflow is MUST (developing, scheduling & monitoring). Experience in Integration tool Fivetran is MUST. Experience working with dataframes using Snowpark is MUST. Experience in Automate data workflows and integrate with Azure Devops CI/CD pipelines is MUST . Strong Python & Java scripting for data transformation and automation. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Managing sets of XML, JSON, and CSV from different sources. Build, monitor, and optimize ETL and ELT processes with data models. Continually review and audit data models for enhancement. Hands on experience in Code updates, new code development, and reverse engineering. Possess ownership right from start to finish for the allocated project work. Experience with client interaction is must for demonstrating multiple data solutions. Preferred Snowflake SnowPro Certified professionals. Regular engagement with teams for status reporting and routine activities. Implementation of data streaming solutions from different sources for data migration & transformation. Soft Skills: Hands-on analytical, problem solving and debugging skills. Ability to work under pressure. The person Should be flexible to work independently or in a team. Excellent communication skills and ability to present results in a concise manner to technical & non-technical stakeholders.

Posted Date not available

Apply

6.0 - 10.0 years

4 - 7 Lacs

hyderabad, pune, bengaluru

Work from Office

Locations- Pune, Bangalore, Hyderabad, Indore Contract duration : 6 month Responsibilities : - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have : - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted Date not available

Apply

5.0 - 10.0 years

15 - 27 Lacs

chennai, bengaluru

Hybrid

Job Summary: We are looking for a highly experienced Senior Lead Data Engineer / Architect specializing in Snowflake data warehousing, with strong skills in Power BI. The ideal candidate will have 810 years of experience in data engineering and analytics, with proven leadership in designing, implementing, and optimizing large-scale Snowflake architectures and developing high-impact business intelligence solutions using Power BI. Key Responsibilities: Lead the end-to-end architecture, design, and implementation of enterprise-grade data solutions on Snowflake. Build and maintain complex data pipelines and advanced ETL/ELT frameworks, leveraging Snowflakes native features for scalability and performance. Optimize Snowflake databases, warehouses, secure data sharing, and storage structures for cost and performance efficiency. Collaborate with data architects and engineers to establish best practices for data governance, data security (RBAC, masking), and lifecycle management in Snowflake. Leverage advanced Snowflake capabilities: Streams, Tasks, Time Travel, Zero Copy Cloning, Data Sharing, and Materialized Views. Write highly optimized SQL queries, UDFs (user-defined functions), and scripts for large-scale analytics workloads. Mentor junior developers in Snowflake best practices, performance tuning, and troubleshooting. Drive integration between Power BI and Snowflake using native connectors and optimal data modeling. Support data security implementation, including Row-Level Security (RLS) and Column-Level Security (CLS) in both Snowflake and Power BI. Qualifications: Bachelors or masters degree in computer science, Information Systems, or related field. 8–10 years of hands-on experience with enterprise data warehousing and analytics. Minimum 4 years of proven experience implementing, optimizing, and maintaining Snowflake environments. Deep expertise in Snowflake architecture, performance tuning, data modeling, security, and advanced features. Expert-level SQL, DBT skills, with ability to author and optimize complex transformations and queries in Snowflake. Experience with Power BI for large-scale reporting, including DAX scripting and Power Query. Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt) and scripting languages (Python preferred) is highly desirable. Effective communication skills and the ability to lead technical discussions with stakeholders and mentor team members. Strong analytical and troubleshooting abilities, with a track record of delivering data solutions in complex, large-scale environments.

Posted Date not available

Apply

6.0 - 11.0 years

18 - 33 Lacs

pune, chennai, bengaluru

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted Date not available

Apply

4.0 - 9.0 years

15 - 30 Lacs

hyderabad, pune, bengaluru

Hybrid

Job Title: Sr. Snowflake Administrator Primary skills: Snowflake administration , ETL Fundamentals, SQL (Basic + Advanced), Data Warehousing, Snowpipe, SnowSQL,PLSQL,Stored Procedures. Job Summary: We are looking for an experienced Snowflake Administrator to join our team. The ideal candidate will be responsible for managing and optimizing our Snowflake environment, ensuring high performance, security, and reliability. This role involves collaboration with data engineering, BI, and infrastructure teams to support data-driven initiatives across the organization. Key Responsibilities: Administer and maintain the Snowflake data platform, including user access, roles, resource monitors, and security policies. Monitor system performance, usage, and storage; tune and optimize Snowflake performance. Implement and manage Snowflake features such as Snowpipes, Streams, Tasks, and Materialized Views. Handle data replication, sharing, and integration across environments and external systems. Automate routine maintenance tasks and monitor alerts using scripting or orchestration tools. Coordinate with DevOps and Infrastructure teams for patching, upgrades, and performance tuning. Ensure compliance with data governance, privacy, and security standards. Troubleshoot and resolve issues related to Snowflake and data access. Support ETL/ELT pipelines and collaborate with data engineering teams for efficient data workflows. Provide technical guidance and support to development and analytics teams using Snowflake. Required Skills & Qualifications: Experience as a Snowflake Administrator or in a similar role. Strong understanding of Snowflake architecture, security, and administration. Strong in SQL and performance tuning. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data integration tools (e.g., Fivetran) is a plus. Experience in managing Snowflake accounts, warehouses, and multi-cluster configurations. Solid understanding of data governance, security policies, and compliance requirements. Strong communication and collaboration skills. Snowflake certification (preferred but not mandatory). Nice to Have: Experience in CI/CD for data infrastructure. Exposure to BI tools such as Power BI or Tableau. Experience working in Agile environments.

Posted Date not available

Apply

6.0 - 11.0 years

15 - 17 Lacs

hyderabad, chennai, bengaluru

Work from Office

Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Greetings From Mississippi Consultant LLP!! We are a Recruitment firm based in Pune, having various clients globally. Presently we are having an opening with one of our client contact Soniya soniya05.mississippiconsultants@gmail.com

Posted Date not available

Apply

7.0 - 12.0 years

17 - 32 Lacs

chennai, bengaluru

Hybrid

SQLPL/SQL features like Built In Functions, AnalyticalFunctions, CursorsCursor variables, Native dynamic SQLbulk binding techniques P Snowflake cloud data platform including Snowflake utilities like SnowSQLSnowPipedata loading cloud AWS Data ETL

Posted Date not available

Apply

10.0 - 15.0 years

15 - 30 Lacs

chennai, bengaluru

Hybrid

PLSQL experience in creating database objects like Tables, StoredProcedures, DDL/DML Triggers Views Indexes Cursors, Functions amp User data typesBuilt In Functions, AnalyticalFunctions, Cursors, Cursor variables, Native dynamicRDBMS Snowflake cloud

Posted Date not available

Apply

10.0 - 12.0 years

30 - 40 Lacs

hyderabad

Work from Office

Job Title : Snowflake Architect Location: Hyderabad Shift: 10:00 AM-7:00 PM IST Job Summary: We are seeking a seasoned and strategic Snowflake Architect to lead the design and evolution of our enterprise data platform. This is an onsite role based in Hyderabad, requiring deep technical expertise in Snowflake, cloud ecosystems, and modern data engineering. You will play a pivotal role in defining architectural standards, guiding implementation, and ensuring performance, scalability, and governance across data platforms. Key Responsibilities Own the end-to-end Snowflake architecture, ensuring scalability, security, cost-efficiency, and performance across the enterprise data platform. Define the data platform strategy including architectural blueprints, best practices, data models, and integration standards. Design and oversee implementation of complex data pipelines, ELT frameworks, and real-time/near-real-time ingestion solutions using Snowflake and the modern data stack. Collaborate with global stakeholders across engineering, product, analytics, and business teams to translate requirements into robust, scalable architectures. Conduct architecture reviews, performance assessments, and proof of concepts (POCs) for data initiatives and new tools. Lead Snowflake optimization efforts, including warehouse sizing, query tuning, partitioning strategies, materialized views, and data lifecycle management. Ensure governance and compliance by enforcing architectural and security standards including access controls, encryption, audit trails, and data retention. Guide and mentor engineers and analysts across teamsfostering a strong architectural mindset and technical rigor. Evaluate emerging tools, frameworks, and trends in the Snowflake and cloud data ecosystem to ensure we remain ahead of the curve. Required Skills Expert-level proficiency in Snowflake, including advanced features: multi-cluster warehouses, Snowpipe, external tables, data sharing, materialized views, and RBAC/ABAC. Strong experience in enterprise data architecture, including dimensional, normalized, and data vault modeling. Deep understanding of data engineering pipelines (ETL/ELT), streaming ingestion, and orchestration tools like Airflow, dbt, Dagster, or equivalent. Hands-on skills in SQL and Python, with proven ability to design performant, production-ready solutions. Experience with cloud-native platforms (AWS, Azure, or GCP), especially around cost governance, networking, and data security in Snowflake environments. Familiarity with IaC tools like Terraform, and building/maintaining CI/CD pipelines for data workflows. Knowledge of data governance frameworks, data cataloging, lineage tracking, and compliance standards such as HIPAA, GDPR, SOC2. Proven leadership in architecture reviews, capacity planning, performance tuning, and mentoring delivery teams. Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 10+ years of experience in data engineering or architecture roles, including 3–5 years in an architect-level role. Relevant certifications in Snowflake, AWS, Azure, dbt, or other modern data technologies preferred.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies