Jobs
Interviews

174 Snowsql Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 4 months ago

Apply

3 - 6 years

10 - 20 Lacs

Hyderabad, Bangalore/Bengaluru

Work from Office

Snowflake Developer

Posted 4 months ago

Apply

5 - 7 years

5 - 15 Lacs

Pune, Chennai, Bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 4 months ago

Apply

8 - 13 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

3 - 8 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 4 months ago

Apply

4 - 8 years

0 - 1 Lacs

Mohali

Work from Office

Job Title : Snowflake Developer (4+ years' experience) Location : F, 384, Sector 91 Rd, Phase 8B, Industrial Area, Sector 91, Sahibzada Ajit Singh Nagar, Punjab 160055. Job Type : Fulltime (In-house) Job Overview : We are looking for an experienced Snowflake Developer with 4+ years of hands-on experience in Snowflake Data Warehouse and related tools. You will be responsible for building, managing, and optimizing Snowflake data pipelines, assisting in data integration, and contributing to overall data architecture. The ideal candidate should have a strong understanding of data modeling, ETL processes, and experience working with cloud-based data platforms. Responsibilities : Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Optimize query performance and data storage in Snowflake. Work with stakeholders to define data requirements and provide technical solutions. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Monitor and troubleshoot performance issues in Snowflake environments. Automate repetitive data processes and report generation tasks. Ensure data integrity, security, and compliance with data governance policies. Assist in data migration and platform upgrades. Required Skills : 4+ years of experience working with Snowflake Data Warehouse . Proficient in SQL , SnowSQL , and ETL processes . Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines , data lakes, and data integration tools. Experience in query optimization and performance tuning in Snowflake. Understanding of data governance and best practices. Strong knowledge of data security and privacy policies in a cloud environment. Experience in using tools like dbt , Airflow , or similar orchestration tools is a plus. #Salary: No bar for deserving candidates. Location: - Mohali Punjab (Work from office) Shift:- Night Shift Other Benefits: 5 Days working US based work culture and environment Indoor and Outdoor events Paid Leaves Health Insurance Employee engagement activities like month end & festival celebration, team outing, birthday celebrations. Gaming and sports area Please comment/DM to know more. You may also e-mail your resume to me at priyankaaggarwal@sourcemash.com

Posted 4 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bhubaneshwar, Bengaluru, Hyderabad

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 4 months ago

Apply

3 - 8 years

15 - 25 Lacs

Bengaluru, Hyderabad, Noida

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)

Posted 4 months ago

Apply

6.0 - 8.0 years

20 - 30 Lacs

pune, bengaluru

Hybrid

Must Have Skills: 6+ years of experience in designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse. Expertise in Snowflake data modeling, ELT using snow pipe, implementing stored procedures and standard DWH and ETL concepts. Experience with data security and data access controls and design in Snowflake. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning. Experience in managing Snowflake environments including user roles, security, and access control. Experience in handling backup, failover, and disaster recovery strategies. Experience in maintaining Snowflake repository and version control for production and test environments. Good to have Snowflake cloud data warehouse Architect certification. Good to have experience in building solutions on Snowflake using combination of Python, PySpark with SnowSQL. Good to have AWS Cloud experience. Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues

Posted Date not available

Apply

3.0 - 6.0 years

13 - 18 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Roles and Responsibilities Design, develop, and optimize data pipelines using Snowflake . Implement data models, transformations, and ETL processes. Work with stakeholders to understand requirements and translate them into scalable data solutions. Ensure data quality, performance tuning, and security compliance. Integrate Snowflake with other cloud services and BI tools. Required Skills: 3-6 years of experience in data engineering or development roles. Strong expertise in Snowflake (warehousing, performance tuning, query optimization). Proficiency in SQL and ETL tools (e.g., Informatica, Talend, dbt ). Familiarity with cloud platforms (AWS/Azure/GCP) Good understanding of data modeling and data governance. BE/ BTech compulsory, any field Nice to Have: Experience with Python or Spark for data processing. Knowledge of CI/CD pipelines for data workflows.

Posted Date not available

Apply

5.0 - 8.0 years

12 - 15 Lacs

hyderabad, pune, bengaluru

Work from Office

Role & responsibilities Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tools. 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certification is preferable. Contact Kareem kareem.mississippiconsultants@gmail.com

Posted Date not available

Apply

6.0 - 11.0 years

15 - 25 Lacs

chennai

Hybrid

Lead Snowflake Data Engineer: Madallion, Data Ingestion, Duplicate Data, Snow sight, Snow SQL A minimum of 5 years of hands-on Snowflake experience Proven expertise in query and performance optimization Strong background in medallion architecture and star schema design Demonstrated experience building scalable data warehouses (not limited to ingesting data from flat files) SnowPro Core Certification is required; SnowPro Advanced certifications in Data Engineering, Data Analysis, or Architecture are highly desirable Snowflake Data Engineer: Data Ingestion, Duplicate Data, Should have used ADF as their main ELT tool and can demonstrate knowledge on how to transfer data using SnowSQL,materialized views and tasks. DBT is a bonus.of the communication, including attachments.

Posted Date not available

Apply

6.0 - 11.0 years

20 - 27 Lacs

hyderabad, pune, bengaluru

Hybrid

Job Title: Snowflake Developer Experience: 4+ years Short Description: Assist in migrating existing Snowflake code (converted by SnowConvert) into dbt (data build tool). • Test and validate the migrated code in Snowflake and dbt for functionality and performance. • Support the implementation of re-engineered dbt pipelines. • Migrate data transformation code for downstream reports to dbt. • Migrate OLAP cubes from SQL Server to Snowflake/dbt. • Support migration of downstream system integrations to Snowflake/dbt. • Perform testing and validation for all migrated objects and integrations. *Description for Internal Candidates The contractors will be required to: Assist in migrating existing Snowflake code (converted by SnowConvert) into dbt (data build tool). Test and validate the migrated code in Snowflake and dbt for functionality and performance. Support the implementation of re-engineered dbt pipelines. Migrate data transformation code for downstream reports to dbt. Migrate OLAP cubes from SQL Server to Snowflake/dbt. Support migration of downstream system integrations to Snowflake/dbt. Perform testing and validation for all migrated objects and integrations. Desired Skills & Experience Strong SQL & Python proficiency Experience with API integrations Experience with Azure Data Factory Comprehensive understanding of ELT (Extract, Load, Transform) processes and medallion architecture on big data platforms such as Snowflake, Google BigQuery, Amazon Redshift etc Familiarity with dbt and best practices for data modeling and transformation Familiarity with CICD/devops for data engineering development Experience with dashboarding tools such as Power BI, Looker, Qlik, Quicksight etc Knowledge of data warehouse migrations and legacy system transitions Advantage Zensar We are a technology consulting and services company with 11, 800+ associates in 33 global locations. More than 130 leading enterprises depend on our expertise to be more disruptive, agile and competitive. We focus on conceptualizing, designing, engineering, marketing, and managing digital products and experiences for high-growth companies looking to disrupt through innovation and velocity. Zensar Technologies is an Equal Employment Opportunity (EEO) and Affirmative Action Employer, encouraging diversity in the workplace. Please be assured that we will consider all qualified applicants fairly, regardless of race, creed, color, ancestry, religion, sex, national origin, citizen status, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veterans status. Zensar is a place where you are free to express yourself in an environment that values individuality, nurtures development and is mindful of wellbeing. We put our people and customers at the center of everything that we do. Our core values include: Putting people first Client-centricity Collaboration

Posted Date not available

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted Date not available

Apply

6.0 - 11.0 years

17 - 30 Lacs

kolkata, hyderabad/secunderabad, bangalore/bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted Date not available

Apply

6.0 - 10.0 years

4 - 7 Lacs

hyderabad, pune, bengaluru

Work from Office

Contract duration : 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities : - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have : - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted Date not available

Apply

6.0 - 10.0 years

4 - 7 Lacs

hyderabad, pune, bengaluru

Work from Office

Locations- Pune, Bangalore, Hyderabad, Indore Contract duration : 6 month Responsibilities : - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have : - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted Date not available

Apply

5.0 - 10.0 years

15 - 27 Lacs

chennai, bengaluru

Hybrid

Job Summary: We are looking for a highly experienced Senior Lead Data Engineer / Architect specializing in Snowflake data warehousing, with strong skills in Power BI. The ideal candidate will have 810 years of experience in data engineering and analytics, with proven leadership in designing, implementing, and optimizing large-scale Snowflake architectures and developing high-impact business intelligence solutions using Power BI. Key Responsibilities: Lead the end-to-end architecture, design, and implementation of enterprise-grade data solutions on Snowflake. Build and maintain complex data pipelines and advanced ETL/ELT frameworks, leveraging Snowflakes native features for scalability and performance. Optimize Snowflake databases, warehouses, secure data sharing, and storage structures for cost and performance efficiency. Collaborate with data architects and engineers to establish best practices for data governance, data security (RBAC, masking), and lifecycle management in Snowflake. Leverage advanced Snowflake capabilities: Streams, Tasks, Time Travel, Zero Copy Cloning, Data Sharing, and Materialized Views. Write highly optimized SQL queries, UDFs (user-defined functions), and scripts for large-scale analytics workloads. Mentor junior developers in Snowflake best practices, performance tuning, and troubleshooting. Drive integration between Power BI and Snowflake using native connectors and optimal data modeling. Support data security implementation, including Row-Level Security (RLS) and Column-Level Security (CLS) in both Snowflake and Power BI. Qualifications: Bachelors or masters degree in computer science, Information Systems, or related field. 8–10 years of hands-on experience with enterprise data warehousing and analytics. Minimum 4 years of proven experience implementing, optimizing, and maintaining Snowflake environments. Deep expertise in Snowflake architecture, performance tuning, data modeling, security, and advanced features. Expert-level SQL, DBT skills, with ability to author and optimize complex transformations and queries in Snowflake. Experience with Power BI for large-scale reporting, including DAX scripting and Power Query. Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt) and scripting languages (Python preferred) is highly desirable. Effective communication skills and the ability to lead technical discussions with stakeholders and mentor team members. Strong analytical and troubleshooting abilities, with a track record of delivering data solutions in complex, large-scale environments.

Posted Date not available

Apply

6.0 - 11.0 years

18 - 33 Lacs

pune, chennai, bengaluru

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.

Posted Date not available

Apply

4.0 - 9.0 years

15 - 30 Lacs

hyderabad, pune, bengaluru

Hybrid

Job Title: Sr. Snowflake Administrator Primary skills: Snowflake administration , ETL Fundamentals, SQL (Basic + Advanced), Data Warehousing, Snowpipe, SnowSQL,PLSQL,Stored Procedures. Job Summary: We are looking for an experienced Snowflake Administrator to join our team. The ideal candidate will be responsible for managing and optimizing our Snowflake environment, ensuring high performance, security, and reliability. This role involves collaboration with data engineering, BI, and infrastructure teams to support data-driven initiatives across the organization. Key Responsibilities: Administer and maintain the Snowflake data platform, including user access, roles, resource monitors, and security policies. Monitor system performance, usage, and storage; tune and optimize Snowflake performance. Implement and manage Snowflake features such as Snowpipes, Streams, Tasks, and Materialized Views. Handle data replication, sharing, and integration across environments and external systems. Automate routine maintenance tasks and monitor alerts using scripting or orchestration tools. Coordinate with DevOps and Infrastructure teams for patching, upgrades, and performance tuning. Ensure compliance with data governance, privacy, and security standards. Troubleshoot and resolve issues related to Snowflake and data access. Support ETL/ELT pipelines and collaborate with data engineering teams for efficient data workflows. Provide technical guidance and support to development and analytics teams using Snowflake. Required Skills & Qualifications: Experience as a Snowflake Administrator or in a similar role. Strong understanding of Snowflake architecture, security, and administration. Strong in SQL and performance tuning. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data integration tools (e.g., Fivetran) is a plus. Experience in managing Snowflake accounts, warehouses, and multi-cluster configurations. Solid understanding of data governance, security policies, and compliance requirements. Strong communication and collaboration skills. Snowflake certification (preferred but not mandatory). Nice to Have: Experience in CI/CD for data infrastructure. Exposure to BI tools such as Power BI or Tableau. Experience working in Agile environments.

Posted Date not available

Apply

7.0 - 12.0 years

17 - 32 Lacs

chennai, bengaluru

Hybrid

SQLPL/SQL features like Built In Functions, AnalyticalFunctions, CursorsCursor variables, Native dynamic SQLbulk binding techniques P Snowflake cloud data platform including Snowflake utilities like SnowSQLSnowPipedata loading cloud AWS Data ETL

Posted Date not available

Apply

10.0 - 15.0 years

15 - 30 Lacs

chennai, bengaluru

Hybrid

PLSQL experience in creating database objects like Tables, StoredProcedures, DDL/DML Triggers Views Indexes Cursors, Functions amp User data typesBuilt In Functions, AnalyticalFunctions, Cursors, Cursor variables, Native dynamicRDBMS Snowflake cloud

Posted Date not available

Apply

10.0 - 12.0 years

30 - 40 Lacs

hyderabad

Work from Office

Job Title : Snowflake Architect Location: Hyderabad Shift: 10:00 AM-7:00 PM IST Job Summary: We are seeking a seasoned and strategic Snowflake Architect to lead the design and evolution of our enterprise data platform. This is an onsite role based in Hyderabad, requiring deep technical expertise in Snowflake, cloud ecosystems, and modern data engineering. You will play a pivotal role in defining architectural standards, guiding implementation, and ensuring performance, scalability, and governance across data platforms. Key Responsibilities Own the end-to-end Snowflake architecture, ensuring scalability, security, cost-efficiency, and performance across the enterprise data platform. Define the data platform strategy including architectural blueprints, best practices, data models, and integration standards. Design and oversee implementation of complex data pipelines, ELT frameworks, and real-time/near-real-time ingestion solutions using Snowflake and the modern data stack. Collaborate with global stakeholders across engineering, product, analytics, and business teams to translate requirements into robust, scalable architectures. Conduct architecture reviews, performance assessments, and proof of concepts (POCs) for data initiatives and new tools. Lead Snowflake optimization efforts, including warehouse sizing, query tuning, partitioning strategies, materialized views, and data lifecycle management. Ensure governance and compliance by enforcing architectural and security standards including access controls, encryption, audit trails, and data retention. Guide and mentor engineers and analysts across teamsfostering a strong architectural mindset and technical rigor. Evaluate emerging tools, frameworks, and trends in the Snowflake and cloud data ecosystem to ensure we remain ahead of the curve. Required Skills Expert-level proficiency in Snowflake, including advanced features: multi-cluster warehouses, Snowpipe, external tables, data sharing, materialized views, and RBAC/ABAC. Strong experience in enterprise data architecture, including dimensional, normalized, and data vault modeling. Deep understanding of data engineering pipelines (ETL/ELT), streaming ingestion, and orchestration tools like Airflow, dbt, Dagster, or equivalent. Hands-on skills in SQL and Python, with proven ability to design performant, production-ready solutions. Experience with cloud-native platforms (AWS, Azure, or GCP), especially around cost governance, networking, and data security in Snowflake environments. Familiarity with IaC tools like Terraform, and building/maintaining CI/CD pipelines for data workflows. Knowledge of data governance frameworks, data cataloging, lineage tracking, and compliance standards such as HIPAA, GDPR, SOC2. Proven leadership in architecture reviews, capacity planning, performance tuning, and mentoring delivery teams. Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 10+ years of experience in data engineering or architecture roles, including 3–5 years in an architect-level role. Relevant certifications in Snowflake, AWS, Azure, dbt, or other modern data technologies preferred.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies