Jobs
Interviews

174 Snowsql Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 2 months ago

Apply

5.0 - 10.0 years

0 - 3 Lacs

Noida

Work from Office

• Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. • Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. • Develop and enforce data modelling standards and best practices for Snowflake environments. • Develop, optimize, and maintain Snowflake data warehouses. • Leverage Snowflake features such as clustering, materialized views, and semi structured data processing to enhance data solutions. • Ensure data architecture solutions meet performance, security, and scalability requirements. • Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. • Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. • Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. • Provide mentorship and guidance to junior data engineers and architects. • Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: • 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. • Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. • Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. • Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. • SnowSQL Experience in developing stored Procedures writing Queries to analyse and transform data • Working experience on ETL tools like Fivetran, DBT labs, MuleSoft • Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. • Excellent problem-solving skills and attention to detail. • Effective communication and collaboration abilities. • Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. • Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. • Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: • Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.

Posted 2 months ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Hyderabad, Bengaluru

Hybrid

Skills -Snowflake, AWS, SQL, PLSQL/TSQL, DWH, Python, PySpark •Experience with Snowflake utilities, SnowSQL, SnowPipe, Able to administer & monitor Snowflake computing platform •Good in Cloud Computing AWS NP-Immediate Email- sachin@assertivebs.com

Posted 2 months ago

Apply

8.0 - 14.0 years

8 - 14 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Roles & Responsibilities: Analyze requirements: Collaborate with stakeholders to scrutinize requirements for new software applications or enhancements, guaranteeing that developed software fulfills customer needs. Design software solutions: Contribute to crafting detailed design specifications for data models, schemas, views, and stored procedures using Snowflake features such as time travel, zero copy cloning, and secure data sharing based on gathered requirements, steering the development process to ensure the resulting software meets functional and technical demands. Develop and deploy scalable software: Write clean, maintainable, and well-documented data pipelines using Snowflake SQL, Snowpipe, and other tools to ingest, transform, and deliver data from various sources, leveraging your expertise to ensure scalability, complexity, efficiency and lead deployment activities of that code in multiple environments. Integrate software components: Seamlessly integrate software components into a fully functional software system, ensuring compatibility and interoperability with existing systems for smooth communication and data exchange. Perform unit testing: Conduct thorough unit testing of developed queries and components, ensuring data quality and accuracy by implementing data validation, testing, and monitoring frameworks and tools adhering to quality standards and expected performance levels. Debug and troubleshoot: Skillfully debug and troubleshoot software applications, swiftly identifying and resolving issues encountered during development or deployment to ensure uninterrupted operation and minimal downtime for end-users. Provide technical support: Offer expert technical support and guidance to end-users by applying Snowflake best practices such as partitioning, clustering, caching, and compression, empowering them to utilize the software effectively and troubleshoot any encountered issues. Stay updated with technology: Remain abreast of emerging technologies, trends, and best practices in Snowflake and data domain, integrating relevant advancements into our software solutions. Collaborate with team: Foster effective communication and coordination throughout the software development lifecycle by collaborating with IT team members, data engineers, project managers, and end-users, ensuring a collaborative work environment and successful project delivery. Mentor and lead junior developers. Document processes: Document processes, procedures, and technical specifications related to software development and deployment, facilitating knowledge sharing within the team and streamlining future development efforts. Experience Requirement: 8-14 years of experience with software development tools, including integrated development environments (IDEs), version control systems (e.g., Git), and issue tracking systems (e.g., Jira), DevOps principles and CI/CD pipelines. Experience providing technical support and guidance to end-users during the implementation and deployment of software applications. Strong analytical thinking skills to understand complex requirements and design software solutions accordingly. Ability to read and understand other developer's code. Proficiency in industry standard testing methodologies and debugging techniques to ensure software quality and identify and resolve issues. Ability to document processes, procedures, and technical specifications related to software development and deployments.

Posted 2 months ago

Apply

3.0 - 5.0 years

0 Lacs

, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 3+ years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 months ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowflake Professionals in the following areas : JD as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: Min 6-7 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling. Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool like snaplogic, Informatica, Datastage. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 months ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Ahmedabad

Work from Office

Roles and Responsibility : 1. Database and Datawarehouse Expertise: Demonstrate excellent understanding of Database and Datawarehouse concepts. Strong proficiency in writing SQL queries. 2. Snowflake Cloud Data Warehouse: Design and implement Snowflake cloud data warehouse. Develop and implement cloud-related architecture and data modeling. 3. Migration Projects: Manage migration projects, specifically migrating from On-prem to Snowflake. 4. Snowflake Capabilities: Utilize comprehensive knowledge of Snowflake capabilities such as Snow pipe, Stages, SnowSQL, Streams, and tasks. 5. Advanced Snowflake Concepts: Implement advanced Snowflake concepts like setting up resource monitor, RBAC controls, Virtual Warehouse sizing, Zero copy clone. 6. Data Migration Expertise: Possess in-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse. 7. Snowflake Feature Deployment: Deploy Snowflake features such as data sharing, event, and lake house patterns. 8. Incremental Extraction Loads: Execute Incremental extraction loads, both batched and streaming.

Posted 2 months ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Snowflake Developer exp: 5+ years Location: Pan India Work Mode: WFO

Posted 2 months ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Summary Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Role: Security Engineer / Analyst Industry Type: IT Services & Consulting Department: IT & Information Security Employment Type: Full Time, Permanent Role Category: IT Security

Posted 3 months ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Ahmedabad

Hybrid

Key Responsibilities: Lead the end-to-end Snowflake platform implementation , including architecture, design, data modeling, and governance. Oversee the migration of data and pipelines from legacy platforms to Snowflake, ensuring quality, reliability, and business continuity. Design and optimize Snowflake-specific data models , including use of clustering keys, materialized views, Streams, and Tasks. Build and manage scalable ELT/ETL pipelines using modern tools and best practices. Define and implement standards for Snowflake development , testing, and deployment, including CI/CD automation. Collaborate with cross-functional teams including data engineering, analytics, DevOps, and business stakeholders. Establish and enforce data security , privacy, and governance policies using Snowflakes native capabilities. Monitor and tune system performance and cost efficiency through appropriate warehouse sizing and usage patterns. Lead code reviews, technical mentoring, and documentation for Snowflake-related processes. Required Snowflake Expertise: Snowflake Architecture – Deep understanding of virtual warehouses, data sharing, multi-cluster, zero-copy cloning. Ability to enhance architecture and implement solutions as per the architecture. Performance Optimization – Proficient in tuning queries, clustering, caching, and workload management. Data Engineering – Experience with processing batch, real time using multiple Snowflake features like Snowpipe, Streams & Tasks, stored procedures, and data ingestion patterns. Data Security & Governance – Strong experience with RBAC, dynamic data masking, row-level security, and tagging. Experience enabling such capabilities in Snowflake and at least one enterprise product solution. Advanced SQL – Expertise writing, analyzing, performance optimization of complex SQL queries, transformations, semi-structured data handling (JSON, XML). Cloud Integration – Experience with at least one of the major cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, Step Functions, etc.

Posted 3 months ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Role & responsibilities :Snowflake Developer Skill Set : Snowflake,IICS,ETL,Cloud Experience- 5 Years- 12 Years Location-Pune/Mumbai/Chennai/Bangalore/Hyderabad/Delhi Notice Period: immediate- 30 days If all above criteria matches to your profile please share your updated CV with all below details Total Exp- ? Relevant Exp- ? Current CTC- ? Exp. CTC- ? Notice Period- ? IF serving what is LWD? Pan Card Number -?Mandatory Passport size photo please attach -Mandatory Please share your all above details on sneha.joshi@alikethoughts.com

Posted 3 months ago

Apply

7.0 - 12.0 years

0 Lacs

Kochi

Work from Office

Greetings from TCS Recruitment Team! Role: SNOWFLAKE LEAD/ SNOWFLAKE SOLUTION ARCHITECT/ SNOWFLAKE ML ENGINEER Years of experience: 7 to 18 Years Walk-In-Drive Location: Kochi Walk-in-Location Details: Tata Consultancy Services TCS Centre SEZ Unit, Infopark Kochi Phase 1, Infopark Kochi P.O, Kakkanad, Kochi - 682042, Kerala India Drive Time: 9 am to 1:00 PM Date: 21-Jun-25 Must Have Deep knowledge of Snowflakes architecture, SnowSQL, Snowpipe, Streams, Tasks, Stored Procedures. Strong understanding of cloud platforms (AWS, Azure, GCP). Proficiency in SQL, Python, or scripting languages for data operations. Experience with ETL/ELT tools, data integration, and performance tuning. Familiarity with data security, governance, and compliance standards (GDPR, HIPAA, SOC 2).

Posted 3 months ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 3 months ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Posted 3 months ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Hyderabad, Pune

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.

Posted 3 months ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Pune, Bengaluru

Hybrid

Job Title: Snowflake Developer Experience: 5+ years Location: Pune and Hyderabad (Hybrid) Job Type: Full-time About Us: We're seeking an experienced Snowflake Developer to join our team in Pune and Hyderabad. As a Snowflake Developer, you will be responsible for designing, developing, and implementing data warehousing solutions using Snowflake. You will work closely with cross-functional teams to ensure seamless data integration and analytics. Key Responsibilities: Design, develop, and deploy Snowflake-based data warehousing solutions Collaborate with stakeholders to understand data requirements and develop data models Optimize Snowflake performance, scalability, and security Develop and maintain Snowflake SQL scripts, stored procedures, and user-defined functions Troubleshoot data integration and analytics issues Ensure data quality, integrity, and compliance with organizational standards Work with data engineers, analysts, and scientists to ensure seamless data integration and analytics Stay up-to-date with Snowflake features and best practices Requirements: 5+ years of experience in Snowflake development and administration Strong expertise in Snowflake architecture, data modeling, and SQL Experience with data integration tools (e.g., Informatica, Talend, Informatica PowerCenter) Proficiency in Snowflake security features and access control Strong analytical and problem-solving skills Excellent communication and collaboration skills Experience working in hybrid or remote teams Bachelor's degree in Computer Science, Engineering, or related field Nice to Have: Experience with cloud platforms (AWS, Azure, GCP) Knowledge of data governance and data quality frameworks Experience with ETL/ELT tools (e.g., Informatica PowerCenter, Talend, Microsoft SSIS) Familiarity with data visualization tools (e.g., Tableau, Power BI) Experience working with agile methodologies

Posted 3 months ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Mandatory - Snowflake. Validate ETL Essential Skills: Informatica, ETL Tools,Snowflake (Mandatory)

Posted 3 months ago

Apply

15.0 - 25.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : PySpark, Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education

Posted 3 months ago

Apply

15.0 - 25.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : PySparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution. Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education

Posted 3 months ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Chennai, Tamil Nadu, India

On-site

DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.

Posted 3 months ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Chennai, Tamil Nadu, India

On-site

DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.

Posted 3 months ago

Apply

9.0 - 14.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Job Description: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Role & responsibilities Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Work with data and analytics experts to strive for greater functionality in our data systems. Assemble large, complex data sets that meet functional / non-functional business requirements. – Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. – Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. – Unit Test databases and perform bug fixes. – Develop best practices for database design and development activities. – Take on technical leadership responsibilities of database projects across various scrum teams. Manage exploratory data analysis to support dashboard development (desirable) Required Skills: – Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). – Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. – Understanding of data modelling techniques and working knowledge with OLAP systems – Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. – In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. – Ability to fine tune report generating queries. – Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. – Understanding of index design and performance-tuning techniques – Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions – Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). – Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions – Exposure to Source control like GIT, Azure DevOps – Understanding of Agile methodologies (Scrum, Itanban) – experience with NoSQL database to migrate data into other type of databases with real time replication (desirable). – Experience with CI/CD automation tools (desirable) – Programming language experience in Golang, Python, any programming language, Visualization tools (Power BI/Tableau) (desirable).

Posted 3 months ago

Apply

5.0 - 15.0 years

22 - 24 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Bengaluru, Chennai, Gurgaon JobType: full-time We are looking for an experiencedSnowflake Developerto join our Data Engineering team. The ideal candidate will possess a deep understanding ofData Warehousing,SQL,ETL tools like Informatica, andvisualization platforms such as Power BI. This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing:Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development:Design and maintain ELT pipelines usingSnowflake,Fivetran, andDBTto ingest and transform data from multiple sources. SQL Development:Write and optimize complexSQL queriesandstored proceduresto support robust data transformations and analytics. Data Modeling & ELT:Implement advanced data modeling practices includingSCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis:Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting:Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation:Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise inSnowflakefor large-scale data warehousing and ELT operations. StrongSQLskills with the ability to create and manage complex queries and procedures. Proven experience withInformatica PowerCenterfor ETL development. Proficiency withPower BIfor data visualization and reporting. Hands-on experience withFivetranfor automated data integration. Familiarity withDBT,Sigma Computing,Tableau, andOracle. Solid understanding ofdata analysis,requirement gathering, andsource-to-target mapping. Knowledge of cloud ecosystems such asAzure (including ADF, Databricks); experience withAWS or GCPis a plus. Experience with workflow orchestration tools likeAirflow,Azkaban, orLuigi. Proficiency inPythonfor scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree inComputer Science,Statistics,Informatics,Information Systems, or a related field. Key Tools & Technologies Snowflake,snowsql,Snowpark SQL,Informatica,Power BI,DBT Python,Fivetran,Sigma Computing,Tableau Airflow,Azkaban,Azure,Databricks,ADF

Posted 3 months ago

Apply

1.0 - 4.0 years

4 - 7 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies