Jobs
Interviews

174 Snowsql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

9 - 18 Lacs

hyderabad, chennai, bengaluru

Work from Office

We have an opportunity AWS Snowflake DBT Developer in PWC AC Position: AWS Snowflake DBT Developer Experie nce Required: 4-8 Years Notice Period: Immediate to 60 Days Locations: Bangalore, Hyderabad, Kolkata, Chennai, Pune, Gurgaon & Mumbai Work Mode: Hybrid Must Have Skills: 1) AWS 2) Data bricks 3) Snowflake data warehousing, including SQL, Snow pipe 4) SnowSQL, SnowPipe 5) Data Build tool Must Have : Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,ETL data Pipelines, Big Data model techniques using Python / Java Experience in loading disparate data sets and translating complex functional and technical requirements into detailed design Should be aware of deploying Snowflake features such as data sharing, events and lake-house patterns Should have experience with data security and data access controls and design Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modeling) Proficient in Lambda and Kappa Architectures Strong AWS hands-on expertise with a programming background preferably Python/Scala Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory Strong experience in AWS compute services like AWS EMR, Glue and Sagemaker and storage services like S3, Redshift & Dynamodb Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming Experience in one of the flow tools like Airflow, Nifi or Luigi Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Strong understanding of Cloud data migration processes, methods and project lifecycle Good analytical & problem-solving skills Good communication and presentation skills Desired Knowledge / Skills: Experience in building stream-processing systems, using solutions such as Storm or Spark-Streaming Experience in Big Data ML toolkits, such as Mahout, SparkML, or H2O Knowledge in Python Worked in Offshore / Onsite Engagements Experience in one of the flow tools like Airflow, Nifi or Luigi Experience in AWS services like STEP & Lambda Interested candidates kindly share profiles on dhamma.b.bhawsagar@pwc.com MB 8459621589

Posted 9 hours ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

Role Overview: As an AWS Senior Developer at PwC - AC, you will collaborate with the Offshore Manager and Onsite Business Analyst to comprehend the requirements and take charge of implementing end-to-end Cloud data engineering solutions such as Enterprise Data Lake and Data hub in AWS. Your role will involve showcasing strong proficiency in AWS cloud technology, exceptional planning and organizational skills, and the ability to work as a cloud developer/lead in an agile team to deliver automated cloud solutions. Key Responsibilities: - Proficient in Azure Data Services, Databricks, and Snowflake - Deep understanding of traditional and modern data architecture and processing concepts - Proficient in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines using Python / Java - Building stream-processing systems with solutions like Storm or Spark-Streaming - Designing and implementing scalable ETL/ELT pipelines using Databricks and Apache Spark - Optimizing data workflows and ensuring efficient data processing - Understanding big data use-cases and Lambda/Kappa design patterns - Implementing Big Data solutions using Microsoft Data Platform and Azure Data Services - Exposure to Open-Source technologies such as Apache Spark, Hadoop, NoSQL, Kafka, Solr/Elastic Search - Driving adoption and rollout of Power BI dashboards for finance stakeholders - Well-versed with quality processes and implementation - Experience in Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab - Guiding the assessment and implementation of finance data marketplaces - Good communication and presentation skills Qualifications Required: - BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA - Certification on AWS Architecture desirable - Experience in building stream-processing systems with Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Worked in Offshore / Onsite Engagements - Experience in AWS services like STEP & Lambda - Good Project Management skills with consulting experience in Complex Program Delivery - Good to have knowledge in Cloud - AWS, GCP, Informatica-Cloud, Oracle-Cloud, Cloud DW - Snowflake & DBT Additional Details: Travel Requirements: Travel to client locations may be required as per project requirements. Line of Service: Advisory Horizontal: Technology Consulting Designation: Senior Associate Location: Anywhere in India Apply now if you believe you can contribute to PwC's innovative and collaborative environment, and be a part of a globally recognized firm dedicated to excellence and diversity.,

Posted 3 days ago

Apply

8.0 - 12.0 years

0 Lacs

bangalore, karnataka

On-site

Role Overview: You will be responsible for architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions. Your role will involve designing scalable data architectures with Snowflake, integrating cloud technologies such as AWS, Azure, GCP, and ETL/ELT tools like DBT. Additionally, you will guide teams in proper data modeling, transformation, security, and performance optimization. Key Responsibilities: - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Design scalable data architectures with Snowflake and integrate cloud technologies like AWS, Azure, GCP, and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Load data from disparate data sets and translate complex functional and technical requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Implement data security and data access controls and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Utilize AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Implement Lambda and Kappa Architectures - Utilize Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Utilize AWS compute services like AWS EMR, Glue, and Sagemaker, as well as storage services like S3, Redshift, and DynamoDB - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Spark core, SQL, and Spark Streaming - Experience with flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Qualifications Required: - 8-12 years of relevant experience - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python/Java - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Proficiency in AWS, Data bricks, and Snowflake data warehousing, including SQL, Snow pipe - Experience in data security, data access controls, and design - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Good knowledge of Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Good experience with AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, SQL, and Spark Streaming - Experience in one of the flow tools like Airflow, Nifi, or Luigi - Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Kindly share your profiles on dhamma.b.bhawsagar@pwc.com if you are interested in this opportunity.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Snowflake Lead with experience and additional expertise in ETL tools and DBT, you will play a crucial role in designing, implementing, and managing Snowflake solutions to facilitate efficient data management and analytics. Your leadership will be pivotal in guiding the team to deliver high-quality, scalable, and optimized data solutions. Your proficiency in ETL and DBT will further enhance the organization's data processing capabilities. **Roles & Responsibilities:** - Perform analysis on the existing Data Storage system and develop Data Solutions in Snowflake. - Design the data warehouse and provide guidance to the team for implementation using Snowflake SnowSQL. - Define and communicate best practices, coding standards, and architectural guidelines for Snowflake development. - Collaborate with stakeholders to understand business requirements and translate them into effective Snowflake data solutions. - Architect Snowflake data models, schemas, and pipelines aligned with the organization's data strategy and industry best practices. - Lead the design and development of complex ETL processes and data transformations using Snowflake, as well as additional ETL tools. - Ensure seamless data integration from various sources into Snowflake while maintaining data quality and integrity. - Utilize experience with DBT (Data Build Tool) to transform raw data into well-structured, business-ready data models. - Design and manage DBT workflows, ensuring consistent and reliable data transformations. - Identify and address performance bottlenecks in Snowflake queries, data loading, and transformations. - Implement optimization techniques such as caching, indexing, and partitioning to enhance query performance. - Streamline data movement, transformation, and loading processes beyond Snowflake using expertise in ETL tools. - Evaluate, select, and integrate ETL tools that complement the Snowflake ecosystem and enhance overall data processing efficiency. - Work closely with cross-functional teams to understand their data requirements and deliver optimal solutions. - Facilitate communication between technical and non-technical teams to ensure successful project outcomes. - Hands-on experience in Python. - Good hands-on experience in converting Source Independent Load, Post Load Process, StoredProcedure, SQL to Snowflake. - Experience in migrating data from on-premises databases and files to Snowflake. - Strong understanding of ELT/ETL and integration concepts and design best practices. - Experience in performance tuning of the Snowflake pipelines and ability to troubleshoot issues quickly. - Experience in Snowsql, Snowpipe, and Snowpark. **Other Qualifications:** - Experience in data engineering, data warehousing, and analytics. - Strong hands-on experience with Snowflake, including architecture design, ETL development, and performance optimization. - Proficiency in ETL tools and experience with DBT. - Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services. - Proven leadership experience, including managing technical teams and projects. - Excellent problem-solving skills and ability to analyze complex technical challenges. - Effective communication skills and a collaborative approach to working with diverse teams. Please share your CV at parul@mounttalent.com. Location: Bangalore, Chennai, Noida, Pune, Mumbai, and Hyderabad.,

Posted 3 days ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

ahmedabad, chennai, bengaluru

Work from Office

Educational Requirements Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

pune, gurugram, bengaluru

Work from Office

Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, SQL Queries, Snowpipe, Snowsql, Performance Tuning, Query Optimization, Stored procedures, DBT, Matillion and ETL. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Snowflake Utilities , Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities: Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications: Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.

Posted 5 days ago

Apply

6.0 - 10.0 years

8 - 13 Lacs

hyderabad

Work from Office

Job description Position Title: Data Engineer (Snowflake Lead) Experience: 7+ Years Shift Schedule: Rotational Shifts Location: Hyderabad Role Overview We are seeking an experienced Data Engineer with strong expertise in Snowflake to join the Snowflake Managed Services team. This role involves data platform development, enhancements, and production support across multiple clients. You will be responsible for ensuring stability, performance, and continuous improvement of Snowflake environments. Key Responsibilities Design, build, and optimize Snowflake data pipelines, data models, and transformations. Provide L2/L3 production support for Snowflake jobs, queries, and integrations. Troubleshoot job failures, resolve incidents, and perform root cause analysis (RCA). Monitor warehouses, tune queries, and optimize Snowflake performance and costs. Manage service requests such as user provisioning, access control, and role management. Create and maintain documentation, runbooks, and standard operating procedures. Required Skills & Experience 5+ years of hands-on experience in Snowflake development and support. Strong expertise in SQL, data modeling, and query performance tuning. Experience with ETL/ELT development and orchestration tools (e.g., Azure Data Factory). Familiarity with CI/CD pipelines and scripting (Python or PySpark). Strong troubleshooting and incident resolution skills. Preferred Skills SnowPro Core Certification. Experience with ticketing systems (ServiceNow, Jira). Hands-on experience with Azure cloud services. Knowledge of ITIL processes.

Posted 6 days ago

Apply

9.0 - 14.0 years

18 - 25 Lacs

pune, chennai, bengaluru

Work from Office

Job Description- Position/Role- Snowflake Developer + sql Experience: 9+ Location – Chennai/Bangalore/Mumbai/Pune Need candidate only from Tier 1 companies Client-Hexaware(permanent) Notice-Immediate to currently serving til Oct 1st week

Posted 6 days ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

bengaluru

Work from Office

Role & responsibilities Minimum 5 years of experience in Data & Analytics with a strong focus on data transformation and modeling. Hands-on expertise in DBT (Core and Cloud) including model development, testing, and deployment. Proficiency in Snowflake including data warehousing concepts, performance tuning, and integration with DBT. Design, develop, and maintain DBT models for data transformation across cloud and on-premise environments. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. Optimize and refactor existing DBT pipelines for performance, maintainability, and scalability. Integrate DBT workflows with modern data platforms, particularly Snowflake. Ensure data quality, lineage, and governance across all transformation layers. Participate in code reviews, testing, and deployment of DBT models. Excellent problem-solving skills and ability to work independently and collaboratively. Strong communication skills to effectively Preferred candidate profile Immediate joiner

Posted 6 days ago

Apply

15.0 - 25.0 years

10 - 14 Lacs

gurugram

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Apache Spark Good to have skills : PySpark, Python (Programming Language) Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As an Application Lead, you will be responsible for designing, building, and configuring applications. Acting as the primary point of contact, you will lead the development team, oversee the delivery process, and ensure successful project execution.________________________________________Roles & ResponsibilitiesAct as a Subject Matter Expert (SME) in application developmentLead and manage a development team to achieve performance goalsMake key technical and architectural decisionsCollaborate with cross-functional teams and stakeholdersProvide technical solutions to complex problems across multiple teamsOversee the complete application development lifecycleGather and analyze requirements in coordination with stakeholdersEnsure timely and high-quality delivery of projects________________________________________Professional & Technical SkillsMust-Have SkillsProficiency in Apache SparkStrong understanding of big data processingExperience with data streaming technologiesHands-on experience in building scalable, high-performance applicationsKnowledge of cloud computing platformsMust-Have Additional SkillsPySparkSpark SQL / SQLAWS________________________________________ Additional InformationThis is a full-time, on-site role based in GurugramCandidates must have a minimum of 5 years of hands-on experience with Apache SparkA minimum of 15 years of full-time formal education is mandatory Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 - 11.0 years

7 - 17 Lacs

gurugram

Work from Office

We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.

Posted 1 week ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

bengaluru

Work from Office

Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills : Snowflake/Azure Data Factory Required Skills Snowflake AND SQL

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

noida

Remote

Job Title: Snowflake Solution Architect Experience: 7+ Years Location: 100% Remote Job Type: Contract Experience with Agile-based development Problem-solving skills Proficiency in writing performant SQL Queries/Scripts to generate business insights and drive better organizational decision-making. Job Summary: We are seeking a skilled and detail-oriented Snowflake Developer to design, develop, and maintain scalable data solutions using the Snowflake platform. The ideal candidate will have experience in data warehousing, ETL/ELT processes, and cloud-based data architecture. Key Responsibilities: • Design and implement data pipelines using Snowflake, SQL, and ETL tools. • Develop and optimize complex SQL queries for data extraction and transformation. • Create and manage Snowflake objects such as databases, schemas, tables, views, and stored procedures. • Integrate Snowflake with various data sources and third-party tools. • Monitor and troubleshoot performance issues in Snowflake environments. • Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. • Ensure data quality, security, and governance standards are met. • Automate data workflows and implement best practices for data management. Required Skills and Qualificationxs: • Proficiency in Snowflake SQL and Snowflake architecture. • Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion). • Strong knowledge of cloud platforms (AWS, Azure, or GCP). • Familiarity with data modeling and data warehousing concepts. • Experience with Python, Java, or Shell scripting is a plus. • Understanding of data security, role-based access control, and data sharing in Snowflake. • Excellent problem-solving and communication skills. Preferred Qualifications: • Snowflake certification (e.g., SnowPro Core). • Experience with CI/CD pipelines and DevOps practices. • Knowledge of BI tools like Tableau, Power BI, or Looker. Share resume to: Vishal Kumar Vishal@akaasa.com

Posted 1 week ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

bhubaneswar, kolkata, nagpur

Work from Office

Job description Job Description Snowflake Developer Snowflake Dev requires to develop and maintain data warehouse solutions using Snowflake cloud data platform Responsibilities and Qualifications Develop the solution in accordance with best practices guidelines and policies Bachelors degree in computer science Business Administration or equivalent educational or professional experience and/or qualifications Minimum of 78 years of in-depth experience in Data Management Data Warehouse solutions design and delivery of enterprise level architecture 3 years of strong experience in Snowflake features development and best practices In-depth understanding of Snowflake features including SnowSQL, Snowpipe compute and storage Experience in Python to create migration pipelines and scripting Proficiency in SQL data structures and database design principles Experience in collaborating with cross functional teams in delivery aspects associated with architecture design technology Strong analytical problem solving and multitasking skills as well as communication and interpersonal skills Experience in Technical delivery using Agile Methodologies Experience in performance tuning troubleshooting and resolving issues related to Snowflake Skills Mandatory Skills : Snowflake-Data Science, Snowpark Container services

Posted 1 week ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

noida, uttar pradesh, india

On-site

Databricks Snowflake Data Engineer Design, develop, and maintain scalable ETL pipelines using Databricks to process, transform, and load large datasets into snowflake or other data stores. Collaborate with cross-functional teams, including data architects, analysts, and business stakeholders, to gather data requirements and deliver efficient data solutions. Exposure to Snowflake Warehouse and exposure in SnowSQL. Experience in loading data into snowflake Data model using Pyspark Pipelines. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Key Skill Sets Required Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Databricks, Snowflake, Snowsql & Pyspark is required. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is desirable. Experience developing security models.

Posted 1 week ago

Apply

6.0 - 11.0 years

22 - 35 Lacs

new delhi, pune, bengaluru

Work from Office

Job Description We are looking for an experienced Data Engineer / Technical Lead with mandatory expertise in Snowflake and dbt (data build tool) , along with strong skills in Data Warehousing, Data Modeling, and Data Engineering. The ideal candidate will have hands-on experience in AWS data services (Aurora, RDS, DMS, DynamoDB), Oracle, PL/SQL, and SQL, coupled with leadership experience in customer-facing roles for requirement gathering and solution design. Roles & Responsibilities Key Responsibilities: Technical Leadership & Data Engineering: Lead end-to-end data warehousing projects using Snowflake and dbt (mandatory skills) Design and implement scalable data models, ETL/ELT pipelines, and data architecture in Snowflake Develop and optimize dbt models, transformations, and data workflows for efficient analytics Work with AWS data services - Aurora, RDS, DMS, DynamoDB - for data integration and migration Optimize SQL, PL/SQL queries and database performance (Oracle, PostgreSQL, etc.) Implement CDC (Change Data Capture) and real-time data processing solutions Ensure data governance, quality, and best practices in data engineering Requirement Analysis & Customer Handling: Collaborate with stakeholders to gather, analyze, and document business requirements Handle Change Requests (CRs) and provide technical solutions aligned with business needs Act as a bridge between business teams and technical teams for data-driven decision-making Team Leadership & Mentorship: Guide and mentor junior data engineers in Snowflake, dbt, and modern data stack best practices Ensure adherence to data security, compliance, and scalability principles Mandatory Skills & Experience: 6-13 years of hands-on experience in Data Engineering, Data Warehousing, and Data Modeling Expert-level proficiency in Snowflake and dbt (data build tool) - Mandatory Strong expertise in: AWS Data Services (Aurora, RDS, DMS, DynamoDB) Oracle, PL/SQL, SQL, and database optimization ETL/ELT pipelines & Data Integration Data Architecture & Dimensional Modeling (Star Schema, Kimball, etc.) Experience in customer handling, requirement gathering, and solution design Preferred Qualifications: Certification in Snowflake, dbt, or AWS Data Technologies Experience with Python, PySpark, or other scripting languages Knowledge of CI/CD pipelines for data workflows

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

Technical / Behavioral - You must have proven track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding best practices. You must be an expert in using SQL queries with UNIX shell /Python scripting skills. You should have experience of using AWS services like EC2, S3, EMR to move data onto cloud platform and experience with Enterprise data lake strategy and use of Snowflake (AWS) is a big plus. You must have knowledge to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python and Snowflake SnowSQL is a big plus. You should have experience in SQL performance optimization for large data volumes. You should have proven analytical and problem-solving skills You should be strong in Database and Data Warehousing concepts. You must be able to work independently in a globally distributed environment You must have superior SQL and Data Modeling skills and experience performing deep data analysis on multiple database platforms like Oracle, or Snowflake. You should have experience in working with devops tools for code migrations. Nice to have working experience in Control-M or similar scheduling tools. Nice to have adequate knowledge on DevOps, JIRA and Agile practices. How Your Work Impacts the Organization Cloud Enablement and Data Model ready for Analytics.

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

hyderabad, chennai

Hybrid

Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a DBT professional, you will be responsible for designing, developing, and defining technical architecture for data pipelines and performance scaling in a big data environment. Your expertise in PL/SQL, including queries, procedures, and JOINs, will be crucial for the integration of Talend data and ensuring data quality. You will also be proficient in Snowflake SQL, writing SQL queries against Snowflake, and developing scripts in Unix, Python, etc., to facilitate Extract, Load, and Transform operations. It would be advantageous to have hands-on experience and knowledge of Talend. Candidates with previous experience in PROD support will be given preference. Your role will involve working with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. You will be responsible for data analysis, troubleshooting data issues, and providing technical support to end-users. In this position, you will develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Your problem-solving skills will be put to the test, and you will be expected to have a continuous improvement approach. Possessing Talend/Snowflake Certification would be considered desirable. Excellent SQL coding skills, effective communication, and documentation skills are essential. Knowledge of the Agile delivery process is preferred. You must be analytical, creative, and self-motivated to excel in this role. Collaboration within a global team environment is key, necessitating excellent communication skills. Your contribution to Virtusa will be valued, as teamwork, quality of life, and professional development are the core values the company upholds. By joining a global team of 27,000 professionals, you will have access to exciting projects and opportunities to work with cutting-edge technologies throughout your career. Virtusa provides an environment that nurtures new ideas, fosters excellence, and encourages personal and professional growth.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing and implementing scalable data models using Snowflake to support business intelligence and analytics solutions. This will involve implementing ETL/ELT solutions with complex business transformations and handling end-to-end data warehousing solutions. Additionally, you will be tasked with migrating data from legacy systems to Snowflake systems and writing complex SQL queries to extract, transform, and load data with a focus on high performance and accuracy. Your role will also include optimizing SnowSQL queries for better processing speeds and integrating Snowflake with 3rd party applications. To excel in this role, you should have a strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpipe and Snowpark/Streamlit, as well as familiarity with cloud platforms such as AWS, Azure, or GCP and other cloud-based data technologies, will be beneficial. Knowledge of data modeling concepts like star schema, snowflake schema, and data partitioning is essential. Experience with tools like dbt, Matillion, or Airbyte for data transformation and automation is preferred, along with familiarity with Snowflake's Time Travel, Streams, and Tasks features. Proficiency in data pipeline orchestration using tools like Airflow or Prefect, as well as scripting and automation skills in Python or Java, are required. Additionally, experience with data visualization tools like Tableau, Power BI, QlikView/QlikSense, or Looker will be advantageous.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a skilled Snowflake Data Engineer with over 4 years of experience, you will be responsible for designing and implementing scalable data solutions using Snowflake as a cloud data platform. Your expertise in data modelling, ETL processes, and performance optimization will be crucial for the success of our projects. Your key responsibilities will include developing and optimizing data pipelines on Snowflake, managing databases and schemas, and integrating data from various sources using ETL/ELT tools like Talend, Informatica, or Matillion. You will also be expected to monitor and optimize query performance, design data models based on business requirements, and collaborate with data analysts and other stakeholders to deliver effective data solutions. To excel in this role, you must have at least 2 years of experience working specifically with Snowflake, strong SQL skills, and knowledge of data warehousing concepts. Your familiarity with cloud platforms like AWS, Azure, or GCP, hands-on experience with Snowflake utilities, and proficiency in ETL tools and scripting languages will be essential. Preferred qualifications for this position include SnowPro Certification, experience in Agile/Scrum development methodologies, and knowledge of data governance and compliance standards such as GDPR and HIPAA. Strong problem-solving, analytical, communication, and teamwork skills are also required to succeed in this role. If you are passionate about data engineering, have a proven track record in Snowflake development, and possess the technical and soft skills necessary to thrive in a collaborative environment, we encourage you to apply for this full-time, permanent position with us. The work schedule is day shift, Monday to Friday, and the location is remote. Kindly respond to the following application questions: 1. Do you have SnowPro Certification or equivalent credentials 2. How many years of experience do you have in Agile/Scrum development methodologies 3. How many years of experience do you have as a Snowflake Data Engineer 4. What is your current monthly salary 5. What is your expected salary 6. What is your notice period Join us in revolutionizing data solutions and making a real impact in the industry!,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a Snowflake Data Engineer in Sydney, your main responsibility will be to design and implement cloud data platforms and cloud-related architectures. You will be utilizing Snowflake's native capabilities to design solutions and collaborating within a project team to solve complex problems. Your role will also involve providing thought leadership, strategy, and direction for the Snowflake practice. The ideal candidate for this position must have hands-on experience in Snowflake, including proficiency in SnowSQL and Snowflake stored procedures. Additionally, experience in data modeling, both logical and physical, is required. You should also possess advanced working knowledge of SQL and have experience working with relational databases, including query authoring (SQL) and familiarity with various databases. If you are passionate about working with data and cloud technologies, this role offers an exciting opportunity to contribute to the development of cutting-edge data solutions using Snowflake.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are invited to apply for the position of Snowflake Developer with a minimum experience of 3+ years. This is an immediate joiner position based onsite in Navi Mumbai. As a Snowflake Developer, you will be responsible for handling Snowsql, SQL, and Snowflake-related tasks. If you are interested in this role, please send your resume to abdul.sabur@programming.com or directly message Diksha Sharma, Saloni Rai, or Abdul Sabur for further consideration. We look forward to hearing from you and potentially welcoming you to our team in Navi Mumbai.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

kolkata, ahmedabad, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

kolkata, pune, bengaluru

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies