Jobs
Interviews

197 Snowpipe Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 25 Lacs

Chennai

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 3 months ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 23 June 2025 Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.

Posted 3 months ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Kolkata

Work from Office

Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF

Posted 3 months ago

Apply

2.0 - 7.0 years

2 - 7 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements- Btech/BE, Mtech/ME, Bsc/Msc, BCA/MCA Location- PAN INDIA

Posted 3 months ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Location- PAN INDIA

Posted 3 months ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.

Posted 3 months ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2393_JOB Date Opened 09/11/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake Engineer - Database Administraion City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 3 months ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 3 months ago

Apply

5.0 - 10.0 years

17 - 30 Lacs

Pune, Bengaluru

Hybrid

Key Responsibilities: Data modeling and design : Create data models and designs for data warehousing solutions. ETL/ELT development : Develop ETL/ELT pipelines using Snowflake's data loading and transformation capabilities. Data quality and integrity : Ensure data quality and integrity by implementing data validation, data cleansing, and data normalization techniques. Team leadership : Lead a team of developers, provide technical guidance, and ensure timely delivery of projects. Collaboration with stakeholders : Collaborate with stakeholders to understand business requirements, provide technical solutions, and ensure that solutions meet business needs. Performance optimization : Optimize performance of data warehousing solutions by implementing best practices, indexing, and caching. Security and governance : Ensure security and governance of data warehousing solutions by implementing access controls, auditing, and data masking. Requirements: 5+ years of experience : Experience in data warehousing, ETL/ELT development, and data modeling. Snowflake experience : Strong experience in Snowflake, including data loading, data transformation, and data querying. Data modeling skills : Strong data modeling skills, including experience with data modeling tools such as Erwin or PowerDesigner. ETL/ELT development skills : Strong ETL/ELT development skills, including experience with ETL/ELT tools such as Informatica or Talend. Leadership skills : Strong leadership skills, including experience in leading teams and managing projects. Communication skills : Strong communication skills, including experience in collaborating with stakeholders and communicating technical solutions. Bachelor's degree : Bachelor's degree in Computer Science, Information Technology, or related field.

Posted 3 months ago

Apply

5.0 - 10.0 years

0 - 1 Lacs

Ahmedabad, Chennai, Bengaluru

Hybrid

Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.

Posted 3 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)

Posted 3 months ago

Apply

4.0 - 9.0 years

9 - 18 Lacs

Bengaluru

Hybrid

About Us Shravas Technologies, founded in 2016, is an IT services company based out of Bangalore, India. The company specializes in Software QA and related services such as Data Mining, Analytics, and Visualization. Job Title Snowflake Developer (4 to 9 Years Experience) Location Bangalore Type Full-time, Hybrid Job Summary We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate should have hands-on expertise in building and optimizing scalable data pipelines and working with Snowflake data warehouse solutions. This role involves working closely with business analysts, data scientists, and other developers to deliver reliable, secure, and high-performance data solutions. Key Responsibilities - Design, develop, and implement Snowflake-based data solutions. - Create and maintain scalable ETL/ELT pipelines using tools such as SQL, Python, dbt, Airflow, or similar - Develop data models and schema design optimized for performance and usability. - Write and optimize complex SQL queries for data transformation and extraction. - Integrate Snowflake with other systems like MySQL, SQL Server, AWS (S3, Lambda), Azure, or GCP using APIs or connectors. - Manage Snowflake security (roles, users, access control). - Monitor data pipeline performance and troubleshoot issues. - Participate in code reviews, unit testing, and documentation. Required Skills and Qualifications - Bachelor's degree in Computer Science, Information Systems, or a related field. - 4 to 9 years of experience in data engineering or d - Proficiency in SQL and performance tuning. - Experience with data pipeline and ETL tools (e.g., Informatica, Talend, dbt, Apache Airflow). - Familiarity with cloud platforms (AWS, Azure, or GCP). - Understanding of data warehousing concepts and best practices. - Knowledge of version control systems like Git. Preferred Skills - Experience with Python or Scala for data processing. - Familiarity with tools like Stitch, Fivetran, or Matillion. - Exposure to CI/CD pipelines for data projects. - Knowledge of data governance and security compliance. - Understanding of financial and economic data trends is a plus. Soft Skills - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and collaboratively in a team environment. - Detail-oriented with a strong focus on quality and accuracy. Reporting To Lead Data Engineer / Chief Data Officer

Posted 3 months ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.

Posted 3 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Noida, Bhubaneswar, Gurugram

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 3 months ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 3 months ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Kochi, Chennai, Thiruvananthapuram

Work from Office

Snowflake Data Warehouse Development : Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflakes features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development : Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark : Leverage PySpark for data transformations within the Snowflake environment . Implement complex data cleansing , enrichment , and validation processes using PySpark to ensure the highest data quality. Collaboration : Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.

Posted 3 months ago

Apply

4 - 8 years

7 - 14 Lacs

Pune

Hybrid

Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com

Posted 4 months ago

Apply

8 - 10 years

10 - 16 Lacs

Hyderabad

Work from Office

Required Skills: 8+ years of overall IT experience, with 4+ Snowflake development . Strong experience in Azure Data Platform tools including: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure Functions Proficient in Snowflake architecture , virtual warehouses , Snowpipe , and Streams & Tasks . Solid experience with SQL , Python , and performance tuning techniques. Understanding of data warehousing concepts, dimensional modeling , and metadata management . Experience integrating Snowflake with Power BI , Tableau , or similar BI tools. Familiar with CI/CD tools (Azure DevOps, Git) for pipeline deployments. Key Responsibilities: Design, develop, and optimize Snowflake-based data pipelines and data warehouse solutions . Implement data ingestion processes from diverse sources using Azure Data Factory (ADF) , Data Lake , and Event Hub . Develop scalable ELT/ETL workflows with Snowflake and Azure components. Create and optimize SQL scripts , stored procedures , and user-defined functions within Snowflake. Develop and maintain data models (dimensional and normalized) supporting business needs. Implement role-based access controls , data masking , and governance within Snowflake. Monitor performance and troubleshoot issues in Snowflake and Azure data pipelines. Work closely with business analysts, data scientists, and engineering teams to ensure data quality and delivery. Use DevOps practices for version control, CI/CD, and deployment automation.

Posted 4 months ago

Apply

3 - 6 years

5 - 12 Lacs

Hyderabad

Work from Office

Role and Responsibilities: Establish, configure, and manage Git repositories to support version control and collaboration. Develop and troubleshoot procedures, views, and complex PL/SQL queries, ensuring effective integration and functionality within Git environments. Experience with tools like SQL Developer, TOAD, or similar. Develop complex SQL queries, scripts, and stored procedures to support application and reporting needs. Writing SQL queries to extract, manipulate, and analyze data from databases. Optimizing queries to improve performance and reduce execution time. Creating and maintaining database tables, views, indexes, and stored procedures. Design, implement, and optimize relational database schemas, tables, and indexes. Create and maintain database triggers, functions, and packages using PL/SQL. Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Comprehensive expertise in SQL, Snowflake and version control tools such as Git and SVN. Minimum of 3 years of experience in application support and maintenance. Proven ability to communicate complex technical concepts effectively Demonstrated ability to exhibit client empathy and ensure customer satisfaction with issue resolution. Strong written and verbal communication skills. Adept at presenting intricate technical information in an accessible manner to varied audiences. Ability to thrive in a fast-paced, dynamic environment with high levels of ambiguity. Practical problem-solving skills focused on resolving immediate customer issues while planning for long-term solutions. Highly organized and process-oriented, with a proven track record of driving issue resolution by collaborating across multiple teams. Strong interpersonal skills with a customer-centric approach, maintaining patience and composure under pressure during real-time issue resolution. Working knowledge of DSP/SSP platforms is an added advantage. Open to work in night shifts in a 24/7 project.

Posted 4 months ago

Apply

3 - 8 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843

Posted 4 months ago

Apply

5 - 10 years

15 - 25 Lacs

Bengaluru

Work from Office

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Qualification : Any Graduate or Above Relevant Experience : 5 to 10 Years Required Technical Skill Set (Skill Name) : Snowflake, Azure Managed Services platforms Keywords Must-Have Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake Location : PAN INDIA CTC Range : 25 LPA-40 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. Ipooja.singh@blackwhite.in I www.blackwhite.in

Posted 4 months ago

Apply

5 - 7 years

5 - 15 Lacs

Pune, Chennai, Bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 4 months ago

Apply

6 - 11 years

15 - 30 Lacs

Bengaluru

Hybrid

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Required skills and qualifications : Snowflake SQL PYTHON Qualification : Any Graduate or Above Relevant Experience : 4 to 12yrs Location : BLR/HYD/PUNE/BBSR CTC Range : 10 to 35 LPA Notice period : Any Mode of Interview : Virtual Mode of Work : In Office Nithyashree Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA Nithyashree@blackwhite.in I www.blackwhite.in

Posted 4 months ago

Apply

2 - 6 years

6 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech

Posted 4 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies