Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Chennai
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 week ago
5.0 - 7.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.
Posted 1 week ago
4.0 - 7.0 years
7 - 17 Lacs
Gurugram
Hybrid
Job Title: Snowflake Data Engineer Location: Gurgaon Notice Period: Immediate to 30 Days Job Description & Summary We are seeking a Snowflake Data Engineer to join our team and enhance our data solutions. The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that supports analytical and operational requirements. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organization in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistency. Development of efficient stored procedure and understanding of complex business logic to meet system requirements and enhance functionality effectively. Implement best practices in data security, role-based-access control, and data masking within Snowflake to maintain compliance and data governance standards. Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Ability to work independently with minimal supervision, deliver projects within committed timelines, and excel in ambiguous environments with a strong problem-solving approach. Capable of performing root cause analysis (RCA) and implementing changes to address issues and improve processed effectively. Develop comprehensive documentation for data structures, ETL workflows and system processes to ensure transparency and knowledge sharing within the team. Experienced in automating and monitoring Snowflakes tasks, identifying issues, troubleshooting errors, and implementing fixes to ensure seamless operations. Mandatory skill sets: Snowflake, SQL(Intermediate/Advance), Python (Elementary /Intermediate) Good to have skill sets: Azure Data Factory (Elementary /Intermediate), Power BI, JIRA
Posted 1 week ago
5.0 - 10.0 years
18 - 22 Lacs
Bengaluru
Hybrid
We are looking for a candidate seasoned in handling Data Warehousing challenges. Someone who enjoys learning new technologies and does not hesitate to bring his/her perspective to the table. We are looking for someone who is enthusiastic about working in a team, can own and deliver long-term projects to completion. Responsibilities: • Contribute to the teams vision and articulate strategies to have fundamental impact at our massive scale. • You will need a product-focused mindset. It is essential for you to understand business requirements and architect systems that will scale and extend to accommodate those needs. • Diagnose and solve complex problems in distributed systems, develop and document technical solutions and sequence work to make fast, iterative deliveries and improvements. • Build and maintain high-performance, fault-tolerant, and scalable distributed systems that can handle our massive scale. • Provide solid leadership within your very own problem space, through data-driven approach, robust software designs, and effective delegation. • Participate in, or spearhead design reviews with peers and stakeholders to adopt what’s best suited amongst available technologies. • Review code developed by other developers and provided feedback to ensure best practices (e.g., checking code in, accuracy, testability, and efficiency) • Automate cloud infrastructure, services, and observability. • Develop CI/CD pipelines and testing automation (nice to have) • Establish and uphold best engineering practices through thorough code and design reviews and improved processes and tools. • Groom junior engineers through mentoring and delegation • Drive a culture of trust, respect, and inclusion within your team. Minimum Qualifications: • Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience • Min 5 years of experience curating data and hands on experience working on ETL/ELT tools. • Strong overall programming skills, able to write modular, maintainable code, preferably Python & SQL • Strong Data warehousing concepts and SQL skills. Understanding of SQL, dimensional modelling, and at least one relational database • Experience with AWS • Exposure to Snowflake and ingesting data in it or exposure to similar tools • Humble, collaborative, team player, willing to step up and support your colleagues. • Effective communication, problem solving and interpersonal skills. • Commit to grow deeper in the knowledge and understanding of how to improve our existing applications. Preferred Qualifications: • Experience on following tools – DBT, Fivetran, Airflow • Knowledge and experience in Spark, Hadoop 2.0, and its ecosystem. • Experience with automation frameworks/tools like Git, Jenkins Primary Skills Snowflake, Python, SQL, DBT Secondary Skills Fivetran, Airflow,Git, Jenkins, AWS, SQL DBM
Posted 1 week ago
4.0 - 8.0 years
5 - 15 Lacs
Kolkata
Work from Office
Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF
Posted 1 week ago
6.0 - 11.0 years
18 - 22 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Greetings from Primus Global Technology!!! We are hiring for a Snowflake Administrator role with a leading MNC for locations including Bangalore, Chennai & Hyderabad This is a contract position (6 months to 1 year) with a potential for extension based on performance. The selected candidate will be on Primus Global payroll . Experience Required: 6+ years (4.5+ in Snowflake) Salary: 1,50,000 to 1,80,000 per month Contract Duration: 6 - 12 months (Extendable based on performance) Payroll: Primus Global Technology Note: Only candidates with experience as a Snowflake Administrator are eligible for this position. This opening is not for Snowflake Developers. Key Responsibilities: Database Management: Snowflake account/user management, performance tuning, backups Security: Implement RBAC, encryption, and compliance policies Cost Management: Monitor and optimize Snowflake costs ETL & Integration: Support data pipelines and integration with other systems Performance Tuning: Improve query and system performance Support: Troubleshooting and vendor escalation Collaboration: Work with architects and stakeholders, provide system health reports Apply Now! Send your resume to: npandya@primusglobal.com Looking for the immediate joiner Contact: Nidhi P Pandya Sr. Associate Talent Acquisition Primus Global Technology Pvt. Ltd. All THE BEST JOB SEEKERS
Posted 2 weeks ago
4.0 - 6.0 years
7 - 14 Lacs
Udaipur, Kolkata, Jaipur
Hybrid
Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 46 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4-6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 2 weeks ago
7.0 - 8.0 years
8 - 18 Lacs
Pune
Hybrid
Warm Greetings From Dataceria !!! --------------------------------------------------------------------------------- As a Senior SQL Quality Assurance Tester , / Senior ETL tester Immediate joiners send your resume to carrers@dataceria.com: ------------------------------------------------------------------------------- you will be at the forefront of ensuring the quality and reliability of our data systems. You will play a critical role in analysing raw data, building test frameworks, and validating data products using Python. Collaborating closely with data analytics experts and stakeholders, you will contribute to the stability and functionality of our data pipelines. This role offers an exciting opportunity to work with cutting-edge technologies and make a significant impact on our data engineering processes. Responsibilities: Analyse and organise raw data to meet business needs and objectives. Develop, update, and maintain SQL scripts and test cases as applications and business rules evolve, identifying areas for improvement. Delegate tasks effectively, ensuring timely and accurate completion of deliverables. Partner with stakeholders, including Product, Data and Design teams, to address technical issues and support data engineering needs. Perform root cause analysis of existing models and propose effective solutions for improvement. Serve as a point of contact for cross-functional teams, ensuring the smooth integration of quality assurance practices into workflows. Demonstrate strong time management skills. Lead and mentor a team of SQL testers and data professionals, fostering a collaborative and high-performing environment. What we're looking for in our applicants: +7 years of relevant experience in data engineering and testing roles and team management. Proven experience leading and mentoring teams, with strong organizational and interpersonal skills. Proficiency in SQL testing , with a focus on Snowflake, and experience with Microsoft SQL Server . Advanced skills in writing complex SQL queries . At least intermediate level proficiency in Python programming. Experienced with Python libraries for testing and ensuring data quality. Hands-on experience with Git Version Control System (VCS). Working knowledge of cloud computing architecture(Azure ADO) Experience with data pipeline and workflow management tools like Airflow. Ability to perform root cause analysis of existing models and propose effective solutions. Strong interpersonal skills and a collaborative team player -------------------------------------------------------------------------------------------------------- Nice to have 1.ETL testing broader Knowledge and experience 2.Confluence 3.Strong in SQL queries. 4.Data warehouse 5. Snowflake 6.Cloud platform(Azure ADO) ------------------------------------------------------------------------------------------------------- Joining : Immediate: Work location: Pune (hybrid). Open Positions : 1 -Senior SQL Quality Assurance Tester , If interested please share your updated resume to carrers@dataceria.com : -------------------------------------------------------------------------------------------------------- Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com
Posted 2 weeks ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Role & responsibilities 1. Senior Snowflake Developer-Experience- 8+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office Notice Period- immediate to 15 days CTC: 37 Lakhs JD: Summary ThoughtFocus is looking for a senior snowflake developer for our NYC and London based financial services client operating in Public/Private Loans, CLOs, and Long/Short Credit. You will play a pivotal role in accomplishing the successful delivery of our strategic initiatives. You will be responsible for developing solutions using technologies like Snowflake, Coalesce & Fivetran. Location: Bengaluru, India Requirements: IT experience of 8+ years with a minimum of 3+ years of experience as a Snowflake Developer. Design, develop, and optimize Snowflake objects such as databases, schemas, tables, views, and store procedures. Expertise in Snowflake utilities such as Snow SQL, Snow Pipe, Stages, Tables, Zero Copy Clone, Streams and Tasks, Time travel, data sharing, data governance, and row access policy. Experience in migrating data from Azure Cloud to Snowflake, ensuring data integrity, performance optimization, and minimal disruption to business operations. Experience in Snow pipe for continuous loading and unloading data into Snowflake tables. Experience in using the COPY, PUT, LIST, GET, and REMOVE commands. Experience in Azure integration and data loading (Batch and Bulk loading). Experience in creating the System Roles & Custom Roles and Role Hierarchy in Snowflake. Expertise in masking policy and network policy in Snowflake. Responsible for designing and maintaining ETL tools (Coalesce & Fivetran) that include extracting the data from the MS-SQL Server database and transforming the data as per Business requirements. Extensive experience in writing complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes, and Exception Handling using MS-SQL Server (TSQL). Effective communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. What's on offer Competitive and above market salary. Hybrid work schedule. Opportunity to get exposure and technology experience in global financial markets. Education Bachelor's degree in Computer Science / IT / Finance / Economics or equivalent. 2. Please find the below Lead Snowflake JD Location: Bangalore( UK Shift) 3 days work from Office CTC:45 Lakhs 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. • Strong communication skills and interpersonal skills. • Ability to influence and collaborate effectively with cross-functional teams. • Exceptional problem-solving and decision-making abilities. • Experience in working in an agile development environment. • Experience working in a fast-paced, dynamic environment. • Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, functions, and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL, Snowflake 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools Should be able to define and enforce Best Practices Good communication skills to be able to discuss and deliver requirements effectively with the client Preferred candidate profile
Posted 2 weeks ago
5.0 - 9.0 years
5 - 9 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Job description Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Pune, Chennai
Hybrid
Snowflake + SQL 5 to 15 Yrs Pune /Chennai If shortlisted candidate should be available for F2F interview Pune & Chennai
Posted 2 weeks ago
7.0 - 8.0 years
8 - 18 Lacs
Bengaluru
Hybrid
Warm Greetings From Dataceria !!! --------------------------------------------------------------------------------- As a Senior SQL Quality Assurance Tester , / Senior ETL tester ------------------------------------------------------------------------------- you will be at the forefront of ensuring the quality and reliability of our data systems. You will play a critical role in analysing raw data, building test frameworks, and validating data products using Python. Collaborating closely with data analytics experts and stakeholders, you will contribute to the stability and functionality of our data pipelines. This role offers an exciting opportunity to work with cutting-edge technologies and make a significant impact on our data engineering processes. Responsibilities: Analyse and organise raw data to meet business needs and objectives. Develop, update, and maintain SQL scripts and test cases as applications and business rules evolve, identifying areas for improvement. Delegate tasks effectively, ensuring timely and accurate completion of deliverables. Partner with stakeholders, including Product, Data and Design teams, to address technical issues and support data engineering needs. Perform root cause analysis of existing models and propose effective solutions for improvement. Serve as a point of contact for cross-functional teams, ensuring the smooth integration of quality assurance practices into workflows. Demonstrate strong time management skills. Lead and mentor a team of SQL testers and data professionals, fostering a collaborative and high-performing environment. What we're looking for in our applicants: +7 years of relevant experience in data engineering and testing roles and team management. Proven experience leading and mentoring teams, with strong organizational and interpersonal skills. Proficiency in SQL testing , with a focus on Snowflake, and experience with Microsoft SQL Server . Advanced skills in writing complex SQL queries . At least intermediate level proficiency in Python programming. Experienced with Python libraries for testing and ensuring data quality. Hands-on experience with Git Version Control System (VCS). Working knowledge of cloud computing architecture(Azure ADO) Experience with data pipeline and workflow management tools like Airflow. Ability to perform root cause analysis of existing models and propose effective solutions. Strong interpersonal skills and a collaborative team player -------------------------------------------------------------------------------------------------------- Nice to have 1.ETL testing broader Knowledge and experience 2.Confluence 3.Strong in SQL queries. 4.Data warehouse 5. Snowflake 6.Cloud platform(Azure ADO) ------------------------------------------------------------------------------------------------------- Joining : Immediate: Work location : Bangalore (Hybrid) Open Positions : 1 -Senior SQL Quality Assurance Tester , If interested please share your updated resume to carrers@dataceria.com : If you are from chennai can also apply --------------------------------------------------------------------------------------------------------
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 weeks ago
5.0 - 10.0 years
9 - 19 Lacs
Chennai
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 weeks ago
10.0 - 12.0 years
20 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role: Snowflake + SQL Location: Job Description: We are seeking a skilled Snowflake Developer to design, develop, and manage scalable data solutions using the Snowflake cloud data platform. The ideal candidate will have deep experience in data warehousing, SQL development, ETL processes, and cloud data architecture. Understanding of Control M and Tableau will be an added advantage. 1. Snowflake (Cloud Data Warehouse): 1. Good understanding of Snowflake ECO system 2. Good experience on Data modeling and Dimensional Modeling and techniques and will be able to drive the Technical discussions with IT & Business and Architects / Data Modelers 3. Need to guide the team and provide the technical solutions 4. Need to prepare the technical solution and architectures as part of project requirements 5. Virtual Warehouse (Compute) - Good Understanding of Warehouse creation & manage 6. Data Modeling & Storage - Strong knowledge on LDM/PDM design 7. Data Loading/Unloading and Data Sharing- Should have good knowledge 8. SnowSQL (CLI)- Expertise and excellent understanding of Snowflake Internals and Integration 9. Strong hands on experience on SNOWSQL queries and Stored procedures and performance tuning techniques 10. Good knowledge on SNOWSQL Scripts preparation the data validation and Audits 11. SnowPipe Good knowledge of Snow pipe implementation 12. Expertise and excellent understanding of S3 - Internal data copy/movement 13. Good knowledge on Security & Readers and Consumers accounts 14. Good knowledge and hands on experience on Query performance tuning implementation techniques 2. SQL Knowledge: 1. Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions 2. Strong knowledge on stored procedures 3. Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse
Posted 3 weeks ago
3 - 8 years
15 - 25 Lacs
Bhubaneshwar, Bengaluru, Hyderabad
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
3 - 8 years
15 - 25 Lacs
Bengaluru, Hyderabad, Noida
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
5 - 10 years
0 Lacs
Mysore, Bengaluru, Kochi
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies Snowflake & Python - Data Engineer/Architect in Bangalore, Karnataka on 12th April [Saturday] 2025 - Snowflake/ Python/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 12th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 12th April [Saturday] 2025 Experience 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hotel Grand Mercure Bangalore 12th Main Rd, 3rd Block, Koramangala 3 Block, Koramangala, Bengaluru, Karnataka 560034 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
5 - 10 years
0 Lacs
Pune, Nagpur, Mumbai (All Areas)
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Pune, Maharashtra on 5th April [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Pune, Maharashtra on 5th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience – 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hexaware Technologies Limited, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pimpri-Chinchwad, Pune, Maharashtra 411057 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
2 - 7 years
6 - 16 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Exciting Snowflake developer Job Opportunity at Infosys! We are looking for skilled Snowflake Developers to join our dynamic team PAN INDIA. If you have a passion for technology and a minimum of 2 to 9 years of hands-on experience in snowflake application development, this is your chance to make an impact. At Infosys, we value innovation, collaboration, and diversity. We believe that a diverse workforce drives creativity and fosters a richer company culture. Therefore, we strongly encourage applications from all genders and backgrounds. Ready to take your career to the next level? Join us in shaping the future of technology. Visit our careers page for more details on how to apply.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2