Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Chennai
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 1 week ago
5.0 - 7.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.
Posted 1 week ago
4.0 - 7.0 years
7 - 17 Lacs
Gurugram
Hybrid
Job Title: Snowflake Data Engineer Location: Gurgaon Notice Period: Immediate to 30 Days Job Description & Summary We are seeking a Snowflake Data Engineer to join our team and enhance our data solutions. The ideal candidate will be responsible for designing and maintaining efficient data structures, optimizing data storage and retrieval within Snowflake, and ensuring data integrity across various data sources. This role involves collaboration with cross-functional teams to deliver high-quality data solutions that supports analytical and operational requirements. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organization in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Build and maintain robust ETL pipelines to integrate data from multiple sources into Snowflake, ensuring data integrity and consistency. Development of efficient stored procedure and understanding of complex business logic to meet system requirements and enhance functionality effectively. Implement best practices in data security, role-based-access control, and data masking within Snowflake to maintain compliance and data governance standards. Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Ability to work independently with minimal supervision, deliver projects within committed timelines, and excel in ambiguous environments with a strong problem-solving approach. Capable of performing root cause analysis (RCA) and implementing changes to address issues and improve processed effectively. Develop comprehensive documentation for data structures, ETL workflows and system processes to ensure transparency and knowledge sharing within the team. Experienced in automating and monitoring Snowflakes tasks, identifying issues, troubleshooting errors, and implementing fixes to ensure seamless operations. Mandatory skill sets: Snowflake, SQL(Intermediate/Advance), Python (Elementary /Intermediate) Good to have skill sets: Azure Data Factory (Elementary /Intermediate), Power BI, JIRA
Posted 1 week ago
5.0 - 10.0 years
18 - 22 Lacs
Bengaluru
Hybrid
We are looking for a candidate seasoned in handling Data Warehousing challenges. Someone who enjoys learning new technologies and does not hesitate to bring his/her perspective to the table. We are looking for someone who is enthusiastic about working in a team, can own and deliver long-term projects to completion. Responsibilities: • Contribute to the teams vision and articulate strategies to have fundamental impact at our massive scale. • You will need a product-focused mindset. It is essential for you to understand business requirements and architect systems that will scale and extend to accommodate those needs. • Diagnose and solve complex problems in distributed systems, develop and document technical solutions and sequence work to make fast, iterative deliveries and improvements. • Build and maintain high-performance, fault-tolerant, and scalable distributed systems that can handle our massive scale. • Provide solid leadership within your very own problem space, through data-driven approach, robust software designs, and effective delegation. • Participate in, or spearhead design reviews with peers and stakeholders to adopt what’s best suited amongst available technologies. • Review code developed by other developers and provided feedback to ensure best practices (e.g., checking code in, accuracy, testability, and efficiency) • Automate cloud infrastructure, services, and observability. • Develop CI/CD pipelines and testing automation (nice to have) • Establish and uphold best engineering practices through thorough code and design reviews and improved processes and tools. • Groom junior engineers through mentoring and delegation • Drive a culture of trust, respect, and inclusion within your team. Minimum Qualifications: • Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience • Min 5 years of experience curating data and hands on experience working on ETL/ELT tools. • Strong overall programming skills, able to write modular, maintainable code, preferably Python & SQL • Strong Data warehousing concepts and SQL skills. Understanding of SQL, dimensional modelling, and at least one relational database • Experience with AWS • Exposure to Snowflake and ingesting data in it or exposure to similar tools • Humble, collaborative, team player, willing to step up and support your colleagues. • Effective communication, problem solving and interpersonal skills. • Commit to grow deeper in the knowledge and understanding of how to improve our existing applications. Preferred Qualifications: • Experience on following tools – DBT, Fivetran, Airflow • Knowledge and experience in Spark, Hadoop 2.0, and its ecosystem. • Experience with automation frameworks/tools like Git, Jenkins Primary Skills Snowflake, Python, SQL, DBT Secondary Skills Fivetran, Airflow,Git, Jenkins, AWS, SQL DBM
Posted 1 week ago
4.0 - 8.0 years
5 - 15 Lacs
Kolkata
Work from Office
Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF
Posted 1 week ago
6.0 - 11.0 years
18 - 22 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Greetings from Primus Global Technology!!! We are hiring for a Snowflake Administrator role with a leading MNC for locations including Bangalore, Chennai & Hyderabad This is a contract position (6 months to 1 year) with a potential for extension based on performance. The selected candidate will be on Primus Global payroll . Experience Required: 6+ years (4.5+ in Snowflake) Salary: 1,50,000 to 1,80,000 per month Contract Duration: 6 - 12 months (Extendable based on performance) Payroll: Primus Global Technology Note: Only candidates with experience as a Snowflake Administrator are eligible for this position. This opening is not for Snowflake Developers. Key Responsibilities: Database Management: Snowflake account/user management, performance tuning, backups Security: Implement RBAC, encryption, and compliance policies Cost Management: Monitor and optimize Snowflake costs ETL & Integration: Support data pipelines and integration with other systems Performance Tuning: Improve query and system performance Support: Troubleshooting and vendor escalation Collaboration: Work with architects and stakeholders, provide system health reports Apply Now! Send your resume to: npandya@primusglobal.com Looking for the immediate joiner Contact: Nidhi P Pandya Sr. Associate Talent Acquisition Primus Global Technology Pvt. Ltd. All THE BEST JOB SEEKERS
Posted 2 weeks ago
4.0 - 6.0 years
7 - 14 Lacs
Udaipur, Kolkata, Jaipur
Hybrid
Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 46 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4-6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 2 weeks ago
7.0 - 8.0 years
8 - 18 Lacs
Pune
Hybrid
Warm Greetings From Dataceria !!! --------------------------------------------------------------------------------- As a Senior SQL Quality Assurance Tester , / Senior ETL tester Immediate joiners send your resume to carrers@dataceria.com: ------------------------------------------------------------------------------- you will be at the forefront of ensuring the quality and reliability of our data systems. You will play a critical role in analysing raw data, building test frameworks, and validating data products using Python. Collaborating closely with data analytics experts and stakeholders, you will contribute to the stability and functionality of our data pipelines. This role offers an exciting opportunity to work with cutting-edge technologies and make a significant impact on our data engineering processes. Responsibilities: Analyse and organise raw data to meet business needs and objectives. Develop, update, and maintain SQL scripts and test cases as applications and business rules evolve, identifying areas for improvement. Delegate tasks effectively, ensuring timely and accurate completion of deliverables. Partner with stakeholders, including Product, Data and Design teams, to address technical issues and support data engineering needs. Perform root cause analysis of existing models and propose effective solutions for improvement. Serve as a point of contact for cross-functional teams, ensuring the smooth integration of quality assurance practices into workflows. Demonstrate strong time management skills. Lead and mentor a team of SQL testers and data professionals, fostering a collaborative and high-performing environment. What we're looking for in our applicants: +7 years of relevant experience in data engineering and testing roles and team management. Proven experience leading and mentoring teams, with strong organizational and interpersonal skills. Proficiency in SQL testing , with a focus on Snowflake, and experience with Microsoft SQL Server . Advanced skills in writing complex SQL queries . At least intermediate level proficiency in Python programming. Experienced with Python libraries for testing and ensuring data quality. Hands-on experience with Git Version Control System (VCS). Working knowledge of cloud computing architecture(Azure ADO) Experience with data pipeline and workflow management tools like Airflow. Ability to perform root cause analysis of existing models and propose effective solutions. Strong interpersonal skills and a collaborative team player -------------------------------------------------------------------------------------------------------- Nice to have 1.ETL testing broader Knowledge and experience 2.Confluence 3.Strong in SQL queries. 4.Data warehouse 5. Snowflake 6.Cloud platform(Azure ADO) ------------------------------------------------------------------------------------------------------- Joining : Immediate: Work location: Pune (hybrid). Open Positions : 1 -Senior SQL Quality Assurance Tester , If interested please share your updated resume to carrers@dataceria.com : -------------------------------------------------------------------------------------------------------- Dataceria Software Solutions Pvt Ltd Follow our LinkedIn for more job openings : https://www.linkedin.com/company/dataceria/ Email : careers@dataceria.com
Posted 2 weeks ago
8.0 - 13.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Role & responsibilities 1. Senior Snowflake Developer-Experience- 8+ Years Location- Bangalore- Mahadevapaura( Hybrid- UK shift)- 3 days office Notice Period- immediate to 15 days CTC: 37 Lakhs JD: Summary ThoughtFocus is looking for a senior snowflake developer for our NYC and London based financial services client operating in Public/Private Loans, CLOs, and Long/Short Credit. You will play a pivotal role in accomplishing the successful delivery of our strategic initiatives. You will be responsible for developing solutions using technologies like Snowflake, Coalesce & Fivetran. Location: Bengaluru, India Requirements: IT experience of 8+ years with a minimum of 3+ years of experience as a Snowflake Developer. Design, develop, and optimize Snowflake objects such as databases, schemas, tables, views, and store procedures. Expertise in Snowflake utilities such as Snow SQL, Snow Pipe, Stages, Tables, Zero Copy Clone, Streams and Tasks, Time travel, data sharing, data governance, and row access policy. Experience in migrating data from Azure Cloud to Snowflake, ensuring data integrity, performance optimization, and minimal disruption to business operations. Experience in Snow pipe for continuous loading and unloading data into Snowflake tables. Experience in using the COPY, PUT, LIST, GET, and REMOVE commands. Experience in Azure integration and data loading (Batch and Bulk loading). Experience in creating the System Roles & Custom Roles and Role Hierarchy in Snowflake. Expertise in masking policy and network policy in Snowflake. Responsible for designing and maintaining ETL tools (Coalesce & Fivetran) that include extracting the data from the MS-SQL Server database and transforming the data as per Business requirements. Extensive experience in writing complex SQL Queries, Stored Procedures, Views, Functions, Triggers, Indexes, and Exception Handling using MS-SQL Server (TSQL). Effective communication skills and interpersonal skills. Ability to influence and collaborate effectively with cross-functional teams. Exceptional problem-solving abilities. Experience in working in an agile development environment. Experience working in a fast-paced, dynamic environment. Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. What's on offer Competitive and above market salary. Hybrid work schedule. Opportunity to get exposure and technology experience in global financial markets. Education Bachelor's degree in Computer Science / IT / Finance / Economics or equivalent. 2. Please find the below Lead Snowflake JD Location: Bangalore( UK Shift) 3 days work from Office CTC:45 Lakhs 13+ years of IT experience, a proven track record of successfully leading a development team to deliver SQL and Snowflake complex projects. • Strong communication skills and interpersonal skills. • Ability to influence and collaborate effectively with cross-functional teams. • Exceptional problem-solving and decision-making abilities. • Experience in working in an agile development environment. • Experience working in a fast-paced, dynamic environment. • Good to have some prior experience or high-level understanding of hedge funds, private debt, and private equity. SQL, Snowflake development via expertise in all aspects related to analysis, understanding the business requirement, taking an optimized approach to developing code, and ensuring data quality in outputs presented Advanced SQL to create and optimize stored procedures, functions, and performance optimize Approach analytically to translate data into last mile SQL objects for consumption in reports and dashboards 5+ years of experience in MS SQL, Snowflake 3+ years of experience in teams where SQL outputs were consumed via Power BI / Tableau / SSRS and similar tools Should be able to define and enforce Best Practices Good communication skills to be able to discuss and deliver requirements effectively with the client Preferred candidate profile
Posted 2 weeks ago
5.0 - 9.0 years
5 - 9 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Job description Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse. - Good To Have Skills: Experience in other data warehousing technologies such as Redshift, BigQuery, or Azure Synapse Analytics. - Experience in designing and developing applications using Snowflake Data Warehouse. - Strong understanding of data warehousing concepts and best practices. - Experience in working with cross-functional teams to deliver high-quality solutions. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Pune, Chennai
Hybrid
Snowflake + SQL 5 to 15 Yrs Pune /Chennai If shortlisted candidate should be available for F2F interview Pune & Chennai
Posted 2 weeks ago
7.0 - 8.0 years
8 - 18 Lacs
Bengaluru
Hybrid
Warm Greetings From Dataceria !!! --------------------------------------------------------------------------------- As a Senior SQL Quality Assurance Tester , / Senior ETL tester ------------------------------------------------------------------------------- you will be at the forefront of ensuring the quality and reliability of our data systems. You will play a critical role in analysing raw data, building test frameworks, and validating data products using Python. Collaborating closely with data analytics experts and stakeholders, you will contribute to the stability and functionality of our data pipelines. This role offers an exciting opportunity to work with cutting-edge technologies and make a significant impact on our data engineering processes. Responsibilities: Analyse and organise raw data to meet business needs and objectives. Develop, update, and maintain SQL scripts and test cases as applications and business rules evolve, identifying areas for improvement. Delegate tasks effectively, ensuring timely and accurate completion of deliverables. Partner with stakeholders, including Product, Data and Design teams, to address technical issues and support data engineering needs. Perform root cause analysis of existing models and propose effective solutions for improvement. Serve as a point of contact for cross-functional teams, ensuring the smooth integration of quality assurance practices into workflows. Demonstrate strong time management skills. Lead and mentor a team of SQL testers and data professionals, fostering a collaborative and high-performing environment. What we're looking for in our applicants: +7 years of relevant experience in data engineering and testing roles and team management. Proven experience leading and mentoring teams, with strong organizational and interpersonal skills. Proficiency in SQL testing , with a focus on Snowflake, and experience with Microsoft SQL Server . Advanced skills in writing complex SQL queries . At least intermediate level proficiency in Python programming. Experienced with Python libraries for testing and ensuring data quality. Hands-on experience with Git Version Control System (VCS). Working knowledge of cloud computing architecture(Azure ADO) Experience with data pipeline and workflow management tools like Airflow. Ability to perform root cause analysis of existing models and propose effective solutions. Strong interpersonal skills and a collaborative team player -------------------------------------------------------------------------------------------------------- Nice to have 1.ETL testing broader Knowledge and experience 2.Confluence 3.Strong in SQL queries. 4.Data warehouse 5. Snowflake 6.Cloud platform(Azure ADO) ------------------------------------------------------------------------------------------------------- Joining : Immediate: Work location : Bangalore (Hybrid) Open Positions : 1 -Senior SQL Quality Assurance Tester , If interested please share your updated resume to carrers@dataceria.com : If you are from chennai can also apply --------------------------------------------------------------------------------------------------------
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 weeks ago
5.0 - 10.0 years
9 - 19 Lacs
Chennai
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 weeks ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 weeks ago
10.0 - 12.0 years
20 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role: Snowflake + SQL Location: Job Description: We are seeking a skilled Snowflake Developer to design, develop, and manage scalable data solutions using the Snowflake cloud data platform. The ideal candidate will have deep experience in data warehousing, SQL development, ETL processes, and cloud data architecture. Understanding of Control M and Tableau will be an added advantage. 1. Snowflake (Cloud Data Warehouse): 1. Good understanding of Snowflake ECO system 2. Good experience on Data modeling and Dimensional Modeling and techniques and will be able to drive the Technical discussions with IT & Business and Architects / Data Modelers 3. Need to guide the team and provide the technical solutions 4. Need to prepare the technical solution and architectures as part of project requirements 5. Virtual Warehouse (Compute) - Good Understanding of Warehouse creation & manage 6. Data Modeling & Storage - Strong knowledge on LDM/PDM design 7. Data Loading/Unloading and Data Sharing- Should have good knowledge 8. SnowSQL (CLI)- Expertise and excellent understanding of Snowflake Internals and Integration 9. Strong hands on experience on SNOWSQL queries and Stored procedures and performance tuning techniques 10. Good knowledge on SNOWSQL Scripts preparation the data validation and Audits 11. SnowPipe Good knowledge of Snow pipe implementation 12. Expertise and excellent understanding of S3 - Internal data copy/movement 13. Good knowledge on Security & Readers and Consumers accounts 14. Good knowledge and hands on experience on Query performance tuning implementation techniques 2. SQL Knowledge: 1. Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions 2. Strong knowledge on stored procedures 3. Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse
Posted 3 weeks ago
4 - 8 years
7 - 14 Lacs
Pune
Hybrid
Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com
Posted 1 month ago
3 - 6 years
5 - 12 Lacs
Hyderabad
Work from Office
Role and Responsibilities: Establish, configure, and manage Git repositories to support version control and collaboration. Develop and troubleshoot procedures, views, and complex PL/SQL queries, ensuring effective integration and functionality within Git environments. Experience with tools like SQL Developer, TOAD, or similar. Develop complex SQL queries, scripts, and stored procedures to support application and reporting needs. Writing SQL queries to extract, manipulate, and analyze data from databases. Optimizing queries to improve performance and reduce execution time. Creating and maintaining database tables, views, indexes, and stored procedures. Design, implement, and optimize relational database schemas, tables, and indexes. Create and maintain database triggers, functions, and packages using PL/SQL. Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Comprehensive expertise in SQL, Snowflake and version control tools such as Git and SVN. Minimum of 3 years of experience in application support and maintenance. Proven ability to communicate complex technical concepts effectively Demonstrated ability to exhibit client empathy and ensure customer satisfaction with issue resolution. Strong written and verbal communication skills. Adept at presenting intricate technical information in an accessible manner to varied audiences. Ability to thrive in a fast-paced, dynamic environment with high levels of ambiguity. Practical problem-solving skills focused on resolving immediate customer issues while planning for long-term solutions. Highly organized and process-oriented, with a proven track record of driving issue resolution by collaborating across multiple teams. Strong interpersonal skills with a customer-centric approach, maintaining patience and composure under pressure during real-time issue resolution. Working knowledge of DSP/SSP platforms is an added advantage. Open to work in night shifts in a 24/7 project.
Posted 1 month ago
3 - 8 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 1 month ago
5 - 10 years
15 - 25 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Qualification : Any Graduate or Above Relevant Experience : 5 to 10 Years Required Technical Skill Set (Skill Name) : Snowflake, Azure Managed Services platforms Keywords Must-Have Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake Location : PAN INDIA CTC Range : 25 LPA-40 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. Ipooja.singh@blackwhite.in I www.blackwhite.in
Posted 1 month ago
5 - 7 years
5 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2