Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
35 - 45 Lacs
Noida, Pune, Bengaluru
Hybrid
Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero-copy clone, time travel, and automating them. Experience in handling semi-structured data (JSON, XML), and columnar PARQUET using the VARIANT attribute in Snowflake. Experience in re-clustering the data in Snowflake with a good understanding of Micro-Partitions. Experience in Migration processes to Snowflake from an on-premises database environment. Experience in designing and building manual or auto-ingestion data pipelines using Snowpipe. SnowSQL experience in developing stored procedures and writing queries to analyze and transform data. Must have skills - Certified Snowflake Architect, Snowflake Architecture, Snow Pipes, SnowSQL, SQL, CI/CD and Python Perks and benefits Competitive compensation package. Opportunity to work with industry leaders. Collaborative and innovative work environment. Professional growth and development opportunities.
Posted 3 weeks ago
3.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelors or masters degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.
Posted 3 weeks ago
2.0 - 7.0 years
5 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 3 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake
Posted 3 weeks ago
10.0 - 17.0 years
35 - 45 Lacs
Pune, Bangalore Rural, Chennai
Work from Office
We are search and staffing firm, catering to the hiring needs of top most IT Giants, Products and Captive. One such client which is a leading IT services company in healthcare domain is looking for: Job Summary: We are looking for an experienced Snowflake Architect to lead the design and implementation of scalable and high-performance data platforms on Snowflake. The ideal candidate should have a strong background in data architecture, data modeling, cloud platforms, and data warehousing with deep expertise in Snowflake. Key Responsibilities: Architect and design Snowflake-based data warehouse solutions for large-scale enterprise environments. Collaborate with business and data teams to understand data requirements and translate them into scalable architecture. Define best practices for data loading, transformation (ELT/ETL), and performance tuning in Snowflake. Lead the migration of existing data warehouses (e.g., Teradata, Oracle, Netezza) to Snowflake. Implement security and governance practices including role-based access control, masking, and data auditing. Optimize Snowflake storage and compute usage to ensure cost-effective solutions. Develop reusable frameworks and automation for data ingestion and transformation using tools like DBT, Airflow, or custom scripts. Guide and mentor development teams on Snowflake usage, query optimization, and design standards. Evaluate emerging data technologies and provide recommendations for adoption. Required Skills and Experience: 10+ years in data architecture, data engineering, or related roles. 3+ years of hands-on experience in Snowflake data platform. Expertise in SQL, performance tuning, and data modeling (3NF, dimensional, etc.). Experience with ELT tools (e.g., Informatica, Talend, DBT, Matillion, etc.). Strong understanding of cloud platforms (AWS, Azure, or GCP) and native integration with Snowflake. Familiarity with Python or scripting for automation and orchestration. Knowledge of CI/CD pipelines and version control using Git. Excellent communication and stakeholder management skills. Please Note: We are looking for candidates who can join on immediate basis, also please be assured your resume shall be highly confidential and will be only taken ahead after your consent.
Posted 3 weeks ago
10.0 - 15.0 years
15 - 20 Lacs
Pune
Work from Office
Notice Period: Immediate About the role: We are hiring a Senior Snowflake Data Engineer with 10+ years of experience in cloud data warehousing and deep expertise on the Snowflake platform. The ideal candidate should have strong skills in SQL, ETL/ELT, data modeling, and performance tuning, along with a solid understanding of Snowflake architecture, security, and cost optimization. Roles & Responsibilities: Collaborate with data engineers, product owners, and QA teams to translate business needs into efficient Snowflake-based data models and pipelines. Design, build, and optimize data solutions leveraging Snowflake features such as virtual warehouses, data sharing, cloning, and time travel. Develop and maintain robust ETL/ELT pipelines using tools like Talend, Snowpipe, Streams, Tasks, and Python. Ensure optimal performance of SQL queries, warehouse sizing, and cost-efficient design strategies. Implement best practices for data quality, security, and governance, including RBAC, network policies, and masking. Contribute to code reviews and development standards to ensure high-quality deliverables. Support analytics and BI teams with data exploration and visualization using tools like Tableau or Power BI. Maintain version control using Git and follow Agile development practices. Required Skills: Snowflake Expertise: Deep knowledge of Snowflake architecture and core features. SQL Development: Advanced proficiency in writing and optimizing complex SQL queries. ETL/ELT: Hands-on experience with ETL/ELT design using Snowflake tools and scripting (Python). Data Modeling: Proficient in dimensional modeling, data vault, and best practices within Snowflake. Automation & Scripting: Python or similar scripting language for data workflows. Cloud Integration: Familiarity with Azure and its services integrated with Snowflake. BI & Visualization: Exposure to Tableau, Power BI or other similar platforms.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Work Location :Bangalore, chennai, Hyderabad, Pune, Bhubaneshwar, Kochi Experience :5-10yrs Job Description: Hands on experience in Snowflake Experience in Snowpipe, Snowsql Strong datawarehouse experience Please share your updated profile to suganya@spstaffing.in, if you are actively looking for change.
Posted 3 weeks ago
8.0 - 13.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities Snowflake JD Must have 8- 15 years of experience in Data warehouse, ETL, BI projects • Must have at least 5+ years of experience in Snowflake, 3+DBT • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable
Posted 4 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Work from Office
About Client Hiring for One of the top most MNC!! Job Description Job Title : Snowflake Developer /Snowflake Data Engineer Qualification : Any Graduate or Above Relevant Experience : 4 to 12 Years SKILL Snowflake Python/Pyspark SQL AWS services Role descriptions / Expectations from the Role Strong experience in building/designing the data warehouse, data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow Should have good understanding on ETL processes and ETL tools. Location : Hyderabad CTC Range : 20 LPA TO 30 LPA Notice period : ANY Shift Timing : N/A Mode of Interview : VIRTUAL Mode of Work : WORK FROM OFFICE Vardhani IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8686127477 I vardhani@blackwhite.in I www.blackwhite.in
Posted 1 month ago
6.0 - 11.0 years
35 - 50 Lacs
Pune, Gurugram, Delhi / NCR
Hybrid
Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Posted 1 month ago
6.0 - 11.0 years
35 - 50 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.
Posted 1 month ago
5.0 - 10.0 years
12 - 18 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Work from Office
Role & responsibilities 1.Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool DBT for transformation 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Contact Soniya soniya05.mississippiconsultants@gmail.com
Posted 1 month ago
2.0 - 5.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Description. Tietoevry Create is seeking a skilled Snowflake Developer to join our team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and maintaining data solutions using Snowflake's cloud data platform. You will work closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business value.. 7+ years of experience in designing, development of Datawarehouse & Data integration projects (SSE / TL level). Experience of working in Azure environment. Developing ETL pipelines in and out of data warehouse using a combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.. Good understanding of database design concepts Transactional / Datamart / Data warehouse etc.. Expertise in loading from disparate data sets and translating complex functional and technical requirements into detailed design. Will also perform analysis of vast data stores and uncover insights.. Snowflake data engineers will be responsible for architecting and implementing substantial scale data intelligence solutions around Snowflake Data Warehouse.. A solid experience and understanding of architecting, designing, and operationalizing large-scale data & analytics solutions on Snowflake Cloud Data Warehouse is a must.. Very good articulation skill. Flexible and ready to learn new skills.. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 - 3 Lacs
Noida
Work from Office
• Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns. • Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities. • Develop and enforce data modelling standards and best practices for Snowflake environments. • Develop, optimize, and maintain Snowflake data warehouses. • Leverage Snowflake features such as clustering, materialized views, and semi structured data processing to enhance data solutions. • Ensure data architecture solutions meet performance, security, and scalability requirements. • Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities. • Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights. • Stay updated with the latest trends and advancements in data architecture and Snowflake technologies. • Provide mentorship and guidance to junior data engineers and architects. • Troubleshoot and resolve data architecture-related issues effectively. Skills Requirement: • 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect. • Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing. • Experience in designing and building manual or auto ingestion data pipeline using Snowpipe. • Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL. • SnowSQL Experience in developing stored Procedures writing Queries to analyse and transform data • Working experience on ETL tools like Fivetran, DBT labs, MuleSoft • Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them. • Excellent problem-solving skills and attention to detail. • Effective communication and collaboration abilities. • Relevant certifications (e.g., SnowPro Core / Advanced) are a must have. • Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data. • Strong communication and exceptional team player with effective problem-solving skills Educational Qualification Required: • Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.
Posted 1 month ago
5.0 - 10.0 years
8 - 18 Lacs
Hyderabad, Chennai, Coimbatore
Hybrid
Job Requirements skilled professional with expertise in Snowflake and SQL to join our team. The candidate will be responsible for managing, optimizing, and scaling our Snowflake data warehouse solutions while ensuring seamless integration with SQL-based systems. Key Responsibilities Design, develop, and maintain Snowflake databases and data warehouse solutions. Build and optimize SQL queries for data processing and reporting. Collaborate with cross-functional teams to implement data models and pipelines. Ensure data security, quality, and compliance in Snowflake environments. Monitor and troubleshoot Snowflake and SQL systems for performance issues. Create documentation and best practices for Snowflake and SQL usage. Qualifications Proven experience in Snowflake database management and development. Strong proficiency in SQL programming and query optimization. Knowledge of ETL processes and data warehousing concepts. Familiarity with cloud platforms and integration tools is a plus. Excellent problem-solving skills and attention to detail. Preferred Skills Experience with Snowflake tools such as Snowpipe, Time Travel, and Cloning. Understanding of schema design and database architecture. Ability to work with large datasets and implement efficient storage solutions. Familiarity with data visualization tools for reporting purposes. Technology Engineering Expertise Able to deal with diverse set of stakeholders Proficient in articulation communication and presentation High integrity Problem solving skills learning attitude Team player
Posted 1 month ago
5.0 - 8.0 years
16 - 22 Lacs
Hyderabad
Work from Office
Job Title: EY-GDS Consulting-AI And DATA-Snowflake Data Engineer-Senior Business Unit Description: The Information Technology group delivers secure, reliable technology solutions that enable DTCC to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance. Position Summary Provides technical expertise and may coordinate some day to day deliverables for a team. Assists in the technical design of large business systems; builds applications, interfaces between applications, understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices & standards; contributes to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; contributes expertise on significant application components, vendor products, program languages, databases, operating systems, etc., and guides less experienced staff during the build and test phases. Specific Responsibilities Act as a technical guide on one or more applications utilized by DTCC. Work with the Business System Analyst to ensure designs satisfy functional requirements. Work with large, complex data sets and high throughput data pipelines that meet business requirements. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. Build data and analytics tools that utilize the data pipeline to provide actionable insights to operational efficiency and other key business performance metrics. Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs. Collaborate with data scientists and architects on several projects. Solve various complex problems. Key skills: 5+ Years of Experience as Data Engineer or in similar role 4+ Years Cloud Datawarehouse Snowflake experience with Stream, Snowpipe ,Task, Snowpark etc 4+ years of Python development experience is necessary. Experience in distributed processing frameworks like spark, databricks, apache iceberg, data lakehouse architecture patterns Experience in a cloud-based environment. Experience with asynchronous processing using Python. Hands-on experience with database technologies (e.g. SQL and NoSql) with performance tuning. Technical expertise with data technologies and/or machine learning techniques Great numerical and analytical skills Ability to write reusable code components. Open-minded to the new technologies, frameworks Qualifications Minimum of 6 years of related experience Bachelor's degree preferred or equivalent experience With 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm simplifies the complexities of clearing, settlement, asset servicing, data management, data reporting and information services across asset classes, bringing increased security and soundness to financial markets. In 2022, DTCCs subsidiaries processed securities transactions valued at U.S. $2.5 quadrillion. Its depository provides custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $72 trillion. DTCCs Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 17.5 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn, Twitter, YouTube and Facebook. Interested candidates Kindly Apply to the below Link https://careers.ey.com/job-invite/1586345/
Posted 1 month ago
8.0 - 13.0 years
12 - 22 Lacs
Hyderabad, Bengaluru
Hybrid
Skills -Snowflake, AWS, SQL, PLSQL/TSQL, DWH, Python, PySpark •Experience with Snowflake utilities, SnowSQL, SnowPipe, Able to administer & monitor Snowflake computing platform •Good in Cloud Computing AWS NP-Immediate Email- sachin@assertivebs.com
Posted 1 month ago
6.0 - 11.0 years
20 - 30 Lacs
Mumbai, Pune, Bengaluru
Hybrid
Key Responsibilities: Support Snowflake Native Apps built by developers across the enterprise Review the design, development, and optimization of native apps using Snowflake and Snowpark Container Services Troubleshoot complex SQL queries, UDFs , and Python scripts for data processing in client environments Engage directly with clients to understand business needs, present technical designs, and resolve application use issues Ensure data quality, security, and performance across all stages of the data lifecycle Coordinate with support engineers and provide a bridge to data engineering Collaborate with cross-functional teams including Product, Analytics, and DevOps Required Skills & Experience: 6+ years of experience in Data Engineering or related field 3+ years of experience working in client-facing roles , including requirement gathering, solutioning, and demos Hands-on expertise with Snowflake (warehouse management, resource monitoring, Snow pipe, etc.) Strong SQL programming and performance tuning skills Proficiency in Python , including creating and managing UDFs Experience building or supporting Snowflake native applications Familiarity with Snowpark Container Services and deploying containerized workloads in Snowflake Good to Have Skills: Strong understanding of data modelling, ETL/ELT processes, and cloud data architecture Excellent problem-solving, communication, and leadership skills Preferred Qualifications: SnowPro certifications Experience with CI/CD pipelines Exposure to Tableau, Power BI, or other visualisation tools (nice to have) Leader of client-support or escalation team Role & responsibilities Preferred candidate profile
Posted 1 month ago
9.0 - 14.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Ahmedabad
Hybrid
Key Responsibilities: Lead the end-to-end Snowflake platform implementation , including architecture, design, data modeling, and governance. Oversee the migration of data and pipelines from legacy platforms to Snowflake, ensuring quality, reliability, and business continuity. Design and optimize Snowflake-specific data models , including use of clustering keys, materialized views, Streams, and Tasks. Build and manage scalable ELT/ETL pipelines using modern tools and best practices. Define and implement standards for Snowflake development , testing, and deployment, including CI/CD automation. Collaborate with cross-functional teams including data engineering, analytics, DevOps, and business stakeholders. Establish and enforce data security , privacy, and governance policies using Snowflakes native capabilities. Monitor and tune system performance and cost efficiency through appropriate warehouse sizing and usage patterns. Lead code reviews, technical mentoring, and documentation for Snowflake-related processes. Required Snowflake Expertise: Snowflake Architecture – Deep understanding of virtual warehouses, data sharing, multi-cluster, zero-copy cloning. Ability to enhance architecture and implement solutions as per the architecture. Performance Optimization – Proficient in tuning queries, clustering, caching, and workload management. Data Engineering – Experience with processing batch, real time using multiple Snowflake features like Snowpipe, Streams & Tasks, stored procedures, and data ingestion patterns. Data Security & Governance – Strong experience with RBAC, dynamic data masking, row-level security, and tagging. Experience enabling such capabilities in Snowflake and at least one enterprise product solution. Advanced SQL – Expertise writing, analyzing, performance optimization of complex SQL queries, transformations, semi-structured data handling (JSON, XML). Cloud Integration – Experience with at least one of the major cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, Step Functions, etc.
Posted 1 month ago
5.0 - 10.0 years
0 - 2 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Role & responsibilities :Snowflake Developer Skill Set : Snowflake,IICS,ETL,Cloud Experience- 5 Years- 12 Years Location-Pune/Mumbai/Chennai/Bangalore/Hyderabad/Delhi Notice Period: immediate- 30 days If all above criteria matches to your profile please share your updated CV with all below details Total Exp- ? Relevant Exp- ? Current CTC- ? Exp. CTC- ? Notice Period- ? IF serving what is LWD? Pan Card Number -?Mandatory Passport size photo please attach -Mandatory Please share your all above details on sneha.joshi@alikethoughts.com
Posted 1 month ago
7.0 - 12.0 years
0 Lacs
Kochi
Work from Office
Greetings from TCS Recruitment Team! Role: SNOWFLAKE LEAD/ SNOWFLAKE SOLUTION ARCHITECT/ SNOWFLAKE ML ENGINEER Years of experience: 7 to 18 Years Walk-In-Drive Location: Kochi Walk-in-Location Details: Tata Consultancy Services TCS Centre SEZ Unit, Infopark Kochi Phase 1, Infopark Kochi P.O, Kakkanad, Kochi - 682042, Kerala India Drive Time: 9 am to 1:00 PM Date: 21-Jun-25 Must Have Deep knowledge of Snowflakes architecture, SnowSQL, Snowpipe, Streams, Tasks, Stored Procedures. Strong understanding of cloud platforms (AWS, Azure, GCP). Proficiency in SQL, Python, or scripting languages for data operations. Experience with ETL/ELT tools, data integration, and performance tuning. Familiarity with data security, governance, and compliance standards (GDPR, HIPAA, SOC 2).
Posted 1 month ago
6.0 - 11.0 years
17 - 30 Lacs
Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 1 month ago
6.0 - 11.0 years
17 - 30 Lacs
Kolkata, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough