Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 10 years
15 - 25 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Qualification : Any Graduate or Above Relevant Experience : 5 to 10 Years Required Technical Skill Set (Skill Name) : Snowflake, Azure Managed Services platforms Keywords Must-Have Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake Location : PAN INDIA CTC Range : 25 LPA-40 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. Ipooja.singh@blackwhite.in I www.blackwhite.in
Posted 1 month ago
5 - 7 years
5 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.
Posted 1 month ago
6 - 11 years
15 - 30 Lacs
Bengaluru
Hybrid
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Snowflake Developer Required skills and qualifications : Snowflake SQL PYTHON Qualification : Any Graduate or Above Relevant Experience : 4 to 12yrs Location : BLR/HYD/PUNE/BBSR CTC Range : 10 to 35 LPA Notice period : Any Mode of Interview : Virtual Mode of Work : In Office Nithyashree Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA Nithyashree@blackwhite.in I www.blackwhite.in
Posted 1 month ago
2 - 6 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: • Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech
Posted 1 month ago
4 - 9 years
8 - 18 Lacs
Bhubaneswar, Pune, Bengaluru
Work from Office
Required Skillset: Snowflake architecture,Snowpipe, data cloning, and time travel,SQL,Data Modeling. Must Have: Deep understanding of cloud data warehousing and expertise in designing, developing, and implementing data solutions using Snowflake. Design, develop, and optimize data models and ETL workflows for Snowflake Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Implement data integration solutions using Snowflake, including data ingestion from various sources. Write efficient SQL queries to analyze large datasets and improve performance. Monitor and troubleshoot Snowflake performance issues, offering solutions and enhancements. Ensure data quality, consistency, and governance by implementing best practices. Develop and maintain documentation related to data architecture, processes, and data governance. Stay current with Snowflake features, functionalities, and industry trends to suggest improvements and innovations. Snowflake certification (SnowPro Core or Advanced) is a plus. Knowledge of data visualization tools (e.g., Tableau, Power BI) is advantageous Experience in Agile methodologies and working in an Agile team environment Responsibility of / Expectations from the Role: Customer Centric Work closely with client teams to understand project requirements and translate into technical design Experience working in scrum or with scrum teams Internal Collaboration: Work with project teams and guide the end to end project lifecycle, resolve technical queries Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Soft Skills: Good communication skills Ability to interact with various internal groups and CoEs
Posted 1 month ago
8 - 13 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
6 - 16 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
15 - 25 Lacs
Bhubaneshwar, Bengaluru, Hyderabad
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
3 - 8 years
15 - 25 Lacs
Bengaluru, Hyderabad, Noida
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
5 - 10 years
0 Lacs
Mysore, Bengaluru, Kochi
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies Snowflake & Python - Data Engineer/Architect in Bangalore, Karnataka on 12th April [Saturday] 2025 - Snowflake/ Python/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 12th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 12th April [Saturday] 2025 Experience 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hotel Grand Mercure Bangalore 12th Main Rd, 3rd Block, Koramangala 3 Block, Koramangala, Bengaluru, Karnataka 560034 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
5 - 10 years
0 Lacs
Pune, Nagpur, Mumbai (All Areas)
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Pune, Maharashtra on 5th April [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Pune, Maharashtra on 5th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience – 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hexaware Technologies Limited, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pimpri-Chinchwad, Pune, Maharashtra 411057 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
5 - 10 years
13 - 23 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 2 months ago
5 - 10 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 2 months ago
4 - 9 years
10 - 20 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Hexaware Technologies conducting Walk in drive on 5th April-25 in Pune, below is the required details. Interested candidates walk in to the below given venue with your updated cv. Required skill: Snowflake, Python, SQL. Total Experience: 4 to 9 years. Relevant Exp: minimum 4 years. Work Location: Chennai, Bangalore, Mumbai, Pune. Work Mode: Hybrid. Interview Date: 5th April-25. Interview Mode: face to face. Interview Timing: 9am to 12pm. Venue: Phase 3, Hinjawadi Rajiv Gandhi Infotech Park, Hinjawadi, Pune, Pimpri-Chinchwad, Maharashtra 411057. Point of Contact: Gopinath R. Regards, Gopinath R.
Posted 2 months ago
4 - 8 years
8 - 15 Lacs
Hyderabad
Work from Office
Hi , Greetings of the day, We have our ongoing requirement "SnowFlake Developer" - Hyderabad Location currently, requesting you to revert back with your updated profile if you're interested with the same, Immediate Joiners Preferred Mode of Work : Work From Office Only(Hyderabad) Interested candidates can share their updated resume to "skeramadha@adaequare.com" Designation: Snowflake Developer / Data Engineer Experience: 4-7 years Job Description: Should have 2+ years of experience on Snowflake Development Should have 2+ years of experience in SQL Good To have 1+ years of experience in Python development Good to have 1+ years of experience on Snowflake Streamlit (UI) Thanks & Regards Saikumar Eramadha ADAEQUARE | TAXILLA | UDYOG Designation: Sr Executive Talent Acquisition Email: Skeramadha@adaequare.com Address: North East Block, First Floor, Display Building, NAC Campus, Cyberabad, Kondapur (Post), Hyderabad 500 084 URL: http://www.adaequare.com | www. http://www.taxilla.com/| www.udyogsoftware.com ISO 27001 | Microsoft Gold Certified | SAP Hybris Partner | SAP Integration Partner
Posted 2 months ago
7 - 12 years
15 - 25 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer (Snowflake+Python+Cloud)! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 2 months ago
5 - 10 years
10 - 20 Lacs
Chennai, Pune, Noida
Work from Office
Interested candidates can share resumes at deepali.rawat@rsystems.com Must Have: SQL Dbt Python Data Quality & Data modelling Good to Have Snowflake db, snowpipe, fivetran Resource should be expert in dbt and SQL, should be able to develop and maintain dbt model, understand data flow, perform data quality, testing of data using dbt etc.
Posted 2 months ago
4 - 9 years
0 Lacs
Mysore, Bengaluru, Hyderabad
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Bangalore, Karnataka on 29th March [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 29th March [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 12 years Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience: 5 years to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies Ltd, Shanti Niketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Hyderabad
Remote
Key Responsibilities: Snowflake Architecture & Setup : Design and implement Snowflake environments , ensuring best practices in RBAC, network security policies, and external access integrations . Iceberg Catalog Implementation : Configure and manage Apache Iceberg catalogs within Snowflake and integrate with Azure ADLS Gen2 for external storage. External Storage & Access : Set up external tables, storage integrations , and access policies for ADLS Gen2, AWS S3, and GCS . Data Ingestion & Streaming : Implement Snowpipe, Dynamic Tables , and batch/streaming ETL pipelines for real-time and scheduled data processing. CI/CD & Automation : Develop CI/CD pipelines for Snowflake schema changes, security updates, and data workflows using Terraform, dbt, GitHub Actions, or Azure DevOps . Snowflake Notebooks & Snowpark : Utilize Snowflake Notebooks for analytics and data exploration, and develop Snowpark applications for machine learning and complex data transformations using Python, Java, or Scala . Security & Compliance : Implement RBAC, Okta SSO authentication, OAuth, network security policies, and governance frameworks for Snowflake environments. Notification & Monitoring Integration : Set up event-driven notifications and alerting using Azure Event Grid, SNS, or cloud-native services . Performance & Cost Optimization : Continuously monitor query performance, warehouse utilization, cost estimates, and optimizations to improve efficiency. Documentation & Best Practices : Define best practices for Snowflake architecture, automation, security, and performance tuning . Required Skills & Experience: 7+ years of experience in data architecture and engineering, specializing in Snowflake Expertise in SQL, Python, and Snowpark APIs. Hands-on experience with Iceberg Catalogs, Snowflake Notebooks, and external storage (Azure ADLS Gen2, S3, GCS). Strong understanding of CI/CD for Snowflake , including automation with Terraform, dbt, and DevOps tools. Experience with Snowpipe, Dynamic Tables, and real-time/batch ingestion pipelines. Proven ability to analyze and optimize Snowflake performance, storage costs, and compute efficiency. Knowledge of Okta SSO, OAuth, federated authentication, and network security in Snowflake. Cloud experience in Azure, AWS, or GCP , including cloud networking and security configurations. Additional Details: This is a Contractual position for a duration of 6-12 months, This is a Completely Remote Opportunity .
Posted 2 months ago
4 - 9 years
5 - 15 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Job Title: Snowflake Developer with PL/SQL Exp - 4+ Years Location - Bangalore/Hyderabad/Pune Job Description: We are looking for a highly skilled Snowflake Developer with strong expertise in PL/SQL to join our team. In this role, you will design, implement, and maintain data pipelines and solutions in the Snowflake data platform. You will work closely with other teams to ensure seamless data integration, optimize query performance, and create efficient ETL processes using Snowflake and PL/SQL. Requirements: Proven experience with Snowflake data platform and SQL-based development. Strong expertise in PL/SQL, with experience writing complex stored procedures and functions. Hands-on experience in developing and optimizing ETL processes. Familiarity with data modeling, data warehousing, and data integration best practices. Knowledge of cloud data platforms, data migration, and data transformation techniques. Excellent problem-solving skills and the ability to work in a collaborative environment. Strong communication skills to work effectively with cross-functional teams. Preferred Qualifications: Experience with Snowflake data sharing and Snowflake-specific features (e.g., Snowpipe, Streams, Tasks). Familiarity with cloud computing platforms like AWS, Azure, or Google Cloud added advantage. Interested can share your to saritha.yennapally@relanto.ai or you can refer if someone is looking for job change/job need. Regards, TA Team
Posted 2 months ago
4 - 8 years
5 - 12 Lacs
Mumbai Suburbs
Work from Office
4+ years of experience working with Snowflake and cloud data platforms. Strong expertise in SQL. Experience with ETL/ELT tools. Familiarity with cloud platforms (AWS, Azure, or GCP) Immediate joiner within 2 weeks
Posted 2 months ago
5 - 8 years
12 - 18 Lacs
Chennai, Noida
Hybrid
Greetings from Sopra Steria! Hiring for Snowflake Developer Location: Noida & Chennai Experience: 5 - 8 Years Work type: Permanent Work Mode: Hybrid Job Description: We are seeking a skilled Data Transformation Specialist to join our team and play a key role in implementing data solutions. Our partner, RSMB , will be designing a range of data solutions throughout the delivery phase Key Responsibilities: Develop and implement data transformation solutions based on the designs provided by RSMB. Write, optimize, and maintain complex SQL queries to ensure efficient data processing. Work within the Snowflake environment , ensuring best practices for data storage, processing, and retrieval. Collaborate with stakeholders, including data architects and analysts, to ensure smooth execution of data solutions. Troubleshoot and resolve performance issues in SQL queries and Snowflake environments. Leverage Python for scripting and automation of data processing tasks. Ensure data integrity, quality, and compliance with industry standards. Interested shall send resume to gokulakrishnan.b@soprasteria.com for further levels of interview process. Regards, Gokulakrishnan B Talent Acquisition
Posted 2 months ago
6 - 10 years
15 - 30 Lacs
Chennai, Hyderabad, Kolkata
Work from Office
About Client Hiring for One of Our Multinational Corporations! Job Description Job Title: Snowflake Developer Qualification : Graduate Relevant Experience: 6 to 8 years Must-Have Skills: Snowflake Python SQL Roles and Responsibilities: Design, develop, and optimize Snowflake-based data solutions Write and maintain Python scripts for data processing and automation Work with cross-functional teams to implement scalable data pipelines Ensure data security and performance tuning in Snowflake Debug and troubleshoot database and data processing issues Location: Kolkata,Hyderabad, Chennai, Mumbai Notice Period: Upto 60 days Mode of Work: On-site -- Thanks & Regards Nushiba Taniya M Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432408 |Nushiba@blackwhite.in |www.blackwhite.in
Posted 2 months ago
3 - 6 years
18 - 20 Lacs
Chennai, Bengaluru
Work from Office
Role & responsibilities : Manages and optimizes the performance, security, and reliability of databases in the Snowflake Data Cloud platform with below Roles and responsibilities . Experience in Cloud(Azure or GCP) and Automation Platforms(Terraform , Azure DevOps , Python or Shell scripting) is Must Good to have : DBA Experience in another DB Platforms Postgressql , MySQL and Mongo DB Understand of snowflake architecture and database concepts in intermediate level. Security and compliance Infrastructure Database administration Tools User management Storage Cost control Updates
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2