Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
20 - 27 Lacs
Kochi, Chennai, Thiruvananthapuram
Work from Office
Snowflake Data Warehouse Development : Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflakes features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development : Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark : Leverage PySpark for data transformations within the Snowflake environment . Implement complex data cleansing , enrichment , and validation processes using PySpark to ensure the highest data quality. Collaboration : Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.
Posted 3 weeks ago
5 - 10 years
0 - 1 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
JD for Snowflake Admin Key Responsibilities: Administer and manage Snowflake environments including user roles, access control, and resource monitoring. Develop, test, and deploy ELT/ETL pipelines using Snowflake SQL and other tools (e.g., Informatica, DBT, Matillion). Monitor query performance and storage utilization; implement performance tuning and optimization strategies. Manage and automate tasks such as warehouse scaling, Snowpipe ingestion, and task scheduling. Work with semi-structured data formats (JSON, XML, Avro, Parquet) using VARIANT and related functions. Set up and manage data sharing, replication, and failover across Snowflake accounts. Implement and manage security best practices including RBAC, masking policies, and object-level permissions. Collaborate with Data Engineers, Architects, and BI teams to support analytics use cases. Required Skills: Strong hands-on experience with Snowflake architecture, SQL, and performance tuning. Experience with Snowflake features such as Streams, Tasks, Time Travel, Cloning, and External Tables. Proficiency in working with SnowSQL and managing CLI-based operations. Knowledge of cloud platforms (AWS / Azure / GCP) and integration with Snowflake. Experience with data ingestion tools and scripting languages (Python, Shell, etc.). Good understanding of CI/CD pipelines and version control (Git). Role & responsibilities Preferred candidate profile
Posted 1 month ago
5 - 10 years
3 - 7 Lacs
Chennai
Work from Office
Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills
Posted 1 month ago
4 - 8 years
7 - 14 Lacs
Pune
Hybrid
Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com
Posted 1 month ago
3 - 6 years
5 - 12 Lacs
Hyderabad
Work from Office
Role and Responsibilities: Establish, configure, and manage Git repositories to support version control and collaboration. Develop and troubleshoot procedures, views, and complex PL/SQL queries, ensuring effective integration and functionality within Git environments. Experience with tools like SQL Developer, TOAD, or similar. Develop complex SQL queries, scripts, and stored procedures to support application and reporting needs. Writing SQL queries to extract, manipulate, and analyze data from databases. Optimizing queries to improve performance and reduce execution time. Creating and maintaining database tables, views, indexes, and stored procedures. Design, implement, and optimize relational database schemas, tables, and indexes. Create and maintain database triggers, functions, and packages using PL/SQL. Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Comprehensive expertise in SQL, Snowflake and version control tools such as Git and SVN. Minimum of 3 years of experience in application support and maintenance. Proven ability to communicate complex technical concepts effectively Demonstrated ability to exhibit client empathy and ensure customer satisfaction with issue resolution. Strong written and verbal communication skills. Adept at presenting intricate technical information in an accessible manner to varied audiences. Ability to thrive in a fast-paced, dynamic environment with high levels of ambiguity. Practical problem-solving skills focused on resolving immediate customer issues while planning for long-term solutions. Highly organized and process-oriented, with a proven track record of driving issue resolution by collaborating across multiple teams. Strong interpersonal skills with a customer-centric approach, maintaining patience and composure under pressure during real-time issue resolution. Working knowledge of DSP/SSP platforms is an added advantage. Open to work in night shifts in a 24/7 project.
Posted 1 month ago
6 - 10 years
8 - 18 Lacs
Kolhapur, Hyderabad, Chennai
Work from Office
Relevant Exp:5+ Yrs Mandatory Skills: Snowflake architecture, Matillion, SQL, Python, SnowSQL, any cloud Exp Night shift (6 PM to 3 AM) Complete WFO - 5 Days Email Id: anusha@akshayaitsolutions.com Loc: Hyd/ Ban/Chennai/Kolhapur
Posted 1 month ago
3 - 8 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 8 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 1 month ago
3 - 6 years
10 - 20 Lacs
Hyderabad, Bangalore/Bengaluru
Work from Office
Snowflake Developer
Posted 1 month ago
5 - 7 years
5 - 15 Lacs
Pune, Chennai, Bengaluru
Work from Office
experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.
Posted 1 month ago
8 - 13 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
6 - 16 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
4 - 8 years
0 - 1 Lacs
Mohali
Work from Office
Job Title : Snowflake Developer (4+ years' experience) Location : F, 384, Sector 91 Rd, Phase 8B, Industrial Area, Sector 91, Sahibzada Ajit Singh Nagar, Punjab 160055. Job Type : Fulltime (In-house) Job Overview : We are looking for an experienced Snowflake Developer with 4+ years of hands-on experience in Snowflake Data Warehouse and related tools. You will be responsible for building, managing, and optimizing Snowflake data pipelines, assisting in data integration, and contributing to overall data architecture. The ideal candidate should have a strong understanding of data modeling, ETL processes, and experience working with cloud-based data platforms. Responsibilities : Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Optimize query performance and data storage in Snowflake. Work with stakeholders to define data requirements and provide technical solutions. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Monitor and troubleshoot performance issues in Snowflake environments. Automate repetitive data processes and report generation tasks. Ensure data integrity, security, and compliance with data governance policies. Assist in data migration and platform upgrades. Required Skills : 4+ years of experience working with Snowflake Data Warehouse . Proficient in SQL , SnowSQL , and ETL processes . Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines , data lakes, and data integration tools. Experience in query optimization and performance tuning in Snowflake. Understanding of data governance and best practices. Strong knowledge of data security and privacy policies in a cloud environment. Experience in using tools like dbt , Airflow , or similar orchestration tools is a plus. #Salary: No bar for deserving candidates. Location: - Mohali Punjab (Work from office) Shift:- Night Shift Other Benefits: 5 Days working US based work culture and environment Indoor and Outdoor events Paid Leaves Health Insurance Employee engagement activities like month end & festival celebration, team outing, birthday celebrations. Gaming and sports area Please comment/DM to know more. You may also e-mail your resume to me at priyankaaggarwal@sourcemash.com
Posted 1 month ago
3 - 8 years
15 - 25 Lacs
Bhubaneshwar, Bengaluru, Hyderabad
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Bangalore/Bhubaneswar/Hyderabad Required Skills, Snowflake developer Snowpipe SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
3 - 8 years
15 - 25 Lacs
Bengaluru, Hyderabad, Noida
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location :Noida/Gurgaon/Pune/Bangalore/Bhubaneswar/kochi Required Skills, Snowflake developer snowpipe Interested candidates can send resumes to nandhini.spstaffing@gmail.com or ping me in 8148043843(Whatsapp)
Posted 1 month ago
5 - 10 years
0 Lacs
Mysore, Bengaluru, Kochi
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies Snowflake & Python - Data Engineer/Architect in Bangalore, Karnataka on 12th April [Saturday] 2025 - Snowflake/ Python/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 12th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 12th April [Saturday] 2025 Experience 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hotel Grand Mercure Bangalore 12th Main Rd, 3rd Block, Koramangala 3 Block, Koramangala, Bengaluru, Karnataka 560034 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
5 - 10 years
0 Lacs
Pune, Nagpur, Mumbai (All Areas)
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Pune, Maharashtra on 5th April [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Pune, Maharashtra on 5th April [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience – 4 years to 12 years Time: 9.00 AM to 5 PM Venue: Hexaware Technologies Limited, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pimpri-Chinchwad, Pune, Maharashtra 411057 Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
5 - 10 years
13 - 23 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 2 months ago
5 - 10 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 2 months ago
6 - 11 years
12 - 22 Lacs
Bengaluru, Kochi, Hyderabad
Work from Office
About Client Hiring for One of Our Multinational Corporations! Job Title : Snowflake Developer /Snowflake Data Engineer Qualification : Any Graduate or Above Relevant Experience : 5 to 10Year Must Have Skills : Snowflake Python PySpark SQL AWS/ Azure Roles and Responsibilities : Strong experience in building/designing the data warehouse, data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake. Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow. Should have good understanding on ETL processes and ETL tools. Location : Bangalore, Kochi, Hyderabad, Nagpur CTC Range : Upto 30 LPA (Lakhs Per Annum) Notice period : 90 days Mode of Interview : Virtual Mode of Work : Work From Office -- Thanks & Regards, SHRIVIDYA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:08067432410 shrividya@blackwhite.in |www.blackwhite.in
Posted 2 months ago
4 - 9 years
10 - 20 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Hexaware Technologies conducting Walk in drive on 5th April-25 in Pune, below is the required details. Interested candidates walk in to the below given venue with your updated cv. Required skill: Snowflake, Python, SQL. Total Experience: 4 to 9 years. Relevant Exp: minimum 4 years. Work Location: Chennai, Bangalore, Mumbai, Pune. Work Mode: Hybrid. Interview Date: 5th April-25. Interview Mode: face to face. Interview Timing: 9am to 12pm. Venue: Phase 3, Hinjawadi Rajiv Gandhi Infotech Park, Hinjawadi, Pune, Pimpri-Chinchwad, Maharashtra 411057. Point of Contact: Gopinath R. Regards, Gopinath R.
Posted 2 months ago
5 - 8 years
18 - 20 Lacs
Pune
Work from Office
Client: persistent Role- C2H Location: PAN India(Hybrid) POC:Bhajan Job Description: MUST have Skill: Data Stage, SnowSQL Skill Description: ETL, Data Warehousing, Data Stage, Testing,IBM Datastage Data Stage, IBM Datastage, ETL, Testing, Data Warehousing, SnowSQL
Posted 2 months ago
7 - 12 years
15 - 25 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer (Snowflake+Python+Cloud)! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 2 months ago
4 - 9 years
0 Lacs
Mysore, Bengaluru, Hyderabad
Hybrid
Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Bangalore, Karnataka on 29th March [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 29th March [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 12 years Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience: 5 years to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies Ltd, Shanti Niketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********
Posted 2 months ago
4 - 9 years
5 - 15 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Job Title: Snowflake Developer with PL/SQL Exp - 4+ Years Location - Bangalore/Hyderabad/Pune Job Description: We are looking for a highly skilled Snowflake Developer with strong expertise in PL/SQL to join our team. In this role, you will design, implement, and maintain data pipelines and solutions in the Snowflake data platform. You will work closely with other teams to ensure seamless data integration, optimize query performance, and create efficient ETL processes using Snowflake and PL/SQL. Requirements: Proven experience with Snowflake data platform and SQL-based development. Strong expertise in PL/SQL, with experience writing complex stored procedures and functions. Hands-on experience in developing and optimizing ETL processes. Familiarity with data modeling, data warehousing, and data integration best practices. Knowledge of cloud data platforms, data migration, and data transformation techniques. Excellent problem-solving skills and the ability to work in a collaborative environment. Strong communication skills to work effectively with cross-functional teams. Preferred Qualifications: Experience with Snowflake data sharing and Snowflake-specific features (e.g., Snowpipe, Streams, Tasks). Familiarity with cloud computing platforms like AWS, Azure, or Google Cloud added advantage. Interested can share your to saritha.yennapally@relanto.ai or you can refer if someone is looking for job change/job need. Regards, TA Team
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2