Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
15 - 25 Lacs
Chennai
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 3 months ago
5.0 - 7.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications.
Posted 3 months ago
4.0 - 8.0 years
7 - 17 Lacs
Kolkata, Hyderabad, Pune
Work from Office
Role: Snowflake Developer Exp:4+ yrs Location: PAN INDIA
Posted 3 months ago
3.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
About the Opportunity Job TypeApplication 23 June 2025 Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.
Posted 3 months ago
4.0 - 8.0 years
5 - 15 Lacs
Kolkata
Work from Office
Skills and Qualifications: Bachelor's and/or master's degree in computer science or equivalent experience. Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Required Skills Snowflake, SQL, ADF
Posted 3 months ago
6.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2393_JOB Date Opened 09/11/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake Engineer - Database Administraion City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 months ago
6.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 months ago
5.0 - 10.0 years
17 - 30 Lacs
Pune, Bengaluru
Hybrid
Key Responsibilities: Data modeling and design : Create data models and designs for data warehousing solutions. ETL/ELT development : Develop ETL/ELT pipelines using Snowflake's data loading and transformation capabilities. Data quality and integrity : Ensure data quality and integrity by implementing data validation, data cleansing, and data normalization techniques. Team leadership : Lead a team of developers, provide technical guidance, and ensure timely delivery of projects. Collaboration with stakeholders : Collaborate with stakeholders to understand business requirements, provide technical solutions, and ensure that solutions meet business needs. Performance optimization : Optimize performance of data warehousing solutions by implementing best practices, indexing, and caching. Security and governance : Ensure security and governance of data warehousing solutions by implementing access controls, auditing, and data masking. Requirements: 5+ years of experience : Experience in data warehousing, ETL/ELT development, and data modeling. Snowflake experience : Strong experience in Snowflake, including data loading, data transformation, and data querying. Data modeling skills : Strong data modeling skills, including experience with data modeling tools such as Erwin or PowerDesigner. ETL/ELT development skills : Strong ETL/ELT development skills, including experience with ETL/ELT tools such as Informatica or Talend. Leadership skills : Strong leadership skills, including experience in leading teams and managing projects. Communication skills : Strong communication skills, including experience in collaborating with stakeholders and communicating technical solutions. Bachelor's degree : Bachelor's degree in Computer Science, Information Technology, or related field.
Posted 3 months ago
10.0 - 15.0 years
8 - 15 Lacs
Hyderabad
Hybrid
Job Description: This person will help bring rigor and discipline in day-to-day operations & production supports Ability to work in a fast-paced, high-energy environment and bring sense of urgency & attention to details skills to the table. Coordinates closely with other BI team members to help ensure meaningful prioritization Escalates potential issues in timely fashion and seeks paths for resolution Excellent communication skills and ability to manage expectations Required skills/experience : 10+ years of progressive experience in Snowflake and BI relevant cloud technologies with extensive experience in Extraction, Modelling, & Reporting Worked under Implementation, Enhancement and Support projects. Conduct workshops with stakeholders to understand and analyze business requirements, problem statements, design gaps in existing process to provide scope & solution aligning with organizations IT Architectural landscape tools. Familiar with the concepts of SDLC with proficiency in mapping business requirements, technical documentation, application design, development and troubleshooting for information systems management Expertise in Power BI and Dashboarding skills Production Support - Experience in process chain management, Monitoring and scheduling the jobs. KEY RESPONSIBILITIES Good to have: Experience in Informatica (IICS/IDMC) is a plus Experienced in upgrade projects for warehousing, ETL and Reporting applications Hands on experience in SQL Server and/or Oracle, design and development; SAP functional Knowledge, Advanced analytics is a plus PROFESSIONAL EXPERIENCE/QUALIFICATIONS 10+ years of progressive experience in Snowflake and BI relevant cloud technologies with extensive experience in Extraction, Modelling, & Reporting Worked under Implementation, Enhancement and Support projects. Bachelors or Masters or similar educational qualification
Posted 3 months ago
5.0 - 8.0 years
12 - 16 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool. For Transformation good to have DBT experience 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Soniya soniya05.mississippiconsultants@gmail.com We are a Recruitment firm based in Pune, having various clients globally.
Posted 3 months ago
4.0 - 9.0 years
4 - 9 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Snowflake Data Warehouse. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Snowflake Data Warehouse. Collaborate with cross-functional teams to analyze business requirements and develop scalable and efficient solutions. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Ensure the quality of the application by conducting unit testing, integration testing, and performance testing. Professional & Technical Skills: Must To Have Skills: Experience in Snowflake Data Warehouse. Good To Have Skills: Experience in other data warehousing technologies like Redshift, BigQuery, or Azure Synapse Analytics. Strong understanding of database concepts and SQL. Experience in ETL tools like Talend, Informatica, or DataStage. Experience in developing and maintaining technical documentation. Experience in conducting unit testing, integration testing, and performance testing. Additional Information: The candidate should have experience in Snowflake Data Warehouse. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.
Posted 3 months ago
11.0 - 21.0 years
25 - 37 Lacs
Noida, Pune, Bengaluru
Hybrid
Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Cloud knowledge, especially AWS knowledge is necessary Well-versed with best practices around Data Governance, Data Stewardship and overall Data Quality initiatives Inventories existing data design, including data flows and systems, Designs data Model that integrates new and existing environment, Develops Conducts architecture meetings with Team leads and Solution Architects to communicate complex ideas, issues, and system processes along with Architecture discussions in their current projects Design the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL. Good hands-on experience in converting Source Independent Load, Post Load Process, Stored Procedure, SQL to Snowflake Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue quickly 1 yr Work experience in DBT. Data modeling and data integration. Advance SQL skills for analysis, standardizing queries Mandatory Skillset: Snowflake, DBT, Data Architecture Design experience in Data Warehouse.
Posted 3 months ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 3 months ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 months ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843(Please text)
Posted 3 months ago
5.0 - 10.0 years
9 - 19 Lacs
Chennai
Work from Office
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake SQL. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex queries to optimize database performance and troubleshoot issues. Implement star schema designs for efficient data modeling and querying. Participate in code reviews to ensure adherence to coding standards.
Posted 3 months ago
3.0 - 8.0 years
12 - 22 Lacs
Noida, Bhubaneswar, Gurugram
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 months ago
3.0 - 8.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing!! Role :Snowflake Developer Experience Required :3 to 10 yrs Work Location : Bangalore/Hyderabad/Bhubaneswar/Pune/Gurgaon/Noida/Kochi Required Skills, Snowflake Interested candidates can send resumes to nandhini.spstaffing@gmail.com or whatsapp-8148043843
Posted 3 months ago
6.0 - 8.0 years
1 - 4 Lacs
Chennai
Work from Office
Job Title:Snowflake Developer Experience6-8 Years Location:Chennai - Hybrid : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).
Posted 3 months ago
5.0 - 10.0 years
20 - 27 Lacs
Kochi, Chennai, Thiruvananthapuram
Work from Office
Snowflake Data Warehouse Development : Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflakes features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development : Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark : Leverage PySpark for data transformations within the Snowflake environment . Implement complex data cleansing , enrichment , and validation processes using PySpark to ensure the highest data quality. Collaboration : Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions.
Posted 3 months ago
5 - 10 years
0 - 1 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
JD for Snowflake Admin Key Responsibilities: Administer and manage Snowflake environments including user roles, access control, and resource monitoring. Develop, test, and deploy ELT/ETL pipelines using Snowflake SQL and other tools (e.g., Informatica, DBT, Matillion). Monitor query performance and storage utilization; implement performance tuning and optimization strategies. Manage and automate tasks such as warehouse scaling, Snowpipe ingestion, and task scheduling. Work with semi-structured data formats (JSON, XML, Avro, Parquet) using VARIANT and related functions. Set up and manage data sharing, replication, and failover across Snowflake accounts. Implement and manage security best practices including RBAC, masking policies, and object-level permissions. Collaborate with Data Engineers, Architects, and BI teams to support analytics use cases. Required Skills: Strong hands-on experience with Snowflake architecture, SQL, and performance tuning. Experience with Snowflake features such as Streams, Tasks, Time Travel, Cloning, and External Tables. Proficiency in working with SnowSQL and managing CLI-based operations. Knowledge of cloud platforms (AWS / Azure / GCP) and integration with Snowflake. Experience with data ingestion tools and scripting languages (Python, Shell, etc.). Good understanding of CI/CD pipelines and version control (Git). Role & responsibilities Preferred candidate profile
Posted 4 months ago
5 - 10 years
3 - 7 Lacs
Chennai
Work from Office
Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills
Posted 4 months ago
4 - 8 years
7 - 14 Lacs
Pune
Hybrid
Role :Snowflake Developer Experience: 4 to 6 years Key responsibilities: Perform Development & Support activities for Data warehousing domain Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues Perform Development & Deployment. Should be able to Code, Unit Test & Deploy Creation necessary documentation for all project deliverable phases Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met Technical Skills: Mandatory I n depth Knowledge of SQL, Unix & advanced Unix Shell Scripting Should have very clear understanding of Snowflake Architecture At least 4+ Year Hands on experience on Snowflake : snowsql , copy command , stored procedures , performance tuning and other advanced features like snowpipe , semi structured data load, types of tables. Hands on file transfer mechanism (NDM, SFTP , Data router etc) • Knowledge of Schedulers like TWS Certification for snowflake Good to have Python: Good to have Worked on AVRO, PARQUET files loading to snowflake - good to have Informatica: Good to have Pune Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. Hybrid min 3 days work from office. Shift- 1 PM to 10 PM 2 Round of Interview. NP: Immediate Joiners to 15 days ( only NP serving candidates) Location : Magarpatta City, Pune ( Hybrid) Excellent Communication Skills Interested Candidate Share resume at dipti.bhaisare@in.experis.com
Posted 4 months ago
3 - 6 years
5 - 12 Lacs
Hyderabad
Work from Office
Role and Responsibilities: Establish, configure, and manage Git repositories to support version control and collaboration. Develop and troubleshoot procedures, views, and complex PL/SQL queries, ensuring effective integration and functionality within Git environments. Experience with tools like SQL Developer, TOAD, or similar. Develop complex SQL queries, scripts, and stored procedures to support application and reporting needs. Writing SQL queries to extract, manipulate, and analyze data from databases. Optimizing queries to improve performance and reduce execution time. Creating and maintaining database tables, views, indexes, and stored procedures. Design, implement, and optimize relational database schemas, tables, and indexes. Create and maintain database triggers, functions, and packages using PL/SQL. Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Comprehensive expertise in SQL, Snowflake and version control tools such as Git and SVN. Minimum of 3 years of experience in application support and maintenance. Proven ability to communicate complex technical concepts effectively Demonstrated ability to exhibit client empathy and ensure customer satisfaction with issue resolution. Strong written and verbal communication skills. Adept at presenting intricate technical information in an accessible manner to varied audiences. Ability to thrive in a fast-paced, dynamic environment with high levels of ambiguity. Practical problem-solving skills focused on resolving immediate customer issues while planning for long-term solutions. Highly organized and process-oriented, with a proven track record of driving issue resolution by collaborating across multiple teams. Strong interpersonal skills with a customer-centric approach, maintaining patience and composure under pressure during real-time issue resolution. Working knowledge of DSP/SSP platforms is an added advantage. Open to work in night shifts in a 24/7 project.
Posted 4 months ago
6 - 10 years
8 - 18 Lacs
Kolhapur, Hyderabad, Chennai
Work from Office
Relevant Exp:5+ Yrs Mandatory Skills: Snowflake architecture, Matillion, SQL, Python, SnowSQL, any cloud Exp Night shift (6 PM to 3 AM) Complete WFO - 5 Days Email Id: anusha@akshayaitsolutions.com Loc: Hyd/ Ban/Chennai/Kolhapur
Posted 4 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |