Home
Jobs

1714 Snowflake Jobs - Page 47

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 7 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Description 1.4 to 7 years of experience as a Python Developer with a strong portfolio demonstrating your expertise 2. Good hands on knowledge with database technologies such as SQL and NoSQL 3. Familiarity with Snowflake 4. Good in Troubleshooting, debugging, maintaining and improving existing software 5. Excellent problem-solving abilities with strong communication and collaboration skills. 6. Collaborate with cross-functional teams to design and implement innovative solutions. 7. Attention to detail and commitment to producing high-quality work. 8. Working knowledge of agile development practices including Continuous Integration and SCRUM 9. Bengaluru location is a must. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level LEVEL 3 - SENIOR6-9 Years Experience Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Python;SQL;troubleshooting Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

5 - 9 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Description 3 to 5 years of experience as a Python Developer with a strong portfolio demonstrating your expertise 2. Good hands on knowledge with database technologies such as SQL and NoSQL 3. Familiarity with Snowflake 4. Good in Troubleshooting, debugging, maintaining and improving existing software 5. Excellent problem-solving abilities with strong communication and collaboration skills. 6. Collaborate with cross-functional teams to design and implement innovative solutions. 7. Attention to detail and commitment to producing high-quality work. 8. Working knowledge of agile development practices including Continuous Integration and SCRUM Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level LEVEL 2 - PROFESSIONAL3-6 Years Experience Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Python;SQL;troubleshooting;debugging Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

5 - 9 years

3 - 7 Lacs

Thane

Work from Office

Naukri logo

Description **Tableau Admin** Job OverviewThe Tableau Server/Site administrator monitors overall server health including server usage patterns, process status (up/down/failover), job status (success/failure), disk drive space, stale content, license provisioning, Tableau Bridge activity and space usage. You will interact with technologies such as Snowflake, Oracle, SQL Database, Airflow, Python, and AWS to ensure Tableau runs smoothly and efficiently. **Key Responsibilities:** Monitor Tableau Server infrastructure and resource utilization (processor, memory, disk) or Tableau Bridge pool availability and activity. Monitor Tableau Cloud application-level metrics, measure content metrics in Stage, UAT and Production by enabling Admin Insights Develop and maintain Python scripts for automation and monitoring tasks. Analyze performance metrics and create dashboards to visualize data health. Monitor client software installation including versions and database drivers, including Tableau Desktop, Tableau Reader, and Tableau Bridge. Collaborate with data engineering and analytics teams to troubleshoot and resolve data extract issues. Implement best practices for production monitoring, including alerting and incident management. Conduct root cause analysis on production incidents and develop solutions to prevent recurrence. Document processes, workflows, and troubleshooting guides for internal use. **Qualifications:** Bachelors degree in Computer Science, Data Science, or a related field. Proven experience with Tableau CMT, Tableau Desktop, Tableau Server, and Python. Knowledge of Tableau Pulse, Tableau Agent, Snowflake, SQL, and Airflow is desirable. Knowledge of data governance and data security best practices. Strong analytical skills with a focus on data quality and performance. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication skills, both written and verbal. Self-motivated team player. Eager to learn and share knowledge. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills acceptance testing;Python;automation Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

6 - 10 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Description India Template - R2D2 Global Interfaces Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade D Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills data architecture;AWS Languages RequiredENGLISH Role Rarity To Be Defined

Posted 2 months ago

Apply

4 - 9 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Hiring Talend Engineers! Exp-4-10 years Location-Hyderabad/Permanent Hybrid Mode Role & responsibilities We are seeking a Data Engineer responsible for developing and implementing the IT technology strategy for the Mortgage Servicing team. This role involves collaborating with business and product teams to design, build, and deliver processes that drives our business and technology objectives within the Mortgage industry. The Data Engineer will focus on executing key initiatives and consolidating research and technologies into a unified technical roadmap to create innovative solutions for both organization and customers. Skill required: 4+ years of experience in IT. Talend engineers preferably with snowflake experience. Experience in ETL processes and Data modelling. Understanding of IT concepts and methodologies. Communication: Strong communication skills to collaborate with cross-functional teams. Problem-Solving: Excellent problem-solving skills and attention to detail

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Pune, Coimbatore, Mumbai (All Areas)

Hybrid

Naukri logo

We are hosting an Open Walk-in Drive in Bangalore on 29th March [Saturday] 2025. Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience 5 years to 10 years Time: 9.30 AM to 4:00 PM Point of Contact: Aishwarya G / aishwaryag5@hexaware.com Venue: Hexaware Technologies Ltd, Shantiniketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Key Skills and Experience: Must have 5 - 10 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable What to Bring: Updated resume Photo ID, Passport size photo Mention "Aishwarya G" at the top of your resume. How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at aishwaryag5@hexaware.com We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Role: Snowflake Developer Required Technical Skill Set: Snowflake Desired Competencies (Technical/Behavioral Competency) : At least 5+ years of relevant work experience in any Data Warehouse Technologies At least 2+ years of experience in designing, implementing, and migrating Data/Enterprise/Engineering workloads on to snowflake DWH. Experience with PLSQL and SQL must. Should be able take the requirements from Business, co-ordinate with Business and IT Teams on clarifications, dependencies and status reporting As an individual contributor, should be able Create, test, and implement business solutions in Snowflake Experience in implementing Devops/CICD using Azure Devops / GITLAB Actions is preferred. Hands on experience in Data Modeling Expert in SQL and Performance tuning techniques of queries Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility Strong in writing snowflakes stored procedures, views, UDFs etc. Good exposure of handling CDC using Streams, TimeTravel Proficient in working with Snowflake Tasks, Data Sharing, Data replication

Posted 2 months ago

Apply

2 - 5 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in clients environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)

Posted 2 months ago

Apply

7 - 11 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

About us: Working at Target means helping all families discover the joy of everyday life. We bring that vision to life through our values and culture. . About the Role: As a Lead Data Engineer, you will serve as the technical anchor for the engineering team, responsible for designing and developing scalable, high-performance data solutions. You will own and drive data architecture that supports both functional and non-functional business needs, ensuring reliability, efficiency, and scalability. Your expertise in big data technologies, distributed systems, and cloud platforms will help shape the engineering roadmap and best practices for data processing, analytics, and real-time data serving. You will play a key role in architecting and optimizing data pipelines using Hadoop, Spark, Scala/Java, and cloud technologies to support enterprise-wide data initiatives. Additionally, experience with API development for serving low-latency data and Customer Data Platforms (CDP) will be a strong plus. Key Responsibilities: Architect and buildscalable, high-performance data pipelinesanddistributed data processing solutionsusingHadoop, Spark, Scala/Java, and cloud platforms (AWS/GCP/Azure). Design and implementreal-time and batch data processing solutions, ensuring data is efficiently processed and made available for analytical and operational use. Develop APIs and data servicesto exposelow-latency, high-throughputdata for downstream applications, enabling real-time decision-making. Optimize and enhancedata models, workflows, and processing frameworksto improve performance, scalability, and cost-efficiency. Drivedata governance, security, and compliancebest practices. Collaborate withdata scientists, product teams, and business stakeholdersto understand requirements and deliverdata-driven solutions. Lead thedesign, implementation, and lifecycle managementof data services and solutions. Stay up to date withemerging technologiesand drive adoption of best practices inbig data engineering, cloud computing, and API development. Providetechnical leadership and mentorshipto engineering teams, promoting best practices indata engineering and API design. About You: 7+ years of experience in data engineering, software development, or distributed systems. Expertise in big data technologiessuch asHadoop, Spark, and distributed processing frameworks. Strong programming skills in Scala and/or Java(Python is a plus). Experience with cloud platforms (AWS, GCP, or Azure)and theirdata ecosystem(e.g., S3, BigQuery, Databricks, EMR, Snowflake, etc.). Proficiency in API developmentusingREST, GraphQL, or gRPCto serve real-time and batch data. Experience with real-time and streaming data architectures(Kafka, Flink, Kinesis, etc.). Strong knowledge ofdata modeling, ETL pipeline design, and performance optimization. Understanding ofdata governance, security, and compliancein large-scale data environments. Experience with Customer Data Platforms (CDP) or customer-centric data processingis a strong plus. Strong problem-solving skills and ability to work incomplex, unstructured environments. Excellent communication and collaboration skills, with experience working incross-functional teams. Why Join Us? Work with cutting-edgebig data, API, and cloud technologiesin a fast-paced, collaborative environment. Influence and shape thefuture of data architecture and real-time data servicesat Target. Solvehigh-impact business problemsusingscalable, low-latency data solutions. Be part of a culture that valuesinnovation, learning, and growth.

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Tamil Nadu

Work from Office

Naukri logo

Responsibilities: Developing responsive user-facing applications using Vue.js Developing Backend code for API connectivity and Snowflake Connectivity using Node JS Building modular and reusable components and libraries Optimizing application for performance Implementing automated testing integrated into development and maintenance workflows Staying up-to-date with all recent developments in the JavaScript and Vue.js space Keeping track of security updates and issues found with Vue.js and all project dependencies Proposing any upgrades and updates necessary for keeping up with modern security and development best practices Skills: Highly proficient with the Typescript/JavaScript, Node JS languages and its modern syntax and features Highly proficient with Vue.js framework and its core principles such as components, reactivity, and the virtual DOM Familiarity with the Vue.js ecosystem, including Vue CLI, Vuex, Vue Router, and Nuxt.js Good understanding of HTML5 and CSS3, including Sass Understanding of server-side rendering and its benefits and use cases Knowledge of functional programming and object-oriented programming paradigms Ability to write efficient, secure, well-documented, and clean JavaScript code Familiarity with automated JavaScript testing. Experience with both consuming and designing RESTful APIs Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade : B Level : To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) : No Remote work possibility : No Global Role Family : To be defined Local Role Name : To be defined Local Skills : Node.js;Vue.js;JavaScript Languages Required: : ENGLISH Role Rarity : To Be Defined

Posted 2 months ago

Apply

8 - 11 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title SNOWFLAKE ARCHITECT Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to provide best fit architectural solutions for one or more projects. You would also provide technology consultation and assist in defining scope and sizing of work You would implement solutions, create technology differentiation and leverage partner technologies. Additionally, you would participate in competency development with the objective of ensuring the best-fit and high quality technical solutions. You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines, policies and norms of Infosys.If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre-requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 2 months ago

Apply

2 - 7 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title IT Consulting Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 2 months ago

Apply

4 - 9 years

0 Lacs

Mysore, Bengaluru, Hyderabad

Hybrid

Naukri logo

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Bangalore, Karnataka on 29th March [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Bangalore, Karnataka on 29th March [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / Python/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 12 years Details of the Walk-in Drive: Date: 29th March [Saturday] 2025 Experience: 5 years to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies Ltd, Shanti Niketan, 11th Floor, Crescent - 2 Prestige, Whitefield Main Rd, Mahadevapura, Bengaluru, Karnataka 560048 Work Location: Open (Hyderabad/Bangalore / Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE Python Fivetran SNOWPARK & SNOWPIPE SQL Pyspark/Spark DWH Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4 - 15 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 3 year of experience in query and processing data using python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: Updated resume Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Posted 2 months ago

Apply

2 - 6 years

0 - 1 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a highly skilled Full Stack Developer with expertise in Snowflake, AI-driven development, and modern database architectures. Required Candidate profile Experienced Data Engineer & AI Developer skilled in Snowflake, MySQL, MongoDB, vector search, embeddings, , Node.js, Python, APIs, Kubernetes, and microservices for scalable, secure systems.

Posted 2 months ago

Apply

1 - 5 years

2 - 5 Lacs

Mumbai

Work from Office

Naukri logo

The Data Warehousing Engineer role at IndusInd Bank involves overseeing key operations, ensuring compliance, and driving business growth. Responsibilities include managing customer interactions, improving service efficiency, and coordinating with various teams to achieve operational excellence. The ideal candidate should possess strong analytical skills, excellent communication, and a proactive approach to problem-solving. Prior experience in a similar role is preferred. Candidates must demonstrate leadership qualities and adaptability to dynamic banking environments. This position offers a great opportunity to grow within the banking sector.

Posted 2 months ago

Apply

8 - 11 years

18 - 30 Lacs

Pune

Hybrid

Naukri logo

What’s the role all about? As a Specialist BI Developer, you’ll be a key contributor to developing Reports in a multi-region, multi-tenant SaaS product. You’ll collaborate with the core R&D team to build high-performance Reports to serve the use cases of several applications in the suite. How will you make an impact? Take ownership of the software development lifecycle, including design, development, unit testing, and deployment, working closely with QA teams. Ensure that architectural concepts are consistently implemented across the product. Act as a product expert within R&D, understanding the product’s requirements and its market positioning. Work closely with cross-functional teams (Product Managers, Sales, Customer Support, and Services) to ensure successful product delivery. Translate business needs to technical specifications Analyze the requirement and work with Product team to freeze requirements in accordance with reporting application capabilities Manage the Project in JIRA Conduct Agile ceremonies in absence of Scrum Master Conduct design and code reviews Have you got what it takes? Bachelor/Master of Engineering Degree in Computer Science, Electronic Engineering or equivalent from reputed institute. 8-11 years of BI report development experience Expertise in SQL & any cloud-based databases. Expertise in Snowflake is advantage Expertise in any BI tools like Tableau, Power BI, MicroStrategy etc.. Experience working in enterprise Data warehouse/ Data Lake system Strong knowledge of Analytical Data base and schemas. Expertise in optimizing the data extraction process and queries Experience in managing project in JIRA, conduct Agile ceremonies Experience working with Data modelers and Governance team Experience in database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Experience working in functional testing, Performance testing etc.. Experience working in any Performance test script generation – JMeter, Gatling etc.. Experience working in automating the Testing process for E2E and Regression cases. Experience working in JAVA/ Web services will be added advantage. Experience with public cloud infrastructure and technologies such as AWS/Azure/GCP etc What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 6620 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 months ago

Apply

2 - 5 years

15 - 25 Lacs

Chandigarh

Work from Office

Naukri logo

Company Profile Priority Technology Holdings, Inc. (NASDAQ: PRTH), is headquartered in Alpharetta, Georgia USA. Our India office is located in Chandigarh, where our dynamic team builds state of the art, sophisticated Fintech products & solutions. We are an emerging payments powerhouse that offer a single unified platform for Banking & Payments powering modern commerce. Priority offers a unique family of products which integrate into SMB Payments, B2B Payments and Enterprise Payments to help businesses thrive. We are on a mission to offer an industry agnostic platform that enables businesses to collect, store and send money using various new age payment methods. Priority is an employee-first organisation and we continually strive to ensure their professional and personal success supported by employee friendly policies and a positive work environment built on mutual respect and professionalism. We offer a dynamic work environment, with continuous growth & learning opportunities. We believe in growing together and our people are the driving force behind our success. Job Description Background: The Data team at PRTH is responsible for building next generation data and information solutions across our Payment and Banking solutions. We have an incredible staff of Data Engineers , Quality engineers, Data Analysts and Data Scientists that align to execute on our vision. Responsibilities: Design and Develop complex data processes in coordination with business stakeholders to solve critical financial and operational processes. Design and Develop ETL/ELT pipelines against traditional databases and distributed systems and to flexibly produce data back to the business and analytics teams for analysis. Work in an agile, fail fast environment directly with business stakeholders and analysts, while recognising data reconciliation and validation requirements. Develop data solutions in coordination with development teams across a variety of products and technologies. Build processes that analyse and monitor data to help maintain controls - correctness, completeness and latency. Participate in design reviews and code reviews Work with colleagues across global locations Troubleshoot and resolve production issues Performance Enhancements Required Skills & Qualifications Programming Skills Python / PySpark / Scala Database Skills – Analytical Databases like Snowflakes / SQL Good to have - Elastic Search , Kafka , Nifi , Jupyter Notebooks, Good to have - Knowledge of AWS services like S3 / Glue / Athena / EMR / lambda

Posted 2 months ago

Apply

8 - 13 years

40 - 50 Lacs

Bengaluru

Hybrid

Naukri logo

Senior Data Engineer - JD The Economist Intelligence Unit (EIU) is a world leader in global business intelligence. We help businesses, the financial sector and governments to understand how the world is changing and how that creates opportunities to be seized and risks to be managed. At our heart is a 50 year forward look, a global forecast of the majority of the worlds economies, we seek to analyse the future and deliver that insight through multiple channels and insights, allowing our clients to take better trading, investment and policy decisions. We’re changing, embedding alternate data sources such as GPS and satellite data into our forecasting, products will increasingly be tailored to individual clients, driven by some of the most innovative data in the market. A highly collaborative team of Product Managers, Customer Experience and Product Engineering is being created with a focus on creating business and customer value driven by real time analytics alongside our traditional products. We are transitioning to operating with agile product teams and adopting cloud native engineering practices and we need your help. As a back-end developer you will work with an amazing team of developers, designers and product managers to design and build applications to service the Economist Intelligence Unit (EIU) clients. How you will contribute: Build data pipelines: Architecting, creating and maintaining data pipelines and ETL processes in AWS Support and Transition: Support and optimise our current desktop data tool set and Excel analysis pipeline to a transformative Cloud based highly scalable architecture. Work in an agile environment: within a collaborative agile cross-functional product team using Scrum and Kanban Collaborate across departments: Work in close relationship with data science teams and with business (economists/data) analysts in refining their data requirements for various initiatives and data consumption requirements Educate and train: Required to train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases Participate in ensuring compliance and governance during data use: To ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Work within, and encourages a Devops culture and Continuous Delivery process The ideal skills for this role include: Experience with programing in Python, Spark and SQL Prior experience in AWS services (Such as AWS Lambda, Glue, Step function, Cloud Formation, CDK) Knowledge of building bespoke ETL solutions Data modelling, and T-SQL for managing business data and reporting Capable of technical deep-dives into code and architecture Ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management Experience in working with data science teams in refining and optimizing data science and machine learning models and algorithms. Effective communication skills

Posted 2 months ago

Apply

5 - 8 years

12 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Location: Bangalore Experience: 5 - 8 Years Notice Period: Immediate to 15 Days Overview The Supply Chain Analytics BA will play a key role in analyzing and optimizing supply chain processes using SAP ERP (P2P, MM, SD), Power BI, SQL, and Snowflake . You will be responsible for data analysis, requirement gathering, FIT-GAP analysis, and stakeholder alignment to drive business improvements. This position requires strong expertise in supply chain reporting and analytics , ensuring efficient data-driven decision-making for operations. Responsibilities Analyze and optimize supply chain & operations data to improve business efficiency. Work with SAP ERP modules (P2P, MM, SD) to extract insights and enhance processes. Perform data discovery, data mining, data management, and analytics . Conduct FIT-GAP analysis, requirement gathering, and business process mapping . Develop business requirement documents (BRDs) and presentations for stakeholders. Facilitate workshops, stakeholder coordination, and alignment with business units. Collaborate with cross-functional teams to drive supply chain process improvements . Utilize Power BI, SQL, and Snowflake to build reports and dashboards for decision-making. Work with non-SAP ERP systems for data discovery and analytics. Stay updated with AI/ML trends and explore potential applications in supply chain analytics. Requirements 5+ years of experience in reporting and analytics within supply chain & operations . 4-5 years of hands-on experience with SAP ERP (P2P, MM, SD) processes. Strong expertise in data analysis, data discovery, and data management . Experience in requirement gathering, business documentation, and stakeholder collaboration . Ability to conduct FIT-GAP sessions, workshops, and process alignment discussions . Proficiency in Power BI, SQL, and Snowflake for reporting and data analytics. Familiarity with AI/ML concepts (good to have). Excellent problem-solving, analytical, and communication skills. Tech Stack SAP ERP (P2P, MM, SD) Power BI SQL Snowflake AI/ML (Good to have) If you are an experienced Supply Chain Analytics BA looking to drive impactful analytics-driven business improvements, we invite you to join our dynamic team!

Posted 2 months ago

Apply

4 - 5 years

9 - 12 Lacs

Ernakulam, Kochi

Work from Office

Naukri logo

TNP is looking for an extraordinary Data Engineer who loves to push boundaries to solve complex business problems using creative solutions. As a Data Engineer, you will work in the Technology team that helps deliver our Data Engineering offerings on a large scale to clients worldwide. Role Responsibilities: Design, develop, and maintain scalable data pipelines and architectures for batch and real-time processing. Build and optimize data integration workflows, ETL/ELT processes, and data transformation pipelines. Implement data modeling, schema design, and data governance strategies to ensure data quality and consistency. Work with relational and NoSQL databases, data lakes, and distributed systems to manage and store structured and unstructured data. Develop, test, and deploy custom data solutions using programming languages such as Python and SQL. Collaborate with cross-functional teams to identify data requirements and deliver solutions that meet business needs. Monitor data pipelines for performance, reliability, and scalability, and troubleshoot issues as they arise. Ensure data security and compliance with company policies and industry standards. Document processes, tools, and systems for knowledge sharing and scalability. Must-Have Skills: Expertise in SQL and relational database systems (e.g., PostgreSQL, MySQL, Oracle). Proficiency in programming languages like Python, Java, or Scala. Hands-on experience with ETL tools. Experience with Big Data frameworks such as Apache Spark, Hadoop, or Kafka. Knowledge of cloud platforms (AWS, Azure, GCP) and tools like Redshift, Snowflake, or BigQuery. Proficiency in working with data lakes, data warehouses, and real-time streaming architectures. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Strong problem-solving, analytical, and communication skills. Good to Have: Experience with data visualization tools (e.g., Tableau, Power BI) Knowledge of machine learning pipelines and collaboration with Data Scientists. Exposure to containerization technologies like Docker and orchestration tools like Kubernetes. Understanding of DevOps practices and Infrastructure as Code (IaC) tools such as Terraform. Certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools.

Posted 2 months ago

Apply

7 - 12 years

9 - 17 Lacs

Noida

Work from Office

Naukri logo

About The Role : Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Develop, operate, optimize, test, and maintain the businesss data warehouse, as well as develop ETL processes, cube development for database and performance administration, and dimensional designing of table structures Drive the full life-cycle of back-end development of the businesss data warehouse, as well as develop ETL processes, cube development for database and performance administration, and dimensional designing of table structures Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Define data retention policies Build analytical tools to utilize the data pipeline, providing actionable insights into key business performance metrics, including operational efficiency and customer acquisition Choose and integrate tools for monitoring, managing, alerting, and improving customer database performance Develop and implement automated processes that ensure uninterrupted updating and correction of database vulnerabilities Assemble large, complex sets of data that meet non-functional and functional business requirements Requirements 5+ years of experience or over 5 completed projects Knowledge of data management fundamentals (data modeling, ELT/ETL, data quality, metadata management, data warehouse/lakes patterns, distributed systems) AWS/Azure cloud Snowflake and Databricks are must-haves Strong proficiency with SQL and its variations among popular databases Experience with some modern relational databases Knowledge of the programming language Python

Posted 2 months ago

Apply

3 - 5 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced Data Engineer to join our Platform Modernization squad focusing on Snowflake implementation and optimization. This role will be instrumental in building and maintaining our modern data infrastructure while ensuring optimal performance security and reliability. Key Responsibilities - Design implement and manage Snowflake data warehouses and compute resources - Develop and maintain robust ETL/ELT pipelines using Snowflake best practices - Implement data security protocols and access controls within the Snowflake environment - Perform performance tuning and optimization of queries and warehouses - Create and maintain documentation for data processes and architectures - Collaborate with cross-functional teams to understand data requirements and implement solutions - Monitor and optimize warehouse costs and resource utilization - Implement data governance policies and ensure compliance Preferred Qualifications - Snowflake SnowPro certification - Experience with cloud platforms (AWS Azure or GCP) - Knowledge of data visualization tools (Tableau, Power BI or similar) - Experience with version control systems (Git) - Familiarity with CI/CD practices - Experience with data governance and security frameworks Additional Details Global Grade : C Remote work possibility : Yes Global Role Family : 60236 (P) Software Engineering Local Role Name : 6504 Developer / Software Engineer Local Skills : 57676 Snowflake Languages Required: : English.

Posted 2 months ago

Apply

6 - 10 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Key Skills and Knowledge: 1.Strong understanding of the Boomi platform and its functionalities. 2.Experience with Boomi cloud configurations and troubleshooting. 3.Experience in Boomi Cloud Configuration and troubleshooting. 4.Experience in Molecule/atom installations. 5.Experience in handling high volume configurations Role & responsibilities : * Boomi API Management * Boomi MDH * Boomi Flow * Boomi Event Stream + Plus Point * Atom Queues Points * Postman Points * EDI, JSON Knowledge * Communication Protocols (Kafka Added Advantage) * MS SQL * Java & Groovy Scripting * Snowflake (Added Advantage)

Posted 2 months ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies