Home
Jobs

1082 Snowflake Jobs - Page 15

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

11 - 21 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Chennai Career Event- Applications invited for Engagement Lead About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the worlds largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet societys evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobils affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. ExxonMobil is organizing scheduled in-person interviews at Chennai on 5th & 6th July 2025 for Engagement Lead . Work Location: Bengaluru (Last date to apply is 27th June 2025) Note: Shortlisted candidates will receive an interview invitation letter from recruiting team What role you will play in team Globally provide support to Procurement organization by proposing, developing & maintaining dashboards that creates business value. Lead & mentor internal team members for delivering the business results. Job Location: Bangalore, Karnataka, India What you will do Lead the execution of dashboard and insights-related components within the annual value delivery plan and strategic initiatives. Liaison between stakeholders and internal analytics team. Manage data-side system migrations and updates, and ensure timely incorporation of changes into analytics solutions Provide insightful data analysis to support senior stakeholders on local, regional or global basis and communicates complex data analysis in a concise but meaningful manner Actively guide and mentor other members of the team in fulfilling their duties by coaching them on more advance data analysis or on how to develop a clear opportunity set Work closely with the Procurement Manager, Data Engineering team, and other procurement professionals to plan, design, and deliver dashboards that meet the analytical needs of the Procurement organization. Review existing dashboards and visualizations to assess their effectiveness and identify opportunities to improve or better utilize them. Proactively identify data-related challenges, outline clear action plans, share progress with stakeholders, and highlight insights and opportunities within the data. Maintain existing analytics deliverables while driving continuous improvements to boost adoption and impact through effective change management About you Qualifications & Skills: Bachelor’s degree in computer science/Engineer with minimum 6 CGPA, with minimum 6 years of Experience in SQL and Snowflake databases and visualization tools like, Tableau or Power BI At least 3 years’ experience in Procurement Process and systems. Experience in SAP Analytics Cloud, SAP S4 HANA Project management knowledge is an added advantage, to effectively plan, coordinate, and deliver analytics solutions within defined timelines and expectations. Analytical mindset with the ability to work with large datasets and extract valuable insights. Excellent communication and presentation skills, capable of working effectively in a diverse team environment. Minimum 3 years’ experience of leading a team Self-motivated and able to work with minimal supervision Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.

Posted 1 week ago

Apply

4.0 - 9.0 years

3 - 8 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

We are seeking a proactive and technically sound Snowflake Data Architect to design, implement, and optimize scalable data solutions using the Snowflake cloud data platform . The ideal candidate will have strong experience in data modeling, performance tuning, and governance, along with the ability to collaborate with cross-functional teams and contribute to long-term data strategy and architecture. Key Responsibilities: Design, implement, and manage end-to-end data solutions using Snowflake . Define and execute robust data strategies aligned with business needs. Optimize Snowflake environments for performance, cost-efficiency, and scalability. Implement data governance, security controls, and role-based access. Understand and apply modern data management technologies and architectures. Oversee and maintain the organizations data inventory , ensuring accuracy and accessibility. Collaborate with engineering, analytics, and business teams to understand requirements and translate them into technical solutions. Stay up to date with industry trends and continuously improve the organization's data management systems. Key Skills & Technologies: Snowflake: Warehousing, Snowpipe, Streams & Tasks, Virtual Warehouses SQL: Advanced query development and performance optimization ETL/ELT Tools: dbt, Azure Data Factory, Talend, Informatica, Matillion (any) Data Modeling: Star/Snowflake schema, SCDs, normalization Cloud Platforms: AWS (S3, IAM), Azure (Blob, ADF), or GCP Data Governance & Security: RBAC, data cataloging, encryption Scripting (Preferred): Python or Shell Version Control (Git), CI/CD familiarity is a plus Please contact me at nandhana.suresh@deliverycentric.com for additional details.

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

4+ years of Testing Experience and at least 2 years in ETL Testing and automation Experience of automating ETL flows Experience of development of automation framework for ETL Good coding skills in Python and PytestExpert at Test Data Analysis Test design Good at Database Analytics(ETL or BigQuery) Having snowflake knowledge is a plus Good communication skills with customers and other stakeholders Capable of working independently or with little supervision

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Skills required : Bigdata Workflows (ETL/ELT), Python hands-on, SQL hands-on, Any Cloud (GCP BigQuery preferred), Airflow (good knowledge on Airflow features, operators, scheduling etc) NOTE Candidate will be having the coding test (Python and SQL) in the interview process. This would be done through coders-pad. Panel would set it at run-time.

Posted 1 week ago

Apply

2.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Experience in designing and developing data pipelines in a modern data stack (Snowflake, AWS, Airflow,DBT etc) Strong experience on Python Over 2+ years of experience in Snowflake and DBT Able to work in afternoon shift and front end the customer independently so that he/she possess strong communication Strong knowledge in Python, DBT, Snowflake, Airflow Ability to manage both structured and unstructured data Work with multiple data sources (APIs, Databases, S3, et) Own design, documentation, and lifecycle management of data pipelines Help implement the CI/CD processes and release engineering for organizations data pipelines Experience in designing and developing CI/CD processes and managing release management for data pipelines Proficient in Python, SQL, Airflow, AWS, Bitbucket, working with APIs and other types of data sources Good to have knowledge in Salesforce Primary skills : AWS Cloud, Snowflake DW, Azure SQL, SQL, Python (Must Have) DBT( Must Have)

Posted 1 week ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

Mangaluru, Hyderabad, Bengaluru

Work from Office

Naukri logo

We're looking for a Senior Backend Developer who thrives at the intersection of software engineering and data engineering . This role involves architecting and optimizing complex, high-throughput backend systems that power data-driven products at scale. If you have deep backend chops, strong database expertise across RDBMS platforms, and hands-on experience with large-scale data workflows, we'd love to hear from you. Key Responsibilities 1. Leadership Project Delivery Lead backend development teams, ensuring adherence to Agile practices and development best practices. Collaborate across product, frontend, DevOps, and data teams to design, build, and deploy robust features and services. Drive code quality through reviews, mentoring, and enforcing design principles. 2. Research Innovation Conduct feasibility studies on emerging technologies, frameworks, and methodologies. Design and propose innovative solutions for complex technical challenges using data-centric approaches. Continuously improve system design with a forward-thinking mindset. 3. System Architecture Optimization Design scalable, distributed, and secure system architectures. Optimize and refactor legacy systems to improve performance, maintainability, and scalability. Define best practices around observability, logging, and resiliency. 4. Database Data Engineering Design, implement, and optimize relational databases (PostgreSQL, MySQL, SQL Server, etc.). Develop efficient SQL queries, stored procedures, indexes, and schema migrations. Collaborate with data engineering teams on ETL/ELT pipelines , data ingestion, transformation, and warehousing. Work with large datasets , batch processing, and streaming data (e.g., Kafka, Spark, Airflow). Ensure data integrity, consistency, and security across backend and analytics pipelines. Must-Have Skills Backend Development: TypeScript, Node.js (or equivalent backend framework), REST/GraphQL API design. Databases Storage: Strong proficiency in PostgreSQL , plus experience with other RDBMS like MySQL , SQL Server , or Oracle . Familiarity with NoSQL (e.g., Redis, MongoDB) and columnar/OLAP stores (e.g., ClickHouse, Redshift). Awareness on Data Engineering : Hands-on work with data ingestion , transformation , pipelines , and data orchestration tools. Exposure to tools like Apache Airflow , Kafka , Spark , or dbt . Cloud Infrastructure: Proficiency with AWS (Lambda, EC2, RDS, S3, IAM, CloudWatch). DevOps CI/CD: Experience with Docker, Kubernetes, GitHub Actions or similar CI/CD pipelines. Architecture: Experience designing secure, scalable, and fault-tolerant backend systems. Agile SDLC: Strong understanding of Agile workflows, SDLC best practices, and version control (Git). Nice-to-Have Skills Experience with event-driven architectures or microservices . Exposure to data warehouse environments (e.g., Snowflake, BigQuery). Knowledge of backend-for-frontend collaboration (especially with React.js). Familiarity with data cataloging, data governance, and lineage tools. Preferred Qualifications Bachelor's or Master's in Computer Science, Software Engineering, or a related technical field. Proven experience leading backend/data projects in enterprise or startup environments. Strong system design, analytical, and problem-solving skills. Awareness of cybersecurity best practices in cloud and backend development.

Posted 1 week ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Were looking for a skilled Node.js Developer with a strong foundation in data engineering to join our engineering team. Youll be responsible for building scalable backend systems using modern Node.js frameworks and tools, while also designing and maintaining robust data pipelines and integrations. Primary Responsibilities: Build and maintain performant APIs and backend services using Node.js and frameworks like Express.js, NestJS, or Fastify. Develop and manage ETL/ELT pipelines, data models, schemas, and data transformation logic for analytics and operational use. Ensure data quality, integrity, and consistency through validation, monitoring, and logging. Work with database technologies (MySQL, PostgreSQL, MongoDB, Redis) to store and manage application and analytical data. Implement integrations with third-party APIs and internal microservices. Use ORMs like Sequelize, TypeORM, or Prisma for data modeling and interaction. Write unit, integration, and E2E tests using frameworks such as Jest, Mocha, or Supertest. Collaborate with frontend, DevOps, and data engineering teams to ship end-to-end features. Monitor and optimize system performance, logging (e.g., Winston, Pino), and error handling. Contribute to CI/CD workflows and infrastructure automation using tools like PM2, Docker and Jenkins. Required Skills: 3+ years of experience in backend development using Node.js. Hands-on experience with Express.js, NestJS, or other Node.js frameworks. Understanding of data modelling, partitioning, indexing, and query optimization. Experience in building and maintaining data pipelines, preferably using custom Node.js scripts. Familiarity with stream processing and messaging systems (e.g., Kafka, RabbitMQ, or Redis Streams). Solid understanding of SQL and NoSQL data stores and schema design. Strong knowledge of JavaScript and preferably TypeScript. Familiarity with cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, or Cloud Functions. Experience with containerized environments (Docker) and CI/CD. Experience with data warehouses (e.g., BigQuery, Snowflake, Redshift). Nice To Have: Cloud Certification in AWS or GCP. Experience with distributed processing tools (eg. Spark, Trino/Presto) Experience with Data Transformation tool (ex. DBT, SQLMesh) and Data Orchestration (ex. Apache Airflow, Kestra etc) Familiarity with Serverless architectures and tools like Vercel/Netlify for deployment

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 15 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

3+ years of experience in data science roles, working with tabular data in large-scale projects. Experience in feature engineering and working with methods such as XGBoost, LightGBM, factorization machines , and similar algorithms. Experience in adtech or fintech industries is a plus. Familiarity with clickstream data, predictive modeling for user engagement, or bidding optimization is highly advantageous. MS or PhD in mathematics, computer science, physics, statistics, electrical engineering, or a related field. Proficiency in Python (3.9+), with experience in scientific computing and machine learning tools (e.g., NumPy, Pandas, SciPy, scikit-learn, matplotlib, etc.). Familiarity with deep learning frameworks (such as TensorFlow or PyTorch) is a plus. Strong expertise in applied statistical methods, A/B testing frameworks, advanced experiment design, and interpreting complex experimental results. Experience querying and processing data using SQL and working with distributed data storage solutions (e.g., AWS Redshift, Snowflake, BigQuery, Athena, Presto, MinIO, etc.). Experience in budget allocation optimization, lookalike modeling, LTV prediction, or churn analysis is a plus. Ability to manage multiple projects, prioritize tasks effectively, and maintain a structured approach to complex problem-solving. Excellent communication and collaboration skills to work effectively with both technical and business teams.

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Observability : Ensure end-to-end monitoring of pipelines, data services, infrastructure, and databases. Proactively detect and resolve issues to maintain system health. FinOps : Track and optimize platform usage and cost. Understand cost drivers, perform forecasting, and automate cost control measures. User Management : Manage onboarding, offboarding, and role-based access controls across tools including Tableau, Snowflake, and AWS. Privilege Access Management : Oversee and audit elevated access to critical systems in compliance with security policies. Application Maintenance : Perform regular maintenance, updates, and health checks on platform components to ensure operational stability. Service Desk Management : Triage and resolve incidents, service requests (SRs), and problems. Maintain the BAU roster and collaborate with cross-functional teams. Minor Enhancements : Address low-effort business enhancements (23 days) through a structured request process. Business Continuity Planning : Maintain and test Business Continuity Plans (e.g., Tableau DR) to ensure platform resilience. Deployment Services : Support production deployments, bug fixes, and enhancements in line with CI/CD pipelines. Data Load Fixes : Resolve failures in data ingestion due to scheduling, connectivity, infrastructure, or secret rotation issues. Transformations/Data Model Support : Provide Level 1 triage for issues arising from schema changes, malformed data, or source inconsistencies. Functional Data Questions : Perform initial triage for data requests or quality issues, and coordinate with domain-specific data analysts as needed. Project Support : Offer support for projects needing platform team involvement License Review : Participate in quarterly Tableau license reviews and ensure license compliance. Documentation : Maintain procedures, work instructions, and knowledge base (KB) articles for operational consistency and knowledge transfer. Preferred candidate profile 3+ years of experience in a Data Platform Support, DevOps, or Operations role. Hands-on experience with tools like Tableau, Snowflake, AWS, Informatica Cloud. Familiarity with ITSM practices (e.g., incident, problem, change management). Proficiency with Jira, CI/CD workflows, and monitoring tools. Strong documentation, communication, and stakeholder management skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Chennai

Hybrid

Naukri logo

Job Title:- Data Engineer Location:- Chennai, Bangalore, Pune, Hyderabad (Hybrid) Job Type:- Permanent Employee Responsibilities: Skillsets required: Application and API development SQL data modeling and automation Experience working with GIS map services and spatial databases Experience creating GIS map services Data and application architecture Handling of legacy data Familiarity with Client work processes and data is a plus Platforms: DataBricks Snowflake ESRI ArcGIS / ArcSDE New GenAI app being developed Tasks that will need to be done: Combining and integrating spatial databases from different sources to be used with the new GenAI application Building of map services with associated metadata to support questions from geoscience users Set up necessary updating cycles for databases and map services to ensure evergreen results Help with constructing APIs for these databases and map services to structure the best possible workflows for users Assistance with data and application architecture Help with handling legacy data, such as links to existing applications, databases, and services Ensure that IT requirements are being met as we build our project, including integration, data tiers, access control and status monitoring

Posted 1 week ago

Apply

8.0 - 12.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Total Experience: 8+ years in IT. Relevant Experience: 5+ years. Snowflake with Python, Dayshift More than 8 years of IT Experience, specifically in data Engineering stream Should possess Developmental skills in Snowflake, basic IBM Datastage/ any other ETL tool, SQL (expert ), basics of Python/Pyspark, AWS along with high proficiency in Oracle SQL Hands on Experience in handling databases along with experience in any scheduling tool like Ctrl-M, Control - M Excellent customer service, interpersonal, communication and team collaboration skills Excellent in debugging skills in Databases and should have played a key member role in earlier projects. Excellent in SQL and PL/SQL coding (development. Ability to identify and implement process and/or application improvements Must be able to work on multiple simultaneous tasks with limited supervision Able to follow change management procedures and internal guidelines Any relevant technical Certifications in data stage is a plus

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Role: Technical Project Manager Location: Gachibowli, Hyderabad Duration: Full time Timings: 5:30pm - 2:00am IST Note: Looking for Immediate Joiners only (15-30 Days Notice) Job Summary: We are seeking a Technical Project Manager with a strong data engineering background to lead and manage end-to-end delivery of data platform initiatives. The ideal candidate will have hands-on exposure to AWS, ETL pipelines, Snowflake, DBT , and must be adept at stakeholder communication, agile methodologies, and cross-functional coordination across engineering, data, and business teams. Key Responsibilities: Plan, execute, and deliver data engineering and cloud-based projects within scope, budget, and timeline. Work closely with data architects, engineers, and analysts to manage deliverables involving ETL pipelines , Snowflake data warehouse , and DBT models . Lead Agile/Scrum ceremonies sprint planning, backlog grooming, stand-ups, and retrospectives. Monitor and report project status, risks, and issues to stakeholders and leadership. Coordinate cross-functional teams across data, cloud infrastructure, and product teams . Ensure adherence to data governance, security , and compliance standards throughout the lifecycle. Manage third-party vendors or consultants as required for data platform implementations. Own project documentation including project charters, timelines, RACI matrix, risk registers, and post-implementation reviews. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, Information Systems, or related field (Masters preferred). 8+ years in IT with 3-5 years as a Project Manager in data-focused environments. Hands-on understanding of: AWS services (e.g., S3, Glue, Lambda, Redshift) ETL/ELT frameworks and orchestration Snowflake Data Warehouse DBT (Data Build Tool) for data modeling Familiar with SQL, data pipelines , and data quality frameworks . Experience using project management tools like JIRA, Confluence, MS Project, Smartsheet. PMP, CSM, or SAFe certifications preferred. Excellent communication, presentation, and stakeholder management skills.

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Noida

Hybrid

Naukri logo

QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job QualificationsJob QualificationsRequirements and skills • At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). • Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. • Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. • Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. • Performance Testing • Experience with version control systems like Git • Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. • Strong communication and collaboration skills. • Attention to detail and a passion for delivering high-quality solutions. • Ability to work in a fast-paced environment and manage multiple priorities. • Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud Including Logic App, Azure Functions, ADF

Posted 1 week ago

Apply

7.0 - 12.0 years

17 - 30 Lacs

Hyderabad/ Secunderabad, Ahmedabad, Chennai

Work from Office

Naukri logo

Applicants who require a UK work visa are considered. Software Engineers can have any of the below skillsets welcome to apply: Cloud Platforms (Azure/AWS/GCP) | DevOps Engineer | Microsoft 365 | Microsoft Dynamics 365 | Power Technologies (PowerApps, Power Automate & Power BI) | SharePoint SME | Test Engineer | Frontend Development | Fullstack Developer | DBA Admin | SAP | Salesforce | BigData | Data Engineer | AI Engineer | Hadoop | Snowflake | Java / JavaScript / React JS / REST API / C# / ASP.Net / SQL Server / PySpark / Node JS) | Terraform | Kubernetes | Docker | Site Resilience Engineer | Scrum Master | Business Analyst | Human Resource As a Software Engineer, you will work in the product team and be a core contributor. You will collaborate with other engineers, defining and delivering solutions that expand on product offerings and new capabilities and support the continued growth. Use a modularized approach, data-driven and measure our results. Continually innovate and improve, strive to learn and grow, and have a standard of excellence, a strong sense of ownership, and excellent technical skills in agile environments. NOTE: This Job provides initial 3 years of visa Sponsorship under the UK Skilled Worker visa category, which involves processing charges . One who is enthusiastic about relocating to the UK and making a bright future can apply. Good to have any of the below Skillsets: Frontend Development skills. Javascript, React JS, REST API's TypeScript, NodeJS, HTML, CSS Significant commercial C# experience specifically. Microsoft Office 365, AWS, Azure cloud platforms Ability to collaborate in a development team Excellent communication skills with team leads/line managers. Data Modelling, Data Analytics, Data Governance, Machine Learning, B2B Customer Operations, Master Data Management, Data interoperability, Azure Key Vault, Data integration Azure Data Lake, Data Science, Digital Transformation, Cloud Migration, Data Architecture, Data Migrations Data Marts, Agile Delivery, ETL/ELT, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, ARM/Terraform, Azure Powershell, Data Catalogue Key Accountabilities: Deliver high-quality software in acceptable timescales To take ownership of a significant and key area within the solution To suggest estimates of the expected time to complete work Designing and implementing services using C#, .NET Core, Azure, SQL Server, Javascript, Angular.js, NodeJS Designing and implementing web APIs using .NET Core, C# To work well within a team environment To abide by design and coding guidelines To be proactive and self-sufficient with excellent attention to detail Location: London, UK Duration: Full Time Start Date: ASAP Rate 30 Lakhs per annum Competitive Holiday Website: https://saitpoint.com/ Employment Business: Saitpoint Private Limited (India) and Saitpoint Limited (UK) Contact me: hr@saitpoint.com

Posted 1 week ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

Role: Snowflake Developer Exp:4+ yrs Location: PAN INDIA

Posted 1 week ago

Apply

11.0 - 14.0 years

16 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Roles and Responsibilities Design, develop, and maintain large-scale data pipelines using Azure Data Factory (ADF) to extract, transform, and load data from various sources into Snowflake Data Warehouse. Develop complex SQL queries to optimize database performance and troubleshoot issues in Snowflake tables. Collaborate with cross-functional teams to gather requirements for reporting needs and design scalable solutions using Power BI. Ensure high-quality data modeling by creating logical and physical models for large datasets. Troubleshoot technical issues related to ETL processes, data quality, and performance tuning.

Posted 1 week ago

Apply

8.0 - 13.0 years

17 - 25 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Call: 7738402343 Mail: divyani@contactxindia.com Role & responsibilities Snowflake with Python, Dayshift More than 8 years of IT Experience, specifically in data Engineering stream Should possess Developmental skills in Snowflake, basic IBM Datastage/ any other ETL tool, SQL (expert ), basics of Python/Pyspark, AWS along with high proficiency in Oracle SQL Hands on Experience in handling databases along with experience in any scheduling tool like Ctrl-M, Control - M Excellent customer service, interpersonal, communication and team collaboration skills Excellent in debugging skills in Databases and should have played a key member role in earlier projects. Excellent in SQL and PL/SQL coding (development. Ability to identify and implement process and/or application improvements Must be able to work on multiple simultaneous tasks with limited supervision Able to follow change management procedures and internal guidelines Any relevant technical Certifications in data stage is a plus Preferred candidate profile

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Data Analyst - Procurement Bengaluru, India We are seeking a skilled and highly motivated Senior Data Analyst to join our Procurement team. In this role, you will analyze various data management tools, optimize software license usage, and provide key insights to stakeholders regarding software license performance and cost-effectiveness. The ideal candidate will possess a strong technical background and a keen analytical mindset, with a focus on improving the overall reporting landscape and help drive efficiency. Candidates must be available to work and attend meetings during Pacific Standard Time (PST). Key Responsibilities: This position is responsible for data analysis and reporting needs for the Procurement team (SS&P) in support of Business Services. The role covers data needs related, but not limited to, Snowflake, Coupa, ServiceNow, and financial metrics supporting strategic priorities and participates in a variety of special projects involving data analysis, business operations, and data management. Managing Procurement reports (e.g., Excel, Tableau, Power BI) with precision and effectiveness based on data coming from a variety of business systems and data warehouses (Coupa, Snowflake, SCOUT, ServiceNow) Providing first line resolution and support for data requests as well as partnering with IT and Sec-Ops team accordingly Growing the analytics and Business Intelligence related to software contracts, metrics, cost, budget supported by Procurement and Sourcing to assist in building a data-driven culture: Software Asset Inventory: Maintain an accurate and up-to-date inventory of software licenses, versions, and deployments throughout the organisation. License Compliance: Ensure adherence to software licensing agreements, monitor license usage, and take necessary measures to address any non-compliance issues. License Optimization: Analyse software usage patterns, identify opportunities for optimization, and implement strategies to optimize license allocation and reduce costs (ex: co-terming agreements). Cost Optimization: Focus on software license cost management, tracking, and forecasting. Collaboration: Work with cross-functional teams (e.g. IT, Security, Bu) to improve data quality and reporting. Renewals: Provide up-to-date and reliable information to ensure software contracts are renewed on time. Design BI dashboards, scorecards, charts/graphs, drill-downs, and dynamic reports to meet new information needs. Lead the creation of a catalog of Key Performance Indicators and the documentation of their supporting business requirements, data models, calculation rules, and metadata. Qualifications: Bachelor s degree in Computer Science, Information Systems Management, or a related field, or an equivalent combination of education and/or experience. Advanced Excel skills required; ServiceNow or BI tool certifications are a plus. Experience: 5+ years of experience as a Data Analyst, with hands-on experience in data analysis and creating visualizations using BI tools. Expertise with SQL, Snowflake, Power BI, and Tableau. Strong understanding of Procurement/Sourcing processes, especially in SaaS and Cloud Services. Strong knowledge of ServiceNow ITAM and/or ITSM. Experience with cloud-based analytics or data governance. Proficient in PowerPoint with the ability to influence decision-making using data-driven insights. Excellent problem-solving and data analysis skills, with a focus on cloud performance metrics and cost optimization. Strong written and verbal communication skills, with the ability to explain complex technical concepts to both technical and non-technical stakeholders. Ability to manage multiple priorities and projects in a fast-paced environment. Excited to work in a fast-growing global environment and able to thrive with both autonomy and collaboration.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Data Engineer - Enterprise Data Platform Get to know Data Engineering Okta s Business Operations team is on a mission to accelerate Okta s scale and growth. We bring world-class business acumen and technology expertise to every interaction. We also drive cross-functional collaboration and are focused on delivering measurable business outcomes. Business Operations strives to deliver amazing technology experiences for our employees, and ensure that our offices have all the technology that is needed for the future of work. The Data Engineering team is focused on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Senior Data Engineer Opportunity A Senior Data Engineer is responsible for designing, building, and maintaining scalable solutions. This role involves collaborating with data engineers, analysts, scientists and other engineers to ensure data availability, integrity, and security. The ideal candidate will have a strong background in cloud platforms, data warehousing, infrastructure as code, and continuous integration/continuous deployment (CI/CD) practices. What you ll be doing: Design, develop, and maintain scalable data platforms using AWS, Snowflake, dbt, and Databricks. Use Terraform to manage infrastructure as code, ensuring consistent and reproducible environments. Develop and maintain CI/CD pipelines for data platform applications using GitHub and GitLab. Troubleshoot and resolve issues related to data infrastructure and workflows. Containerize applications and services using Docker to ensure portability and scalability. Conduct vulnerability scans and apply necessary patches to ensure the security and integrity of the data platform. Work with data engineers to design and implement Secure Development Lifecycle practices and security tooling (DAST, SAST, SCA, Secret Scanning) into automated CI/CD pipelines. Ensure data security and compliance with industry standards and regulations. Stay updated with the latest trends and technologies in data engineering and cloud platforms. What we are looking for: BS in Computer Science, Engineering or another quantitative field of study 5+ years in a data engineering role 5+ years experience working with SQL, ETL tools such as Airflow and dbt, with relational and columnar MPP databases like Snowflake or Redshift, hands-on experience with AWS (e.g., S3, Lambda, EMR, EC2, EKS) 2+ years of experience managing CI/CD infrastructures, with strong proficiency in tools like GitHub Actions, Jenkins, ArgoCD, GitLab, or any CI/CD tool to streamline deployment pipelines and ensure efficient software delivery. 2+ years of experience with Java, Python, Go, or similar backend languages. Experience with Terraform for infrastructure as code. Experience with Docker and containerization technologies. Experience working with lakehouse architectures such as Databricks and file formats like Iceberg and Delta Experience in designing, building, and managing complex deployment pipelines.

Posted 1 week ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior People Data Ops Product Manager, you will own enterprise data products and assets, such as curated data sets, semantic layers and foundational data pipelines of HR pyramid. You will drive product strategy, discovery, delivery, and evolution of these data products to ensure they meet the analytical, operational, and compliance needs of Target s diverse user base. About the Role As a Senior People Data Ops Product Manager, you will work in Target s product model and partner closely with engineers, data scientists, UX designers, governance and privacy experts, and business stakeholders to build and scale data products that deliver measurable outcomes. You will be accountable for understanding customer needs and business objectives, and translating them into a clear roadmap of capabilities that drive adoption and impact. You will: Define the vision, strategy, and roadmap for one or more data products, aligning with enterprise data and business priorities. Deeply understand your users analysts, data scientists, engineers, and business leaders and their data needs. Translate complex requirements into clear user stories, acceptance criteria, and product specifications. Drive decisions about data sourcing, quality, access, and governance in partnership with engineering, privacy, and legal teams. Prioritize work in a unified backlog across discovery, design, data modeling, engineering, and testing. Ensure high-quality, reliable, and trusted data is accessible and usable for a variety of analytical and operational use cases. Evangelize the value of your data product across the enterprise and support enablement and adoption efforts. Use data to make decisions about your product's performance, identify improvements, and evaluate new opportunities. About You Must have minimum 3 years of college degree in computer science or information technology. A total of 9+ years of experience in which 5+ years of product management experience, ideally with a focus on data products, platforms, or analytical tooling Deep understanding of data conceptsdata modeling, governance, quality, privacy, and lifecycle management Experience delivering products in agile environments (e.g., user stories, iterative development, scrum teams) Ability to translate business needs into technical requirements and communicate effectively across roles Demonstrated success in building products that support data consumers like analysts, engineers, and business users Experience working with modern data technologies (e.g., Snowflake, Hadoop, Airflow, GCP, etc.) is a plus Strategic thinker with strong analytical and problem-solving skills Strong leadership, collaboration, and communication skills Willing to coach and mentor team members.

Posted 1 week ago

Apply

2.0 - 6.0 years

3 - 8 Lacs

Pune, Sangli

Work from Office

Naukri logo

We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

About the Opportunity Job TypeApplication 23 June 2025 Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.

Posted 1 week ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

About the Opportunity Job TypeApplication 21 June 2025 Title Senior Analyst- Data Scientist Department Data Value Location Gurgaon Reports To Suman Kaur Level 3 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our Data Value team and feel like youre part of something bigger. About your team Data Value team drives the renewed focus of extracting value from Fidelitys data for business and client insights and working as one voice with the business, technology, and data teams. The teams vision is to create measurable business impact by leveraging technology and utilising the skills to generate valuable insights and streamline engagements. The Data Science function within Data Value supports Fidelity Internationals Sales, Marketing, Propositions, Risk, Finance, Customer Service and HR teams across the globe. The key objectives of the function are: To develop deep customer insights for our businesses helping them segment and target customers more effectively To develop a fact-based understanding of sales trends and identify actionable sales growth opportunities for each of our sales channels To understand customer preferences in terms of products, service attributes and marketing activity to help refine each of these To help develop new services lines e.g. develop customer analytics for key IFAs, DC Clients, Individual clients etc. To develop market and competitive intelligence in our key markets to help shape our business planning in those markets The function works directly with business heads and other senior stakeholders stakeholders to identify areas of analytics, define problem statements and develop key insights. About your role You will be expected to take a leading role in developing the Data Science and Advanced Analytics solutions for our business. This will involve: Engaging with the key stakeholders to understand Fidelitys sales, marketing, client services and propositions context Implement advanced analytics solutions on On-Premises/Cloud platforms, develop proof-of-concepts and engage with internal and external ecosystem to progress the proof of concepts to production. Engaging and collaborating with different other internal teams like Data engineering, DevOps, technology team etc for development of new tools, capabilities, and solutions. Maximize Adoption of Cloud Based advanced analytics solutionsBuild out sandbox analytics environments using Snowflake, AWS, Adobe, Salesforce. About you Key Responsibilities Developing and Delivering Data Science solutions for business (40%) Partner with internal (FIL teams) & external ecosystem to design and deliver advanced analytics enabled Data Science solutions Create advanced analytics solution on quantitative and text data using Artificial Intelligence, Machine Learning and NLP techniques. Create compelling visualisations that enable the smooth consumption of predictions and insights for customer benefit . Stakeholder Management (30%) Works with channel heads/stakeholders and other sponsors understand the business problem and translate it into appropriate analytics solution. Engages with key stakeholders for smooth execution, delivery, and implementation of solutions Adoption of Cloud enabled Data Science solutions(20%) Maximize Adoption of Cloud Based advanced analytics solution Build out sandbox analytics environments using Snowflake, AWS, Adobe, Salesforce Deploy solutions in productions while adhering to best practices involving Model Explainability, MLOps, Feature Stores, Model Management, Responsible AI etc Collaboration and Ownership (10%) Sharing of knowledge, best practices with the team including coaching or training in some of deep learning/machine learning methodologies. Provides mentoring, coaching, and consulting advice and guidance to staff, e.g. analytic methodologies, data recommendations Takes complete independent ownership of the projects and the initiatives in the team with the minimal support Experience and Qualifications Required Qualifications: Engineer from IIT/Masters in field related to Data Science/Economics/Mathematics (Tie1 Institutions like ISI, Delhi School of Economics)/M.B.A from tier 1 institutions Must have Skills & Experience Required: Overall, 8+ years of experience in Data Science and Analytics 5+ years of hands-on experience in - Statistical Modelling /Machine Learning Techniques/Natural Language Processing/Deep Learning 5+ years of experience in Python/Machine Learning/Deep Learning Excellent problem-solving skills Should be able to run analytics applications such as Python, SAS and interpret statistical results Implementation of models with clear measurable outcomes Good to have Skills & Experience Required: Ability to engage in discussion with senior stakeholders on defining business problems, designing analyses projects, and articulating analytical insights to stakeholders. Experience on SPARK/Hadoop/Big Data Platforms is a plus Experience with unstructured data and big data Experience with secondary data and knowledge of primary market research is a plus. Ability to independently own and manage the projects with minimal support. Excellent analytical skills and a strong sense for structure and logic Ability to develop, test and validate hypotheses. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 week ago

Apply

9.0 - 14.0 years

13 - 17 Lacs

Gurugram

Work from Office

Naukri logo

About the Opportunity Job TypeApplication 20 June 2025 Title Data Scientist, Risk Data Analytics Department Data Value Location Gurgaon Reports To Associate Director, Risk Data Analytics Level 5 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our Data Value team and feel like youre part of something bigger. About Global Risk The Global Risk team in Fidelity covers the management oversight of Fidelitys risk profile, including key risk frameworks, policies and procedures and oversight and challenge processes. The team partner with the businesses to ensure Fidelity manages its risk profile within defined risk appetite. The team comprises risk specialists covering all facets of risk management, including investment, financial, non-financial and strategic risk. As part of a broader General Counsel team, the Risk team collaborates closely with Compliance, Legal, Tax and Corporate Sustainability colleagues. About Risk Data Analytics Hub The vision of Risk Data Analytics Hub is to establish a data-centric risk function that is forward-thinking, resilient, and proactive. The hubs mission is to enhance risk management processes and unlock innovative opportunities in the ever-changing risk and business landscape. The Hub has made significant strides in the Investment Risk, delivering prominent contributions such as the Fund Performance Monitoring, Fund Aggregate Exposures, Fund Market Risk, Fund Liquidity Risk, and other comprehensive monitoring and reporting dashboards. These tools have been crucial in supporting risk oversight and regulatory submissions. The Hubs goal is to scale this capability across global risk, using data-driven insights to uncover hidden patterns and predict emerging risks. This will enable decision-makers to prioritise actions that align with business objectives. The approach is to dismantle silos and foster collaboration across the global risk team, introducing new tools, techniques, and innovation themes to enhance agility. About your role You will be expected to take a leading role in developing the Data Science and Advanced Analytics solutions for our business. This will involve: Engaging with the key stakeholders to understand various subject areas in Global Risk Team including Investment Risk, Non-Financial Risk, Enterprise Risk, Model Risk, Enterprise Resilience etc. Implement advanced analytics solutions on On-Premises/Cloud platforms, develop proof-of-concepts and engage with internal and external ecosystem to progress the proof of concepts to production. Engaging and collaborating with different other internal teams like Data Lake, Data Engineering, DevOps/MLOps, Technology team etc for development of new tools, capabilities, and solutions. Maximize adoption of Cloud Based advanced analytics solutionsBuild out sandbox analytics environments using Snowflake, AWS, Adobe, Salesforce. Support delivered models and infrastructure on AWS including data changes and model tuning About you Key Responsibilities Developing and Delivering Data Science solutions for business (40%) Partner with internal (FIL teams) & external ecosystem to design and deliver advanced analytics enabled Data Science solutions. Create advanced analytics solution on quantitative and text data using Artificial Intelligence, Machine Learning and NLP techniques. Create compelling visualisations that enable the smooth consumption of predictions and insights for customer benefit. . Stakeholder Management (30%) Works with Risk SMEs/Managers, stakeholders and sponsors to understand the business problem and translate it into appropriate analytics solution. Engages with key stakeholders for smooth execution, delivery, implementation and maintenance of solutions. Adoption of Cloud enabled Data Science solutions(20%) Maximize Adoption of Cloud Based advanced analytics solution Build out sandbox analytics environments using Snowflake, AWS, Adobe, Salesforce Deploy solutions in productions while adhering to best practices involving Model Explainability, MLOps, Feature Stores, Model Management, Responsible AI etc Collaboration and Ownership (10%) Sharing of knowledge, best practices with the team including coaching or training in some of deep learning/ machine learning methodologies. Provides mentoring, coaching, and consulting advice and guidance to staff, e.g. analytic methodologies, data recommendations. Takes complete independent ownership of the projects and the initiatives in the team with the minimal support. Experience and Qualifications Required Qualifications: Engineer from IIT/Masters in field related to Data Science/Economics/Mathematics (Tie1 Institutions like ISI, Delhi School of Economics)/M.B.A from tier 1 institutions Must have Skills & Experience Required: Overall, 9+ years of experience in Data Science and Analytics 5+ years of hands-on experience in - Statistical Modelling /Machine Learning Techniques/Natural Language Processing/Deep Learning 5+ years of experience in Python/Machine Learning/Deep Learning Excellent problem-solving skills Should be able to run analytics applications such as Python, SAS and interpret statistical results Implementation of models with clear measurable outcomes Good to have Skills & Experience Required: Ability to engage in discussion with senior stakeholders on defining business problems, designing analyses projects, and articulating analytical insights to stakeholders. Experience on SPARK/Hadoop/Big Data Platforms is a plus. Experience with unstructured data and big data. Experience with secondary data and knowledge of primary market research is a plus. Ability to independently own and manage the projects with minimal support. Excellent analytical skills and a strong sense for structure and logic. Ability to develop, test and validate hypotheses.

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies