Home
Jobs

1714 Snowflake Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

27 - 35 Lacs

Hyderabad, Bengaluru, india

Hybrid

Naukri logo

Job Title: Data & Analytics Cloud Platform Manager AWS & Snowflake Job Summary: We are seeking an experienced Data & Analytics Cloud Platform Manager to oversee and optimize a Snowflake-based data platform on AWS. The ideal candidate will have deep expertise in managing operations, automation, support governance, SLI optimization, and change management to ensure reliability and performance. This role requires strong leadership skills to drive efficiency, automation, and operational excellence in a cloud-native data environment. Key Responsibilities: Cloud Platform Management: Oversee and optimize the AWS-based data & analytics infrastructure, ensuring performance, security, and scalability. Incident Management: Lead incident resolution using structured problem management practices, ensuring minimal business impact. Support Governance: Establish robust support frameworks to drive consistency in monitoring, issue resolution, and operations. SLI/SLO Management: Define, track, and optimize Service Level Indicators (SLIs) & Service Level Objectives (SLOs) for platform reliability. End-to-End Automation: Implement automation for incident response, monitoring, alerting, and self-healing systems to streamline support operations. Change Management: Manage platform updates, deployments, and data governance changes with minimal disruption. Security & Compliance: Ensure data security, access controls, and regulatory compliance across AWS and Snowflake environments. Collaboration & Leadership: Work closely with engineering, analytics, DevOps, and business teams to drive efficient platform operations. Required Skills & Qualifications: Experience managing cloud-based data platforms, especially AWS & Snowflake. Strong expertise in Incident Management, Support Governance, and SLI/SLO optimization. Hands-on experience in automating data operations and support processes. Deep understanding of AWS cloud services, Snowflake architecture, and data pipeline optimizations. Solid knowledge of ITIL frameworks, DevOps, and cloud-native operations. Proficiency in SQL, Python, Spark, and data orchestration tools. Experience with monitoring tools like AWS CloudWatch, Snowflake Query Performance Dashboard, and Application Insights. Excellent problem-solving, analytical, and communication skills. Preferred Qualifications: Exposure to CI/CD pipelines for data applications. Experience with Infrastructure-as-Code (IaC) tools such as Terraform or CloudFormation. Knowledge of machine learning integration for optimizing platform reliability.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Job Location: Chennai, Bangalore, Hyderabad Experience: 5 - 15 yrs Job Type: FTE Shift Timing: 2 PM IST till 11 PM IST Note: Looking only for Immediate to 1 week joiners. Must be comfortable for Video discussion. JD Key Skills: Data Analyst, SQL, Datastage, Snowflake, AWS, Datawarehouse Looking for a Sr. Data Analyst who is experienced in the ETL/Datawarehousing technologies Experienced Data Analyst in ETL/Datawarehousing Experience with SQL, Datastage, Autosys, Snowflake, AWS Knowledge of Agile execution/delivery processes/tools including Confluence, JIRA, SharePoint, and ServiceNow Candidate needs to be having the below : >Minimum of 5+ years of Data analyst experience >Experienced in ETL (Datastage preferrable but fine with others as well) >Experienced in Data Analysis ( Data Cleansing, Data validation, Data Mapping & Solutioning, ETL QA) >Experienced in SQL (Snowflake, SQL server, Oracle SQL/ PL SQL - Any of these) >Should be capable of Client facing work on a day-to-day basis >Knowledgeable in Investment Banking/Asset Management/Compliance Regulatory reporting - any of these is good-to-have Contact Person - Amrita Please share your updated profile to amrita.anandita@htcinc.com with the below mentioned details: Full Name (As per Aadhar card) - Total Exp. - Rel. Exp. (Data Analyst) - Rel. Exp. (Datawarehouse) - Rel. Exp. (SQL) - Rel. Exp. (Datastage) - Rel. Exp. (Snowflake) - Rel. Exp. (AWS) - Highest Education (if has done B.Tech/ B.E, then specify) - Notice Period - If serving Notice or not working, then mention your last working day as per your relieving letter - CCTC - ECTC - Current Location - Preferred Location -

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Chennai

Work from Office

Naukri logo

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-12 yrs Location: Chennai Skill: Snowflake Developer Desired Skill Sets: Should have strong experience in Snwoflake Strong Experience in AWS and Python Experience in ETL tools like Abinitio, Teradata Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Snowflake Development: Rel Exp in AWS: Rel Exp in Python/Abinitio/Teradata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for F2F interview on 14th June, Saturday between 9 AM- 12 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

RSM is looking for an experienced Hands-On Technical Manager with expertise in big data technologies and multi-cloud platforms to lead our technical team for the financial services industry. The ideal candidate will possess a strong background in big data architecture, cloud computing, and a deep understanding of the financial services industry. As a Technical Manager, you will be responsible for leading technical projects, hands-on development, delivery management, sales and ensuring the successful implementation of data solutions across multiple cloud platforms. This role requires a unique blend of technical proficiency, sales acumen, and presales experience to drive business growth and deliver innovative data solutions to our clients. Responsibilities: Provide technical expertise and guidance on the selection, and hands-on implementation, and optimization of big data platforms, tools, and technologies across multiple cloud environments (e.g., AWS, Azure, GCP, Snowflake, etc.) Architect and build scalable and secure data pipelines, data lakes, and data warehouses to support the storage, processing, and analysis of large volumes of structured and unstructured data. Lead and mentor a team of technical professionals in the design, development, and implementation of big data solutions and data analytics projects within the financial services domain. Stay abreast of emerging trends, technologies, and industry developments in big data, cloud computing, and financial services, and assess their potential impact on the organization. Develop and maintain best practices, standards, and guidelines for data management, data governance, and data security in alignment with regulatory requirements and industry standards. Collaborate with the sales and business development teams to identify customer needs, develop solution proposals, and present technical demonstrations and presentations to prospective clients. Collaborate with cross-functional teams including data scientists, engineers, business analysts, and stakeholders to define project requirements, objectives, and timelines. Basic Qualifications: Bachelor's degree or higher in Computer Science, Information Technology, Business Administration, Engineering or related field. Minimum of ten years of overall technical experience in solution architecture, design, hands-on development with a focus on big data technologies, multi-cloud platforms, and with at-least 5 years of experience specifically in financial services. Strong understanding of the financial services industry - capital markets, retail and business banking, asset management, insurance, etc. In-depth knowledge of big data technologies such as Hadoop, Spark, Kafka, and cloud platforms such as AWS, Azure, GCP, Snowflake, Databricks, etc. Experience with SQL, Python, Pyspark or other programming languages used for data transformation, analysis, and automation. Excellent communication, presentation, and interpersonal skills, with the ability to articulate technical concepts to both technical and non-technical audiences. Hands-on experience extracting (ETL using CDC, Transaction Logs, Incremental) and processing large data sets for Streaming and Batch data loads. Ability to work from our Hyderabad, India office at least twice a week Preferred Qualifications: Professional certifications in cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified Azure Solutions Architect, Azure Data Engineer, SnowPro Core) and/or big data technologies. Experience with Power BI, Tableau or other Reporting and Data Visualization tools Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code tools Education/Experience: Bachelor s degree in MIS, CS, Engineering or equivalent field. Master s degree is CS or MBA is preferred. Advanced Data and Cloud Certifications are a plus.

Posted 1 week ago

Apply

6.0 - 10.0 years

10 - 17 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Job Description: We are looking for a skilled Data / Analytics Engineer with hands-on experience in vector databases and search optimization techniques . You will help build scalable, high-performance infrastructure to support AI-powered applications like semantic search , recommendation systems , and RAG pipelines . Key Responsibilities: Optimize vector search algorithms for performance and scalability. Build pipelines to process high-dimensional embeddings (e.g., BERT , CLIP , OpenAI ). Implement ANN indexing techniques like HNSW , IVF , PQ . Integrate vector search with data platforms and APIs . Collaborate with cross-functional teams (data scientists, engineers, product). Monitor and resolve latency , throughput , and scaling issues. Must-Have Skills: Python AWS Vector Databases (e.g., Elasticsearch , FAISS , Pinecone ) Vector Search / Similarity Search ANN Search Algorithms HNSW , IVF , PQ Snowflake / Databricks Embedding Models – BERT , CLIP , OpenAI Kafka / Flink for real-time data pipelines REST APIs , GraphQL , or gRPC for integration Good to Have: Knowledge of semantic caching and hybrid retrieval Experience with distributed systems and high-performance computing Familiarity with RAG (Retrieval-Augmented Generation) workflows Apply Now if You: Enjoy solving performance bottlenecks in AI infrastructure Love working with cutting-edge ML models and search technologies Thrive in collaborative , fast-paced environments

Posted 1 week ago

Apply

4.0 - 9.0 years

0 Lacs

Bengaluru

Work from Office

Naukri logo

Required skillset: Experience in Data Platform Support, DevOps, or Operations role. Experience in Tableau, Snowflake, AWS, Informatica Cloud. Familiarity with ITSM practices Proficiency with Jira, CI/CD workflows, and monitoring tools.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

The data architect is responsible for designing, creating, and managing an organizations data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, accessible, secure, and aligned with business objectives. The data architect designs data models, warehouses, file systems and databases, and defines how data will be collected and organized. Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and designs database management systems that meet the organizations needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs processes for the ETL process from various sources into the organizations data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Design, create, and implement logical and physical data models for both IT and business solutions to capture the structure, relationships, and constraints of relevant datasets Build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions Effectively collaborate and communicate with various stakeholders to understand data and business requirements and translate them into data models Create entity-relationship diagrams (ERDs), data flow diagrams, and other visualization tools to represent data models Collaborate with database administrators and software engineers to implement and maintain data models in databases, data warehouses, and data lakes Develop data modeling best practices, and use these standards to identify and resolve data modeling issues and conflicts Conduct performance tuning and optimization of data models for efficient data access and retrieval Incorporate core data management competencies, including data governance, data security and data quality Job Requirements Education: A bachelors degree in computer science, data science, engineering, or related field Experience: At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Skills: Ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern data architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs Expert knowledge of data modeling concepts, methodologies, and best practices Proficiency in data modeling tools such as Erwin or ER/Studio Knowledge of relational databases and database design principles Familiarity with dimensional modeling and data warehousing concepts Strong SQL skills for data querying, manipulation, and optimization, and knowledge of other data science languages, including JavaScript, Python, and R Ability to collaborate with cross-functional teams and stakeholders to gather requirements and align on data models Excellent analytical and problem-solving skills to identify and resolve data modeling issues Strong communication and documentation skills to effectively convey complex data modeling concepts to technical and business stakeholders

Posted 1 week ago

Apply

3.0 - 6.0 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Chennai Career Event- Applications invited for Solution Analyst About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the worlds largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet societys evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobils affiliates in India ExxonMobils affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. ExxonMobil is organizing scheduled in-person interviews at Chennai on 5th & 6th July 2025 for Solution Analyst . Work Location: Bengaluru (Last date to apply is 27th June 2025) Note: Shortlisted candidates will receive an interview invitation letter from recruiting team What role you will play in team Globally provide support to Procurement organization by assisting in deriving valuable business insights & propose solutions for strategic business planning through visualization by developing and maintaining dashboards. Develop, maintain, and enhance analytical tools and visualizations using various systems like SQL server, Snowflake database, tableau and PowerBI. Job Location: Bangalore, Karnataka, India What you will do Develop, maintain, and enhance dashboards and visualizations for Procurement department capturing the key KPIs and metrics, using analytical tools like Tableau, PowerBI, SQL and Snowflake database. Develop logics & calculations for metrics, build customized insights to support procurement stakeholders on local, regional and global basisCollaborate effectively with Engagement Leads, Procurement commercial and operation Teams, Data Engineering and Procurement Management to understand the requirements and deliver the analytical needs Regularly engage stakeholders to provide status updates and address any potential opportunities with the metrics or data sets. Troubleshoot issues related to data and dashboards as and when needed Prepare technical & functional documentation, Train end-users on new dashboards and reports. About you Requirements: Bachelor’s degree in computer science/Engineer with minimum 6 CGPA with 3-6 years of relevant work experience. Practical experience in analytics & visualization tools– Tableau or PowerBI, databases – SQL & Snowflake and process mining tool – Celonis. Knowledge in data modelling, statistical analysis, and ability to apply machine learning algorithms to predict outcomes and trends. Understanding of Procurement (Source to Pay) processes are preferred. Knowledge on Procurement system and applications like SAP Ariba, SAP Analytics Cloud, S4 HANA & SAP MM would be an added advantage. Should carry Analytical mindset to be able to work with massive amount of data. Utilize storytelling techniques to present data and insights in a way that drives decision-making and inspires action Excellent communication and presentation skills and be able to work in a diverse team environment. Should be self-motivated and to be able to work with minimal supervision. Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Pune

Work from Office

Naukri logo

We are staffing small, self-contained development teams with people who love solving problems, building high quality products and services. We use a wide range of technologies and are building up a next generation microservices platform that can make our learning tools and content available to all our customers. If you want to make a difference in the lives of students and teachers and understand what it takes to deliver high quality software, we would love to talk to you about this opportunity. Technology Stack You'll work with technologies such as Java, Spring Boot, Kafka, Aurora, Mesos, Jenkins etc. This will be a hands-on coding role working as part of a cross-functional team alongside other developers, designers and quality engineers, within an agile development environment. Were working on the development of our next generation learning platform and solutions utilizing the latest in server and web technologies. Responsibilities: Build high-quality, clean, scalable, and reusable code by enforcing best practices around software engineering architecture and processes (Code Reviews, Unit testing, etc.) on the team. Work with the product owners to understand detailed requirements and own your code from design, implementation, test automation and delivery of high-quality product to our users. Drive the design, prototype, implementation, and scale of cloud data platforms to tackle business needs. Identify ways to improve data reliability, efficiency, and quality. Plan and perform development tasks from design specifications. Provide accurate time estimates for development tasks. Construct and verify (unit test) software components to meet design specifications. Perform quality assurance functions by collaborating with the cross-team members to identify and resolve software defects. Provide mentoring on software design, construction, development methodologies, and best practices. Participate in production support and on-call rotation for the services owned by the team. Mentors less experienced engineers in understanding the big picture of company objectives, constraints, inter-team dependencies, etc. Participate in creating standards and ensuring team members adhere to standards, such as security patterns, logging patterns, etc. Collaborate with project architects and cross-functional team members/vendors in different geographical locations and assist team members to prove the validity of new software technologies. Promote AGILE processes among development and the business, including facilitation of scrums. Have ownership over the things you build, help shape the product and technical vision, direction, and how we iterate. Work closely with your product and design teammates for improved stability, reliability, and quality. Perform other duties as assigned to ensure the success of the team and the entire organization. Run numerous experiments in a fast-paced, analytical culture so you can quickly learn and adapt your work. Promote a positive engineering culture through teamwork, engagement, and empowerment. Function as the tech lead for various features and initiatives on the team. Build and maintain CI/CD pipelines for services owned by team by following secure development practices. Skills & Experience: 5 to 8 years' experience in a relevant software development role Excellent object-oriented design & programming skills, including the application of design patterns and avoidance of anti-patterns. Strong Cloud platform skills: AWS Lambda, Terraform, SNS, SQS, RDS, Kinesis, DynamoDB etc. Experience building large-scale, enterprise applications with ReactJS/AngularJS. Proficient with front-end technologies, such as HTML, CSS, JavaScript preferred. Experience working in a collaborative team of application developers and source code repositories. Deep knowledge of more than one programming language like Node.js/Java. Demonstrable knowledge of AWS and Data Platform experience: Lambda, Dynamodb, RDS, S3, Kinesis, Snowflake. Demonstrated ability to follow through with all tasks, promises and commitments. Ability to communicate and work effectively within priorities. Ability to advocate ideas and to objectively participate in design critiques. Ability to work under tight timelines in a fast-paced environment. Advanced understanding of software design concepts. Understanding software development methodologies and principles. Ability to solve large scale complex problems. Ability to architect, design, implement, and maintain large scale systems. Strong technical leadership and mentorship ability. Working experience of modern Agile software development methodologies (i.e. Kanban, Scrum, Test Driven Development)

Posted 1 week ago

Apply

6.0 - 11.0 years

11 - 21 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Chennai Career Event- Applications invited for Engagement Lead About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the worlds largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet societys evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobils affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. ExxonMobil is organizing scheduled in-person interviews at Chennai on 5th & 6th July 2025 for Engagement Lead . Work Location: Bengaluru (Last date to apply is 27th June 2025) Note: Shortlisted candidates will receive an interview invitation letter from recruiting team What role you will play in team Globally provide support to Procurement organization by proposing, developing & maintaining dashboards that creates business value. Lead & mentor internal team members for delivering the business results. Job Location: Bangalore, Karnataka, India What you will do Lead the execution of dashboard and insights-related components within the annual value delivery plan and strategic initiatives. Liaison between stakeholders and internal analytics team. Manage data-side system migrations and updates, and ensure timely incorporation of changes into analytics solutions Provide insightful data analysis to support senior stakeholders on local, regional or global basis and communicates complex data analysis in a concise but meaningful manner Actively guide and mentor other members of the team in fulfilling their duties by coaching them on more advance data analysis or on how to develop a clear opportunity set Work closely with the Procurement Manager, Data Engineering team, and other procurement professionals to plan, design, and deliver dashboards that meet the analytical needs of the Procurement organization. Review existing dashboards and visualizations to assess their effectiveness and identify opportunities to improve or better utilize them. Proactively identify data-related challenges, outline clear action plans, share progress with stakeholders, and highlight insights and opportunities within the data. Maintain existing analytics deliverables while driving continuous improvements to boost adoption and impact through effective change management About you Qualifications & Skills: Bachelor’s degree in computer science/Engineer with minimum 6 CGPA, with minimum 6 years of Experience in SQL and Snowflake databases and visualization tools like, Tableau or Power BI At least 3 years’ experience in Procurement Process and systems. Experience in SAP Analytics Cloud, SAP S4 HANA Project management knowledge is an added advantage, to effectively plan, coordinate, and deliver analytics solutions within defined timelines and expectations. Analytical mindset with the ability to work with large datasets and extract valuable insights. Excellent communication and presentation skills, capable of working effectively in a diverse team environment. Minimum 3 years’ experience of leading a team Self-motivated and able to work with minimal supervision Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.

Posted 1 week ago

Apply

4.0 - 9.0 years

3 - 8 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

We are seeking a proactive and technically sound Snowflake Data Architect to design, implement, and optimize scalable data solutions using the Snowflake cloud data platform . The ideal candidate will have strong experience in data modeling, performance tuning, and governance, along with the ability to collaborate with cross-functional teams and contribute to long-term data strategy and architecture. Key Responsibilities: Design, implement, and manage end-to-end data solutions using Snowflake . Define and execute robust data strategies aligned with business needs. Optimize Snowflake environments for performance, cost-efficiency, and scalability. Implement data governance, security controls, and role-based access. Understand and apply modern data management technologies and architectures. Oversee and maintain the organizations data inventory , ensuring accuracy and accessibility. Collaborate with engineering, analytics, and business teams to understand requirements and translate them into technical solutions. Stay up to date with industry trends and continuously improve the organization's data management systems. Key Skills & Technologies: Snowflake: Warehousing, Snowpipe, Streams & Tasks, Virtual Warehouses SQL: Advanced query development and performance optimization ETL/ELT Tools: dbt, Azure Data Factory, Talend, Informatica, Matillion (any) Data Modeling: Star/Snowflake schema, SCDs, normalization Cloud Platforms: AWS (S3, IAM), Azure (Blob, ADF), or GCP Data Governance & Security: RBAC, data cataloging, encryption Scripting (Preferred): Python or Shell Version Control (Git), CI/CD familiarity is a plus Please contact me at nandhana.suresh@deliverycentric.com for additional details.

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

4+ years of Testing Experience and at least 2 years in ETL Testing and automation Experience of automating ETL flows Experience of development of automation framework for ETL Good coding skills in Python and PytestExpert at Test Data Analysis Test design Good at Database Analytics(ETL or BigQuery) Having snowflake knowledge is a plus Good communication skills with customers and other stakeholders Capable of working independently or with little supervision

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Skills required : Bigdata Workflows (ETL/ELT), Python hands-on, SQL hands-on, Any Cloud (GCP BigQuery preferred), Airflow (good knowledge on Airflow features, operators, scheduling etc) NOTE Candidate will be having the coding test (Python and SQL) in the interview process. This would be done through coders-pad. Panel would set it at run-time.

Posted 1 week ago

Apply

2.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Experience in designing and developing data pipelines in a modern data stack (Snowflake, AWS, Airflow,DBT etc) Strong experience on Python Over 2+ years of experience in Snowflake and DBT Able to work in afternoon shift and front end the customer independently so that he/she possess strong communication Strong knowledge in Python, DBT, Snowflake, Airflow Ability to manage both structured and unstructured data Work with multiple data sources (APIs, Databases, S3, et) Own design, documentation, and lifecycle management of data pipelines Help implement the CI/CD processes and release engineering for organizations data pipelines Experience in designing and developing CI/CD processes and managing release management for data pipelines Proficient in Python, SQL, Airflow, AWS, Bitbucket, working with APIs and other types of data sources Good to have knowledge in Salesforce Primary skills : AWS Cloud, Snowflake DW, Azure SQL, SQL, Python (Must Have) DBT( Must Have)

Posted 1 week ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

Mangaluru, Hyderabad, Bengaluru

Work from Office

Naukri logo

We're looking for a Senior Backend Developer who thrives at the intersection of software engineering and data engineering . This role involves architecting and optimizing complex, high-throughput backend systems that power data-driven products at scale. If you have deep backend chops, strong database expertise across RDBMS platforms, and hands-on experience with large-scale data workflows, we'd love to hear from you. Key Responsibilities 1. Leadership Project Delivery Lead backend development teams, ensuring adherence to Agile practices and development best practices. Collaborate across product, frontend, DevOps, and data teams to design, build, and deploy robust features and services. Drive code quality through reviews, mentoring, and enforcing design principles. 2. Research Innovation Conduct feasibility studies on emerging technologies, frameworks, and methodologies. Design and propose innovative solutions for complex technical challenges using data-centric approaches. Continuously improve system design with a forward-thinking mindset. 3. System Architecture Optimization Design scalable, distributed, and secure system architectures. Optimize and refactor legacy systems to improve performance, maintainability, and scalability. Define best practices around observability, logging, and resiliency. 4. Database Data Engineering Design, implement, and optimize relational databases (PostgreSQL, MySQL, SQL Server, etc.). Develop efficient SQL queries, stored procedures, indexes, and schema migrations. Collaborate with data engineering teams on ETL/ELT pipelines , data ingestion, transformation, and warehousing. Work with large datasets , batch processing, and streaming data (e.g., Kafka, Spark, Airflow). Ensure data integrity, consistency, and security across backend and analytics pipelines. Must-Have Skills Backend Development: TypeScript, Node.js (or equivalent backend framework), REST/GraphQL API design. Databases Storage: Strong proficiency in PostgreSQL , plus experience with other RDBMS like MySQL , SQL Server , or Oracle . Familiarity with NoSQL (e.g., Redis, MongoDB) and columnar/OLAP stores (e.g., ClickHouse, Redshift). Awareness on Data Engineering : Hands-on work with data ingestion , transformation , pipelines , and data orchestration tools. Exposure to tools like Apache Airflow , Kafka , Spark , or dbt . Cloud Infrastructure: Proficiency with AWS (Lambda, EC2, RDS, S3, IAM, CloudWatch). DevOps CI/CD: Experience with Docker, Kubernetes, GitHub Actions or similar CI/CD pipelines. Architecture: Experience designing secure, scalable, and fault-tolerant backend systems. Agile SDLC: Strong understanding of Agile workflows, SDLC best practices, and version control (Git). Nice-to-Have Skills Experience with event-driven architectures or microservices . Exposure to data warehouse environments (e.g., Snowflake, BigQuery). Knowledge of backend-for-frontend collaboration (especially with React.js). Familiarity with data cataloging, data governance, and lineage tools. Preferred Qualifications Bachelor's or Master's in Computer Science, Software Engineering, or a related technical field. Proven experience leading backend/data projects in enterprise or startup environments. Strong system design, analytical, and problem-solving skills. Awareness of cybersecurity best practices in cloud and backend development.

Posted 1 week ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Were looking for a skilled Node.js Developer with a strong foundation in data engineering to join our engineering team. Youll be responsible for building scalable backend systems using modern Node.js frameworks and tools, while also designing and maintaining robust data pipelines and integrations. Primary Responsibilities: Build and maintain performant APIs and backend services using Node.js and frameworks like Express.js, NestJS, or Fastify. Develop and manage ETL/ELT pipelines, data models, schemas, and data transformation logic for analytics and operational use. Ensure data quality, integrity, and consistency through validation, monitoring, and logging. Work with database technologies (MySQL, PostgreSQL, MongoDB, Redis) to store and manage application and analytical data. Implement integrations with third-party APIs and internal microservices. Use ORMs like Sequelize, TypeORM, or Prisma for data modeling and interaction. Write unit, integration, and E2E tests using frameworks such as Jest, Mocha, or Supertest. Collaborate with frontend, DevOps, and data engineering teams to ship end-to-end features. Monitor and optimize system performance, logging (e.g., Winston, Pino), and error handling. Contribute to CI/CD workflows and infrastructure automation using tools like PM2, Docker and Jenkins. Required Skills: 3+ years of experience in backend development using Node.js. Hands-on experience with Express.js, NestJS, or other Node.js frameworks. Understanding of data modelling, partitioning, indexing, and query optimization. Experience in building and maintaining data pipelines, preferably using custom Node.js scripts. Familiarity with stream processing and messaging systems (e.g., Kafka, RabbitMQ, or Redis Streams). Solid understanding of SQL and NoSQL data stores and schema design. Strong knowledge of JavaScript and preferably TypeScript. Familiarity with cloud platforms (AWS/GCP/Azure) and services like S3, Lambda, or Cloud Functions. Experience with containerized environments (Docker) and CI/CD. Experience with data warehouses (e.g., BigQuery, Snowflake, Redshift). Nice To Have: Cloud Certification in AWS or GCP. Experience with distributed processing tools (eg. Spark, Trino/Presto) Experience with Data Transformation tool (ex. DBT, SQLMesh) and Data Orchestration (ex. Apache Airflow, Kestra etc) Familiarity with Serverless architectures and tools like Vercel/Netlify for deployment

Posted 1 week ago

Apply

3.0 - 6.0 years

10 - 15 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

3+ years of experience in data science roles, working with tabular data in large-scale projects. Experience in feature engineering and working with methods such as XGBoost, LightGBM, factorization machines , and similar algorithms. Experience in adtech or fintech industries is a plus. Familiarity with clickstream data, predictive modeling for user engagement, or bidding optimization is highly advantageous. MS or PhD in mathematics, computer science, physics, statistics, electrical engineering, or a related field. Proficiency in Python (3.9+), with experience in scientific computing and machine learning tools (e.g., NumPy, Pandas, SciPy, scikit-learn, matplotlib, etc.). Familiarity with deep learning frameworks (such as TensorFlow or PyTorch) is a plus. Strong expertise in applied statistical methods, A/B testing frameworks, advanced experiment design, and interpreting complex experimental results. Experience querying and processing data using SQL and working with distributed data storage solutions (e.g., AWS Redshift, Snowflake, BigQuery, Athena, Presto, MinIO, etc.). Experience in budget allocation optimization, lookalike modeling, LTV prediction, or churn analysis is a plus. Ability to manage multiple projects, prioritize tasks effectively, and maintain a structured approach to complex problem-solving. Excellent communication and collaboration skills to work effectively with both technical and business teams.

Posted 1 week ago

Apply

3.0 - 8.0 years

8 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Observability : Ensure end-to-end monitoring of pipelines, data services, infrastructure, and databases. Proactively detect and resolve issues to maintain system health. FinOps : Track and optimize platform usage and cost. Understand cost drivers, perform forecasting, and automate cost control measures. User Management : Manage onboarding, offboarding, and role-based access controls across tools including Tableau, Snowflake, and AWS. Privilege Access Management : Oversee and audit elevated access to critical systems in compliance with security policies. Application Maintenance : Perform regular maintenance, updates, and health checks on platform components to ensure operational stability. Service Desk Management : Triage and resolve incidents, service requests (SRs), and problems. Maintain the BAU roster and collaborate with cross-functional teams. Minor Enhancements : Address low-effort business enhancements (23 days) through a structured request process. Business Continuity Planning : Maintain and test Business Continuity Plans (e.g., Tableau DR) to ensure platform resilience. Deployment Services : Support production deployments, bug fixes, and enhancements in line with CI/CD pipelines. Data Load Fixes : Resolve failures in data ingestion due to scheduling, connectivity, infrastructure, or secret rotation issues. Transformations/Data Model Support : Provide Level 1 triage for issues arising from schema changes, malformed data, or source inconsistencies. Functional Data Questions : Perform initial triage for data requests or quality issues, and coordinate with domain-specific data analysts as needed. Project Support : Offer support for projects needing platform team involvement License Review : Participate in quarterly Tableau license reviews and ensure license compliance. Documentation : Maintain procedures, work instructions, and knowledge base (KB) articles for operational consistency and knowledge transfer. Preferred candidate profile 3+ years of experience in a Data Platform Support, DevOps, or Operations role. Hands-on experience with tools like Tableau, Snowflake, AWS, Informatica Cloud. Familiarity with ITSM practices (e.g., incident, problem, change management). Proficiency with Jira, CI/CD workflows, and monitoring tools. Strong documentation, communication, and stakeholder management skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Chennai

Hybrid

Naukri logo

Job Title:- Data Engineer Location:- Chennai, Bangalore, Pune, Hyderabad (Hybrid) Job Type:- Permanent Employee Responsibilities: Skillsets required: Application and API development SQL data modeling and automation Experience working with GIS map services and spatial databases Experience creating GIS map services Data and application architecture Handling of legacy data Familiarity with Client work processes and data is a plus Platforms: DataBricks Snowflake ESRI ArcGIS / ArcSDE New GenAI app being developed Tasks that will need to be done: Combining and integrating spatial databases from different sources to be used with the new GenAI application Building of map services with associated metadata to support questions from geoscience users Set up necessary updating cycles for databases and map services to ensure evergreen results Help with constructing APIs for these databases and map services to structure the best possible workflows for users Assistance with data and application architecture Help with handling legacy data, such as links to existing applications, databases, and services Ensure that IT requirements are being met as we build our project, including integration, data tiers, access control and status monitoring

Posted 1 week ago

Apply

8.0 - 12.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Total Experience: 8+ years in IT. Relevant Experience: 5+ years. Snowflake with Python, Dayshift More than 8 years of IT Experience, specifically in data Engineering stream Should possess Developmental skills in Snowflake, basic IBM Datastage/ any other ETL tool, SQL (expert ), basics of Python/Pyspark, AWS along with high proficiency in Oracle SQL Hands on Experience in handling databases along with experience in any scheduling tool like Ctrl-M, Control - M Excellent customer service, interpersonal, communication and team collaboration skills Excellent in debugging skills in Databases and should have played a key member role in earlier projects. Excellent in SQL and PL/SQL coding (development. Ability to identify and implement process and/or application improvements Must be able to work on multiple simultaneous tasks with limited supervision Able to follow change management procedures and internal guidelines Any relevant technical Certifications in data stage is a plus

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Role: Technical Project Manager Location: Gachibowli, Hyderabad Duration: Full time Timings: 5:30pm - 2:00am IST Note: Looking for Immediate Joiners only (15-30 Days Notice) Job Summary: We are seeking a Technical Project Manager with a strong data engineering background to lead and manage end-to-end delivery of data platform initiatives. The ideal candidate will have hands-on exposure to AWS, ETL pipelines, Snowflake, DBT , and must be adept at stakeholder communication, agile methodologies, and cross-functional coordination across engineering, data, and business teams. Key Responsibilities: Plan, execute, and deliver data engineering and cloud-based projects within scope, budget, and timeline. Work closely with data architects, engineers, and analysts to manage deliverables involving ETL pipelines , Snowflake data warehouse , and DBT models . Lead Agile/Scrum ceremonies sprint planning, backlog grooming, stand-ups, and retrospectives. Monitor and report project status, risks, and issues to stakeholders and leadership. Coordinate cross-functional teams across data, cloud infrastructure, and product teams . Ensure adherence to data governance, security , and compliance standards throughout the lifecycle. Manage third-party vendors or consultants as required for data platform implementations. Own project documentation including project charters, timelines, RACI matrix, risk registers, and post-implementation reviews. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, Information Systems, or related field (Masters preferred). 8+ years in IT with 3-5 years as a Project Manager in data-focused environments. Hands-on understanding of: AWS services (e.g., S3, Glue, Lambda, Redshift) ETL/ELT frameworks and orchestration Snowflake Data Warehouse DBT (Data Build Tool) for data modeling Familiar with SQL, data pipelines , and data quality frameworks . Experience using project management tools like JIRA, Confluence, MS Project, Smartsheet. PMP, CSM, or SAFe certifications preferred. Excellent communication, presentation, and stakeholder management skills.

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Noida

Hybrid

Naukri logo

QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job QualificationsJob QualificationsRequirements and skills • At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). • Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. • Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. • Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. • Performance Testing • Experience with version control systems like Git • Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. • Strong communication and collaboration skills. • Attention to detail and a passion for delivering high-quality solutions. • Ability to work in a fast-paced environment and manage multiple priorities. • Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud Including Logic App, Azure Functions, ADF

Posted 1 week ago

Apply

7.0 - 12.0 years

17 - 30 Lacs

Hyderabad/ Secunderabad, Ahmedabad, Chennai

Work from Office

Naukri logo

Applicants who require a UK work visa are considered. Software Engineers can have any of the below skillsets welcome to apply: Cloud Platforms (Azure/AWS/GCP) | DevOps Engineer | Microsoft 365 | Microsoft Dynamics 365 | Power Technologies (PowerApps, Power Automate & Power BI) | SharePoint SME | Test Engineer | Frontend Development | Fullstack Developer | DBA Admin | SAP | Salesforce | BigData | Data Engineer | AI Engineer | Hadoop | Snowflake | Java / JavaScript / React JS / REST API / C# / ASP.Net / SQL Server / PySpark / Node JS) | Terraform | Kubernetes | Docker | Site Resilience Engineer | Scrum Master | Business Analyst | Human Resource As a Software Engineer, you will work in the product team and be a core contributor. You will collaborate with other engineers, defining and delivering solutions that expand on product offerings and new capabilities and support the continued growth. Use a modularized approach, data-driven and measure our results. Continually innovate and improve, strive to learn and grow, and have a standard of excellence, a strong sense of ownership, and excellent technical skills in agile environments. NOTE: This Job provides initial 3 years of visa Sponsorship under the UK Skilled Worker visa category, which involves processing charges . One who is enthusiastic about relocating to the UK and making a bright future can apply. Good to have any of the below Skillsets: Frontend Development skills. Javascript, React JS, REST API's TypeScript, NodeJS, HTML, CSS Significant commercial C# experience specifically. Microsoft Office 365, AWS, Azure cloud platforms Ability to collaborate in a development team Excellent communication skills with team leads/line managers. Data Modelling, Data Analytics, Data Governance, Machine Learning, B2B Customer Operations, Master Data Management, Data interoperability, Azure Key Vault, Data integration Azure Data Lake, Data Science, Digital Transformation, Cloud Migration, Data Architecture, Data Migrations Data Marts, Agile Delivery, ETL/ELT, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, ARM/Terraform, Azure Powershell, Data Catalogue Key Accountabilities: Deliver high-quality software in acceptable timescales To take ownership of a significant and key area within the solution To suggest estimates of the expected time to complete work Designing and implementing services using C#, .NET Core, Azure, SQL Server, Javascript, Angular.js, NodeJS Designing and implementing web APIs using .NET Core, C# To work well within a team environment To abide by design and coding guidelines To be proactive and self-sufficient with excellent attention to detail Location: London, UK Duration: Full Time Start Date: ASAP Rate 30 Lakhs per annum Competitive Holiday Website: https://saitpoint.com/ Employment Business: Saitpoint Private Limited (India) and Saitpoint Limited (UK) Contact me: hr@saitpoint.com

Posted 1 week ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

Role: Snowflake Developer Exp:4+ yrs Location: PAN INDIA

Posted 1 week ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies