Home
Jobs

1123 Snowflake Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : BlueYonder Order Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Role Overview:We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows.Key Responsibilities:Functional Expertise- Must have skill Blue Yonder (BY) Order Promising modules (formerly JDA)- Knowledge of ATP (Available to Promise), CTP (Capable to Promise), and Order Fulfillment logic- Experience with S&OP, Demand Planning, and Inventory Availability functions- Ability to design and interpret supply-demand match rules, sourcing policies, and allocation strategiesTechnical Acumen- Strong grasp of BY architecture, workflows, and configuration capabilities- Proficiency in tools like BY Platform Manager, BY Studio, and BY Workbench- Understanding of data modeling, integration frameworks (REST, SOAP APIs, flat file interfaces), and middleware platforms- Familiarity with PL/SQL, Java, and batch job orchestration for customizations and enhancementsIntegration & Ecosystem Knowledge- Integration experience with OMS, ERP (e.g., SAP, Oracle), WMS, and TMS- Experience in real-time inventory visibility, order brokering, and global ATP engine- Exposure to microservices architecture and cloud deployments (BY Luminate Platform) Implementation & Support Experience- Proven experience in end-to-end BY Order Promising implementations- Ability to conduct solution design workshops, fit-gap analysis, and UAT management- Experience in post-go-live support, performance tuning, and issue triage/resolutionSoft Skills & Project Leadership- Ability to act as a bridge between business and technical teams- Strong stakeholder communication, requirement gathering, and documentation skills- Excellent problem-solving and troubleshooting capabilities- Agile and Waterfall project methodology familiarityPreferred Certifications- Blue Yonder Functional / Technical Certification in Order Promising or Fulfillment- Supply Chain Certifications like APICS / CSCP (desirable) Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Demand Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : BlueYonder Enterprise Supply Planning Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows. Roles & Responsibilities:- Architect and implement end-to-end integration solutions for Blue Yonder (WMS, TMS, ESP, etc) with enterprise systems (ERP, CRM, legacy).- Design integration flows using cloud-native middleware platforms (Azure Integration Services, AWS Glue, GCP Dataflow, etc.).- Enable real-time and batch data ingestion into cloud-based Data Lakes (e.g., AWS S3, Azure Data Lake, Google Cloud Storage) and downstream to Snowflake.- Develop scalable data pipelines to support analytics, reporting, and operational insights from Blue Yonder and other systems.- Integrate Snowflake as an enterprise data platform for unified reporting and machine learning use cases. Professional & Technical Skills: - Leverage Generative AI (e.g., OpenAI, Azure OpenAI) for :Auto-generating integration mapping specs and documentation.- Enhancing data quality and reconciliation with intelligent agents.- Developing copilots for integration teams to speed up development and troubleshooting.- Ensure integration architecture adheres to security, performance, and compliance standards.- Collaborate with enterprise architects, functional consultants, data engineers, and business stakeholders.- Lead troubleshooting, performance tuning, and hypercare support post-deployment Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Enterprise Supply Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:a 9+ of experience in Database & Data warehousing with at least 4+ years on Snowflakeb Played a key role in data related discussions with teams and clients to understand business problems and solutioning requirementsc Liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.Technical Expertisea Strong Experience working as a Snowflake on Cloud Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelinesc good process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese Ability to suggest innovative solutions based on new technologies and latest trendsf SnowPro core certificationg Certified in any one cloudh Develop fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.i Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.j Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.k Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.l Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.M stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.N Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Additional Information:a Any past Consulting experience Good to havec Good to have DBT, Python, Knowledge of Snowpark and any advanced certification Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Project Role : Responsible AI Tech Lead Project Role Description : Ensure the ethical and responsible use of artificial intelligence (AI) technologies. Design and deploy Responsible AI solutions; align AI projects with ethical principles and regulatory requirements. Provide leadership, fosters cross-functional collaboration, and advocates for ethical AI adoption. Must have skills : Data Architecture Principles, Amazon Web Services (AWS), Snowflake Data Warehouse, Core Banking Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Responsible AI Tech Lead, you will ensure the ethical and responsible use of artificial intelligence technologies. Your typical day will involve designing and deploying Responsible AI solutions, aligning AI projects with ethical principles and regulatory requirements, and providing leadership to foster cross-functional collaboration. You will advocate for the adoption of ethical AI practices, ensuring that all AI initiatives are conducted with integrity and accountability, while also engaging with various stakeholders to promote a culture of responsible AI usage across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training sessions to enhance team understanding of Responsible AI principles.- Monitor and evaluate the impact of AI solutions to ensure compliance with ethical standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles, Core Banking, Amazon Web Services (AWS), Snowflake Data Warehouse.- Strong understanding of data governance frameworks and best practices.- Experience with data modeling and architecture design.- Familiarity with machine learning algorithms and their ethical implications.- Ability to communicate complex technical concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 12 years of experience in Data Architecture Principles.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with various teams to ensure that the integration between systems and data models is seamless and efficient. You will engage in discussions to refine the architecture and design, ensuring that the data platform meets the needs of the organization while adhering to best practices. Additionally, you will be involved in problem-solving sessions, where you will provide insights and solutions to enhance the overall data strategy. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a 9+ of experience in data experience with at least 4+ years on Snowflake and 1-3 years on Fivetran.b Played a key role in data related discussions with teams and clients to understand business problems and solutioning requirementsc Liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.Technical Expertisea Strong Experience working as a Snowflake on Cloud Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using Fivetranc Should have process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese Ability to suggest innovative solutions based on new technologies and latest trendse Fivetran end to end migration experience f Fivetran and cloud certification is good to have Professional Attributes:a Project management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring alignment with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools.- Strong understanding of data modeling and ETL processes.- Familiarity with cloud computing concepts and services.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing project progress, coordinating with teams, and ensuring successful application development. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Coordinate with stakeholders to gather requirements- Ensure timely delivery of projects Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of cloud data platforms like AWS or Azure- Hands-on experience in SQL and database management Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education:**Position Summary :**The Data Analyst will focus on collecting, cleaning, and analyzing data to support business decisions.**Key Responsibilities:**- Gather, process, and analyze data to identify trends and insights.- Develop dashboards and reports to communicate findings.- Collaborate with stakeholders to understand data needs.- Ensure data accuracy and quality in all analyses.- Prepare and clean datasets for analysis to ensure accuracy and usability.- Generate reports and dashboards to communicate key performance metrics.- Support data-driven decision-making by identifying actionable insights.- Monitor data pipelines and troubleshoot issues to ensure smooth operation.- Collaborate with cross-functional teams to understand and meet data needs.** Qualifications:**- Bachelor's degree in a relevant field (e.g., Data Science, Statistics, Computer Science).- 2-4 years of experience in data analytics.- Proficiency in tools like Power BI, Tableau, and SQL.- Strong analytical and problem-solving skills.- Effective communication and teamwork abilities. Additional Information:- The candidate should have minimum 5 years of experience in Data Analytics.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 11.0 years

0 - 1 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Databricks, Python, Pyspark, Azure Databricks, SQL, Azure Data Factory, Any cloud (AWS/Azure), any Data Modelling tool (MSBI ,Snowflake ,Google Big query, AWS Redshift ) Shift Timings (Client 1): 6:00 AM IST 2:00 PM IST (OR) Shift Timings (Client 2): 2:00 PM IST 11:00 PM IST Role & responsibilities Preferred candidate profile

Posted 2 weeks ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleSRE Engineer - Bangalore About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job TitleSRE Engineer - Bangalore Key skillsSRE, Garfana, Python/ Scripting, Unix, SQL Location - Bangalore (Hybrid - 3 days WFO) Shift Timings12:30pm-9:30pm Looking only for immediate joiners Technical Requirement: Job Summary: Capco is looking for a Bengaluru based Developer with prior experience in developing operational tooling. Candidates should have a technical background in Grafana, Python, and DBMS, and be prepared to support development at an Enterprise Command Center with 60+ India based SRE and Production Support Engineers. Design, build and maintain core infrastructure that enables our client to maintain 8,000+ users Work on performance enhancements/developments as prioritized by Business and collaborate with Production Support for implementation Support debugging of production issues across services Lead initiatives to improve system stability and efficiency Identify automation opportunities throughout daily processes to streamline tasks and increase operational efficiency Collaborate across teams such as operations, IT, and business stakeholders to improve operational tools Desired Experience / Skills: Bachelor s degree, preferably in Computer Science or Engineering or other relevant technical fields Expert on Unix/Linux using Python Expert in a data management system (DBMS) In-depth knowledge and experience with Grafana Experience in developing Grafana Dashboard Build monitoring solutions that alert on symptoms rather than on outages Experience working within SRE, DevOps and Production Support Ability to operate with a managed services mentality and utilize Agile methodologies to accomplish all tasks Financial Services Industry experience preferable Experience with knowledge management systems, preferably Confluence and SharePoint Knowledge of Cloud - AWS, Azure and Snowflake would be considered a big plus If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

40 - 50 Lacs

Pune

Remote

Naukri logo

Position Summary: We are seeking a strategic and hands-on Director of Business Intelligence to lead our BI function, enabling data-driven decision-making across the organization. This individual will be responsible for developing and executing the BI roadmap, aligning key metrics across departments, and ensuring the accuracy, accessibility, and value of our data assets. The ideal candidate is a strong leader with excellent business acumen, technical fluency, and a track record of transforming data into actionable insights. Key Responsibilities: Leadership & Strategy Own and execute the company's BI strategy in alignment with broader business objectives. Serve as the central liaison across business functions to gather requirements and align reporting priorities. Develop and maintain a BI roadmap, including priorities for tools, platforms, and capabilities. Insights & Analytics Oversee the creation of dashboards, reports, and analytical models that support executive decision-making and operational performance. Drive a culture of data literacy and self-service reporting across departments. Partner with FP&A, marketing, operations, and product teams to identify and track key performance indicators (KPIs). Governance & Quality Ensure data definitions and calculations are consistent and aligned across all business units. Support and enforce data governance policies in collaboration with data engineering and compliance teams. Maintain high standards for data accuracy, timeliness, and completeness. Technology & Tools Evaluate and manage BI tools (e.g., Qlik Sense, Tableau, Power BI, Snowflake, etc.). Oversee ETL processes, data warehousing strategy, and the integration of various data sources. Collaborate with IT and engineering to ensure scalable and secure infrastructure. Qualifications: Bachelors degree in Business, Computer Science, Data Science, or a related field (Masters preferred). 8+ years of experience in analytics, BI, or data strategy roles; at least 3 years in a leadership role. Expertise in BI tools, SQL, data visualization, and modern data stacks. Strong project management and stakeholder communication skills. Experience aligning KPIs and building cross-functional reporting frameworks. Preferred Attributes: Familiarity with data governance and regulatory compliance. Ability to thrive in a fast-paced, ambiguous environment. Strong business judgment with the ability to balance detail and big-picture thinking. What youll get if you join us: Fully work from home opportunity Competitive Salary plus Bonus Health insurance, Personal Accidental & Life Insurance Benefits Innovative culture with an open and collaborative environment Many opportunities to develop core and new skillsets and have a stake in your own success Freedom to create your best work and make a visible impact on the organization

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Are you ready to play a key role in transforming Thomson Reuters into a truly data-driven companyJoin our Data & Analytics (D&A) function and be part of the strategic ambition to build, embed, and mature a data-driven culture across the entire organization. The Data Architecture organization within the Data and Analytics division is responsible for designing and implementing a unified data strategy that enables the efficient, secure, and governed use of data across the organization. We aim to create a trusted and customer-centric data ecosystem, built on a foundation of data quality, security, and openness, and guided by the Thomson Reuters Trust Principles. Our team is dedicated to developing innovative data solutions that drive business value while upholding the highest standards of data management and ethics. About the Role In this opportunity as a Data Architect, you will: Lead Architecture Design: Architect and Lead Data Platform EvolutionSpearhead the conceptual, logical, and physical architecture design for our enterprise Data Platform (encompassing areas like our data lake, data warehouse, streaming services, and master data management systems). You will define and enforce data modeling standards, data flow patterns, and integration strategies to serve a diverse audience from data engineers to AI/ML practitioners and BI analysts. Technical Standards and Best Practices: Research and recommend technical standards, ensuring the architecture aligns with overall technology and product strategy. Be hands-on in implementing core components reusable across applications. Hands-on Prototyping and Framework DevelopmentWhile a strategic role, maintain a hands-on approach by designing and implementing proof-of-concepts and core reusable components/frameworks for the data platform. This includes developing best practices and templates for data pipelines, particularly leveraging dbt for transformations, and ensuring efficient data processing and quality. Champion Data Ingestion StrategiesDesign and oversee the implementation of robust, scalable, and automated cloud data ingestion pipelines from a variety of sources (e.g., APIs, databases, streaming feeds) into our AWS-based data platform, utilizing services such as AWS Glue, Kinesis, Lambda, S3, and potentially third-party ETL/ELT tools. Design and optimize solutions utilizing our core cloud data stack, including deep expertise in Snowflake (e.g., architecture, performance tuning, security, data sharing, Snowpipe, Streams, Tasks) and a broad range of AWS data services (e.g., S3, Glue, EMR, Kinesis, Lambda, Redshift, DynamoDB, Athena, Step Functions, MWAA/Managed Airflow) to build and automate end-to-end analytics and data science workflows. Data-Driven Decision-Making: Make quick and effective data-driven decisions, demonstrating strong problem-solving and analytical skills. Align strategies with company goals. Stakeholder Collaboration: Collaborate closely with external and internal stakeholders, including business teams and product managers. Define roadmaps, understand functional requirements, and lead the team through the end-to-end development process. Team Collaboration: Work in a collaborative team-oriented environment, sharing information, diverse ideas, and partnering with cross-functional and remote teams. Quality and Continuous Improvement: Focus on quality, continuous improvement, and technical standards. Keep service focus on reliability, performance, and scalability while adhering to industry best practices. Technology Advancement: Continuously update yourself with next-generation technology and development tools. Contribute to process development practices. About You Youre a fit for the role of Data Architect, Data Platform if your background includes: Educational Background: Bachelors degree in information technology. Experience: 10 + years of IT experience with at least 5 years in a lead design or architectural capacity. Technical Expertise: Broad knowledge and experience with Cloud-native software design, Microservices architecture, Data Warehousing, and proficiency in Snowflake. Cloud and Data Skills: Experience with building and automating end-to-end analytics pipelines on AWS, familiarity with NoSQL databases. Data Pipeline and Ingestion MasteryExtensive experience in designing, building, and automating robust and scalable cloud data ingestion frameworks and end-to-end data pipelines on AWS. This includes experience with various ingestion patterns (batch, streaming, CDC) and tools. Data Modeling: Proficient with concepts of data modeling and data development lifecycle. Advanced Data ModelingDemonstrable expertise in designing and implementing various data models (e.g., relational, dimensional, Data Vault, NoSQL schemas) for transactional, analytical, and operational workloads. Strong understanding of the data development lifecycle, from requirements gathering to deployment and maintenance. Leadership: Proven ability to lead architectural discussions, influence technical direction, and mentor data engineers, effectively balancing complex technical decisions with user needs and overarching business constraints Programming Skills: Strong programming skills in languages such as Python or Java or data manipulation, automation, and API development. Regulatory Awareness and Security Acumen: Data Governance and Security AcumenDeep understanding and practical experience in designing and implementing solutions compliant with robust data governance principles, data security best practices (e.g., encryption, access controls, masking), and relevant privacy regulations (e.g., GDPR, CCPA). Containerization and Orchestration: Experience with containerization technologies like Docker and orchestration tools like Kubernetes. #LI-VN1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled * Associate Informatica Developer* with *2 - 4 years of experience* in designing, developing, and maintaining ETL processes using *Informatica PowerCenter*. The ideal candidate should have strong SQL knowledge, data warehousing concepts, and hands-on experience in data integration, transformation, and loading. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues About You You are a fit for this position if your background includes - 2–4 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks. #LI-SM1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a Senior Customer Data Analyst to join our Customer Data Audience Operations Team within the Marketing & Data organization at Thomson Reuters. Based in Hyderabad, India, this role will support our Customer Data Platform (CDP) operations, helping marketers leverage customer data effectively through audience segmentation and activation. About the Role In this role as a Senior Customer Data Analyst, you will Develop a comprehensive understanding of the CDP data structure, data models, tables, and available data types. Process and fulfill audience segmentation requests from marketing teams through our Workfront ticketing system. Create audience segments in Treasure Data CDP and push them to appropriate activation channels. Collaborate with marketing teams to understand their segmentation needs and provide data-driven solutions. Maintain documentation of segment creation processes and audience definitions. Monitor segment performance and provide recommendations for optimization. Stay current with AI capabilities within Treasure Datas AI Foundry to enhance segmentation strategies. Assist in troubleshooting data issues and ensuring data quality within segments. About You You’re a fit for the role of Senior Customer Data Analyst, if your background includes Bachelors degree in Computer Science, Information Technology, Engineering, Statistics, Mathematics, or related field. 6-8 years of experience in data analysis, data management, or related roles. Proficiency in SQL query writing and data manipulation. Basic understanding of marketing technology platforms (CDP, CRM, Marketing Automation). Ability to translate business requirements into technical specifications. Strong attention to detail and problem-solving skills. Excellent communication skills in English, both written and verbal. Experience with Treasure Data/Snowflake/Eloqua/Salesforce. Knowledge of AI/ML concepts and applications in marketing. Understanding of data privacy regulations and compliance requirements. Experience with data visualization tools. Basic programming skills (Python, R, etc.) Data analysis and interpretation. Familiarity with cloud-based data platforms. Understanding of relational databases. Microsoft Office Suite (especially Excel). Curiosity and eagerness to learn, Detail-oriented approach, Ability to work in a fast-paced environment, Team collaboration, Time management and prioritization abilities. Shift Timings2 PM to 11 PM (IST). Work from office for 2 days in a week (Mandatory) #LI-GS2 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer. About the Role In this opportunity as Data Engineer, you will: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer Innovate with new approaches to meeting data management requirements Effectively communicate and liaise with other data management teams embedded across the organization and data consumers in data science and business analytics teams. Analyze existing data pipelines and assist in enhancing and re-engineering the pipelines as per business requirements. Bachelor’s degree or equivalent required, Computer Science or related technical degree preferred About You You’re a fit for the role if your background includes: Mandatory skills Data Warehousing, data models, data processing[ Good to have], SQL, Power BI / Tableau, Snowflake [good to have] , Python 3.5 + years of relevant experience in Implementation of data warehouse and data management of data technologies for large scale organizations Experience in building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting Worked on Analyzing data pipelines Knowledgeable about Data Warehousing, including data models and data processing Broad understanding of the technologies used to build and operate data and analytic systems Excellent critical thinking, communication, presentation, documentation, troubleshooting and collaborative problem-solving skills Beginner to intermediate Knowledge of AWS, Snowflake, Python Hands-on experience with programming and scripting languages Knowledge of and hands on experience with Data Vault 2.0 is a plus Also have experience in and comfort with some of the following skills/concepts: Good in writing SQL and performance tuning Data Integration tools lie DBT, Informatica, etc. Intermediate in programming language like Python/PySpark/Java/JavaScript AWS services and management, including Serverless, Container, Queueing and Monitoring services Consuming and building APIs. #LI-SM1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

3 - 8 Lacs

Chennai

Work from Office

Naukri logo

We are organizing a direct walk-in drive at Chennai location. Please find below details and skills for which we have a walk-in at TCS - Chennai on 7th June 2025 Experience: 4 - 8 years Skill Name :- (1) Informatica IICS (2) Azure Dot Net developer (3) AWS data engineer (4) Azure Data Engineer (5) Azure DevOps (Terraform, Kubernetnets) (6) Bigdata (Pyspark, Hive) (7) Java Spring boot & Microservices (8) ReactJS (9) Snowflake

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 27 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Location: Kolkata, Hyderabad, Bangalore Exp 4 to 17 years Band 4B, 4C, 4D Skill set -Snowflake, Horizon , Snowpark, Kafka for ETL

Posted 2 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Chennai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer youll be: Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a go-to expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other delta loading approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review Building Looker Dashboard for use cases if required What makes you a great fit: Having 3+ years of extensive development experience using snowflake or similar data warehouse technology Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, Looker Experience in agile processes, such as SCRUM Extensive experience in writing advanced SQL statements and performance tuning them Experience in Data Ingestion techniques using custom or SAAS tool like fivetran Experience in data modelling and can optimise existing/new data models Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets Additional Information Maximum official notice period acceptable for this role is 30 days This is remote opportunity. Looker/Power BI, DBT, SQL, snowflake are mandatory for this role Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 weeks ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Pune

Work from Office

Naukri logo

We, at Jet2 (UK’s third largest airlines and the second largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. We are setting up a brand-new Data Engineering program with an aim to make the data assets high quality & available for scientific study leading higher monetization in terms of increased revenue, improved operational efficiency and high-quality service. We are looking for a Test Engineer to join our Data Engineering practice and help us fulfil our aspiration and journey to become a data first organization The successful candidate will play a big role in the success of the new Data Engineering program, the incumbent would work closely with Data Engineering team to build automated testing frameworks for the Data Lake and Data Warehouse environments of Jet2 to ensure high quality certified data available in the platform for consumption. The incumbent will be working under the supervision of Senior Test Engineer and Data Engineering team to deliver testing projects. Roles and Responsibilities The successful candidate will work independently on test engineering projects with zero or minimal guidance and mentor/guide junior members in the team, the incumbent is expected to operate out of Pune location and collaborate with various stakeholders in Pune, Leeds and Sheffield. Partner with Data Engineering team to define quality and ensure that data product meets or exceeds the quality standards Review requirements, specifications and technical design documents to provide timely and meaningful feedback Develop software quality assurance (SQA) test plans, write test cases for the given functional requirement, and determine product quality or release readiness Develop and maintain testing framework and automation for Data Engineering pipeline testing and Data Warehouse model testing Develop, prioritize, plan and coordinate testing activities Design, develop and execute automation scripts Lead and drive Quality Engineering initiatives Identify, record, document bugs and maintain issue tracking system Perform functional and integration testing to assure data quality Design and apply data quality assurance practices, processes and standards Design and develop actionable reports and dashboards for continuous improvements Ensure tracking, reporting and resolution of issues in a timely manner Coordinate and collaborate with broader Data team including Data Architects, Data Scientists, Other Data Engineering Teams, Business Intelligence and Visualisation Teams Technical Skills & Knowledge Strong understanding of testing methodologies Work experience with testing RDBMS databases Expert knowledge of SQL is desired Hands on experience in working with ETL Testing Experience of automated testing tools and frameworks Hands-on experience of test engineering work in Cloud environment (GCP - preferred, AWS, Azure) Good understanding of DevOps/CICD processes and tools Test cases creation and management experience Strong understanding of APIs and working knowledge on testing APIs. Experience of non-functional testing including security and performance testing processes and tools (desirable) Experience with source control tools such as Git\TFS Testing and Quality Assurance experience of Data Warehouse environments (Snowflake Cloud Data Warehouse desirable) Soft Skills Good communication skill – Written & Verbal Ability build strong relationship with people across teams Experience of working with people from different geographies particularly UK & US Ability to negotiate on project timelines, efforts and resources for win-win situation Exceptional presentation skill to an ability to do storytelling to create maximum impact of the proposal or final solution Leadership & Organizational Skill Collaborate with broader Data team including Data Architects, Data Scientists, Other Data Engineering Teams, Business Intelligence and Visualisation Teams Work closely with Senior Test Engineer and Data Engineering Team Lead to help them build Test Engineering Centre of Excellence

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Consultant Data Engineer Tools & Technology : Snowflake, Snowsql, AWS, DBT, Snowpark, Airflow, DWH, Unix, SQL, Shell Scripting, Pyspark, GIT, Visual Studio, Service Now. Duties and Responsibility Act as Consultant Data Engineer Understand business requirement and designing, developing & maintaining scalable automated data pipelines & ETL processes to ensure efficient data processing and storage. Create a robust, extensible architecture to meet the client/business requirements Snowflake objects with integration with AWS services and DBT Involved in different type of data ingestion pipelines as per requirements. Development in DBT (Data Build Tool) for data transformation as per the requirements Working on multiple AWS services integration with Snowflake. Working with integration of structured data & Semi-Structure data sets Work on Performance Tuning and cost optimization Work on implementing CDC or SCD type 2 Design and build solutions for near real-time stream as well as batch processing. Implement best practices for data management, data quality, and data governance. Responsible for data collection, data cleaning & pre-processing using Snowflake and DBT Investigate production issues and fine-tune our data pipelines Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery. Co-ordinates and Support software developers, database architects, data analysts and data scientists on data initiatives Orchestrate the pipeline using Airflow Suggests improvements to processes, products and services. Interact with users, management, and technical personnel to clarify business issues, identify problems, and suggest changes/solutions to business and developers. Create technical documentation on confluence to aim knowledge sharing. -Associate Data Engineer Tools & Technology : Snowflake, DBT, AWS, Airflow, ETL, Datawarehouse, shell Scripting, SQL, Git, Confluence, Python • Duties and Responsibility Act as offshore Data engineer and enhancement & testing. Design and build solutions for near real-time stream processing as well as batch processing. Development in snowflake objects with there unique features implemented Implementing data integration and transformation workflows using DBT Integration with AWS services with snowflake Participate in implementation plan, respond to production issues Responsible for data collection, data cleaning & pre-processing Experience in developing UDF, Snowflake Procedures, Streams, and Tasks. Involved in troubleshooting customer data issues, manual load if any data missed, data duplication checking and handling with RCA Investigate Productions jobs failure with including investigation till find out RCA. Development of ETL processes and data integration solutions. Understanding the business needs of the client and provide technical solution Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Handling major outages effectively along with effective communication to business, users & development partners. Defines and creates Run Book entries and knowledge articles based on incidents experienced in the production - Associate Engineer • Tools and Technology: UNIX, ORACLE, Shell scripting, ETL, Hadoop, Spark, Sqoop, Hive, Control-m, Techtia, SQL, Jira, HDFS, Snowflake, DBT, AWS • Duties and Responsibility Worked as an Senior Production /Application Support Engineer Working as Production support member for Loading, Processing and Reporting of files and generating Reports. Monitoring multiple Batches, Jobs, Processes and analyzing issue related the job failures and Handling FTP failure, Connectivity issue of Batch/Job failure. Performing data analysis on files and generating files and sending files to destination server depends on functionality of job. Creating Shell Script for automating the daily task or Service Owner Requested. Involved in tuning the Jobs to improve performance and performing daily checks. Coordinating with Middleware, DWH, CRM and most of teams in case of any issue of CRQ. Monitoring the overall functioning of process, identifying improvement area and implement help of scripting. Involved in tuning the Jobs to improve performance and raising PBI after approval from Service owner. Involved in performance improvement Automation activities to decrees manual workload Data ingestion from RDBMS system to HDFS/Hive through SQOOP Understand customer problems and provide appropriate technical solutions. Handling major outages effectively along with effective communication to business, users & development partners. Coordinating with Client, On- Site persons and joining the bridge call for any issues. Handling daily issues based on application and jobs performance.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

30 - 40 Lacs

Pune

Work from Office

Naukri logo

We, at Jet2 (UK’s third largest airline and largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. We are recruiting for an experienced and passionate Lead Test Engineer to join our growing Data Engineering program which is focused on making data assets from across our business available in cloud for advanced analytics, data science and automated decision making. These data assets drive mission critical processes across our commercial and operational teams contributing towards profitable growth, competitive advantage, and exceptional customer satisfaction. This is an exciting opportunity to help us become a leading data-drive organisation through the effective use of data. The successful candidate will play a pivotal role in the success of our Data Engineering & Analytics program working closely with colleagues across our business and technical teams. You will be testing and encompassing a wide variety of data sources and will ensure data is cleansed, well structured, and appropriately transformed to support a wide variety of use cases ranging from self-service analytics through to data warehousing and data science. We are looking for a thought-leader in the space of Cloud Data Engineering to not only enable us to deliver solutions efficiently but also evolve our Data Engineering processes and practices to ensure we continue to align with industry best-practice and take advantage of new developments in the Data Engineering landscape. Key Responsibilities: Our Test Lead’s priority is to plan, monitor and direct testing items, activities, and tasks to deliver high-quality, well-modelled, clean, and trustworthy data assets for use across the business. Our data teams best practices and industry Standards solid experience testing data products, applications Test delivery across several multi-disciplinary (on-prem, cloud, data platforms, warehouse, tableau dashboard etc) data delivery team Assessing the appropriate level of testing for each piece of work, and supporting the planning, execution and documentation of testing carried out Measuring and improving our data test practices and processes Helping develop Agile Data Testing methodologies and Automated Data Testing processes Direct line management for Test Engineers within the teams you are supporting, ensuring that all your direct reports have learning and development plans in place and that these are continually reviewed Lead the new initiatives in test automation, strategy/framework as required and focus specifically on maximizing reusability for regression. Build test plans, test scenarios, and test data to support development projects and project requirements, and design documents Develop and oversee onboarding QA Training for new hire and advance training program for experience resources. Roles and Responsibilities Technical Skills & Knowledge: Test Lead will ideally have experience working in a fast-paced Agile delivery environment with a focus on rapid delivery of business value through iterative development and testing approaches. You will also have the following skills. Advanced SQL knowledge with experience using a wide variety of source systems including Microsoft SQL Server Have prior test leadership and line management experience or have been in a senior test role with significant mentoring experience Test Certified such as ISTQB Experience of working in an Agile delivery team using automated build, deployment, and testing (CI\CD, DevOps, DataOps) Automate tests for data validation, schema verification, data transformation, and data quality checks across different stages of the pipeline. Write scripts and programs in languages such as Python, Java, or Scala to automate test execution. Use advanced features in DBT to write generic and singular test cases. Have experience in developing and implementing best practices across QA team regarding data validation, functional testing, and regression testing Have experience in data testing on one or more of the following technology platforms (cloud data platform experience preferred): Microsoft SQL Server (T-SQL, SSIS, SSRS, SSAS) Google Cloud Platform (GCS, Google Cloud Composer, GKE, Cloud Functions, BiqQuery) Snowflake Cloud Data Platform dBt Data Transformation Framework Tableau Data Visualisation Platform Experience in automated data testing or the creation of automated data testing frameworks would be highly beneficial across any of the above technology platform Relevant Experience: 9+ years’ experience in a data-related role with recent experience in the field of cloud data engineering. Experience of cloud data analytics and data warehousing would be an advantage. Soft Skills: Excellent communication skills – verbal and written Strong team and stakeholder management skills – you should have the ability to build strong relationships with people across a wide variety of teams and backgrounds Experience working with people across different geographical regions, particularly UK and US Excellent planning and time management skills to ensure projects are delivered on time and to requirement with issues being escalated and addressed effectively Exceptional presentation skills – you will be responsible for documenting solution designs for presentation and sign-off as well as delivering internal presentation to support continual development of your team Leadership & Organisation Skills: Lead a team of test engineers working across multiple projects (Senior, Mid, Junior and Graduate level Test Engineers) Set individual development goals, monitor, and manage performance, provide timely feedback and take responsibility for professional development of team members Collaborate with wider data team including Solution Architects, Specialists, Enterprise Architects, Data Scientists, Data Visualisation Specialists, Analytics Engineers, and Test Engineers Qualification & Certification: B.E./B.Tech/ MTech in IT or Computer Science from reputed institute (preferred) or master’s degree in quantitative Subjects e.g. Mathematics, Statistics & Economics

Posted 2 weeks ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT. Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. ' Technical Toolset: o Languages & Frameworks: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging ? intermediate ? marts, Medallion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelors or masters degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.

Posted 2 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies