Home
Jobs

1123 Snowflake Jobs - Page 19

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Must have minimum 6 years of relevant experience in IT Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Proficiency in DBT (Data Build Tool) for data transformation and modelling Experience with ETL/ELT processes and integrating data from multiple sources. Experience in designing Tableau dashboards, data visualizations, and reports Familiarity with data warehousing concepts and best practices Strong problem-solving skills and ability to work in cross-functional teams.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 4+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must be strong in SQL Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking an experienced Data Engineer with expertise in Snowflake and PLSQL to design, develop, and optimize scalable data solutions. The ideal candidate will be responsible for building robust data pipelines, managing integrations, and ensuring efficient data processing within the Snowflake environment. This role requires a strong background in SQL, data modeling, and ETL processes, along with the ability to troubleshoot performance issues and collaborate with cross-functional teams. Responsibilities: Design, develop, and maintain data pipelines in Snowflake to support business analytics and reporting. Write optimized PLSQL queries, stored procedures, and scripts for efficient data processing and transformation. Integrate and manage data from various structured and unstructured sources into the Snowflake data platform. Optimize Snowflake performance by tuning queries, managing workloads, and implementing best practices. Collaborate with data architects, analysts, and business teams to develop scalable and high-performing data solutions. Ensure data security, integrity, and governance while handling large-scale datasets. Automate and streamline ETL/ELT workflows for improved efficiency and data consistency. Monitor, troubleshoot, and resolve data quality issues, performance bottlenecks, and system failures. Stay updated on Snowflake advancements, best practices, and industry trends to enhance data engineering capabilities. Required Skills: Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Hands-on expertise in PLSQL, including writing and optimizing complex queries, stored procedures, and functions. Proven ability to work with large datasets, data warehousing concepts, and cloud-based data management. Proficiency in SQL, data modeling, and database performance tuning. Experience with ETL/ELT processes and integrating data from multiple sources. Familiarity with cloud platforms such as AWS, Azure, or GCP is an added advantage. Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced) are a plus. Strong analytical skills, problem-solving abilities, and attention to detail. Excellent communication skills and ability to work effectively in a collaborative environment.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job description Job TitleETL Tester Job Responsibilities: Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Use Tableau to create visualizations and dashboards that help in data analysis and reporting. Work with Snowflake to test and validate data stored in the cloud data warehouse. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results Exposure to Salesforce and proficiency in developing SQL queries The ideal candidate will have a strong background in ETL processes, data validation, and experience with Tableau and Snowflake. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 6+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

JD - QA resource with exp in Salesforce Marketing cloud and some programming experience basically testing needs and some email templates build potentially. Testing will be for tables and data in Snowflakes and SFMC testing.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

22 - 35 Lacs

Gurugram, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Gurgaon (Remote) Preferred: Immediate Joiners Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Build data integrations, data models to support analytical needs for this project. below. Translate business requirements into technical requirements as needed Design and develop automated scripts for data pipelines to process and transform as per the requirements and monitor those Produce artifacts such as data flow diagrams, designs, data model along with git code as deliverable Use tools or programming languages such as SQL, Python, Snowflake, Airflow, dbt, Salesforce Data cloud Ensure data accuracy, timeliness, and reliability throughout the pipeline. Complete QA, data profiling to ensure data is ready as per the requirements for UAT Collaborate with stakeholders on business, Visualization team and support enhancements Timely updates on the sprint boards, task updates Team lead to provide timely project updates on all the projects Project experience with version control systems and CICD such as GIT, GitFlow, Bitbucket, Jenkins etc. Participate in UAT to resolve findings and plan Go Live/Production deployment Milestones: Data Integration Plan into Data Cloud for structured and unstructured data/RAG needs for the Sales AI use cases Design Data Models and semantic layer on Salesforce AI Agentforce Prompt Integration Data Quality and sourcing enhancements Write Agentforce Prompts and refine as needed Assist decision scientist on the data needs Collaborate with EA team and participate in design review Performance Tuning and Optimization of Data Pipelines Hypercare after the deployment Project Review and Knowledge Transfer

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Requirement : Senior Business Analyst (Data Application & Integration)Experience: 10+ Years Location: Gurgaon (Hybrid) Budget Max:35 LPA Preferred: Immediate Joiners Job Summary: We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills

Posted 2 weeks ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Data Architect, Snowflake Database Architect/Data Modeler with 15+ years of experience in AWS, Snowflake. Responsible for the overall architecture and delivery of the solution; including, risk mitigation, issue resolution, coordination with internal and external team.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Your Impact Shapes the World at Caterpillar Inc When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Job Summary Caterpillar is seeking a skilled Senior Data Scientist to join our team to act as a champion for our Digital Finance-Disruptive Technology Team within the Global Finance Services Division . The incumbent will leverage internal data and advanced analytics to solve business problems. The preference for this role is to be based out of Whitefield Office, Bangalore, India What you will do As a Senior Data Scientist, you will work in a team environment with a group of data scientists and software developers to support the development, integration, and enhancement of Caterpillar's analytics, AI, and Generative AI projects and products. Gain an understanding of Caterpillar s business and accounting practices, systems, and procedures. Developing ML/AI/GenAI models and solutions. Performing data gathering, data mining, and data processing on large volumes of data; creating appropriate data models. Demonstrating a breadth of knowledge in the application of ML/AI/GenAI and statistical methods and/or digital methods to solve business problems. Conducting research on data model optimization and algorithms to improve the effectiveness and accuracy of data analyses. Participating on 1-3 primary projects concurrently. Having a strong focus on continual learning in the Analytics field. Reporting and presenting team results. The Data Scientist will develop good networks within the technical community to enable them to collaborate on technical solutions, obtain resources and cooperation needed, and remove roadblocks so that they can ensure the success of their projects. What you will have Typically, a Bachelor's, Master s, or Ph.D. degree, preferably in statistics, economics, mathematics, or a similar field with quantitative coursework. 5+ Years of Professional Experience with the ability to deliver data science products with almost no technical guidance from others. The ability to provide technical guidance to less experience data science team members. Experience with data analysis and statistical methods such as regression, hypothesis testing, ANOVA, statistical process control, etc. Practical applications of machine learning techniques such as clustering, logistic regression, CART, random forests, SVM, or neural networks. Technical and problem-solving skills with evidence of continuous learning in the analytics field. Breadth and depth of knowledge in the application of statistical and/or digital methods to solve business problems. Experience with projects connecting data from various business domains and providing forecasts and insights to run the business profitably. Working knowledge of cloud technologies (AWS, Azure, Google Cloud, etc.). Working knowledge of version control/repositories such as GitHub. Experience operating in an Agile environment. Initiative to research and apply new methods and digital technologies to exceed customer expectations. Must demonstrate strong initiative, interpersonal skills, and the ability to communicate effectively. Must be comfortable working in MS Office, and industry standard statistics, analytics, and data visualization packages. Experience mechanizing manual data collection processes using Snowflake to enhance indicative data to generate quicker and more granular insights. Tables/Process could be leveraged by the Forecasting and Financial Analysis Reporting groups. Experience with writing fast and memory-efficient code as well as with various types of parallel processingthreading, multi-core/forking, multiple session, and cluster computing. Skills desired: Analytical Thinking Extensive knowledge of techniques and tools for effective analysis; ability to determine the root cause of problems and create alternative solutions that resolve these issues. GenAI / AI / Machine Learning Extensive experience and knowledge of the principles, technologies, and algorithms of GenAI, AI, and machine learning; ability to develop, implement, and deliver related systems, products, and services. Statistics Extensive knowledge of statistical tools, processes, and practices to describe results on measurable scales; ability to use statistical tools and processes to assist in making business decisions. Programming Languages Extensive experience in applying Python (NumPy, SciPy, Pandas, etc.) to solve business challenges; ability to use tools, techniques, and platforms to write and modify programming languages. Information Processing Extensive experience in processing large quantities of detailed information with high levels of accuracy. Databases Extensive knowledge in writing, debugging, and implementing complex queries involving multiple tables or databases. Presentation Skills Strong presentation and communication skills. What you will get: Work Life Harmony Earned and medical leave. Flexible work arrangements Relocation assistance Holistic Development Personal and professional development through Caterpillar s employee resource groups across the globe Career developments opportunities with global prospects Health and Wellness Medical coverage -Medical, life and personal accident coverage Employee mental wellness assistance program Financial Wellness Employee investment plan Pay for performance -Annual incentive Bonus plan. Additional Information: Caterpillar is not currently hiring individuals for this position who now or in the future require sponsorship for employment visa status; however, as a global company, Caterpillar offers many job opportunities outside of the U.S. which can be found through our employment website at www.caterpillar.com/careers Posting Dates: January 16, 2025 - January 29, 2025 Caterpillar is an Equal Opportunity Employer (EEO). Not ready to applyJoin our Talent Community .

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Your Impact Shapes the World at Caterpillar Inc When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Job Summary We are seeking a skilled and experienced Digital Product Owner to join Digital Product Management - AIMS Data - Global Finances Shared Division. The Global Finance Strategy team is looking for a skilled Digital Product Owner to collaborate with business stakeholders and engineering teams in supporting the Global Finance Data Platform. This role will focus on enabling multiple business use cases across the Global Finance organization. The preference for this role is to be based out of Whitefield Office, Bangalore, India What you will do Key Responsibilities: Partner with cross-functional teams to drive the implementation and evolution of the Global Finance Data Platform. Translate business needs into clear product requirements and user stories. Support data analysis, reporting, and system integration efforts. Support User Acceptance Testing What you will have Minimum 4+ years of experience in business and data analysis, particularly in large-scale data platform implementations and system integrations. 3+ years of hands-on experience with SQL, specifically in Snowflake, for data analysis and report development. Experience working with SAP ERP systems or Finance/Accounting functional experience. Proficiency in creating UX wireframes and mockups. Strong understanding of Software Development Life Cycle (SDLC) methodologies, including both Waterfall and Agile. Experience in developing business process flows and data flow diagrams. Experience conducting functional testing and coordinating UAT Excellent written and verbal communication skills for effective stakeholder engagement and requirements gathering. Strong analytical thinking and problem-solving capabilities. Domain knowledge in Accounting and Finance. Proficiency in Microsoft Excel and Power BI. Experience with tools such as JIRA, Azure DevOps, and Confluence. Additional Information: This position requires the selected candidate to work Full -Time in the Whitefield Bangalore, Karnataka office. This position requires candidate to work a 5-day -a -week schedule in the office Domestic Relocation is available Skills desired: Business Analysis Knowledge of business analysis and the set of tasks, techniques and tools required to identify business needs; ability to recommend solutions that deliver value to stakeholders. Level Working Knowledge: Analyzes the value of a business and its functions through the value estimation of assets. Applies the prerequisites to a project before starting the business analysis process. Collaborates with stakeholders, development teams, testing teams, etc., to deliver business solutions. Documents the business case to justify the requirements of time and resources of a project. Utilizes diverse analysis tools and methodologies to group different business activities based on shared characteristics or similarities. Decision Making and Critical Thinking: Knowledge of the decision-making process and associated tools and techniques; ability to accurately analyze situations and reach productive decisions based on informed judgment. Level Extensive Experience: Differentiates assumptions, perspectives, and historical frameworks. Evaluates past decisions for insights to improve decision-making process. Assesses and validates decision options and points and predicts their potential impact. Advises others in analyzing and synthesizing relevant data and assessing alternatives. Uses effective decision-making approaches such as consultative, command, or consensus. Ensures that assumptions and received wisdom are objectively analyzed in decisions. Effective Communications Understanding of effective communication concepts, tools and techniques; ability to effectively transmit, receive, and accurately interpret ideas, information, and needs through the application of appropriate communication behaviors. Level Extensive Experience: Reviews others' writing or presentations and provides feedback and coaching. Adapts documents and presentations for the intended audience. Demonstrates both empathy and assertiveness when communicating a need or defending a position. Communicates well downward, upward, and outward. Employs appropriate methods of persuasion when soliciting agreement. Maintains focus on the topic at hand. Software Change Request Management Knowledge of software change request management; ability to manage software product change requests from customers, prospective customers, vendors, and internal staff. Level Extensive Experience: Describes methods for estimating costs for request fulfillment. Defines responses for non-standard or unsupported change requests. Contributes to the design and development of request process flow and templates. Clarifies description, components affected, need, cost estimate, risk, resources, status. Manages all aspects of the change request process. Researches new tools and techniques for monitoring product efficacy. Software Engineering Knowledge of software engineering; ability to deliver new or enhanced fee-based software products. Level Working Knowledge: Identifies considerations for product integration with multiple platforms and systems. Works with development or delivery of a software package or component. Describes phases, activities, deliverables and processes for a specific methodology. Works with structured documents for developing features, functions, plans and schedules. Describes software design practices, technologies, and considerations. Software Problem Management Knowledge of strategies, practices and tools for resolving software problems; ability to manage software problems in installed software products. Level Working Knowledge: Documents resolution progress and provides feedback to customers. Describes issues and consideration for resolving problems involving other products or vendors. Works with tracking and resolving common types of problems for a product or product group. Describes actions, tools, and procedures for problem reporting, solving, and resolution. Cites examples of unusual problems; follows proper notifications and escalation procedures. Software Product Business Knowledge Knowledge of and experience with the business aspects and operation of a software product; ability to manage install base, current uses, future plans, and product vision. Level Extensive Experience: Participates in enhancing the sales process and expanding sales opportunities. Collects, documents, and maintains product functional requirements; makes recommendations. Supports and participates in major installations and customizations. Maintains and disseminates information on customer use and experiences. Has knowledge of all advanced business features and functions of the product. Relates experiences with unusual or non-traditional uses; assesses opportunities and challenges . User Acceptance Testing (UAT) Knowledge of UAT activities, tasks, tools and techniques; ability to design, implement and evaluate acceptance tests for end-users. Level Extensive Experience: Compares and contrasts features and benefits of major acceptance testing frameworks. Critiques user acceptance plans for appropriateness and completeness. Develops approaches for acceptance testing following legal or contractual agreements. Monitors end-users in defining the testing environment and acceptance criteria; explains the importance of being actively involved in test designs and other testing phases. Applies user acceptance testing in typical software development scenarios. Consults on test strategies, components, processes, plans and approaches during the user acceptance testing process. What you will get: Work Life Harmony Earned and medical leave. Relocation assistance Holistic Development Personal and professional development through Caterpillar s employee resource groups across the globe Career developments opportunities with global prospects Health and Wellness Medical coverage -Medical, life and personal accident coverage Employee mental wellness assistance program Financial Wellness Employee investment plan Pay for performance -Annual incentive Bonus plan. Additional Information: Caterpillar is not currently hiring individuals for this position who now or in the future require sponsorship for employment visa status; however, as a global company, Caterpillar offers many job opportunities outside of the U.S. which can be found through our employment website at www.caterpillar.com/careers Posting Dates: June 3, 2025 - June 16, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

India, Bengaluru

Work from Office

Naukri logo

Senior Data Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Senior Data Engineer - Enterprise Data Platform Get to know Data Engineering Okta s Business Operations team is on a mission to accelerate Okta s scale and growth. We bring world-class business acumen and technology expertise to every interaction. We also drive cross-functional collaboration and are focused on delivering measurable business outcomes. Business Operations strives to deliver amazing technology experiences for our employees, and ensure that our offices have all the technology that is needed for the future of work. The Data Engineering team is focused on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Senior Data Engineer Opportunity A Senior Data Engineer is responsible for designing, building, and maintaining scalable solutions. This role involves collaborating with data engineers, analysts, scientists and other engineers to ensure data availability, integrity, and security. The ideal candidate will have a strong background in cloud platforms, data warehousing, infrastructure as code, and continuous integration/continuous deployment (CI/CD) practices. What you ll be doing: Design, develop, and maintain scalable data platforms using AWS, Snowflake, dbt, and Databricks. Use Terraform to manage infrastructure as code, ensuring consistent and reproducible environments. Develop and maintain CI/CD pipelines for data platform applications using GitHub and GitLab. Troubleshoot and resolve issues related to data infrastructure and workflows. Containerize applications and services using Docker to ensure portability and scalability. Conduct vulnerability scans and apply necessary patches to ensure the security and integrity of the data platform. Work with data engineers to design and implement Secure Development Lifecycle practices and security tooling (DAST, SAST, SCA, Secret Scanning) into automated CI/CD pipelines. Ensure data security and compliance with industry standards and regulations. Stay updated with the latest trends and technologies in data engineering and cloud platforms. What we are looking for: BS in Computer Science, Engineering or another quantitative field of study 5+ years in a data engineering role 5+ years experience working with SQL, ETL tools such as Airflow and dbt, with relational and columnar MPP databases like Snowflake or Redshift, hands-on experience with AWS (e.g., S3, Lambda, EMR, EC2, EKS) 2+ years of experience managing CI/CD infrastructures, with strong proficiency in tools like GitHub Actions, Jenkins, ArgoCD, GitLab, or any CI/CD tool to streamline deployment pipelines and ensure efficient software delivery. 2+ years of experience with Java, Python, Go, or similar backend languages. Experience with Terraform for infrastructure as code. Experience with Docker and containerization technologies. Experience working with lakehouse architectures such as Databricks and file formats like Iceberg and Delta Experience in designing, building, and managing complex deployment pipelines. "This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment." What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

India, Bengaluru

Work from Office

Naukri logo

Senior Full Stack Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box, we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. The Data Engineering Team Our focus is on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Full Stack Engineer Opportunity This role is responsible for designing and developing scalable solutions for our Business Intelligence stacks in a fast-paced, Agile environment. Experience with Frontend Development and UX design Experience with containerization and orchestration (Docker, Kubernetes). Knowledge of DevOps practices and tools. Previous experience in Agile/Scrum development methodologies. What you ll be doing Building and maintaining user-facing applications, ensuring they are performant, responsive, and user-friendly. Collaborate with UX/UI designers to implement visually appealing and intuitive designs and backend developers to showcase data-intensive webpages. Conduct thorough testing and debugging to ensure the quality and performance of the software. Implement seamless integration of microservices with the front-end. Implement security best practices to protect sensitive data and ensure system integrity. Deploy applications to production environments and manage the deployment process. Monitor system performance and address any issues promptly to ensure a smooth user experience. Collaborate with cross-functional teams, including product managers, designers, and other developers, to deliver end-to-end solutions. Monitor system performance and address any issues promptly to ensure a smooth user experience. What you ll bring to the role BS in Computer Science, Engineering or another quantitative field of study 3+ years of experience with frontend development. Strong proficiency in developing UI components using client-side framework ReactJS Extensive experience in web fundamentals like HTML 5 and CSS 3 Solid understanding of full web technology stackRESTful services, client-side frameworks, data persistence technologies and security Expert in Relational Database design, implementation, queries, and reporting (DDL, SQL) Experience working with SQL, ETL tools such as Airflow, with relational and columnar MPP databases like Snowflake, Athena or Redshift Excellent oral and written communication skills, both technical and non-technical audience And extra credit if you have experience in any of the following! Strong knowledge of cloud computing platforms like AWS, including serverless services and infrastructure as code using terraform Experience with cloud infrastructure/platforms (AWS, Azure, Google Cloud Platform) and Data Lake development Experience with packaging and distributing containerized applications using Docker and Kubernetes Experience of developing microservices What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2335_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Snowflake Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

10.0 - 12.0 years

40 Lacs

Hyderabad

Work from Office

Naukri logo

Location: Hyderabad (5days work from office) About the Team At DAZN, the Analytics Engineering team sits at the core of the business, transforming hundreds of data points into actionable insights that drive strategic decisions. From content strategy and product engagement to marketing optimization and revenue analytics we enable scalable, accurate, and accessible data solutions across the organization. The Role Were looking for a Lead Analytics Engineer to take ownership of our analytics data pipelines and play a critical role in designing and scaling DAZNs modern data stack. This is a hands-on technical leadership role where youll shape robust data models using dbt and Snowflake , orchestrate workflows via Airflow , and ensure high-quality, trusted data is delivered for analytical and reporting needs. Key Responsibilities Lead the development and governance of semantic data models to support consistent, reusable metrics. Architect scalable data transformations on Snowflake using SQL and dbt , applying data warehousing best practices. Manage and enhance pipeline orchestration with Airflow , ensuring timely and reliable data processing. Collaborate closely with teams across Product, Finance, Marketing, and Tech to translate business needs into technical data models. Establish and maintain best practices around version control, testing, and CI/CD for analytics workflows. Mentor junior engineers and promote a culture of technical excellence and peer learning . Champion data quality, documentation, and observability across the analytics stack. What Youll Need to Have 10+ years of experience in data or analytics engineering , with 2+ years in a leadership or mentoring role . Strong hands-on experience with SQL , dbt , and Snowflake . Experience with cloud platforms (AWS, GCP, or Azure). Proven expertise in pipeline orchestration tools such as Apache Airflow , Prefect, or Luigi. Deep understanding of dimensional modeling , ELT patterns, and data governance. Ability to navigate both technical deep dives and high-level stakeholder collaboration . Excellent communication skills with both technical and non-technical teams. Nice to Have Background in media, OTT, or sports tech domains. Familiarity with BI tools like Looker , Power BI , or Tableau. Exposure to data testing frameworks such as dbt tests or Great Expectations .

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1999_JOB Date Opened 17/06/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Tester City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

6.0 - 10.0 years

1 - 5 Lacs

Pune

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1661_JOB Date Opened 17/12/2022 Industry Technology Job Type Work Experience 6-10 years Job Title Informatica TDM Developer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationPune, Bangalore, Hyderabad Informatica TDM 1. Data Discovery, Data Subset and Data Masking 2. Data generation 3. Complex masking and genration rule creation 4. Performance tunning for Inoformatica Mapping 5. Debugging with Informatica power center 6. Data Migration Skills and Knowledge 1. Informatica TDM development experiance in Data masking, discovery, Data subsetting and Data Generation is must 2. Should have experiance in working with flat file, MS SQL, Oracle and Snowflake 3. Debugging using Inofmratica Power center 4. Experiance in Tableau will be added advantage 5. Should have basic knowledge about IICS" 6. Must have and Good to have skills Informatica TDM, SQL, Informatica Powercenter, GDPR check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2334_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Informatica ETL Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2393_JOB Date Opened 09/11/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake Engineer - Database Administraion City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

7.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2162_JOB Date Opened 15/03/2024 Industry Technology Job Type Work Experience 7-9 years Job Title Sr Data Engineer City Bangalore Province Karnataka Country India Postal Code 560004 Number of Positions 5 Mandatory Skills: Microsoft Azure, Hadoop, Spark, Databricks, Airflow, Kafka, Py spark RequirmentsExperience working with distributed technology tools for developing Batch and Streaming pipelines using. SQL, Spark, Python Airflow Scala Kafka Experience in Cloud Computing, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture Experience working with Data platforms, including EMR, Airflow, Data bricks (Data Engineering & Delta Lake components) Experience working in Agile and Scrum development process. Experience in EMR/ EC2, Data bricks etc. Experience working with Data warehousing tools, including SQL database, Presto, and Snowflake Experience architecting data product in Streaming, Server less and Microservices Architecture and platform. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1835_JOB Date Opened 03/04/2023 Industry Technology Job Type Work Experience 5-8 years Job Title SQL Database Developer City Hyderabad Province Telangana Country India Postal Code 500081 Number of Positions 1 Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have Location: Hyderabad, Pune, Bangalore check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2168_JOB Date Opened 10/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AWS Data Engineer City Chennai Province Tamil Nadu Country India Postal Code 600002 Number of Positions 4 Mandatory Skills: AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1668_JOB Date Opened 19/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Sr. AWS Developer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 AWS Lambda Glue Kafka/Kinesis RDBMS Oracle, MySQL, RedShift, PostgreSQL, Snowflake Gateway Cloudformation / Terraform Step Functions Cloudwatch Python Pyspark Job role & responsibilities: Looking for a Software Engineer/Senior Software engineer with hands on experience in ETL projects and extensive knowledge in building data processing systems with Python, pyspark and Cloud technologies(AWS). Experience in development in AWS Cloud (S3, Redshift, Aurora, Glue, Lambda, Hive, Kinesis, Spark, Hadoop/EMR) Required Skills: Amazon Kinesis, Amazon Aurora, Data Warehouse, SQL, AWS Lambda, Spark, AWS QuickSight Advanced Python Skills Data Engineering ETL and ELT Skills Experience of Cloud Platforms (AWS or GCP or Azure) Mandatory skills Datawarehouse, ETL, SQL, Python, AWS Lambda, Glue, AWS Redshift. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies