Home
Jobs

1082 Snowflake Jobs - Page 36

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1 - 3 years

8 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

8 - 10 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

About The Role Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ? Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs ? Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor ? 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipro’s Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc ? 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Snowflake. Experience8-10 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

3 - 5 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Snowflake. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 10 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Snowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills S?nowflake Developer Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory skills : Snowflake DB developer + Python & Unix scripting + SQL queries Location : Chennai NP 0 to 30 days Exp 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills

Posted 1 month ago

Apply

4 - 6 years

30 - 34 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Annalect is seeking a hands-on Data QA Manager to lead and elevate data quality assurance practices across our growing suite of software and data products. This is a technical leadership role embedded within our Technology teams, focused on establishing best-in-class data quality processes that enable trusted, scalable, and high-performance data solutions. As a Data QA Manager, you will drive the design, implementation, and continuous improvement of end-to-end data quality frameworks, with a strong focus on automation, validation, and governance. You will work closely with data engineering, product, and analytics teams to ensure data integrity, accuracy, and compliance across complex data pipelines, platforms, and architectures, including Data Mesh and modern cloud-based ecosystems. This role requires deep technical expertise in SQL, Python, data testing frameworks like Great Expectations, data orchestration tools (Airbyte, DbT, Trino, Starburst), and cloud platforms (AWS, Azure, GCP). You will lead a team of Data QA Engineers while remaining actively involved in solution design, tool selection, and hands-on QA execution. Responsibilities Key Responsibilities: Develop and implement a comprehensive data quality strategy aligned with organizational goals and product development initiatives. Define and enforce data quality standards, frameworks, and best practices, including data validation, profiling, cleansing, and monitoring processes. Establish data quality checks and automated controls to ensure the accuracy, completeness, consistency, and timeliness of data across systems. Collaborate with Data Engineering, Product, and other teams to design and implement scalable data quality solutions integrated within data pipelines and platforms. Define and track key performance indicators (KPIs) to measure data quality and effectiveness of QA processes, enabling actionable insights for continuous improvement. Generate and communicate regular reports on data quality metrics, issues, and trends to stakeholders, highlighting opportunities for improvement and mitigation plans. Maintain comprehensive documentation of data quality processes, procedures, standards, issues, resolutions, and improvements to support organizational knowledge-sharing. Provide training and guidance to cross-functional teams on data quality best practices, fostering a strong data quality mindset across the organization. Lead, mentor, and develop a team of Data QA Analysts/Engineers, promoting a high-performance, collaborative, and innovative culture. Provide thought leadership and subject matter expertise on data quality, influencing technical and business stakeholders toward quality-focused solutions. Continuously evaluate and adopt emerging tools, technologies, and methodologies to advance data quality assurance capabilities and automation. Stay current with industry trends, innovations, and evolving best practices in data quality, data engineering, and analytics to ensure cutting-edge solutions. Qualifications Required Skills 11+ years of hands-on experience in Data Quality Assurance, Data Test Automation, Data Comparison, and Validation across large-scale datasets and platforms. Strong proficiency in SQL for complex data querying, data validation, and data quality investigations across relational and distributed databases. Deep knowledge of data structures, relational and non-relational databases, stored procedures, packages, functions, and advanced data manipulation techniques. Practical experience with leading data quality tools such as Great Expectations, DbT tests, and data profiling and monitoring solutions. Experience with data mesh and distributed data architecture principles for enabling decentralized data quality frameworks. Hands-on experience with modern query engines and data platforms, including Trino/Presto, Starburst, and Snowflake. Experience working with data integration and ETL/ELT tools such as Airbyte, AWS Glue, and DbT for managing and validating data pipelines. Strong working knowledge of Python and related data libraries (e.g., Pandas, NumPy, SQLAlchemy) for building data quality tests and automation scripts.

Posted 1 month ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. About The Role - Grade Specific The primary focus is to help organizations design, develop, and optimize their data infrastructure and systems. They help organizations enhance data processes, and leverage data effectively to drive business outcomes. Skills (competencies) Industry Standard Data Modeling (FSLDM) Ab Initio Industry Standard Data Modeling (IBM FSDM)) Agile (Software Development Framework) Influencing Apache Hadoop Informatica IICS AWS Airflow Inmon methodology AWS Athena JavaScript AWS Code Pipeline Jenkins AWS EFS Kimball AWS EMR Linux - Redhat AWS Redshift Negotiation AWS S3 Netezza Azure ADLS Gen2 NewSQL Azure Data Factory Oracle Exadata Azure Data Lake Storage Performance Tuning Azure Databricks Perl Azure Event Hub Platform Update Management Azure Stream Analytics Project Management Azure Sunapse PySpark Bitbucket Python Change Management R Client Centricity RDD Optimization Collaboration SantOs Continuous Integration and Continuous Delivery (CI/CD) SaS Data Architecture Patterns Scala Spark Data Format Analysis Shell Script Data Governance Snowflake Data Modeling SPARK Data Validation SPARK Code Optimization Data Vault Modeling SQL Database Schema Design Stakeholder Management Decision-Making Sun Solaris DevOps Synapse Dimensional Modeling Talend GCP Big Table Teradata GCP BigQuery Time Management GCP Cloud Storage Ubuntu GCP DataFlow Vendor Management GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2

Posted 1 month ago

Apply

3 - 5 years

5 - 10 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Integration Design and Development: Develop integration solutions using SnapLogic to automate data workflows between Snowflake, APIs, Oracle and other data sources. Design, implement, and maintain data pipelines to ensure reliable and timely data flow across systems. Develop API integrations to facilitate seamless data exchange with internal master data management systems. Monitor and optimize data integration processes to ensure high performance and reliability. Provide support for existing integrations, troubleshoot issues, and suggest improvements to streamline operations. Work closely with cross-functional teams, including data analysts, data scientists, and IT, to understand integration needs and develop solutions. Maintain detailed documentation of integration processes and workflows. Experience: 3-4 years of Proven experience as a SnapLogic Integration Engineer. Experience with Snowflake cloud data platform is preferred. Experience in API integration and development. Familiar with RESTful API design and integration. Strong understanding of ETL/ELT processes Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5 - 8 years

22 - 30 Lacs

Pune, Chennai

Work from Office

Naukri logo

Experience: Minimum of 5 years of experience in data engineering, with a strong focus on data pipeline development. At least 2 years of experience leading teams or projects in the healthcare, life sciences, or related domains. Proficiency in Python, with experience in data manipulation libraries. Hands-on experience with AWS Glue, AWS Lambda, S3, Redshift, and other relevant AWS data services. Familiarity with data integration tools, ETL (Extract, Transform, Load) frameworks, and data warehousing solutions. Proven experience working in an onsite-offshore model, managing distributed teams, and coordinating development across multiple time zones.

Posted 1 month ago

Apply

6 - 11 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Senior Snowflake Database Engineer who excels in developing complex queries and stored procedures . The ideal candidate should have a deep understanding of Snowflake architecture and performance tuning techniques. He / She will work closely with application engineers to integrate database solutions seamlessly into applications, ensuring optimal performance and reliability. Strong expertise in Snowflake , including data modeling, query optimization, and performance tuning. Proficiency in writing complex SQL queries, stored procedures, and functions. Experience with database performance tuning techniques, including indexing and query profiling. Familiarity with integrating database solutions into application code and workflows. Knowledge of data governance and data quality best practices is a plus. Strong analytical and problem-solving skills along with excellent communication skills to collaborate effectively

Posted 1 month ago

Apply

7 - 12 years

19 - 25 Lacs

Hyderabad

Hybrid

Naukri logo

Data Engineer Consultant Position Overview: We are seeking a highly skilled and experienced Data Engineering to join our team. The ideal candidate will have a strong background in programming, data management, and cloud infrastructure, with a focus on designing and implementing efficient data solutions. This role requires a minimum of 5+ years of experience and a deep understanding of Azure services and infrastructure and ETL/ELT solutions. Key Responsibilities: Azure Infrastructure Management: Own and maintain all aspects of Azure infrastructure, recommending modifications to enhance reliability, availability, and scalability. Security Management: Manage security aspects of Azure infrastructure, including network, firewall, private endpoints, encryption, PIM, and permissions management using Azure RBAC and Databricks roles. Technical Troubleshooting: Diagnose and troubleshoot technical issues in a timely manner, identifying root causes and providing effective solutions. Infrastructure as Code: Create and maintain Azure Infrastructure as Code using Terraform and GitHub Actions. CI/CD Pipelines: Configure and maintain CI/CD pipelines using GitHub Actions for various Azure services such as ADF, Databricks, Storage, and Key Vault. Programming Expertise: Utilize your expertise in programming languages such as Python to develop and maintain data engineering solutions. Generative AI and Language Models: Knowledge of Language Models (LLMs) and Generative AI is a plus, enabling the integration of advanced AI capabilities into data workflows. Real-Time Data Streaming: Use Kafka for real-time data streaming and integration, ensuring efficient data flow and processing. Data Management: Proficiency in Snowflake for data wrangling and management, optimizing data structures for analysis. DBT Utilization: Build and maintain data marts and views using DBT, ensuring data is structured for optimal analysis. ETL/ELT Solutions: Design ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, leveraging methodologies to acquire data from various structured or semi-structured source systems. Communication: Strong communication skills to explain technical issues and solutions clearly to the Engineering Lead and key stakeholders (as required) Qualifications: Minimum of 5+ years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks.OR Snowflake Expertise in programming languages such as Python. Experience with Kafka for real-time data streaming and integration. Proficiency in Snowflake for data wrangling and management. Proven ability to use DBT to build and maintain data marts and views. Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions. Ability to configure, set up, and maintain GitHub for various code repositories. Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services. In-depth understanding of managing security aspects of Azure infrastructure. Strong problem-solving skills and ability to diagnose and troubleshoot technical issues. Excellent communication skills for explaining technical issues and solutions.

Posted 1 month ago

Apply

2 - 7 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 1 month ago

Apply

3 - 5 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Hello eager tech expert! To create a better future, you need to think outside the box. That"™s why we at Siemens need innovators who aren"™t afraid to push boundaries to join our diverse team of tech gurus. Got what it takes? Then help us create lasting, positive impact! Working for Siemens Financial Services Information Technology (SFS IT), you will work on the continuous enhancement of our Siemens Credit Warehouse solution by translating business requirements into IT solutions and working hand in hand on the implementation of these with our interdisciplinary and international team of IT experts. The Siemens Credit Warehouse is a business-critical IT-application that provides credit rating information and credit limits of our customers to all Siemens entities worldwide. We are looking for an experienced Release Manager to become part of our Siemens Financial Services Information Technology (SFS IT) Data Management team. You will have a pivotal role in the moderation of all aspects related to release management of our Data Platform, liaising between the different stakeholders that range from senior management to our citizen developer community. Through your strong communication & presentation skills, coupled with your solid technical background and critical thinking, you"™re able to connect technical topics to a non-technical/management quorum, leading the topics under your responsibility towards a positive outcome based on your natural constructive approach. You"™ll break new ground by: Lead topics across multiple stakeholders from different units in our organization (IT and Business). Actively listen to issues and problems faced by technical and non-technical members. Produce outstanding technical articles for documentation purposes. You"™re excited to build on your existing expertise, including University degree in computer science, business information systems or similar area of knowledge. At least 3 to 5 years"™ experience in a release manager role. Strong technical background with proven track record in: Data engineering and data warehousing, esp. with Snowflake and dbt open source, ideally dbt cloud that allow you to champing CI/ CD processes end-to-end setup and development of release management processes (CI/CD) and concepts. Azure DevOps (esp. CI/CD, project setup optimization), Github and Gitlab, including Git Bash. Reading YAML code for Azure DevOps pipeline and error handling. Very good programming skills in SQL (esp. DDL and DML statements). General good understanding of Azure Cloud tech stack (Azure Portal, Logic Apps, Synapse, Blob Containers, Kafka, Clusters and Streaming). A proven Track on AWS is a big plus. Experience in Terraform is a big plus. Create a better #TomorrowWithUs! We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Protecting the environment, conserving our natural resources, fostering the health and performance of our people as well as safeguarding their working conditions are core to our social and business commitment at Siemens. This role is based in Pune/Mumbai. You"™ll also get to visit other locations in India and beyond, so you"™ll need to go where this journey takes you. In return, you"™ll get the chance to work with international team and working on global topics.

Posted 1 month ago

Apply

3 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer We are looking for a skilled Data Architect/Engineer with strong expertise in AWS and data lake solutions. If you"™re passionate about building scalable data platforms, this role is for you. Your responsibilities will include: Architect & Design Build scalable and efficient data solutions using AWS services like Glue, Redshift, S3, Kinesis (Apache Kafka), DynamoDB, Lambda, Glue Streaming ETL, and EMR. Real-Time Data Integration Integrate real-time data from multiple Siemens orgs into our central data lake. Data Lake Management Design and manage large-scale data lakes using S3, Glue, and Lake Formation. Data Transformation Apply transformations to ensure high-quality, analysis-ready data. Snowflake Integration Build and manage pipelines for Snowflake, using Iceberg tables for best performance and flexibility. Performance Tuning Optimize pipelines for speed, scalability, and cost-effectiveness. Security & Compliance Ensure all data solutions meet security standards and compliance guidelines. Team Collaboration Work closely with data engineers, scientists, and app developers to deliver full-stack data solutions. Monitoring & Troubleshooting Set up monitoring tools and quickly resolve pipeline issues when needed. You"™d describe yourself as: Experience 3+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Solid understanding of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 1 month ago

Apply

2 - 5 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Responsibilities: You design, develop and test dashboards/apps in Power BI/Qlik Cloud/PowerApps. o You prepare and transform data from the Data Cloud using SQL queries in Snowflake. o You create and maintain technical design documentation (e.g. data models, user instructions etc.). [Optional] You design, develop and test Robotics Process Automation (RPA) solutions with Power Automate and/or for SAP. [Optional] (You design, develop and test ML models in Snowflake/Azure Databricks or similar.) You maintain solutions that are live with future enhancements and support them in the event of errors. Profile: A degree in Computer Science, Business Informatics, or a similar technical field with a specialization in Information Technology. At least 3 years of combined work experience in front-end BI development (Power BI and/or Qlik Cloud) / RPA development (Python, SAP GUI Scripting, PowerApps, UiPath or a similar low-code platform). o Proficiency in writing SQL queries and general relational database proficiency. o Experienced in Data Management (Extract, Transform and Load process); preferably experienced in using Snowflake. o Experienced with Data Science. o [Optional] Knowledge of Python, or similar (e.g. Java, C++). o Knowledge of SAP. Strong analytical and problem-solving skills. Good collaboration and communications skills (English). Ideally, experience in the Supply Chain Management environment.

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture We"™re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 1 month ago

Apply

3 - 5 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities : Candidate should be Immediate Joiner Lead Level Exp Must Informatica Powercenter, Snowflake, Oracle, Unix are Mandatory Skills Only Hyd (GAR-Kokapet) location (Hybrid - Mode) Key Responsibilities : •Design and implement scalable data storage solutions using Snowflake •Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. •Write, optimize, and troubleshoot complex SQL queries within Snowflake •Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs. •Develop and maintain ETL processes using Informatica PowerCenter •Integrate Snowflake with various data sources and third-party applications •Experience in Data Lineage Analysis, Data Profiling, ETL Design and development, Unit Testing, Production batch support and UAT support. •Involve SQL performance tuning, Root causing failures and bifurcating them into different technical issues and resolving them. •In-depth understanding of Data Warehouse, ETL concepts and Data Modelling. •Experience in requirement gathering, analysis, designing, development, and deployment. •Good working knowledge of any ETL tool (preferably Informatica powercenter, DBT) •Should have proficiency in SQL. •Have experience in client facing projects. •Have experience on Snowflake Best Practices. •Should have experience working on Unix shell scripting. •Good to have working experience in python

Posted 1 month ago

Apply

6 - 10 years

15 - 27 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Snowflake Administration: Experience: 6-10 years Key Responsibilities Administer and manage the Snowflake data platform, including monitoring, configuration, and upgrades. Ensure the performance, scalability, and reliability of Snowflake databases and queries. Set up and manage user roles, access controls, and security policies to safeguard data integrity. Optimize database design and storage management to improve efficiency and reduce costs. Collaborate with data engineering and analytics teams to integrate data pipelines and support data workloads. Implement best practices for ETL/ELT processes, query optimization, and data warehouse management. Troubleshoot and resolve issues related to Snowflake platform operations. Monitor resource utilization and provide cost analysis for effective usage. Create and maintain documentation for Snowflake configurations, processes, and policies. Skills and Qualifications Proven experience in Snowflake administration and management. Strong understanding of Snowflake compute and storage management. Expertise in data governance, including column-level data security using secure views and dynamic data masking features. Proficiency in performing data definition language (DDL) operations. Ability to apply strategies for Snowflake performance-tuning. Experience in designing and developing secure access controls using Role-based Access Control (RBAC). Excellent troubleshooting and problem-solving skills. Strong collaboration and communication skills. Understanding of cost optimization approaches and implementation on Snowflake.

Posted 1 month ago

Apply

5 - 10 years

20 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Experience : 5 to 10 Years Location : Hyderabad Notice Period : Immediate to 30 Days Skills Required: 5+ years of experience as a Data Engineer or in a similar role working with large data sets and ELT/ETL processes 7+ years of industry experience in software development Knowledge and practical use of a wide variety of RDBMS technologies such as MySQL, Postgres, SQL Server or Oracle Use of cloud-based data warehouse technologies including Snowflake, AWS RedShift. Strong SQL experience with an emphasis on analytic queries and performance Experience with various NoSQL” technologies such as MongoDB or Elastic Search Familiarity with either native database or external change-data-capture technologies Practical use of various data formats such as CSV, XML, JSON, and Parquet Use of Data flow and transformation tools such as Apache Nifi or Talend Implementation of ELT processes in languages such as Java, Python or NodeJS Use of large, shared data stores such as Amazon S3 or Hadoop File System Thorough and practical use of various Data Warehouse data schemas (Snowflake, Star) If interested please share your updated resume to arampally@jaggaer.com with below details: Total Years of Experience: Years of Experience as Data Engineer: Years of experience in MySQL: Years of Experience in Snowflake, AWS RedShift: Current CTC: Expected CTC: Notice Period:

Posted 1 month ago

Apply

15 - 19 years

15 - 30 Lacs

Noida, Chennai, Bengaluru

Hybrid

Naukri logo

Job Description- Experience - 12 Years 16 Years Primary Skill - Delivery management with Data Warehouse background Notice Period - Immediate to 30 Days Work Location - Chennai, Noida & Bangalore Role & Responsibility- Required Skills 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities: Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics

Posted 1 month ago

Apply

2 - 4 years

8 - 12 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? In Nice as a Java Developer, you will be responsible for designing, developing, testing, and maintaining scalable and efficient Java-based applications that meet business requirements. You will collaborate closely with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality software solutions. Your role involves writing clean, well-structured, and maintainable code following best practices and coding standards. Additionally, you will debug and troubleshoot application issues, ensuring optimal performance and user experience. How will you make an impact? Develop quality, proficient and enterprise grade solutions Test your code using Unit/System tests and automation Work as a member of an agile team to enhance and improve software written in Java Develop according to specific requirements with awareness of scalability, hardware capabilities, cross-environment, and platform implications. Design and present projects to improve current process by researching new knowledge and collaborating on solutions, suggesting process improvements and best practices Demonstrates ability to write efficient code for handling inter-process communications. Fix bugs and care about enterprise grade quality Work as part of the development team towards the application in an aggressive deadline Implementation of software features according to design Work and collaborate in multi-disciplinary Agile teams, adopting Agile spirit, methodology and tools Have you got what it takes? Degree in Computer Science or a related discipline (BE/BTech/MTech/MCA) Must: 3-6 years hands-on software development experience with Java Software development experience in Java, Java-Spring, Hibernate, Linux, Maven, Git Strong knowledge of Java, SpringBoot Microservices Strong knowledge of working and developing Microservices Experience with docker containers running on Kubernetes Good hands-on experience in SQL Excellent communication skills Excellent problem-solving skills Hands-on experience with AWS cloud technologies Open to learn new tech stack as need be Working knowledge of unit testing. Working knowledge of object-oriented software design. Desire to work in a fast-paced environment. Experience with implementation of Data structure and algorithms. Excellent spoken/written English. Self-driven with a strong sense of ownership Friendly disposition; work effectively as a team player Work as part of the development team towards the application. Adhere and contribute to software best engineering practices. Self-motivated and fast learner with a strong sense of ownership and drive Bonus Experience: Experience with Jira Experience in Snowflake Experience with automation/testing tools Experience with/knowledge of agile development processes Experience as a technical or team lead or equivalent experience Experience in contact center domain Experience working in a CI/CD Environment What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID:6873 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 month ago

Apply

4 - 9 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated toimproving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Role description: The Global Business Intelligence & Analyticsteam plays a key role in this change. They focus on delivering insights to enable informed business decision making through the development and roll-out of standardized performance reporting. In addition, by providing insight in the success of the business transformation and the benefits and improvements we aim to achieve, such as increased billability and increased projects margins. Role accountabilities: Support implementation of the Global BI & Analytics scope with a technical focus. Responsible for design, development and testing of OBIA/OAC technology stack of tools (RPD/ODI/BICC/OAC/OBIA/PL-SQL) A hands-on role with design and development activities and internal client facing for functional discussions to improve the design or our BI and Analytics platform consisting of Oracle BI Reporting Repository/Model and development of KPIs, Metrics and Dashboards etc. Extensive Oracle BI tool experience is mandatory. Hands-on experience in repository (RPD) development (Physical, Logical and Presentation layer) using OBIEE Admin tool. Prepare Conceptual, Logical & Physical Data Models. Assist in designing Test Strategy, Test Plan & Test Cases Conduct Architecture & Design reviews to ensure that quality software engineering processes (DevOps) and methodologies in the space of OBIA RPD/ODI/Data Warehouse designs. Participate in Sprint Planning and Agile Framework process and methodologies. Effort, Strategic Planning, Project Scheduling, and developing & implementing Processes. Qualifications & Experience: Has a bachelors degree (or equivalent) in a technical / data discipline. Has a minimum of 4 years of experience in Oracle Business Intelligence Applications OBIA / Oracle BI Apps with expertise in gathering user requirements, designing, developing and support functions. Experience in dimensional modeling, designing data marts, star and snowflake schemas are essential. Must have worked in Oracle BI (OBIA) framework design and implementation projects end to end. Has experience with developing Reports & Dashboards using OBIEE Analysis, Interactive dashboards, and Data visualization tool. Good written and spoken English communication. Enthusiastic, positive, committed, and driven attitude. Strong analytical and data skills and attention to detail. Contribute tothe overall success of a project /deliverable and achieve SLA and KPI targets set for theteam."‹ Contribute toprocess improvement initiatives and other administrativeinitiatives as part of the team's strategy. Power BI reporting and data modeling/engineering skills is a plus. Bring direct hands-on contribution with the creation and development ofcomprehensive content (dashboards, reports, processes, trainingmaterial), to meet the requirements of various reporting tasks andprojects, while ensuring adherence to global reporting standards."‹ Data engineering skill set like, Creating variables, sequences, user functions, scenarios, procedures, interfaces, and packages in ODI. Any certification on ODI, OAC, OBIEE, OBIA is an added advantage. Why Arcadis? We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people. At Arcadis, you will have the opportunity to build the career that is right for you. Because each Arcadian has their own motivations, their own career goals. And, as a people rst business, it is why we will take the time to listen, to understand what you want from your time here, and provide the support you need to achieve your ambitions. #JoinArcadis #CreateALegacy #Hybrid

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

ESSENTIAL FUNCTIONS 3 to 8 years of hands-on experience in design, development, and enhancement ofOAS/OBIEE11g reports, dashboards & its best practices. Strong experience in Oracle SQL. Good to have experience in design, development, and enhancement of BI Publisher 11g reports. UnderstandingOBIEE/OASsecurity implementations and permissions framework Experience withOBIEE/OASdeployment, configurations and general administrative activities Providing first line of technical support for business critical OBIEE / OBIA applications/reports. Should have strong debugging skills to identify the issues. Should have strong knowledge of data warehouse concepts like Star schema, snowflake schema, Dimension and Fact table. Should be able to perform Unit testing and Integration testing. Should be able to work on RPD and MUDE work environment. Good knowledge of performance tuning of dashboards and reports. Strong organizational skills, ability to accomplish multiple tasks within the agreed upon timeframes through effective prioritization of duties and functions in a fast-paced environment. EDUCATION AND EXPERIENCE: BE/BTech/ MCA with 3 to 8 years of relevant experience. Preferably from a services organization background with prior experience in OBIEE/OAS environments.

Posted 1 month ago

Apply

6 - 10 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We're looking for a Data Engineer to join our team. We need someone who's great at building data pipelines and understands how data works. You'll be using tools like DBT and Snowflake a lot. The most important thing for us is that you've worked with all sorts of data sources , not just files. Think different cloud systems, other company databases, and various online tools. What you'll do: Build and manage how data flows into our system using DBT and storing it in Snowflake . Design how our data is organized so it's easy to use for reports and analysis. Fix any data problems that come up. Connect to and get data from many different places , like: Cloud apps (e.g., Salesforce, marketing tools) Various databases (SQL Server, Oracle, etc.) Streaming data Different file types (CSV, JSON, etc.) Other business systems Help us improve our data setup. What you need: Experience as a Data Engineer . Strong skills with DBT (Data Build Tool). Solid experience with Snowflake . Must have experience working with many different types of data sources, especially cloud systems and other company databases not just files. Good at data modeling (organizing data). Comfortable with SQL . Good at solving problems

Posted 1 month ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies