Home
Jobs

1082 Snowflake Jobs - Page 35

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 15 Lacs

Jaipur

Hybrid

Naukri logo

Roles and Responsibilities Role - Argus and other data products are growing and we need to ensure data accuracy across MS SQL, SSIS, and Snowflake platforms. This role is important for maintaining reliable data quality and reports. Responsibilities Develop and implement automated testing strategies to ensure the quality and accuracy of data across MS SQL, SSIS, and Snowflake platforms. Collaborate with Data and Analytics developers and stakeholders to understand data requirements and design comprehensive test plans. Execute test cases, analyze results, and identify and report defects to ensure timely resolution. Establish testing processes and procedures, including test data management and version control, to maintain consistency and reliability. Monitor and evaluate the performance of Products and processes, recommending improvements to enhance data quality and efficiency. Stay updated on emerging trends and best practices in data testing and automation, incorporating new technologies and methodologies as appropriate. Ensure to work in compliance with Hydro Quality system, HSE regulations, policies and standardized operating procedures. Perform all other tasks upon the instructions of the superior in charge which may be necessitated by the operations of the related unit and which do not conflict with any applicable laws, statutory provisions and company rules. Compliance with area-specific customer requirements Required Qualification and Skills 5+ years of experience as a Manual/Automated Tester or Quality Assurance Analyst in a Business Intelligence environment. Proven track record of designing and executing automated test scripts using industry-standard tools and frameworks (e.g., Selenium, JUnit, TestNG). Experience in testing data pipelines, ETL processes, and data warehousing solutions across multiple platforms such as MS SQL Server, SSIS, and Snowflake. Strong analytical skills with the ability to identify data anomalies and discrepancies and troubleshoot issues effectively. Demonstrated ability to work collaboratively with cross-functional teams and communicate effectively with technical and non-technical stakeholders. Experience in Scrum/Agile methodology. Experience in manufacturing domain is a plus BE/B.Tech or MCA or Bachelor’s degree in computer science, Information Systems, Business Administration, or a related field. Fluent English Demonstrated capability to solve complex analytical problems through the internalization of domain knowledge and the application of technical expertise. Excellent communication and interpersonal skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to work independently, manage multiple projects simultaneously, and deliver high- quality results within tight deadlines. What we offer you Working at the world’s only fully integrated aluminum and leading renewable energy company Diverse, global teams Flexible work environment/home office We provide you the freedom to be creative and to learn from experts Possibility to grow with the company, gain new certificates Attractive benefit package

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Jaipur

Remote

Naukri logo

Role & responsibilities Senior DevOps Engineer requires expertise in AWS and Databricks, with a preference for broad cloud experience. The role involves designing, implementing and optimizing CI/CD pipelines, Infrastructure as Code (IaC) and cloud automation solutions. Strong skills in monitoring, security, and scalability within cloud environments are essential. Strong skills in monitoring, security, and scalability within cloud environments area essential. Candidates should have experience in containerization, scripting, and cloud native services, ensuring efficient deployment and management of data and analytics workloads. Years of Experience: 5+ Minimum Preferred candidate profile Strong experience with AWS services (EC2, S3, Lambda, IAM, RDS, etc.) and cloud infrastructure automation. Expertise in Databricks deployment, optimization, and security best practices. Proficiency in CI/CD pipelines using tools like GitHub Actions, Jenkins, or AWS CodePipeline. Experience with Infrastructure as Code (IaC) using Terraform or CloudFormation. Strong scripting skills in Python, Bash, or PowerShell for automation. Familiarity with containerization and orchestration (Docker, Kubernetes, EKS). Knowledge of cloud security, monitoring, and logging (AWS Security Hub, CloudWatch, Datadog). Broad understanding of multi-cloud and hybrid cloud architectures is a plus.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

11 - 21 Lacs

Pune, Chennai, Coimbatore

Work from Office

Naukri logo

Responsibilities and Qualifications: - Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to Relational database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter

Posted 3 weeks ago

Apply

10.0 - 15.0 years

15 - 19 Lacs

Noida

Work from Office

Naukri logo

DE Architect Objective 1 : Develop and Implement Metadata-Driven Framework for Medallion Architecture Strong in data modeling and pipeline design Experience with metadata driven frameworks and governance practices Strong analytical skills to identify and reduce redundancies Knowledge of Snowflake and Medallion Architecture Objective 2 : Optimize Data Pipeline Performance and Reliability Expertise in data pipeline optimization and performance tuning Experience with Indexing and efficient orchestration techniques Ability to identify and implement cost-saving measures Knowledge of monitoring tools and processes Objective 3 : Enhance data modelling and Reusability Strong Communication and training skills Experience in data modeling and reusable asset creation Able to identify and train Subject matter experts Proficiency in gathering and analyzing Stakeholder Feedback Objective 4 : Strengthen Devops Practices and Documentation Knowledge of version control and release processes Experience of DevOps process and CI/CD pipelines Ability to establish and maintain data asset frameworks Strong Documentation skills Objective 5 : Lead and Develop the Data Engineering Team Leadership and team management skills Experience in conducting performance reviews and skill development plans Ability to establish and lead a center of Excellence(CoE) Proficiency in using tasking and estimation tools like Jira and DevOps

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: SAP Data Engineer Location: Hyderabad Job Type: Full-time Experience Level: Mid-Senior] Department: Enterprise Data & Analytics Job Summary We are seeking an experienced SAP Data Engineer with deep expertise in SAP Datasphere (formerly SAP Data Warehouse Cloud) and SAP S/4HANA Public Cloud integration. The ideal candidate will have a strategic mindset towards data replication and virtualization , enabling efficient, scalable, and real-time data access across enterprise SAP systems. In this role, you will collaborate with Business Analysts and cross-functional teams to support our Enterprise Analytics platform , ensuring timely delivery of critical insights into financial, sales, and operational metrics. Key Responsibilities Manage and Optimize Data Flows Ensure efficient, real-time data integration across SAP and cloud systems, supporting optimized reporting pipelines. SAP Datasphere Administration Lead space management within SAP Datasphere, including configuration of Connections, Data Builder, Business Builder, and Analytic Models. Integration and Pipeline Development Integrate SAP Datasphere with SAP S/4HANA Public Cloud, ABAP CDS Views, Snowflake, Azure Data Lake, and other data platforms. ETL/ELT Development Design, develop, and maintain robust ETL/ELT processes ensuring accurate data migration, governance, and policy compliance. Data Modeling Design and optimize scalable, high-performance data models for both batch and streaming workloads across structured and unstructured data sources. SAP Analytics Cloud (SAC) Enablement Implement SAC live connections, build and optimize dashboards, and enhance performance for large-scale analytics. Reporting & Visualization Utilize SAP SAC, Power BI, and Analysis for Office to deliver actionable business insights. AI/ML Integration Collaborate with data scientists to embed predictive analytics and AI/ML models into enterprise reporting workflows. Data Virtualization Strategy Design and implement data virtualization solutions to reduce physical data movement and simplify architecture, improving scalability and cost-efficiency. Data Security & Governance Ensure data quality, security, and compliance across all layers of the data pipeline, adhering to enterprise standards. Advanced Data Transformation Leverage programming tools such as Python , Java , Apache Spark , and SQL for complex data integration and transformation workflows. Qualifications Bachelors or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. 8+ years of experience in SAP data engineering roles. Strong expertise in SAP Datasphere , SAP S/4HANA , SAP Analytics Cloud (SAC) . Hands-on experience with data replication vs. virtualization strategies . Familiarity with Snowflake , Azure Data Lake , and other modern data platforms. Proficiency in scripting and programming languages: Python, Java, Spark, SQL . Experience with data governance , security , and compliance frameworks . Excellent communication and collaboration skills, with the ability to translate business requirements into technical solutions. Preferred Qualifications Experience with ABAP CDS Views and SAP BW/4HANA. Certification in SAP Datasphere or SAP Analytics Cloud is a plus. Exposure to AI/ML integration in enterprise data environments. Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5+ Years Location - Pan India Immediate Joiners only Snowflake developer , Pyspark , Python , API, CI/CD , Cloud services ,Azure , Azure Devops Subject: Fw : TMNA SNOWFLAKE POSITION Please share profiles for Snowflake developers having strong Pyspark experience Job Description: Strong hands-on experience in Snowflake development including Streams, Tasks, and Time Travel Deep understanding of Snowpark for Python and its application for data engineering workflows Proficient in PySpark , Spark SQL, and distributed data processing Experience with API development . Proficiency in cloud services (preferably Azure, but AWS/GCP also acceptable) Solid understanding of CI/CD practices and tools like Azure DevOps, GitHub Actions, GitLab, or Jenkins for snowflake. Knowledge of Delta Lake, Data Lakehouse principles, and schema evolution is a plus Preferred candidate profile

Posted 3 weeks ago

Apply

5.0 - 8.0 years

20 - 35 Lacs

Mumbai, Pune

Hybrid

Naukri logo

About Company: Freestone Infotech is a global IT solutions company providing innovative best-in-class turnkey solutions to enterprises worldwide. Freestone Infotech addresses the enterprise-wide end-to-end needs of organizations with its expertise in Big Data Solutions, Data Analysis, Machine Learning, Business Intelligence, R&D, Product development, and Mobile Application development. Job Overview: As a Senior Software engineer at Freestone Infotech Pvt Ltd. In this role, you should be able to work independently with little supervision. You should have excellent organization and problem-solving skills. If you also have hands-on experience in software development and agile methodologies, wed like to meet you. Job Title : Senior Software Engineer Experience : 5 to 8 years Your experience should include: 5+ years of experience in java and related technologies. 5+ years of experience in software development. Experience with k8s/docker deployment. Experience of Maven Experience in SQL queries Experience with Linux and bash scripting Knowledge of version control (Git etc) Experience with Jenkins and CI/CD pipelines. Experience with JUnit/Mockito for testing. Familiarity with RESTful API development. Experience in Java Multi-Threading development. Nice to have: Experience of Apache Ranger and data access/governance domain Experience of microservices, python, scala Experience with Open Telemetry for monitoring and metrics. Experience with Grafana for visualization and monitoring. Experience in Python Testing framework i.e pytes. Performance testing framework tool locust. Experience of cloud services ADLS, S3, GCP. Experience of big data technologies such as Apache Spark, Apache Hive, EMR. Experience of Snowflake/Databricks/Lake formation. Education: Bachelors / master’s degree in computer science or information technology or a related field.

Posted 3 weeks ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 3 weeks ago

Apply

9.0 - 14.0 years

50 - 85 Lacs

Noida

Work from Office

Naukri logo

About the Role We are looking for a Staff Engineer specialized in Master Data Management to design and develop our next-generation MDM platform. This role is ideal for engineers who have created or contributed significantly to MDM solutions. Youll lead the architecture and development of our core MDM engine, focusing on data modeling, matching algorithms, and governance workflows that enable our customers to achieve a trusted, 360-degree view of their critical business data. A Day in the Life Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Architect and develop scalable, fault-tolerant MDM platform components that handle various data domains. Design and implement sophisticated entity matching and merging algorithms to create golden records across disparate data sources. Develop or Integrate flexible data modeling frameworks that can adapt to different industries and use cases. Create robust data governance workflows, including approval processes, audit trails, and role-based access controls. Build data quality monitoring and remediation capabilities into the MDM platform. Collaborate with product managers, solution architects, and customers to understand industry-specific MDM requirements. Develop REST APIs and integration patterns for connecting the MDM platform with various enterprise systems. Mentor junior engineers and promote best practices in MDM solution development. Lead technical design reviews and contribute to the product roadmap What You Need 8+ years of software engineering experience, with at least 5 years focused on developing master data management solutions or components. Proven experience creating or significantly contributing to commercial MDM platforms, data integration tools, or similar enterprise data management solutions. Deep understanding of MDM concepts including data modeling, matching/merging algorithms, data governance, and data quality management. Strong expertise in at least one major programming language such as Java, Scala, Python, or Go. Experience with database technologies including relational (Snowflake, Databricks, PostgreSQL) and NoSQL systems (MongoDB, Elasticsearch). Knowledge of data integration patterns and ETL/ELT processes. Experience designing and implementing RESTful APIs and service-oriented architectures. Understanding of cloud-native development and deployment on AWS, or Azure. Familiarity with containerization (Docker) and orchestration tools (Kubernetes). Experience with event-driven architectures and messaging systems (Kafka, RabbitMQ). Strong understanding of data security and privacy considerations, especially for sensitive master data.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

19 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are looking for "AWS Data bricks Data Engineer" with Minimum 6 years experience Contact- Atchaya (95001 64554) Required Candidate profile Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks PySpark, Snowflake, Talend

Posted 4 weeks ago

Apply

8.0 - 13.0 years

19 - 25 Lacs

Bengaluru

Remote

Naukri logo

Quality Assurance (QA) Analyst (/ Sr QA - Guidewire Data Migration Job mode: Remote (EST hrs) Notice - 30 days Job Description: Preferred Certificates (any one): Las Lenas/ Mammoth/ Kufri/Jasper/Hakuba/Garmisch/Flaine/Elysian/Dobson/Cortina/Banff/Aspen/Guidewire-V7/8/9/10). Quality assurance (QA) analysts, also known as testing analysts or testers, are responsible for developing and executing test plans to ensure the delivered solution meets project requirements and handles data in a predictable, error-free way. QA analysts may play a different role (e.g. business analyst, SME, etc.) earlier in the project. QA analysts are usually responsible for the following tasks: Work closely with business analysts and developers to identify, script, and execute functional test cases Verify the data mapping and compare the migrated data in the legacy system against the target system i.e., Guidewire applications and Databases. Create and execute regression test cases. Document and track issues identified in testing. Develop an integrated end-to-end functional test plan Develop a release user acceptance test plan Proficient with SQL QA resources will also have the following skills: Experience in testing complex enterprise software solutions. Experience writing and executing test plans. Experience in tracking and resolving issues encountered in testing. Functional understanding of the relevant domain ( underwriting , claims , billing , etc.) Ease and comfort working with technologies such Jira , SQL and Snowflake Candidates must be confident, proactive, and can work independently when needed. Previous Guidewire application knowledge is very beneficial Previous Guidewire Data Migration knowledge is also very beneficial.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

1 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Country: India Location: Building No: 12C, Floor 9,10,11, Building No: 12B -Stilt floor, Raheja Mindspace, Cyberabad, Madhapur, Hyderabad - 500081, Telangana, India Job Title - Logistics Analytics Preferred Location - Hyderabad/Gurgaon-India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: We are seeking a highly analytical and detail-oriented Global Logistics Analytics Specialist to join our logistics COE team. This role is pivotal in driving cost optimization, operational efficiency, and strategic insights across various logistics functions—warehouse, transportation, network OTR, ocean, and air. The ideal candidate will leverage advanced data analytics, AI/ML tools, and business intelligence platforms such as Snowflake, Power BI, and Teradata to provide actionable insights on logistics performance, spend analytics, contract KPIs, asset utilization, delivery cycle times, and more. The role is also responsible for enhancing logistics data governance and formats, supporting data-driven decision-making, and managing global logistics reporting frameworks. Responsibilities: Stakeholder Engagement and Management Collaborate closely with stakeholders to understand project-specific needs and ensure timely updates on activity statuses. Collaborate with regional and global logistics teams, and external 3PL partners. Act as a trusted advisor for logistics performance, bringing insights that drive alignment between tactical execution and strategic priorities. Provide clear, data-backed recommendations to senior stakeholders on route optimization, mode selection, contract terms, and currency risk. Reporting and Governance Develop and maintain standardized logistics dashboards covering KPIs such as delivery cycle time, on-time delivery, asset utilization, claims/returns, and cost vs. budget performance. Oversee centralized reporting frameworks to ensure consistency across business units and geographies. Lead governance of logistics data, ensuring accuracy, consistency, and currency-adjusted spend tracking. Manage the reporting of key metrics, ensuring data integrity and accuracy. Project Coordination and Process Optimization Drive cross-functional initiatives focusing on: Cost reduction, payment term optimization, and mode efficiency. Spend analytics and “should-cost” modeling. Claim and return trends across channels and carriers. Support logistics transformation projects involving AI/ML capabilities for demand-sensing, predictive routing, and exception management. Collaborate with data teams to ensure timely delivery of automation and analytics solutions. Technical Competencies & Service Delivery Requirements Advanced skills in data modeling, querying, and visualization using: Snowflake for cloud data warehousing Power BI for dashboard development and insight storytelling SQL and Teradata for large-scale data analysis Strong understanding of logistics metrics and how they relate to cost efficiency, delivery reliability, and service quality. Experience in managing and optimizing logistics data structures and formats, ensuring scalability for future needs. Capability to run Spend analytics on logistics and warehousing spends and sourcing Automation and Centralization Identify opportunities for automation within reporting functions to streamline processes. Capability to handle large and complex data sets Focus on the centralization of dashboards and reports to improve overall efficiency. Lead the automation of recurring analytics and reporting processes, freeing up bandwidth for deep-dive analysis. Drive centralization of logistics data and KPIs for cross-country/cross-business visibility. Promote self-service analytics models and upskill end users to utilize dashboards and insights independently. Basic Qualifications & Experience: Bachelor’s or Master’s degree in Supply Chain, Logistics, Data Analytics, Engineering, or a related field. 6-11 years of experience in logistics analytics, preferably in a global or multinational environment. Advanced excel modelling skills, VBA and Macros are preferred Knowledge of SQL, MS Access Strong exposure to Power BI and Dax will be preferred Strong experience with logistics KPIs, spend and cost analysis, route/mode optimization, warehouse efficiency and return analytics. Proficiency in Power BI, Teradata, and SQL-based querying. Familiarity with AI/ML concepts and tools used in logistics and operations research. Excellent communication skills with a proven track record of working with cross-functional stakeholders. Highly organized with a strategic mindset and ability to execute tactically. Proficiency in Microsoft Office (Excel, Word, PowerPoint) for analytics and presentations is mandatory. Strong attention to detail, with the ability to identify issues accurately and articulate observations effectively. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice: Click on this link to read the Job Applicant's Privacy Notice

Posted 4 weeks ago

Apply

4.0 - 9.0 years

5 - 12 Lacs

Bengaluru

Hybrid

Naukri logo

Please share quality profiles. snowflake datawares housing exp with good Sql is primary Experience: 7+ Years. Relevant experience: 5+ years. Location: Any Ntt( Hybrid). Please share below skill matrix while sharing the profiles.

Posted 4 weeks ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Role: Snowflake + SQL Location: Job Description: We are seeking a skilled Snowflake Developer to design, develop, and manage scalable data solutions using the Snowflake cloud data platform. The ideal candidate will have deep experience in data warehousing, SQL development, ETL processes, and cloud data architecture. Understanding of Control M and Tableau will be an added advantage. 1. Snowflake (Cloud Data Warehouse): 1. Good understanding of Snowflake ECO system 2. Good experience on Data modeling and Dimensional Modeling and techniques and will be able to drive the Technical discussions with IT & Business and Architects / Data Modelers 3. Need to guide the team and provide the technical solutions 4. Need to prepare the technical solution and architectures as part of project requirements 5. Virtual Warehouse (Compute) - Good Understanding of Warehouse creation & manage 6. Data Modeling & Storage - Strong knowledge on LDM/PDM design 7. Data Loading/Unloading and Data Sharing- Should have good knowledge 8. SnowSQL (CLI)- Expertise and excellent understanding of Snowflake Internals and Integration 9. Strong hands on experience on SNOWSQL queries and Stored procedures and performance tuning techniques 10. Good knowledge on SNOWSQL Scripts preparation the data validation and Audits 11. SnowPipe Good knowledge of Snow pipe implementation 12. Expertise and excellent understanding of S3 - Internal data copy/movement 13. Good knowledge on Security & Readers and Consumers accounts 14. Good knowledge and hands on experience on Query performance tuning implementation techniques 2. SQL Knowledge: 1. Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions 2. Strong knowledge on stored procedures 3. Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse

Posted 4 weeks ago

Apply

10 - 15 years

25 - 40 Lacs

Pune

Hybrid

Naukri logo

Description BS/MS degree in Computer Science or equivalent 1015 years of experience building products on distributed systems, preferably in the Data Security domain Working knowledge of the security domain - Ransomware protection, Anomaly detection, data classification and compliance of unstructured data. Strong knowledge of Cloud platform, APIs, containers, Kubernetes, and Snowflake Knowledge of building micro-service-based applications. Hands-on development in either Golang or Python Strong development experience in Linux/Unix OS platform

Posted 1 month ago

Apply

5 - 10 years

35 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Roles & Responsibilities: Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake. Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP). Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data. Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency. Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization. Implement data governance, security, and compliance best practices. Build and maintain data models, transformations, and data marts for analytics and reporting. Collaborate with data scientists, analysts, and business teams to define data engineering requirements. Automate infrastructure and deployments using Terraform, Airflow, or dbt. Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks. Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools. Basic Qualifications Bachelor’s or Master’s Degree in Computer Science or Data Science 5-8 years of experience in data engineering, big data processing, and cloud-based data platforms. Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks. Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization). Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine). Proficiency in SQL, Python, or Scala for data transformation and analytics. Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg). Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery). Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster. Strong understanding of data governance, access control, and encryption strategies. Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Experience with machine learning feature engineering pipelines in Databricks. Contributions to open-source data engineering projects.

Posted 1 month ago

Apply

4 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Oracle EBS Technical Consultant Full-time DepartmentASPIRE Managed Services Company Description Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role Working as part of a team of managed services consultants, the primary role will be the technical support of Oracle e-Business Suite applications across a wide range of EBS modules and client specific customisations. You will also be experienced in identifying process improvements that translate into Value Level Agreements and tangible business benefits to the customer. Provide day-to-day support and quality assurance Provide hands-on technical & development support for implemented ERP modules Gather and document business requirements on IT incidents and change work. Document and manage technical specifications and software packages Assist in defining and optimizing simple yet effective business processes and drive change within the organization through negotiation and consensus-building Help ensure that ERP initiatives follow the proper planning, scheduling and management processes Manage on-time changes delivery and business expectations and ensure internal customer satisfaction Provide hands on analysis, design, testing, implementation and post implementation support utilizing prescribed software design lifecycle techniques and system documentation techniques (AIMS/OUM) Managing Incidents within the set-out SLAs and ensuring Incidents are updated timely with the expected quality of data and information. Complete development tasks based on requirements submitted by business partners and best practices Fusion reporting OIC there are few interfaces for which we need to support from timeto time. Qualifications 8 years and above in Oracle E-Business support experience Knowledge of Oracle R11i/12 ideal with functional knowledge of core Oracle EBS Modules including Purchasing and Financials or Supply Chain to include Order Management, Inventory & Discrete Manufacturing (WIP & BOM) Strong knowledge of Oracle development tools, some e-Business application functionality, system administration, database structure, online patching and knowledge of a multi-Org architecture Technical Requirements: SQL, PL/SQL, Unix Script, XML Publisher,JDeveloper, Oracle Reports/Forms, Oracle Workflow, Oracle BI Publisher, Web Adi,AME, & Java core skills OAF Customization & OIC Integrations Must have good experience translating business requirements and design into technical solutions ITIL process knowledge Ability to research, learn, troubleshoot and support complex system customisations Willingness to operate and progress in areas that are outside of previous experience Ability to multi-task and prioritise across concurrent workload may be required. Excellent written and verbal communication Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. #LI-SJ1 Cookies Settings

Posted 1 month ago

Apply

3 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Data Visualisation Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role We are looking for a passionate, dynamic, hardworking expert in all things Data Visualisation, using design led approaches and best practice to deliver business outcomes for our clients and solve complex business problems. This role requires exceptional interpersonal skills as you engage with technical and business stakeholders equally - to help understand and define business problems, design solutions and roadmaps, and drive projects from inception (or impasse!) to successful implementation and/or live service. What will you be doing? Setup PowerBI workspaces with appropriate licensing. Support Integration of Power BI to Databricks workspaces. Build PowerBI reports with required refresh frequency Challengingcustomer or project requirements to ensure we are delivering business outcomes to our clients. Qualifications What skills and experience do you need: Experience Required is 4to 6years. Demonstrable experience designing high-quality data visualisation artefacts and taking solutions to production. Ability to apply knowledge of current industry trends and techniques to formulate solutions within the context of projects. Proficient in writing SQL, Stored procedures and views. Creating and optimising complex queries, analysing query performance. Some experience designing and implementing solid DevOps principles for data visualisation projects. Familiar with agile methodologies. Knowledge of at least one enterprise cloud, preferably Azure. Full project lifecycle experience, from initial concept through to deployment and support What technical skills will do you have: Microsoft Power BI enterprise approaches and rollout methodologies. (Preferred) Tableau enterprise approaches and rollout methodologies. (Desirable) Business Intelligence development using the Microsoft BI stack Familiar with and experienced in data modelling methodologies such as Kimble. Setup PowerBI workspaces with appropriate licensing. Support Integration of Power BI to Databricks workspaces. Expert in DAX and M Query Experienced with Analysis Services Cubes in global scale visualisation deployments SQL skills for querying and creating of views/tables, functions and stored procedures. Design and build concepts for dashboards, reports and cubes using SQL, MDX, DAX, Power BI or other visualisation tools. Document design solutions and processes. Microsoft Certifications desirable but not essential Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 1 month ago

Apply

5 - 10 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Technical Architect Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 29 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role About The Role : We are seeking a highly skilled and experienced Senior Java Developer to join our dynamic team. The ideal candidate will have a strong background in Java development, extensive experience with AWS, database management, and preferred work experience in batch processing. As a Senior Java Developer, you will play a crucial role in designing, developing, and maintaining high-performance applications that meet our business needs. Key Responsibilities: Design and DevelopmentArchitect, lead, design and development of robust, scalable, and efficient Java applications. AWS IntegrationUtilize AWS services to build and deploy cloud-based solutions, ensuring high availability and scalability. Database ManagementDesign, implement, and maintain database schemas, write complex SQL queries, and optimize database performance. Batch ProcessingDevelop and manage batch processing systems to handle large volumes of data efficiently. Code QualityEnsure code quality through code reviews, unit testing, and adherence to best practices. CollaborationWork closely with cross-functional teams, including product managers, QA engineers, and other developers, to deliver high-quality software solutions. TroubleshootingIdentify and resolve performance bottlenecks, bugs, and other technical issues. MentorshipProvide guidance and mentorship to junior developers, fostering a culture of continuous learning and improvement. Qualifications Qualifications: EducationBachelors or masters degree in computer science, Engineering, or a related field. ExperienceProven experience of 10+ years working as a Java Developer in product development or services environment. AWS Skills: Minimum 3+ years of work experience in AWS services such as EC2, S3, Lambda function, Step-functions, Event bus etc. Database Skills: Minimum 5+ years of work experience with relational databases (e.g., Oracle, MySQL, PostgreSQL), ability to write complex joins, performance troubleshooting experience etc. Technical Skills: Proficiency in Java, Spring Framework, Hibernate, and RESTful APIs.Problem-SolvingExcellent analytical and problem-solving skills.CommunicationStrong verbal and written communication skills.Team PlayerAbility to work effectively in a collaborative team environment.Preferred Qualifications:Batch Processing experienceHands-on experience with batch processing frameworks and tools.Python experienceMinimum of 2 years is nice to have. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability.One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology.We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat.Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth.#LI-BS1Cookies Settings

Posted 1 month ago

Apply

9 - 11 years

37 - 40 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

10 - 20 years

30 - 40 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

Title: Sr Data and MLOps Engineer Location: Hybrid (Bangalore/Chennai/Trichy) Description: • Experience within the Azure ecosystem, including Azure AI Search, Azure Storage Blob, Azure Postgres, with expertise in leveraging these tools for data processing, storage, and analytics tasks. • Proficiency in data preprocessing and cleaning large datasets efficiently using Azure Tools, Python, and other data manipulation tools. • Strong background in Data Science/MLOps, with hands-on experience in DevOps, CI/CD, Azure Cloud computing, and model monitoring. • Expertise in healthcare data standards, such as HIPAA and FHIR, with a deep understanding of sensitive data handling and data masking techniques to protect PII and PHI. • In-depth knowledge of search algorithms, indexing techniques, and retrieval models for effective information retrieval tasks. Experience with chunking techniques and working with vectors and vector databases like Pinecone. • Ability to design, develop, and maintain scalable data pipelines for processing and transforming large volumes of structured and unstructured data, ensuring performance and scalability. • Implement best practices for data storage, retrieval, and access control to maintain data integrity, security, and compliance with regulatory requirements. • Implement efficient data processing workflows to support the training and evaluation of solutions using large language models (LLMs), ensuring that models are reliable, scalable, and performant. • Proactively identify and resolve data quality issues, pipeline failures, or resource contention to minimize disruption to systems. • Experience with large language model frameworks, such as Langchain, and the ability to integrate them into data pipelines for natural language processing tasks. • Familiarity with Snowflake for data management and analytics, with the ability to work within the Snowflake ecosystem to support data processes. • Knowledge of cloud computing principles and hands-on experience with deploying, scaling, and monitoring AI solutions on platforms like Azure, AWS, and Snowflake. • Ability to communicate complex technical concepts effectively to both technical and non-technical stakeholders, and collaborate with cross-functional teams. • Analytical mindset with attention to detail, coupled with the ability to solve complex problems efficiently and effectively. • Knowledge of cloud cost management principles and best practices to optimize cloud resource usage and minimize costs. • Experience with ML model deployment, including testing, validation, and integration of machine learning models into production systems. • Knowledge of model versioning and management tools, such as MLflow, DVC, or Azure Machine Learning, for tracking experiments, versions, and deployments. • Model monitoring and performance optimization, including tracking model drift and addressing performance issues to ensure models remain accurate and reliable. • Automation of ML workflows through CI/CD pipelines, enabling smooth model training, testing, validation, and deployment. • Monitoring and logging of AI/ML systems post-deployment to ensure consistent reliability, scalability, and performance. • Collaboration with data scientists and engineering teams to facilitate model retraining, fine-tuning, and updating. • Familiarity with containerization technologies, like Docker and Kubernetes, for deploying and scaling machine learning models in production environments. • Ability to implement model governance practices to ensure compliance and auditability of AI/ML systems. • Understanding of model explainability and the use of tools and techniques to provide transparent insights into model behavior. Must Have: • Minimum of 10 years experience as a data engineer • Hands-on experience with Azure Cloud eco-system. • Hands-on experience using Python for data manipulation. • Deep understanding of vectors and vector databases. • Hands-on experience scaling POC to production. • Hands-on experience using tools such as Document Intelligence, Snowflake, function app. Azure AI Search • Experience working with PII/PHI • Hands-on experience working with unstructured data. Role & responsibilities

Posted 1 month ago

Apply

5 - 8 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Snowflake. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

3 - 5 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Mandatory Skills: SQL Server. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

7 - 12 years

30 - 45 Lacs

Hyderabad

Hybrid

Naukri logo

What is AI Engineer team responsible for? As a Senior AI Engineer, youll be a key member of the Data & AI team. This team is responsible for designing and delivering data engineering, analytics, and generative AI solutions that drive meaningful business impact. Were looking for a pragmatic, results-driven problem solver who thrives in a fast-paced environment and is passionate about building solutions at scale. The ideal candidate has a strong technical foundation, a collaborative mindset, and the ability to navigate complex challenges. You should be comfortable working in a fast-moving, startup-like environment within an established enterprise, and should bring strong skill sets to adapt new solutions fast. You will play a crucial role in integrating AI solutions in our existing digital solutions, optimizing our data infrastructure, and enabling insights through data #MID_SENIOR_LEVEL What is a Digital & AI/ML Lead Engineer (Senior AI Engineer) responsible for? Serve as a hands-on technical lead, driving project execution and delivery in our growing AI team based in the Hyderabad office. Collaborate closely with the U.S.-based team and cross-functional stakeholders to understand business needs and deliver scalable, AI-powered solutions. Design and build AI applications leveraging best smart solutions. Provide quick prototype and evaluation AI/ML solutions aligned with business objectives. Stay current with emerging trends in AI, and machine learning and help implement best practices within the team. Mentor and support junior engineers, fostering a culture of learning and technical excellence. Manage unstructured data and generate embeddings that can further be leveraged into AI products. What ideal qualifications, skills & experience would help someone to be successful? Bachelors or master’s degree in computer science, data science, engineering, or a related field from a premium institute. 7+ years of experience in engineering, software engineering, data science, or machine learning, including 3+ years in a technical leadership role. Strong understanding with data pipelines, Snowflake ecosystem and master data management. Proficiency in Python. Experience working with unstructured data, large language models (LLMs), embeddings, and building generative AI prototypes. Self-starter with a passion for learning new tools and technologies. Strong communication skills and a collaborative, ownership-driven mindset. Work Shift Timings - 2:00 PM - 11:00 PM IST

Posted 1 month ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies