Home
Jobs

1123 Snowflake Jobs - Page 32

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together The ETL Developer is responsible for the design, development and maintenance of various ETL processes. This includes the design and development of processes for various types of data, potentially large datasets and disparate data sources that require transformation and cleansing to become a usable data set. This candidate should also be able to find creative solutions to complex and diverse business requirements. The developer should have a solid working knowledge of any programing languages, data analysis, design, ETL tool sets. The ideal candidate must possess solid background on Data Engineering development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with business and technical experts in the team. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 6+ years of development, administration and migration experience in Azure Databricks and Snowflake 6+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

19 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Lead technology solution design and delivery Create and maintain optimal data solutions architecture and AI models Works with business partners to document complex company-wide acceptance test plans. Work concurrently on several projects, each with specific instructions that may differ from Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud 'big data' technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with business-critical data insights, technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Troubleshoot production support issues post release deployment and come up with solutions Explain, Socialize and Vet designs internal and external stakeholders Undergraduate degree or equivalent experience. Undergraduate Degree in Engineering or equivalent Over 7 years of experience in Data Engineering and Advanced Analytics Strong experience in build Generative AI based solutions for data management (data pipelines, data standardization, data quality) and data analytics. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing 'big data' data pipelines, architectures and data sets. Experience in Cloud technologies and SNOWFLAKE Experience in Kafka development Experience in Python/Java programing Experience in creating business data models Experience in Report development and dashboarding Strong Experience in driving Customer Experience Experience in working with agile teams Experience in Healthcare Clinical Domains Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.SM Diversity creates a healthier atmosphereUnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills Proven solid analytical skills and attention to detail Proven ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

15 - 20 Lacs

Noida

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Oversee the Analytics team, ensuring the development of analytical solutions for 30+ markets in the US Build and manage a global team of 30+ data analysts and scientists Handle team building, goal setting, development plans, and team restructuring Provide direction and oversight for the architecture, technical design, development, quality assurance, and support readiness of analytics projects Manage the planning and roadmap definition for all analytics solutions within the analytics projects Collaborate with senior leadership, stakeholders, and infrastructure teams to define technical architecture, policies, processes, and budget management Drive innovation and build strategies for technology modernization, including infra modernization and cloud adaptation Foster positive relationships with internal and external customers to ensure profitability and customer satisfaction Utilize the OKR (Objectives and Key Results) approach to define and track team goals and accomplishments Ensures that all the standard requirements have been met and is involved in performing the technical analysis Assists the project manager by compiling information from the current systems, analyzing the program requirements and ensuring that it meets the specified time requirements Resolves moderate problems associated with the designed programs and provides technical guidance on complex programming Develop and implement analytics solutions for 31 states using structured and unstructured data stored in Snowflake Apply advanced statistics, data science methodologies, and AI practices, including generative AI skills leveraging GPT models Perform data mining, cleaning, and aggregation processes to prepare data for analysis Query and manage data in Snowflake, ensuring data quality and integrity Communicate complex data insights through storytelling with data and visualizations using Power BI Collaborate with non-technical and functional stakeholders from health plans to provide prescriptive analytics and help them find answers to data questions Lead and mentor junior data scientists, providing guidance on best practices and methodologies Ensure expertise with ML Ops for deploying and managing machine learning models in production Work in an agile environment, contributing to sprint planning, reviews, and retrospectives Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 8+ years of experience in data science, with a solid background in traditional ML modeling, statistics, and data science methodologies Experience with ML Ops and deploying machine learning models in production Proficiency in Microsoft Fabric, Azure Stack, Azure ML, and Snowflake querying Expertise in AI practices, including generative AI skills leveraging GPT models Solid skills in storytelling with data and creating visualizations using Power BI Familiarity with embeddings and other advanced data science techniques Proven excellent communication and collaboration skills to work with non-technical stakeholders Ability to work in an agile environment and contribute to team success Preferred Qualifications Master’s degree in Data Science, Statistics, Computer Science, or a related field Experience with healthcare data and knowledge of the healthcare industry At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Technical Delivery Lead to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for managing and leading the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools and Snowflake. This platform will enable offshore (non-US) resources to build and develop Reporting, Analytics, and Data Science solutions. Primary Responsibilities Manage and lead the migration of the on-premises SQLServer Enterprise Data Warehouse to Azure Cloud and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, Databricks, and Snowflake Manage and guide the development of cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Implement and oversee DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Provide technical leadership and mentorship to the engineering team Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 8+ years of experience in a Cloud Data Engineering role, with 3+ years in a leadership or technical delivery role Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Experience with Python or other scripting languages for data processing Experience with Agile methodologies and project management tools Solid experience in developing cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills. Solid analytical skills and attention to detail Proven track record of successful project delivery in a cloud environment Preferred Qualifications Certification in Azure or Snowflake Experience working with automated ETL conversion tools used during cloud migrations (SnowConvert, BladeBridge, etc.) Experience with data modeling and database design Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. As aSenior Data Engineerat Optum you’ll help us work on streamlining the flow of information and deliver insights to manage our various Data Analytics web applications which serve internal and external customers. This specific team is working on features such as OpenAI API integrations, working with customers to integrate disparate data sources into useable datasets, and configuring databases for our web application needs. Your work will contribute to lowering the overall cost of healthcare for our consumers and helping people live healthier lives. Primary Responsibilities Data Pipeline DevelopmentDevelop and maintain data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake. Ensure the smooth flow of data from source systems to destination systems while adhering to data quality and integrity standards Data IntegrationIntegrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting Data Transformation and ProcessingDevelop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks Maintain and enhance existing application databases to support our many Data Analytic web applications, as well as working with our web developers on new requirements and applications Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines Implement data governance in line with company standards Partner with Data Analytics and Product leaders to design best practices and standards for developing productional analytic pipelines Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others) Monitoring and SupportMonitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so You will be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in. Required Qualifications Extensive hands-on experience in developing data pipelines that demonstrate a solid understanding of software engineering principles Proficiency in Python, in fulfilling multiple general-purpose use-cases, and not limited to developing data APIs and pipelines Solid understanding of software engineering principles (micro-services applications and ecosystems) Fluent in SQL (Snowflake/SQL Server), with experience using Window functions and more advanced features Understanding of DevOps tools, Git workflow and building CI/CD pipelines Solid understanding of Airflow Proficiency in design and implementation of pipelines and stored procedures in SQL Server and Snowflake Demonstrated ability to work with business and technical audiences on business requirement meetings, technical white boarding exercises, and SQL coding or debugging sessions Preferred Qualifications Bachelor’s Degreeor higherinDatabase Management, Information Technology, Computer Science or similar Experience with Azure Data Factory or Apache Airflow Experience with Azure Databricks or Snowflake Experience working in projects with agile/scrum methodologies Experience with shell scripting languages Experience working with Apache Kafka, building appropriate producer or consumer apps Experience with production quality ML and/or AI model development and deployment Experience working with Kubernetes and Docker, and knowledgeable about cloud infrastructure automation and management (e.g., Terraform) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

8 - 13 Lacs

Noida

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Cloud Migration Planning and Execution: Assist in developing and implementing strategies for migrating ETL processes to cloud platforms like Azure Participate in assessing the current infrastructure and creating a detailed migration roadmap ETL Development and Optimization: Design, develop, and optimize DataStage ETL jobs for cloud environments Ensure data integrity and performance during the migration process Unix Scripting and Automation: Utilize Unix shell scripting to automate data processing tasks and manage ETL workflows Implement and maintain scripts for data extraction, transformation, and loading Collaboration and Coordination: Work closely with cloud architects, senior data engineers, and other stakeholders to ensure seamless integration and migration Coordinate with IT security teams to ensure compliance with data privacy and security regulations Technical Support and Troubleshooting: Provide technical support during and after the migration to resolve any issues Conduct testing and validation to ensure the accuracy and performance of migrated data Documentation and Training: Maintain comprehensive documentation of the migration process, including data mappings, ETL workflows, and system configurations Assist in training team members and end-users on new cloud-based ETL processes and tools Performance Monitoring and Optimization: Monitor the performance of ETL processes in the cloud and make necessary adjustments to optimize efficiency Implement best practices for cloud resource management and cost optimization Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Engineering graduate or equivalent experience 3+ years of relevant Datastage development experience 2+ years experience in development/coding on Spark/Scala or Python or Pyspark 1+ years of experience working on Microsoft Azure Databricks Relevant experience on Databases like Teradata, Snowflake Hands-on development experience in UNIX scripting Experience in working on data warehousing projects Experience with Test Driven Development and Agile methodologies Sound knowledge of SQL programming and SQL Query Skills Proven ability to apply the knowledge of principles and techniques to solve technical problems and write code based on technical design Proficient in learning & adopting new technologies and use them to execute the use cases for business problem solving Exposure to job schedulers like Airflow and ability to create and modify DAGs Proven solid Communication skills (written and Verbal) Proven ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization Proven exposure on DevOps methodology and creating CI/CD deployment pipeline Proven excellent Analytical and Communication skills (Both Verbal and Written) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Senior Cloud Data Engineer to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for migrating our on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools, Delta lake and Snowflake. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Solid communication and collaboration skills Solid analytical skills and attention to detail Ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Test Planning & Automation Lead - Cloud Data Modernization Position Overview: We are seeking a highly skilled and experienced Test Planning & Automation Lead to join our team for a Cloud Data Modernization project. This role involves leading the data validation testing efforts for the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a target cloud tech stack comprising Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage, etc.) and Snowflake. The primary goal is to ensure data consistency between the on-premises and cloud environments. Primary Responsibilities Lead Data Validation TestingOversee and manage the data validation testing process to ensure data consistency between the on-premises SQLServer and the target cloud environment Tool Identification and AutomationIdentify and implement appropriate tools to automate the testing process, reducing reliance on manual methods such as Excel or manual file comparisons Testing Plan DevelopmentDefine and develop a comprehensive testing plan that addresses validations for all data within the data warehouse CollaborationWork closely with data engineers, cloud architects, and other stakeholders to ensure seamless integration and validation of data Quality AssuranceEstablish and maintain quality assurance standards and best practices for data validation and testing ReportingGenerate detailed reports on testing outcomes, data inconsistencies, and corrective actions Continuous ImprovementContinuously evaluate and improve testing processes and tools to enhance efficiency and effectiveness Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor degree or above education Leadership Experience6+ years as a testing lead in Data Warehousing or Cloud Data Migration projects Automation ToolsExperience with data validation through custom built python frameworks and testing automation tools Testing MethodologiesProficiency in defining and implementing testing methodologies and frameworks for data validation Technical ExpertiseSolid knowledge of Python, SQL Server, Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Analytical Skills: Proven excellent analytical and problem-solving skills to identify and resolve data inconsistencies CommunicationProven solid communication skills to collaborate effectively with cross-functional teams Project ManagementDemonstrated ability to manage multiple tasks and projects simultaneously, ensuring timely delivery of testing outcomes Preferred Qualifications Experience in leading data validation testing efforts in cloud migration projects Familiarity with Agile methodologies and project management tools At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Noida

Work from Office

Naukri logo

Primary Responsibilities Creation of Platform Support/Monitoring portal applications to assist in monitoring of inventories/API performance/data trends. Troubleshoot and resolve production defects at all levels. Make critical architecture decisions to guide the development of robust and scalable software solutions Define and enforce engineering policies and best practices for design, development, and delivery of software's Collaborate cross-functionally to ensure technical alignment with business goals Stay up-to-date with the latest industry trends and technologies Required Qualifications 10+ years of software engineering experience 3+ years of hands-on experience with DevOps principles, CI/CD pipelines and automation tools ( Git/GitHub Actions/Jenkins/GitHub version control...etc.) 2+ years of experience with Angular, .Net Technologies, Databases, Kubernetes and cloud technologies Ability to understand database structures and be able to manipulate, extract, and update data Proficient in SQL; experienced in developing stored procedures/views/triggers/indexes Preferred Qualifications Bachelor’s/Master’s degree in Computer Science/Software Engineering/ or related field 3+ years of experience with Aha, Rally, ServiceNow, and GitHub along with an understanding of DevOps 2+ years' Snowflake cloud data warehouse/platform experience Solid technical and platform knowledge, including some or all ofSPLUNK, Kafka, Queueing, ElasticAPM, Red Gate, Grafana, and Airflow. #LETSGROW At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so. Qualifications - External Required Qualifications: Graduate degree or equivalent experience 8+ years of development, administration and migration experience in Azure Databricks and Snowflake 8+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and motivated Data Analyst to join our team. This role involves close collaboration with stakeholders to understand data requirements and deliver actionable insights through advanced analytics, data modeling, and business intelligence solutions. The ideal candidate will have solid expertise in Advanced Python, Cloud platforms, and SQL, along with a passion for solving complex problems and driving data-driven decision-making. Primary Responsibilities Collaborate with stakeholders to gather and understand data and reporting requirements Design and develop business intelligence solutions using Agile methodologies Build tools and models such as segmentation, dashboards, data visualizations, and decision aids Deliver activity and value analytics to internal and external stakeholders Perform descriptive and regression-based analytics, incorporating SME input into algorithm design Manage and manipulate structured and semi-structured data from various sources Conduct data normalization, quality assurance, and transformation processes Create and maintain data pipelines, ETL workflows, and reporting models Develop and maintain dashboards and reports using Power BI and other visualization tools Provide technical support for database and infrastructure-related queries Ensure data quality and integrity throughout the data lifecycle Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Hands-on experience with Data Warehousing and Business Intelligence (DWBI) Experience with ETL tools such as SSIS and data pipeline orchestration Experience with Cloud platforms (e.g., Azure, AWS, or GCP) for data storage, processing, and analytics Experience in data mining, statistical analysis, and developing predictive models Solid understanding of database management, performance tuning, and software development fundamentals Solid command of Advanced Python for data analysis, automation, and scripting Proficiency in SQL and T-SQL, including writing complex queries, stored procedures, and performance tuning Proven excellent problem-solving, documentation, and communication skills Preferred Qualifications Bachelor’s Degree in MIS, Statistics, Mathematics, Computer Science, or a related field Experience in the healthcare industry Understanding of AI/ML Familiarity with data validation routines, data governance, and compliance standards Familiarity with Snowflake or other cloud-based data warehouses Power BI exposure for creating dashboards, scorecards, and visual analytics Proven ability to work independently and manage multiple priorities in a fast-paced environment #GEN

Posted 3 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Noida

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Shift Timing - 3.30 pm to 1.00 am Primary Responsibilities Independently analyzes and investigates data and complex issues using business Intelligence tools to produce findings that may be shared with others Independently creates business intelligence solutions and models to support existing analyses, perform new analyses, interpret results, develop actionable insights and present recommendations for use across the company Independently develops tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis Work and collaborate closely with US based leaders as a matrixed team environment Be able to partner directly with business stakeholders to understand business problems, gather requirements, and demonstrate tools, solutions and analysis Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Positions in this function are responsible for the management and manipulation of mostly structured data, with a focus on building business intelligence tools, conducting analysis to distinguish patterns and recognize trends, performing normalization operations and assuring data quality. Depending on the specific role and business line, example responsibilities in this function could include creating specifications to bring data into a common structure, creating product specifications and models, developing data solutions to support analyses, performing analysis, interpreting results, developing actionable insights and presenting recommendations for use across the company. Roles in this function could partner with stakeholders to understand data requirements and develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization. Other roles involved could include producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, query, analytic and statistical software to build solutions, perform analysis and interpret data. Positions in this function work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms. This function is not intended for employees performing the following workproduction of standard or self-service operational reporting, casual inference led (healthcare analytics) or data pattern recognition (data science) analysis; and/or image or unstructured data analysis using sophisticated theoretical frameworks. Generally work is self-directed and not prescribed. Required Qualifications 5+ years of experience in business/finance including analysis experience with a solid understanding of data visualization 5+ years of experience in analysis of business process and workflow and providing an evaluation, benchmark and/or process improvement recommendation 5+ years of experience with SQL 5+ years of experience in business/finance including analysis experience with a solid understanding of data visualization 5+ years of experience in analysis of business process and workflow and providing an evaluation, benchmark and/or process improvement recommendation 4+ years of experience with Alteryx, SAS or other ETL tools Experience with Agile Methodology Proven ability to work in ambiguity Preferred Qualifications Bachelor's degree in Business, Finance, Health Administration or related field 3+ years of total experience in Power BI or Tableau Snowflake Experience Experience with Agile and scrum methodologies Healthcare Experience Knowledge on Azure Fundamentals Knowledge on DBT At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Software Engineering Manager having the ability to build robust and high-quality design for the prescribed capability and program. Who can think and take our journey of innovations to next level which will deliver a rich, engaging and unmatched customer experience. In this role, you will ensure flawless cutting-edge experiences. Primary Responsibilities Run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage platform infrastructure and applications Improve reliability, quality, and time-to-market of our suite of software solutions Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve Provide primary operational support and engineering for multiple large distributed software applications EstimateCreate, understand and validate Design and estimated effort for given module/task, and be able to justify it MentorCoach high performing engineering team to build and deliver health care product to market OperationsPossess/acquire solid troubleshooting skills and be interested in performing troubleshooting of issues in different desperate technologies and environments Thought LeadershipPropose and implement best in class architectural solution for big and complex systems EngineeringImplement and adhere to best engineering practices like design, unit testing, functional testing automation, continues integration and delivery Stakeholder managementExcellent communication skills, clarity of thought and ability to take decisions bases on available information Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.Tech/MCA/Msc/MTech (16+ years of formal education, correspondence courses are not relevant) 12+ years of work experience in Product companies Experience with distributed storage technologies like NFS, HDFS, Ceph, S3 as well as dynamic resource management frameworks (Mesos, Kubernetes, Yarn) Solid experience in Core Java, Spring, Struts, and Hibernate/Spring Data JPA Exposure to agile development process Proven ability to program (structured and OO) with one or more high level languages, such as Python, Java, C/C++, Ruby, and JavaScript Proven proactive approach to spotting problems, areas for improvement, and performance bottlenecks Preferred Qualifications Work experience in Agile/Scrum Methodology Work experience in product engineering Knowledge of Snowflake and Bigdata Knowledge of SAFe Knowledge of US Healthcare domain, in general and Payment Integrity, in particular

Posted 3 weeks ago

Apply

8.0 - 11.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary Synechron is seeking a skilled Data Modeler with intermediate-level experience to join our team. This role is pivotal for translating business needs into strategic data models that support our objectives, particularly within the BFSI and Wealth Management domains. The Data Modeler will contribute to our business success by developing, evaluating, and maintaining robust data models that ensure consistency and accuracy in our data management. Software Requirements Required: Proficiency in SQL and hands-on experience with data modeling tools such as PowerDesigner or Erwin Experience with one or more databases like Teradata, SQL Server, Hadoop Hive, or Snowflake Preferred: Familiarity with Python and Power BI Unix scripting skills Overall Responsibilities Design, implement, and maintain conceptual, logical, and physical data models. Translate business needs into data models that support strategic solutions. Develop and enforce best practices for naming conventions and coding practices. Perform reverse engineering of physical data models from various databases. Analyze data-related system integration challenges and propose solutions. Guide System Analysts, Engineers, and Programmers on project capabilities and limitations. Review software modifications to enhance efficiency and performance. Technical Skills (By Category) Programming Languages: Required: SQL Preferred: Python Databases/Data Management: Required: Experience with Teradata, SQL Server, Hadoop Hive, Snowflake Frameworks and Libraries: Preferred: PowerDesigner, Erwin Development Tools and Methodologies: Preferred: Agile methodologies Security Protocols: Preferred: Knowledge of data security best practices Experience Requirements 4 to 6 years of data modeling experience. Experience in the BFSI domain and Wealth Management business. Experience in working with larger agile teams, fleets, and squads. Day-to-Day Activities Create and maintain data models and corresponding metadata. Collaborate with cross-functional teams to support database-related initiatives. Participate in regular meetings to align on project goals and progress. Analyze and propose solutions for system integration challenges. Evaluate data models and databases for discrepancies and propose corrections. Qualifications RequiredBachelor’s degree in Computer Science or a related field. PreferredCertifications in data modeling or database management. Commitment to continuous professional development and staying current with industry trends. Professional Competencies Strong critical thinking and problem-solving capabilities. Effective leadership and teamwork abilities. Excellent communication and stakeholder management skills. Adaptability and a continuous learning orientation. Innovative mindset and ability to manage time and priorities effectively.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Role Summary We are looking for a skilled Talend Developer to join our Digital Delivery team in Hyderabad. The ideal candidate will have 35 years of experience in building robust ETL pipelines, integrating diverse data sources, and ensuring high data quality using the Talend ecosystem. You will play a key role in delivering scalable and secure data integration solutions for enterprise clients. Key Responsibilities Design, develop, and deploy ETL workflows using Talend Open Studio or Talend Data Integration Extract, transform, and load data from multiple sources including databases, APIs, and flat files Implement data transformation logic, cleansing rules, and validation checks as per business requirements Optimize Talend jobs for performance, reliability, and scalability Integrate Talend solutions with cloud data warehouses and third-party systems (e.g., Snowflake, Azure, Salesforce) Troubleshoot and resolve issues in production and test environments Document ETL designs, data mappings, and integration specifications Required Skills & Experience • 35 years of hands-on experience in ETL development, with at least 2 years using Talend • Proficiency in SQL, relational databases, and data modeling concepts • Experience with REST/SOAP APIs, JSON/XML/CSV file formats • Strong understanding of data quality, error handling, and job scheduling • Hands-on knowledge of Git, basic scripting (Shell/Python), and CI/CD for ETL pipelines

Posted 3 weeks ago

Apply

12.0 - 18.0 years

0 - 1 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Greetings from 3i Infotech !! PFB JD for Senior Technical Manager - Solutions Architect position -Navi Mumbai/Mumbai We are seeking a highly motivated and experienced Data & AI Leader to join our team. The ideal candidate will be responsible for leading and managing the delivery of multiple projects within the Data & AI domain. This role requires in-depth expertise in Azure data services, as well as the ability to effectively lead a team of data professionals. Key Responsibilities: Lead a team of data engineers, data scientists, and business analysts in the successful execution of Data & AI projects. Own the end-to-end delivery process, ensuring that projects are completed on time and within budget while maintaining high-quality standards. Collaborate with cross-functional teams, including business stakeholders, to gather requirements, define project scope, and set clear objectives. Design robust and scalable data solutions utilizing Power BI , Tableau, and Azure data services. Provide technical guidance and mentorship to team members, fostering a culture of continuous learning and development. Have Project Management Skills to Plan, execute, and close projects, managing timelines, scope, and resources. Lead and coordinate cross-functional teams, facilitating communication and collaboration to achieve project goals. Client Liaison: Act as the primary point of contact for clients, addressing their needs and resolving any issues that arise. Ensure project deliverables, meet quality standards and align with client requirements. Provide regular project updates and status reports to stakeholders and senior management. Stay up to date with industry trends and emerging technologies in the Data & AI space and apply this knowledge to drive innovation within the team. Team Coordination Key Skills: Bachelors degree in computer science, Engineering, or a related field. Proven experience of total 15+ years in Data, BI and Analytics and 5+ years in leading and managing Data & AI projects, with a track record of successful project delivery. Expertise in Azure data fabric and Snowflake. Extensive experience with Azure data services, including but not limited to Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics. Strong analytical and problem-solving skills, with the ability to design and implement complex data solutions. Excellent communication and leadership skills, with the ability to effectively collaborate with cross-functional teams. Proven ability to mentor and develop team members, fostering a culture of continuous improvement. Nice to Have: Microsoft Azure Certifications Pls share your resumes on silamkoti.saikiran@3i-infotech.com Pls share below deatils: C.CTC : E.CTC: Notice Period: Note: Looking for candidates who can join us in short notice period and in case if your profile is not suitable request, you to share some references. Regards, Kiran HRBP 3i Infotech

Posted 3 weeks ago

Apply

2.0 - 5.0 years

15 - 19 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 weeks ago

Apply

5.0 - 8.0 years

5 - 15 Lacs

Bengaluru

Remote

Naukri logo

Key Responsibilities: Design, develop, and optimize relational (PostgreSQL, SQL Server, MySQL, Oracle) and NoSQL (MongoDB, Cassandra, Redis) databases. Write and optimize complex SQL queries, stored procedures, triggers, and functions. Develop and maintain ETL pipelines for data integration. Ensure database security, backups, and high-availability solutions. Collaborate with teams to support application development and troubleshoot performance issues. Maintain technical documentation and stay updated on database best practices. Required Skills: 5+ years of experience in database development. Strong expertise in PostgreSQL and proficiency in SQL Server, MySQL, or Oracle. Experience with query optimization, indexing, and partitioning. Familiarity with NoSQL databases and cloud DB solutions (AWS RDS, Azure SQL, etc.). Hands-on experience with ETL tools, data warehousing, and scripting (Python, PowerShell, Bash). Strong problem-solving and communication skills.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

9 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

We are seeking a highly skilled Data Engineer with 5 to 8 years of experience to join our growing data team. The ideal candidate will have strong expertise in Azure technologies and advanced Python programming skills. This role involves designing, building, and optimizing data pipelines, ensuring data security, and enabling analytical capabilities across the organization. In this Role, Your Responsibilities Will Be: Develop, test, and maintain high-quality software products by using cutting-edge technologies and best programming practices. Design, develop, and maintain scalable and high-performance data pipelines using Azure Data Factory, Azure Synapse Analytics, and other Azure services. Develop and maintain ETL/ELT processes to ensure the quality, consistency, and accuracy of data. Implement data integration solutions across on-premise, hybrid, and cloud environments. Ensure the security, availability, and performance of enterprise data platforms. Work with relational (SQL Server, Azure SQL) and non-relational databases (Azure Cosmos DB, etc.). Build, test, and deploy data solutions using Azure DevOps, version control, and CI/CD pipelines. Develop and enforce data governance policies and practices to ensure data integrity and security. Perform code reviews and ensure the quality of deliverables from the other developers are as per the standards. Collaborate with multi-functional teams, including designers, developers, and quality assurance engineers to build, refine, and enhance software products. Optimize and troubleshoot large datasets using Python and Azure cloud-native technologies. Create and maintain comprehensive technical documentation. Who You Are: A competent professional characterized by being action-oriented, consistently pursuing self-development to enhance skills and knowledge. You possess a drive for achieving results, demonstrating a commitment to excellence in their endeavors. Tech-savvy professional adept at leveraging technological tools to streamline processes and enhance productivity. You excel in collaborative efforts, recognizing the value of teamwork and effectively contributing to collective goals. For This Role, You Will Need: 5-8 years of experience as a Data Engineer, focusing on Azure cloud services Bachelors degree in computer science, Information Technology, or related field. Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Storage. Strong SQL skills, including experience with data modeling, complex queries, and performance optimization. Ability to work independently and manage multiple tasks simultaneously. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (Azure DevOps). Knowledge of Data Lake Architecture, Data Warehousing, and Data Modeling principles. Experience with RESTful APIs, Data APIs, and event-driven architecture. Familiarity with data governance, lineage, security, and privacy best practices. Strong problem-solving, communication, and collaboration skills. Microsoft Certified: Azure Data Engineer Associate Preferred Qualifications that Set You Apart: Bachelor’s degree in computer science or a related field. Strong analytical, communication, and teamwork skills. Preferred Azure Data Engineer Certifications

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Bangalore Rural, Bengaluru

Hybrid

Naukri logo

4+ years of data engineering experience Understanding of modern data platforms including data lakes and data warehouse, with good knowledge of the underlying architecture, preferably in Snowflake Experience in assembling large, complex sets of data that meets non-functional and functional business requirements Working experience of scripting, data science, and analytics (SQL, Python, PowerShell, JavaScript) Experience working with cloud based systems - Azure & Snowflake data warehouses Working knowledge of CI/CD Working knowledge of building data integrity checks as part of delivery of applications Experience working with Kafka technologies preferred

Posted 3 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Gurugram

Remote

Naukri logo

Role- Data Engineer Experience- 8+ Years Mandatory skiils- SQL, ETL Tools, Data Modeling, IICS and Snowflake Location- Remote Job description: -Experience in Data/SQL -Experience in Data Vault Modelling and DBT -Good hold on Python -Experience in ETL Tools -Should be able to design and build data programs in SQL, ETL Tools -Good Experience with Data combination of: Solution/Design with modeling experience -Experience in Cloud (Preferably GCP) and Azure -Experience in Database migration and Cloud migration

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Position: Experienced Data Engineer We are seeking a skilled and experienced Data Engineer to join our fast-paced and innovative Data Science team. This role involves building and maintaining data pipelines across multiple cloud-based data platforms. Requirements: A minimum of 5 years of total experience, with at least 3-4 years specifically in Data Engineering on a cloud platform. Key Skills & Experience: Proficiency with AWS services such as Glue, Redshift, S3, Lambda, RDS , Amazon Aurora ,DynamoDB ,EMR, Athena, Data Pipeline , Batch Job. Strong expertise in: SQL and Python DBT and Snowflake OpenSearch, Apache NiFi, and Apache Kafka In-depth knowledge of ETL data patterns and Spark-based ETL pipelines. Advanced skills in infrastructure provisioning using Terraform and other Infrastructure-as-Code (IaC) tools. Hands-on experience with cloud-native delivery models, including PaaS, IaaS, and SaaS. Proficiency in Kubernetes, container orchestration, and CI/CD pipelines. Familiarity with GitHub Actions, GitLab, and other leading DevOps and CI/CD solutions. Experience with orchestration tools such as Apache Airflow and serverless/FaaS services. Exposure to NoSQL databases is a plus

Posted 3 weeks ago

Apply

12.0 - 20.0 years

30 - 35 Lacs

Navi Mumbai

Work from Office

Naukri logo

Job Title: Big Data Developer and Project Support & Mentorship Location: Mumbai Employment Type: Full-Time/Contract Department: Engineering & Delivery Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 3 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies