Jobs
Interviews

174 Oracle Adf Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills Proven solid analytical skills and attention to detail Proven ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 2 months ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Technical Delivery Lead to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for managing and leading the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools and Snowflake. This platform will enable offshore (non-US) resources to build and develop Reporting, Analytics, and Data Science solutions. Primary Responsibilities Manage and lead the migration of the on-premises SQLServer Enterprise Data Warehouse to Azure Cloud and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, Databricks, and Snowflake Manage and guide the development of cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Implement and oversee DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Provide technical leadership and mentorship to the engineering team Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 8+ years of experience in a Cloud Data Engineering role, with 3+ years in a leadership or technical delivery role Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Experience with Python or other scripting languages for data processing Experience with Agile methodologies and project management tools Solid experience in developing cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills. Solid analytical skills and attention to detail Proven track record of successful project delivery in a cloud environment Preferred Qualifications Certification in Azure or Snowflake Experience working with automated ETL conversion tools used during cloud migrations (SnowConvert, BladeBridge, etc.) Experience with data modeling and database design Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 2 months ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and implement BI applications using Microsoft Azure, including Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Azure Blob Storage Manage the entire software development life cycle, encompassing requirements gathering, designing, coding, testing, deployment, and support Collaborate with cross-functional teams to define, design, and release new features Utilize CI/CD pipelines to automate deployment using Azure and DevOps tools Monitor application performance, identify bottlenecks, and devise solutions to address these issues Foster a positive team environment and skill development Write clean, maintainable, and efficient code that adheres to company standards and best practices Participate in code reviews to ensure code quality and share knowledge Troubleshoot complex software issues and provide timely solutions Engage in Agile/Scrum development processes and meetings Stay updated with the latest and emerging technologies in software development and incorporate new technologies into solution design as appropriate Proactively identify areas for improvement or enhancement in current architecture Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s or Master’s degree in CS, IT, or a related field with 4+ years of experience in software development 3+ years of experience in writing advance level SQL and PySpark Code 3+ years of experience in Azure Databricks and Azure SQL 3+ years of experience in Azure (ADF) Knowledge on advance SQL, ETL & Visualization tools along with knowledge on data warehouse concepts Proficient in building enterprise-level data warehouse projects using Azure Databricks and ADF Proficient in code versioning tools GitHub Proven excellent understanding of Agile methodologies Proven solid problem-solving skills with the ability to work independently and manage multiple tasks simultaneously Proven excellent interpersonal, written, and verbal communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Technical Leadership Technical GuidanceProvide technical direction and guidance to the development team, ensuring that best practices are followed in coding standards, architecture, and design patterns Architecture DesignDesign and oversee the architecture of software solutions to ensure they are scalable, reliable, and performant Technology StackMake informed decisions on the technology stack (.Net for backend services, React for frontend development) to ensure it aligns with project requirements Code ReviewsConduct regular code reviews to maintain code quality and provide constructive feedback to team members Hands-on DevelopmentEngage in hands-on coding and development tasks, particularly in complex or critical areas of the project Project Management Task PlanningBreak down project requirements into manageable tasks and assign them to team members while tracking progress Milestone TrackingMonitor project milestones and deliverables to ensure timely completion of projects Data Pipeline & ETL Management Data Pipeline DesignDesign robust data pipelines that can handle large volumes of data efficiently using appropriate technologies (e.g., Apache Kafka) ETL ProcessesDevelop efficient ETL processes to extract, transform, and load data from various sources into the analytics platform Product Development Feature DevelopmentLead the development of new features from concept through implementation while ensuring they meet user requirements Integration TestingEnsure thorough testing (unit tests, integration tests) is conducted for all features before deployment Collaboration Cross-functional CollaborationCollaborate closely with product managers, UX/UI designers, QA engineers, and other stakeholders to deliver high-quality products Stakeholder CommunicationCommunicate effectively with stakeholders regarding project status updates, technical challenges, and proposed solutions Quality Assurance Performance OptimizationIdentify performance bottlenecks within applications or data pipelines and implement optimizations Bug ResolutionTriage bugs reported by users or QA teams promptly and ensure timely resolution Innovation & Continuous Improvement Stay Updated with TrendsKeep abreast of emerging technologies in .Net, React, Data Pipelines/ETL tools (like Apache Kafka or Azure Data Factory) that could benefit the product Process ImprovementContinuously seek ways to improve engineering processes for increased efficiency and productivity within the team Mentorship & Team Development MentorshipMentor junior developers by providing guidance on their technical growth as well as career development opportunities Team Building ActivitiesFoster a positive team environment through regular meetings (stand-ups), brainstorming sessions/workshops focusing on problem-solving techniques related specifically towards our tech stack needs (.Net/React/Data pipeline) Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so These responsibilities collectively ensure that the Lead Software Engineer not only contributes technically but also plays a crucial role in guiding their team towards successful project delivery for advanced data analytics products utilizing modern technologies such as .Net backend services combined seamlessly alongside frontend interfaces built using React coupled together via robustly engineered pipelines facilitating efficient ETL processes necessary powering insightful analytical outcomes beneficial end-users alike! Required Qualifications Bachelor’s DegreeA Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related field Professional Experience8+ years of experience in software development with significant time spent on both backend (.Net) and frontend (React) technologies Leadership ExperienceProven experience in a technical leadership role where you have led projects or teams Technical Expertise: Extensive experience with .Net framework (C#) for backend development Proficiency with React for frontend development Solid knowledge and hands-on experience with data pipeline technologies (e.g., Apache Kafka) Solid understanding of ETL processes and tools such as DataBricks, ADF, Scala/Spark Technical Skills Architectural KnowledgeExperience designing scalable and high-performance architectures Cloud ServicesExperience with cloud platforms such as Azure, AWS or Google Cloud Platform Software Development LifecycleComprehensive understanding of the software development lifecycle (SDLC), including Agile methodologies Database ManagementProficiency with SQL and NoSQL databases (e.g., SQL Server, MongoDB) Leadership AbilitiesProven solid leadership skills with the ability to inspire and motivate teams Communication Skills: Proven superior verbal and written communication skills for effective collaboration with cross-functional teams and stakeholders Problem-Solving AbilitiesProven solid analytical and problem-solving skills Preferred Qualification Advanced Degree (Optional)A Master’s degree in a relevant field

Posted 2 months ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Senior Cloud Data Engineer to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for migrating our on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools, Delta lake and Snowflake. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Solid communication and collaboration skills Solid analytical skills and attention to detail Ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Test Planning & Automation Lead - Cloud Data Modernization Position Overview: We are seeking a highly skilled and experienced Test Planning & Automation Lead to join our team for a Cloud Data Modernization project. This role involves leading the data validation testing efforts for the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a target cloud tech stack comprising Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage, etc.) and Snowflake. The primary goal is to ensure data consistency between the on-premises and cloud environments. Primary Responsibilities Lead Data Validation TestingOversee and manage the data validation testing process to ensure data consistency between the on-premises SQLServer and the target cloud environment Tool Identification and AutomationIdentify and implement appropriate tools to automate the testing process, reducing reliance on manual methods such as Excel or manual file comparisons Testing Plan DevelopmentDefine and develop a comprehensive testing plan that addresses validations for all data within the data warehouse CollaborationWork closely with data engineers, cloud architects, and other stakeholders to ensure seamless integration and validation of data Quality AssuranceEstablish and maintain quality assurance standards and best practices for data validation and testing ReportingGenerate detailed reports on testing outcomes, data inconsistencies, and corrective actions Continuous ImprovementContinuously evaluate and improve testing processes and tools to enhance efficiency and effectiveness Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor degree or above education Leadership Experience6+ years as a testing lead in Data Warehousing or Cloud Data Migration projects Automation ToolsExperience with data validation through custom built python frameworks and testing automation tools Testing MethodologiesProficiency in defining and implementing testing methodologies and frameworks for data validation Technical ExpertiseSolid knowledge of Python, SQL Server, Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Analytical Skills: Proven excellent analytical and problem-solving skills to identify and resolve data inconsistencies CommunicationProven solid communication skills to collaborate effectively with cross-functional teams Project ManagementDemonstrated ability to manage multiple tasks and projects simultaneously, ensuring timely delivery of testing outcomes Preferred Qualifications Experience in leading data validation testing efforts in cloud migration projects Familiarity with Agile methodologies and project management tools At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Ahmedabad

Work from Office

Role & responsibilities Senior Data Engineer Job Description GRUBBRR is seeking a mid/senior-level data engineer to help build our next-generation analytical and big data solutions. We strive to build Cloud-native, consumer-first, UX-friendly kiosks and online applications across a variety of verticals supporting enterprise clients and small businesses. Behind our consumer applications, we integrate and interact with a deep-stack of payment, loyalty, and POS systems. In addition, we also provide actionable insights to enable our customers to make informed decisions. Our challenge and goal is to provide a frictionless experience for our end-consumers and easy-to-use, smart management capabilities for our customers to maximize their ROIs. Responsibilities: Develop and maintain data pipelines Ensure data quality and accuracy Design, develop and maintain large, complex sets of data that meet non-functional and functional business requirements Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using cloud technologies Build analytical tools to utilize the data pipelines Skills: Solid experience with SQL & NoSQL Strong Data modeling skills for data lake, data warehouse, data marts including dimensional modeling and star schemas Proficient with Azure Data Factory data integration technology Knowledge of Hadoop or similar Big Data technology Knowledge of Apache Kafka, Spark, Hive or equivalent Knowledge of Azure or AWS analytics technologies Qualifications: BS in Computer Science, Applied Mathematics or related fields (MS preferred) At least 8 years of experience working with OLAPs Microsoft Azure or AWS Data engineer certification a plus

Posted 2 months ago

Apply

2.0 - 4.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 2 months ago

Apply

2.0 - 4.0 years

9 - 14 Lacs

Hyderabad, Gurugram

Work from Office

Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 2 months ago

Apply

9.0 - 12.0 years

7 - 11 Lacs

Chennai, Bengaluru

Work from Office

Job Title:Power BI Lead Experience9-12Years Location:Chennai / Bengaluru : Role and Responsibilities: Talk to client stakeholders, and understand the requirements for building their Business Intelligence dashboards and reports Design, develop, and maintain Power BI reports and dashboards for business users. Translate business requirements into effective visualizations using various data sources Create data models, DAX calculations, and custom measures to support business analytics needs Optimize performance and ensure data accuracy in Power BI reports. Troubleshoot and resolve issues related to transformations and visualizations. Train end-users on using Power BI for self-service analytics. Skills Required: Proficiency in Power BI Desktop and Power BI Service. Good understanding of Power BI Copilot. Strong understanding of data modelling concepts and DAX language. Strong understanding of semantic data modelling concepts. Experience with data visualization best practices. Experience in working with streaming data as well as batch data. Knowledge in ADF would be added advantage. Knowledge in SAS would be added advantage.

Posted 2 months ago

Apply

5 - 8 years

9 - 14 Lacs

Hyderabad

Work from Office

About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Oracle PaaS Cloud Architect. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 months ago

Apply

2 - 5 years

11 - 16 Lacs

Bengaluru

Work from Office

Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot LocationBengaluru Type of ContractPermanent Analytics, Data and Integration Type of workHybrid Work LanguageFluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also a ct as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantageshealth insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps, , MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversityrespect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestl’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestl We are Nestl, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respectrespect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 ?billion sales in 2022, we have an expansive presence, with 344 ?factories in 77 ?countries. Want to learn more? Visit us at www.nestle.com . ?We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestl #beaforceforgood How we will proceed You send us your CV ? We contact relevant applicants ? Interviews ? Feedback ? Job Offer communication to the Finalist ? First working day

Posted 2 months ago

Apply

3 - 4 years

8 - 9 Lacs

Gurugram, Delhi / NCR

Work from Office

3-4 years of exp in Oracle ADF development.Should have hands-on exp in Oracle ADF,Spring Boot,& Oracle DB.Exp in Oracle Database with SQL, PL/SQL,& database optimization techniques.Oracle ADF components such as JSF,EJBs,ADF BC,ADF Faces & Task Flows. Required Candidate profile Familiarity with Java,J2EE,RESTful services, & modern software development practices. Design,develop& maintain web-based applications using Oracle ADF.Build robust,scalable, & secure ADF applications.

Posted 2 months ago

Apply

3 - 4 years

5 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Good experience in Oracle ADF framework and its life cycle Experience in Development, deployment, code migration, troubleshooting, and code tuning of ADF applications. Hands-on experience in Oracle SQL and PL-SQL. Integrated Oracle ADF applications with third-party applications and web services. Implemented Oracle ADF security, web services, and Oracle ADF rich client components. Experience in the Creation/maintenance/tuning of all database objects is an added advantage. Knowledge in other J2EE frameworks, restful and soap web services is an added advantage Create well-designed, reusable processes/objects. Good communication skills, and strong work ethic. Good written and verbal communication skills in English. Documentation skills

Posted 2 months ago

Apply

1 - 3 years

3 - 6 Lacs

Chennai

Work from Office

About VSM Software VSM Software (P) Ltd is an ISO certified company catering to the global needs of Pharma and Banking industries. In both these verticals, we offer solutions and services in specific areas. VSM has Strong founding team based in India and the US A great leadership team who come with high levels of educational qualifications and relevant industry experience Skilled and trained IT and Subject Matter professionals We have a local presence in 5 countries and are further expanding our delivery reach. About the Team The team is responsible for developing custom solutions tailored to client requirements, ensuring optimal performance, and adhering to best practices in coding and application architecture. With a strong focus on innovation, the Oracle ADF Developer team collaborates with stakeholders to translate business needs into technical solutions, troubleshoot application issues, and enhance existing systems. Their contributions are instrumental in driving digital transformation and achieving organizational goals. Responsibilities Develop, design, and deploy Oracle ADF applications Develop database designs and write PL/SQL code Create web applications using Oracle ADF Debug and troubleshoot ADF applications Maintain existing code and develop new code to meet business requirements

Posted 2 months ago

Apply

8 - 12 years

13 - 17 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Max NP: 15 Days. Data Engineer Azure a. Data Engineering and SQL b. Python c. PySpark d. Azure Data lake and ADF e. Databricks f. CICD g. Strong communication Minimum Qualifications Job Requirements: * Bachelors degree in CS. 8 years of hands-on experience in designing and developing distributed data pipelines. 5 years of hands-on experience in Azure data service technologies. 5 years of hands-on experience in Python, SQL, Object oriented programming, ETL and unit testing Experience with data integration with APIs, Web services, Queues Experience with Azure DevOps and CI/CD as well as agile tools and processes including JIRA, confluence. Excellent Communication skill Location: Chennai, Hyderabad,Kolkata, Pune, Ahmedabad, Bengaluru, Mumbai, Delhi, India

Posted 2 months ago

Apply

5 - 10 years

6 - 12 Lacs

Udaipur

Work from Office

Expertise in Oracle EBS environments, proactive problem-solving, and strong collaboration with technical teams, Required Candidate profile Oracle Database Administration, Oracle E-Business Suite (EBS), Performance Tuning, Backup & Recovery, Data Security, Patching & Cloning Perks and benefits As per industry

Posted 2 months ago

Apply

3 - 6 years

9 - 13 Lacs

Mohali, Gurugram, Bengaluru

Work from Office

Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: Experience in HR Data and databases is a must. 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams. Experience in HR Data and databases. Experience in Azure Data Factory

Posted 2 months ago

Apply

1 - 4 years

3 - 7 Lacs

Bengaluru

Work from Office

Req ID: 321498 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job Duties Work closely with Lead Data Engineer to understand business requirements, analyse and translate these requirements into technical specifications and solution design. Work closely with Data modeller to ensure data models support the solution design Develop , test and fix ETL code using Snowflake, Fivetran, SQL, Stored proc. Analysis of the data and ETL for defects/service tickets (for solution in production ) raised and service tickets. Develop documentation and artefacts to support projects Minimum Skills Required ADF Fivetran (orchestration & integration) SQL Snowflake DWH About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Database, SQL, Consulting, Technology

Posted 2 months ago

Apply

1 - 4 years

1 - 5 Lacs

Bengaluru

Work from Office

Req ID: 321505 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Test Analyst to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job Duties Understand business requirements , develop test cases. Work with tech team and client to validate and finalise test cases.. Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects Run in testing phase – SIT and UAT Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required Test Cases development Jira knowledge for record test cases, expected results, outcomes, assign defects) Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Database, SQL, Testing, Consulting, Technology

Posted 2 months ago

Apply

3 - 8 years

10 - 20 Lacs

Hyderabad

Work from Office

Experience in Oracle ADF Development. Experience in Spring Boot. Experience in JAVA ( Java Core ,J2EE,JSF,JSP,Servlets,..). Responsibilities Developing ADF Faces applications Developing ADF Task Flows Developing ADF Business Components Following Oracle recommended best practices for ADF Developing Restful web services using Spring Boot Developing data access components using Spring Data JPA Managing dependencies using Spring Boot Starter dependencies Configuring applications using application. Properties or application files Creating deployment artifacts as executable JAR files using the spring-boot-maven-plugin Qualifications Experience working with JDeveloper 12C , VS Code, Eclipse, Experience working with Oracle SQL & PL/SQL. Experience with development tools (Toad, SQL Developer). Experience creating and tuning different Oracle database objects ( Tables, Indexes, Triggers , Packages, Function, Procedures , ) Experience working with Spring-boot Good understanding for object oriented programming Experience in building scripts Maven, Cradle, CI Jenkins.. Experience with Oracle Fusion Middleware and Weblogic.

Posted 2 months ago

Apply

1 - 3 years

7 - 11 Lacs

Hyderabad

Work from Office

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Machine Learning Services and Pipelines. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data ScientistHyderabad and Gurugram You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Machine Learning and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 5+ years of experience working as a Data Scientist. 4+ years experience building solutions in the commercial or in the supply chain space. 4+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 4+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 4+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Skills, Abilities, Knowledge Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python , Pyspark and database query languages likeSQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation.

Posted 2 months ago

Apply

8 - 12 years

16 - 16 Lacs

Chennai

Work from Office

We are looking for experienced database developer who could join our engineering team to build the enterprise applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you about the potential role. Responsibilities Database Development Utilize expertise in MS SQL Server, PostgreSQL to design and develop efficient and scalable database solutions. Collaborate with development stakeholders to understand and implement database requirements. Write and optimize complex SQL queries, stored procedures, and functions. Database tuning & configurations of servers Knowledge on both Cloud & on-premises database Knowledge of SaaS based applications development ETL Integration Leverage experience with ETL tools such as ADF and SSIS to facilitate seamless data migration. Design and implement data extraction, transformation, and loading processes. Ensure data integrity during the ETL process and troubleshoot any issues that may arise. Reporting Develop and maintain SSRS reports based on customer needs. Collaborate with stakeholders to understand reporting requirements and implement effective solutions. Performance Tuning Database performance analysis using Dynatrace, NewRelic or similar tools. Analyze query performance and implement tuning strategies to optimize database performance. Conduct impact analysis and resolve production issues within specified SLAs. Version Control and Collaboration Utilize GIT and SVN for version control of database scripts and configurations. Collaborate with cross-functional teams using tools such as JIRA for story mapping, tracking, and issue resolution. Documentation Document database architecture, processes, and configurations. Provide detailed RCA (Root Cause Analysis) for any database-related issues. Requirements 6 - 9 years of hands-on experience in software development. Must have extensive experience in stored procedure development and performance fine tuning. Proficient in SQL, MS SQL Server, SSRS, and SSIS. Working knowledge of C# ASP.NET web application development. Ability to grasp new concepts and facilitate continuous learning. Strong sense of responsibility and accountability. Benefits We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 2 months ago

Apply

8 - 10 years

25 - 30 Lacs

Pune

Work from Office

Oracle ADF, Java, Spring Boot, PL-SQL AWS Cloud Design, develop, and implement Oracle Application Development Framework (ADF) applications. Customize ADF components based on business needs and requirements. Test, debug, and optimize ADF applications for performance and functionality. Collaborate with teams to integrate ADF solutions with other systems. Document development processes and ensure adherence to best practices

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies