Home
Jobs
Companies
Resume

148 Oracle Adf Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1624_JOB Date Opened 08/12/2022 Industry Technology Job Type Work Experience 5-8 years Job Title Azure ADF & Power BI Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 4 Roles & Responsibilities: Resource must have 5+ years of hands on experience in Azure Cloud development (ADF + DataBricks) - mandatory Strong in Azure SQL and good to have knowledge on Synapse / Analytics Experience in working on Agile Project and familiar with Scrum/SAFe ceremonies. Good communication skills - Written & Verbal Can work directly with customer Ready to work in 2nd shift Good in communication and flexible Defines, designs, develops and test software components/applications using Microsoft Azure- Data-bricks, ADF, ADL, Hive, Python, Data bricks, SparkSql, PySpark. Expertise in Azure Data Bricks, ADF, ADL, Hive, Python, Spark, PySpark Strong T-SQL skills with experience in Azure SQL DW Experience handling Structured and unstructured datasets Experience in Data Modeling and Advanced SQL techniques Experience implementing Azure Data Factory Pipelines using latest technologies and techniques. Good exposure in Application Development. The candidate should work independently with minimal supervision check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Designer, you will be responsible for assisting in defining requirements and designing applications to meet business process and application requirements using OneStream Extensive Finance SmartCPM. Your typical day will involve collaborating with cross-functional teams and ensuring the delivery of high-quality solutions. Roles & Responsibilities: Collaborate with cross-functional teams to define requirements and design applications using OneStream Extensive Finance SmartCPM. Ensure the delivery of high-quality solutions that meet business process and application requirements. Provide technical guidance and support to team members. Stay updated with the latest advancements in OneStream Extensive Finance SmartCPM and related technologies. Professional & Technical Skills: Must To Have Skills:Extensive experience in OneStream Finance SmartCPM. Good To Have Skills:Experience in related technologies such as Hyperion, SAP BPC, or Oracle EPM. Strong understanding of financial planning and analysis processes. Experience in designing and implementing financial consolidation and reporting solutions. Experience in designing and implementing budgeting and forecasting solutions. Additional Information: The candidate should have a minimum of 5 years of experience in OneStream Extensive Finance SmartCPM. The ideal candidate will possess a strong educational background in finance, accounting, or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Bengaluru office. Qualifications BE

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for Microsoft BI & Data Warehouse Lead to design, develop, and maintain robust data warehouse & ETL solutions using the Microsoft technology stack. The ideal candidate will have extensive expertise in SQL Server development, (ADF), Health insurance Provident fund

Posted 1 week ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. AHEAD is looking for a Sr. Data Engineer (L3 support) to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. The appropriate candidate must be a subject matter expert in managing data platforms. Responsibilities: A Sr. Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as EventHubs, ADF and other cloud native tools as required to address streaming use cases Engineers and maintain ELT processes for loading data lake (Cloud Storage, data lake gen2) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and escalations and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Should possess ownership and leadership skills to collaborate effectively with Level 1 and Level 2 teams. Must have experience in raising tickets with Microsoft and engaging with them to address any service or tool outages in production. Qualifications: 7+ years of professional technical experience 5+ years of hands-on Data Architecture and Data Modelling SME level 5+ years of experience building highly scalable data solutions using Azure data factory, Spark, Databricks, Python 5+ years of experience working in cloud environments (AWS and/or Azure) 3+ years of programming languages such as Python, Spark and Spark SQL. Should have strong knowledge on architecture of ADF and Databricks. Able to work with Level1 and Level 2 teams to resolve platform outages in production environments. Strong client-facing communication and facilitation skills Strong sense of urgency, ability to set priorities and perform the job with little guidance Excellent written and verbal interpersonal skills and the ability to build and maintain collaborative and positive working relationships at all levels Strong interpersonal and communication skills (Written and oral) required Should be able to work in shifts Should have knowledge on azure Dev Ops process. Key Skills: Azure Data Factory, Azure Data bricks, Python, ETL/ELT, Spark, Data Lake, Data Engineering, EventHubs, Azure delta, Spark streaming Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.

Posted 1 week ago

Apply

3.0 - 6.0 years

2 - 6 Lacs

Pune, Greater Noida

Work from Office

Naukri logo

The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Apex Fund Services Position SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager : Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT. Company Websitewww.apexfundservices.com DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Hyderabad Pune Exp- 8+ Location- Chennai, Bangalore, Hyderabad, Pune 6-10+ in Oracle ADF, Java/J2EE, Spring, Struts , SQL, Weblogic 12c, Bit bucket, Git, JIRA, Exposure in AMS Production Support Project Good Multi-Tasking and flexible to stretch when situation demands Develop and support web applications using a variety of Oracle ADF/Fusion 11g technologies Administer Weblogic server applications including general database development and administration Perform detailed application debugging and Root Cause Corrective Actions Able to explain software functionality from a user s or customer s perspective May provide guidance, assistance, and technical leadership to lower level software engineers on more complex/large projects Qualifications for ADF developer Minimum 5 years experience in the IT/Technology industry 5+ years experience using Oracle ADF 11g framework for application development 2 to 3 years of basic SQL and/or PLSQL programming languages Experience with React (React.js / ReactJS ) open-source JavaScript framework Technical skills in Oracle ADF, Java 5+ years of relevant development experience with Oracle tools such as SQL, PL/SQL, Reports, Forms, Workflow Kind Note: Please apply or share your resume only if it matches the above criteria.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services, ADF, ADB, Pyspark- Strong understanding of cloud computing principles- Experience with Azure DevOps for continuous integration and deployment- Knowledge of Azure SQL Database and Azure Cosmos DB- Hands-on experience with Azure Functions and Logic Apps Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services- This position is based at our Pune office (Kharadi) and 3 days WFO mandatory- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com.In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience.The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Qualification Any Graduation,12th/PUC/HSC

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to your team members. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will require you to balance technical expertise with effective communication, fostering a collaborative environment that encourages innovation and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Strong understanding of application development frameworks.- Experience with database management and integration.- Familiarity with cloud computing platforms and services.- Ability to write clean, maintainable, and efficient code. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Back Key Responsibilities Good understanding of data engineering concepts Working experience on Java language Working experience on Apache Beam framework(Using Java language) Working experience on : Azure (ADF) or Google Cloud Platform(GCP) but preferred is GCP GCP Dataflow(using Java language) GCP BigQuery GCP Workflows GCP Cloud Run(using Java language) GCP Cloud Storage GCP Cloud Functions Infrastructure as Code(IaC) Terraform Required Qualifications To Be Successful In This Role Working experience on GitLab and GitLab CICD Working experience on Maven build tool Working experience on Docker Containers Working experience on Oracle Database Good understanding of Shell Scripting Good understanding of Redis Good Communication Skills Good Understanding of Agile methodologies Additional Information Job Type Full Time Work ProfileHybrid (Work from Office) Years of Experience6-10 Years LocationBangalore Benefits What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : OneStream Extensive Finance SmartCPM Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Designer, you will be responsible for assisting in defining requirements and designing applications to meet business process and application requirements using OneStream Extensive Finance SmartCPM. Your typical day will involve collaborating with cross-functional teams and ensuring the delivery of high-quality solutions. Roles & Responsibilities: Collaborate with cross-functional teams to define requirements and design applications using OneStream Extensive Finance SmartCPM. Ensure the delivery of high-quality solutions that meet business process and application requirements. Develop and maintain technical documentation, including design specifications, test plans, and user manuals. Provide technical guidance and support to team members, ensuring adherence to best practices and standards. Stay updated with the latest advancements in OneStream Extensive Finance SmartCPM and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Extensive experience in OneStream Finance SmartCPM. Good To Have Skills:Experience in related technologies such as Hyperion, SAP BPC, or Oracle EPM. Strong understanding of financial planning and analysis processes. Experience in designing and implementing financial consolidation and reporting solutions. Solid grasp of database concepts and SQL. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 5 years of experience in OneStream Finance SmartCPM. The ideal candidate will possess a strong educational background in finance, accounting, or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Hyderabad office. Qualifications BE

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

The Oracle Application Development Framework (ADF) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Oracle Application Development Framework (ADF) domain.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

ADF/Spring boot developer (Senior) Job description Requirements Degree in Computer Science, Engineering or a related subject. Experience in Oracle ADF Development. Experience in Spring Boot. Experience in JAVA ( Java Core ,J2EE,JSF,JSP,Servlets,..). Responsibilities Developing ADF Faces applications Developing ADF Task Flows Developing ADF Business Components Following Oracle recommended best practices for ADF Developing Restful web services using Spring Boot Developing data access components using Spring Data JPA Managing dependencies using Spring Boot Starter dependencies Configuring applications using application.properties or application.yml files Creating deployment artifacts as executable JAR files using the spring-boot-maven-plugin Qualifications Experience working with JDeveloper 12C , VS Code, Eclipse, Experience working with Oracle SQL & PL/SQL. Experience with development tools (Toad, SQL Developer). Experience creating and tuning different Oracle database objects ( Tables, Indexes, Triggers , Packages, Function, Procedures , ) Experience working with Spring-boot Good understanding for object oriented programming Experience in building scripts Maven, Cradle, CI Jenkins.. Experience with Oracle Fusion Middleware and Weblogic. Familiarity with version control systems such as SVN,Git. Significant experience writing, utilizing, and securing RESTful API services. Experience working with web Development: CSS, HTML,JSON Experience with Agile/Scrum methodologies and project management tools

Posted 2 weeks ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Potential candidate has to efficiently executes business strategies that are in line with organisational objectives. He/She also coordinates with channel partners to identify regional market opportunities, and builds working relationships amongst a diverse range of customers. Must have experience in field of sales and marketing of Fa ade systems. Experience of specifying products and handling projects with big customers / Builders / consultants and Architects. Potential candidate must have proven record of preparing and executing a Business plan in similar capacity to achieve desired target. Must have knowledge of Development cycle of a Fa ade system. Technically sound about different Fa ade systems.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

š" We're HiringADF Developer! š" We are looking for an experienced ADF Developer to join our dynamic team in Bangalore Urban The ideal candidate will possess strong technical skills and a passion for developing innovative solutions using Oracle ADF technologies You will play a key role in designing, coding, and implementing applications that meet our business needs. “ LocationBangalore Urban, India Work ModeWork From Office ’ RoleADF Developer What You'll Do Design and develop high-quality applications using Oracle ADF. › Collaborate with cross-functional teams to gather requirements and deliver solutions. ”„ Perform application testing, debugging, and troubleshooting as needed. “ˆ Optimize application performance and ensure scalability. “ Document development processes and maintain technical documentation. Provide support during the deployment of applications to production environments. What Were Looking For Minimum 2 years of experience in Oracle ADF development. Strong understanding of Java, XML, SQL, and web services. Excellent problem-solving skills and attention to detail. Ability to work independently as well as in a team environment. Strong communication skills to effectively collaborate with stakeholders. Ready to elevate your career Apply now and be part of our success story! Show more Show less

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills Proven solid analytical skills and attention to detail Proven ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 2 weeks ago

Apply

5.0 - 9.0 years

13 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Technical Delivery Lead to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for managing and leading the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools and Snowflake. This platform will enable offshore (non-US) resources to build and develop Reporting, Analytics, and Data Science solutions. Primary Responsibilities Manage and lead the migration of the on-premises SQLServer Enterprise Data Warehouse to Azure Cloud and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, Databricks, and Snowflake Manage and guide the development of cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Implement and oversee DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Provide technical leadership and mentorship to the engineering team Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 8+ years of experience in a Cloud Data Engineering role, with 3+ years in a leadership or technical delivery role Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Experience with Python or other scripting languages for data processing Experience with Agile methodologies and project management tools Solid experience in developing cloud-native ETLs and data pipelines using modern technologies on Azure Cloud, Databricks, and Snowflake Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Proven solid communication and collaboration skills. Solid analytical skills and attention to detail Proven track record of successful project delivery in a cloud environment Preferred Qualifications Certification in Azure or Snowflake Experience working with automated ETL conversion tools used during cloud migrations (SnowConvert, BladeBridge, etc.) Experience with data modeling and database design Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud)

Posted 2 weeks ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and implement BI applications using Microsoft Azure, including Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Azure Blob Storage Manage the entire software development life cycle, encompassing requirements gathering, designing, coding, testing, deployment, and support Collaborate with cross-functional teams to define, design, and release new features Utilize CI/CD pipelines to automate deployment using Azure and DevOps tools Monitor application performance, identify bottlenecks, and devise solutions to address these issues Foster a positive team environment and skill development Write clean, maintainable, and efficient code that adheres to company standards and best practices Participate in code reviews to ensure code quality and share knowledge Troubleshoot complex software issues and provide timely solutions Engage in Agile/Scrum development processes and meetings Stay updated with the latest and emerging technologies in software development and incorporate new technologies into solution design as appropriate Proactively identify areas for improvement or enhancement in current architecture Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s or Master’s degree in CS, IT, or a related field with 4+ years of experience in software development 3+ years of experience in writing advance level SQL and PySpark Code 3+ years of experience in Azure Databricks and Azure SQL 3+ years of experience in Azure (ADF) Knowledge on advance SQL, ETL & Visualization tools along with knowledge on data warehouse concepts Proficient in building enterprise-level data warehouse projects using Azure Databricks and ADF Proficient in code versioning tools GitHub Proven excellent understanding of Agile methodologies Proven solid problem-solving skills with the ability to work independently and manage multiple tasks simultaneously Proven excellent interpersonal, written, and verbal communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Technical Leadership Technical GuidanceProvide technical direction and guidance to the development team, ensuring that best practices are followed in coding standards, architecture, and design patterns Architecture DesignDesign and oversee the architecture of software solutions to ensure they are scalable, reliable, and performant Technology StackMake informed decisions on the technology stack (.Net for backend services, React for frontend development) to ensure it aligns with project requirements Code ReviewsConduct regular code reviews to maintain code quality and provide constructive feedback to team members Hands-on DevelopmentEngage in hands-on coding and development tasks, particularly in complex or critical areas of the project Project Management Task PlanningBreak down project requirements into manageable tasks and assign them to team members while tracking progress Milestone TrackingMonitor project milestones and deliverables to ensure timely completion of projects Data Pipeline & ETL Management Data Pipeline DesignDesign robust data pipelines that can handle large volumes of data efficiently using appropriate technologies (e.g., Apache Kafka) ETL ProcessesDevelop efficient ETL processes to extract, transform, and load data from various sources into the analytics platform Product Development Feature DevelopmentLead the development of new features from concept through implementation while ensuring they meet user requirements Integration TestingEnsure thorough testing (unit tests, integration tests) is conducted for all features before deployment Collaboration Cross-functional CollaborationCollaborate closely with product managers, UX/UI designers, QA engineers, and other stakeholders to deliver high-quality products Stakeholder CommunicationCommunicate effectively with stakeholders regarding project status updates, technical challenges, and proposed solutions Quality Assurance Performance OptimizationIdentify performance bottlenecks within applications or data pipelines and implement optimizations Bug ResolutionTriage bugs reported by users or QA teams promptly and ensure timely resolution Innovation & Continuous Improvement Stay Updated with TrendsKeep abreast of emerging technologies in .Net, React, Data Pipelines/ETL tools (like Apache Kafka or Azure Data Factory) that could benefit the product Process ImprovementContinuously seek ways to improve engineering processes for increased efficiency and productivity within the team Mentorship & Team Development MentorshipMentor junior developers by providing guidance on their technical growth as well as career development opportunities Team Building ActivitiesFoster a positive team environment through regular meetings (stand-ups), brainstorming sessions/workshops focusing on problem-solving techniques related specifically towards our tech stack needs (.Net/React/Data pipeline) Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so These responsibilities collectively ensure that the Lead Software Engineer not only contributes technically but also plays a crucial role in guiding their team towards successful project delivery for advanced data analytics products utilizing modern technologies such as .Net backend services combined seamlessly alongside frontend interfaces built using React coupled together via robustly engineered pipelines facilitating efficient ETL processes necessary powering insightful analytical outcomes beneficial end-users alike! Required Qualifications Bachelor’s DegreeA Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or a related field Professional Experience8+ years of experience in software development with significant time spent on both backend (.Net) and frontend (React) technologies Leadership ExperienceProven experience in a technical leadership role where you have led projects or teams Technical Expertise: Extensive experience with .Net framework (C#) for backend development Proficiency with React for frontend development Solid knowledge and hands-on experience with data pipeline technologies (e.g., Apache Kafka) Solid understanding of ETL processes and tools such as DataBricks, ADF, Scala/Spark Technical Skills Architectural KnowledgeExperience designing scalable and high-performance architectures Cloud ServicesExperience with cloud platforms such as Azure, AWS or Google Cloud Platform Software Development LifecycleComprehensive understanding of the software development lifecycle (SDLC), including Agile methodologies Database ManagementProficiency with SQL and NoSQL databases (e.g., SQL Server, MongoDB) Leadership AbilitiesProven solid leadership skills with the ability to inspire and motivate teams Communication Skills: Proven superior verbal and written communication skills for effective collaboration with cross-functional teams and stakeholders Problem-Solving AbilitiesProven solid analytical and problem-solving skills Preferred Qualification Advanced Degree (Optional)A Master’s degree in a relevant field

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a highly skilled and experienced Senior Cloud Data Engineer to join our team for a Cloud Data Modernization project. The successful candidate will be responsible for migrating our on-premises Enterprise Data Warehouse (SQLServer) to a modern cloud-based data platform utilizing Azure Cloud data tools, Delta lake and Snowflake. Primary Responsibilities Lead the migration of the ETLs from on-premises SQLServer based data warehouse to Azure Cloud, Databricks and Snowflake Design, develop, and implement data platform solutions using Azure Data Factory (ADF), Self-hosted Integration Runtime (SHIR), Logic Apps, Azure Data Lake Storage Gen2 (ADLS Gen2), Blob Storage, and Databricks (Pyspark) Review and analyze existing on-premises ETL processes developed in SSIS and T-SQL Implement DevOps practices and CI/CD pipelines using GitActions Collaborate with cross-functional teams to ensure seamless integration and data flow Optimize and troubleshoot data pipelines and workflows Ensure data security and compliance with industry standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 6+ years of experience as a Cloud Data Engineer Hands-on experience with Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage) and Databricks Solid experience in ETL development using on-premises databases and ETL technologies Experience with Python or other scripting languages for data processing Experience with Agile methodologies Proficiency in DevOps and CI/CD practices using GitActions Proven excellent problem-solving skills and ability to work independently Solid communication and collaboration skills Solid analytical skills and attention to detail Ability to adapt to new technologies and learn quickly Preferred Qualifications Certification in Azure or Databricks Experience with data modeling and database design Experience with development in Snowflake for data engineering and analytics workloads Knowledge of data governance and data quality best practices Familiarity with other cloud platforms (e.g., AWS, Google Cloud) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Test Planning & Automation Lead - Cloud Data Modernization Position Overview: We are seeking a highly skilled and experienced Test Planning & Automation Lead to join our team for a Cloud Data Modernization project. This role involves leading the data validation testing efforts for the migration of an on-premises Enterprise Data Warehouse (SQLServer) to a target cloud tech stack comprising Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage, etc.) and Snowflake. The primary goal is to ensure data consistency between the on-premises and cloud environments. Primary Responsibilities Lead Data Validation TestingOversee and manage the data validation testing process to ensure data consistency between the on-premises SQLServer and the target cloud environment Tool Identification and AutomationIdentify and implement appropriate tools to automate the testing process, reducing reliance on manual methods such as Excel or manual file comparisons Testing Plan DevelopmentDefine and develop a comprehensive testing plan that addresses validations for all data within the data warehouse CollaborationWork closely with data engineers, cloud architects, and other stakeholders to ensure seamless integration and validation of data Quality AssuranceEstablish and maintain quality assurance standards and best practices for data validation and testing ReportingGenerate detailed reports on testing outcomes, data inconsistencies, and corrective actions Continuous ImprovementContinuously evaluate and improve testing processes and tools to enhance efficiency and effectiveness Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor degree or above education Leadership Experience6+ years as a testing lead in Data Warehousing or Cloud Data Migration projects Automation ToolsExperience with data validation through custom built python frameworks and testing automation tools Testing MethodologiesProficiency in defining and implementing testing methodologies and frameworks for data validation Technical ExpertiseSolid knowledge of Python, SQL Server, Azure Cloud data tools (ADF, SHIR, Logic Apps, ADLS Gen2, Blob Storage), Databricks, and Snowflake Analytical Skills: Proven excellent analytical and problem-solving skills to identify and resolve data inconsistencies CommunicationProven solid communication skills to collaborate effectively with cross-functional teams Project ManagementDemonstrated ability to manage multiple tasks and projects simultaneously, ensuring timely delivery of testing outcomes Preferred Qualifications Experience in leading data validation testing efforts in cloud migration projects Familiarity with Agile methodologies and project management tools At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Ahmedabad

Work from Office

Naukri logo

Role & responsibilities Senior Data Engineer Job Description GRUBBRR is seeking a mid/senior-level data engineer to help build our next-generation analytical and big data solutions. We strive to build Cloud-native, consumer-first, UX-friendly kiosks and online applications across a variety of verticals supporting enterprise clients and small businesses. Behind our consumer applications, we integrate and interact with a deep-stack of payment, loyalty, and POS systems. In addition, we also provide actionable insights to enable our customers to make informed decisions. Our challenge and goal is to provide a frictionless experience for our end-consumers and easy-to-use, smart management capabilities for our customers to maximize their ROIs. Responsibilities: Develop and maintain data pipelines Ensure data quality and accuracy Design, develop and maintain large, complex sets of data that meet non-functional and functional business requirements Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using cloud technologies Build analytical tools to utilize the data pipelines Skills: Solid experience with SQL & NoSQL Strong Data modeling skills for data lake, data warehouse, data marts including dimensional modeling and star schemas Proficient with Azure Data Factory data integration technology Knowledge of Hadoop or similar Big Data technology Knowledge of Apache Kafka, Spark, Hive or equivalent Knowledge of Azure or AWS analytics technologies Qualifications: BS in Computer Science, Applied Mathematics or related fields (MS preferred) At least 8 years of experience working with OLAPs Microsoft Azure or AWS Data engineer certification a plus

Posted 3 weeks ago

Apply

2.0 - 4.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 3 weeks ago

Apply

2.0 - 4.0 years

9 - 14 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models Programming Skills Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 3 weeks ago

Apply

9.0 - 12.0 years

7 - 11 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Title:Power BI Lead Experience9-12Years Location:Chennai / Bengaluru : Role and Responsibilities: Talk to client stakeholders, and understand the requirements for building their Business Intelligence dashboards and reports Design, develop, and maintain Power BI reports and dashboards for business users. Translate business requirements into effective visualizations using various data sources Create data models, DAX calculations, and custom measures to support business analytics needs Optimize performance and ensure data accuracy in Power BI reports. Troubleshoot and resolve issues related to transformations and visualizations. Train end-users on using Power BI for self-service analytics. Skills Required: Proficiency in Power BI Desktop and Power BI Service. Good understanding of Power BI Copilot. Strong understanding of data modelling concepts and DAX language. Strong understanding of semantic data modelling concepts. Experience with data visualization best practices. Experience in working with streaming data as well as batch data. Knowledge in ADF would be added advantage. Knowledge in SAS would be added advantage.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies