Jobs
Interviews

1432 Adf Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 - 8.0 years

14 - 24 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

4 years of hands-on experience in .NET, C#, MVC, SQL, and Web APIs development Familiarity with Function Apps, Cosmos Db, Durable Function Apps, Event Grid, Azure Data Factory, Logic Apps, Service Bus, and Storage Accounts is essential CTC upto 24LPA

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Machine Learning Services and Pipelines. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist: Hyderabad and Gurugram You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Machine Learning and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 5+ years of experience working as a Data Scientist. 4+ years’ experience building solutions in the commercial or in the supply chain space. 4+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 4+ years’ experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 4+ years’ experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Skills, Abilities, Knowledge: Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills - Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics ‘product’ creation. Show more Show less

Posted 1 month ago

Apply

12.0 - 20.0 years

22 - 37 Lacs

Bengaluru

Hybrid

12+ yrs of experience in Data Architecture Strong in Azure Data Services & Databricks, including Delta Lake & Unity Catalog Experience in Azure Synapse, Purview, ADF, DBT, Apache Spark,DWH,Data Lakes, NoSQL,OLTP NP-Immediate sachin@assertivebs.com

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Location Name: Pune Corporate Office - HO Job Purpose Effectively capable of handling Development + Support in PostgreSQL/Oracle/SQL Database technology Interact with cross functional business teams to understand business needs and prioritize requirements. Duties And Responsibilities Ability to develop End to End functionality/modules and write complex code, perform UAT and production deployment and manage support. Ability to manage a team of 5-6 Developers Defining project milestone and drive partner/development teams to ensure delivery on time. Responsible for Delivery schedule, Change process management, Project Monitoring and Status Reporting Work with internal IT teams to ensure delivery of the agreed solutions Test new builds for all scenarios & production outcomes Key Decisions / Dimensions Decisions in solutions through technology, innovation Tackle production issues Major Challenges Managing delivery & support as it is an added responsibility Managing Support Teams Fulfill the entire requirement within restricted timelines Required Qualifications And Experience Qualifications Min. Qualification required is Graduation. Good negotiation and communication skill. Work Experience Relevant work experience of 8 to 12 Years. Skills Keywords ORACLE SQL, PL/SQL, PROCEDURES, PACKAGES, CURSORS, TRIGGERS, FUNCTIONS, COMPLEX SQL QUERIES AND PLSQL CODE, partitioning techniques, data loading mechanisms, indexes other knowledge and experience in database design. PostgreSQL 12.0 MSSQL 2019 Oracle 11g, Oracle 12c. Oracle SQL developer OR PLSQL developer. ADF 2.0 Knowledge of GITHUB and DEVOPS using Azure pipelines. Hands on experience in query tuning and other optimization knowledge will be added advantage. Show more Show less

Posted 1 month ago

Apply

10.0 - 12.0 years

25 - 27 Lacs

Indore, Hyderabad, Pune

Work from Office

We are seeking a skilled Lead Data Engineer with extensive experience in Snowflake, ADF, SQL, and other relevant data technologies to join our team. As a key member of our data engineering team, you will play an instrumental role in designing, developing, and managing data pipelines, working closely with cross-functional teams to drive the success of our data initiatives. Key Responsibilities: Design, implement, and maintain data solutions using Snowflake, ADF, and SQL Server to ensure data integrity, scalability, and high performance. Lead and contribute to the development of data pipelines, ETL processes, and data integration solutions, ensuring the smooth extraction, transformation, and loading of data from diverse sources. Work with MSBI, SSIS, and Azure Data Lake Storage to optimize data flows and storage solutions. Collaborate with business and technical teams to identify project needs, estimate tasks, and set intermediate milestones to achieve final outcomes. Implement industry best practices related to Business Intelligence and Data Management, ensuring adherence to usability, design, and development standards. Perform in-depth data analysis to resolve data issues and improve overall data quality. Mentor and guide junior data engineers, providing technical expertise and supporting the development of their skills. Effectively collaborate with geographically distributed teams to ensure project goals are met in a timely manner. Required Technical Skills: T-SQL, SQL Server, MSBI (SQL Server Integration Services, Reporting Services), Snowflake, Azure Data Factory (ADF), SSIS, Azure Data Lake Storage. Proficient in designing and developing data pipelines, data integration, and data management workflows. Strong understanding of Cloud Data Solutions, with a focus on Azure-based tools and technologies. Nice to Have: Experience with Power BI for data visualization and reporting. Familiarity with Azure Databricks for data processing and advanced analytics. Mandatory Key Skills Azure Data Lake Storage,Business Intelligence,Data Management,T-SQL,Power BI,Azure Databricks,Cloud Data Solutions,Snowflake*,ADF*,SQL Server*,MSBI*,SSIS*

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Engineer - Data Solutions Delivery + Data Catalog & Quality Engineer About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory, and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Us At ANZ, we're applying new ways technology and data can be harnessed as we work towards a common goal: to improve the financial wellbeing and sustainability of our millions of customers. About The Role At ANZ our purpose is to shape a world where people and communities thrive. We’re making this happen by improving our customers’ financial wellbeing so they can achieve incredible things – be it buying their home, building a business or saving for things big or small. Role Type : Permanent Work Location : Bengaluru Work Schedule : Regular Shifts(Hybrid\Blended) Role Purpose : To use analytical and communication skills to elicit, understand, validate and document requirements for business issues, functions and processes from high-level business needs through to detailed solution requirements. The role plays a key role in supporting the delivery of solutions through the SDLC and in Agile delivery methodologies. The Business Analyst may work independently or in a scrum environment on small/ low complexity changes; with guidance as part of BA team on low-medium complexity components of larger initiatives. What will your day look like? Analysis : Elicitation – define and refine requirements Analysis and Problem Solving Process Mapping – current & target state Identify dependencies/risks and issues Use of Stories/Acceptance Criteria Building Domain/Application Knowledge Deliver : Self-Managing – deliver to outcomes as agreed Good communication & documentation Responds to and invites feedback Facilitation Skills – team based Learning ANZ release processes/ADF Learning Agile Ways of Working Connect : Support team/colleagues | Develop relationships | Attends and learns in Guilds/CoPs Lead : Provide and receive feedback | Share knowledge | Support colleagues wellbeing What will you bring? Must have 8+ years of relevant experience banking domain. Establish and drive business and technical requirements gathering Work closely with developers and testers, project managers and business to define the technical requirements Create documentation i.e. use cases, prototypes, technical specifications Facilitate workshops with stakeholders and internally Enjoy working in a cross-functional team, taking on a variety of analyst roles, and have the ability to flex between business and technical Analyse business and stakeholder requirements to define detailed functional and non-functional solution requirements at the required level of detail and rigor sufficient to support development or make a solution decision. Detailed Description Collaborate in Agile/Scrum environments, participating in ceremonies such as sprint planning, backlog grooming, and retrospectives. Translate business requirements into actionable insights and system capabilities, ensuring alignment with enterprise architecture and banking regulations. Demonstrate a strong understanding of banking products, services, and regulatory framework Proactively identify gaps, dependencies, and risks, offering recommendations to optimize processes or technical solutions. Utilize data analysis tools (e.g., SQL, Excel, BI tools) to support data-driven decision-making and validate solution outcomes. Act as a bridge between business and IT, ensuring solutions meet user expectations and business goals. Ensure traceability of requirements through test case creation, validation, and UAT support. Engage in continuous improvement by identifying opportunities to streamline business processes and reduce manual effort. Possess strong stakeholder management and communication skills to influence and gain buy-in from senior business leaders. So why join us? ANZ is a place where big things happen as we work together to provide banking and financial services across more than 30 markets. With more than 7,500 people, our Bengaluru team is the bank's largest technology, data and operations centre outside Australia. In operation for over 33 years, the centre is critical in delivering the bank's strategy and making an impact for our millions of customers around the world. Our Bengaluru team not only drives the transformation initiatives of the bank, it also drives a culture that makes ANZ a great place to be. We're proud that people feel they can be themselves at ANZ and 90 percent of our people feel they belong. We know our people need different things to be great in their role, so we offer a range of flexible working options, including hybrid work (where the role allows it). Our people also enjoy a range of benefits including access to health and wellbeing services. We want to continue building a diverse workplace and welcome applications from everyone. Please talk to us about any adjustments you may require to our recruitment process or the role itself. If you are a candidate with a disability or access requirements, let us know how we can provide you with additional support. To find out more about working at ANZ visit https://www.anz.com/careers/ . You can apply for this role by visiting ANZ Careers and searching for reference number 97323. Job Posting End Date 03/06/2025 , 11.59pm, (Melbourne Australia) Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Senior Data Engineer Job Summary We are seeking an experienced and highly motivated Senior Azure Data Engineer to join a Data & Analytics team. The ideal candidate will be a hands-on technical leader responsible for designing, developing, implementing, and managing scalable, robust, and secure data solutions on the Microsoft Azure platform. This role involves leading a team of data engineers, setting technical direction, ensuring the quality and efficiency of data pipelines, and collaborating closely with data scientists, analysts, and business stakeholders to meet data requirements. Key Responsibilities Lead, mentor, and provide technical guidance to a team of Azure Data Engineers. Design, architect, and implement end-to-end data solutions on Azure, including data ingestion, transformation, storage (lakes/warehouses), and serving layers. Oversee and actively participate in the development, testing, and deployment of robust ETL/ELT pipelines using key Azure services. Establish and enforce data engineering best practices, coding standards, data quality checks, and monitoring frameworks. Ensure data solutions are optimized for performance, cost, scalability, security, and reliability. Collaborate effectively with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Manage, monitor, and troubleshoot Azure data platform components and pipelines. Contribute to the strategic technical roadmap for the data platform. Experience Qualifications & Experience: Minimum 6-8+ years of overall experience in data engineering roles. Minimum 3-4+ years of hands-on experience designing, implementing, and managing data solutions specifically on the Microsoft Azure cloud platform. Proven experience (1-2+ years) in a lead or senior engineering role, demonstrating mentorship and technical guidance capabilities. Education: Bachelor’s degree in computer science, Engineering, Information Technology, or a related quantitative field (or equivalent practical experience). Technical Skills Core Azure Data Services: Deep expertise in Azure Data Factory (ADF), Azure Synapse Analytics (SQL Pools, Spark Pools), Azure Databricks, Azure Data Lake Storage (ADLS Gen2). Data Processing & Programming Strong proficiency with Spark (using PySpark or Scala) and expert-level SQL skills. Proficiency in Python is highly desired. Data Architecture & Modelling Solid understanding of data warehousing principles (e.g., Kimball), dimensional modelling, ETL/ELT patterns, and data lake design. Databases Experience with relational databases (e.g., Azure SQL Database) and familiarity with NoSQL concepts/databases is beneficial. Version Control Proficiency with Git for code management. Leadership & Soft Skills Excellent leadership, mentoring, problem-solving, and communication skills, with the ability to collaborate effectively across various teams. Skills # Azure Component Proficiency 1 Azure Synapse Analytics High 2 Azure Data Factory High 3 Azure SQL High 4 ADLS Storage High 5 Azure Devops - CICD High 6 Azure Databricks Medium - High 7 Azure Logic App Medium - High 8 Azure Fabric Good to Have, not mandatory 9 Azure Functions Good to Have, not mandatory 10 Azure Purview Good to Have, not mandatory Good experience in Data extraction patterns via ADF – API , Files, Databases. Data Masking in Synapse, RBAC Experience in Data warehousing - Kimbal Modelling. Good communication and collaboration skills. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities We are looking for Technical resource for Oracle Apps R12 financial modules-based application. Below will the main responsibilities of the user: Development activity of Oracle R12.2 release Interact with business users and BA/SA to understand the requirements Prepare the technical specification documents Develop the new Interface, conversion and reports Develop/Customize/personalize new/existing oracle form and OAF pages Perform Impact analysis on possible code changes Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree in Computer Science / Engineering 3+ years of Oracle EBS (Technical) experience with R12 release Development experience in the EBS environment in Reports, Interfaces, Conversions, Extensions, Workflow (RICEW) and Forms deliverables Experience in P2P, Oracle General Ledger (GL), Account Payables (AP), Receivables (AR), Cash Management (CM), Sub-ledger Accounting (SLA), and System administrator modules Experience of end-user interaction for requirements gathering, understanding customer needs and working with multiple groups to coordinate and carry out technical activities which include new development, maintenance and production support activities Good knowledge of R12 financial table structure Good knowledge of Agile Methodologies Good hands-on knowledge of SQL, PLSQL, Oracle reports, Oracle form. OAF/ADF, BI publisher reports, Shell scripting and WebServices (Integrated SOA Gateway) Oracle APEX Knowledge Knowledge of WebServices using Integrated SOA Gateway Proven good analytical, performance tuning and debugging skills. At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 month ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Engineer – Azure Data Platform Location: Padi, Chennai Job Type: Full-Time Role Overview: We are looking for an experienced Data Engineer to join our Azure Data Platform team. The ideal candidate will have a deep understanding of Azure’s data engineering and cloud technology stack. This role is pivotal in driving data-driven decision-making, operational analytics, and advanced manufacturing intelligence initiatives. Key Responsibilities: Lead the design and implementation of data architectures that support operational analytics and advanced manufacturing intelligence, ensuring scalability and flexibility to handle increasing data volumes. Design, implement, and maintain scalable data and analytics platforms using Microsoft Azure services, such as Azure Data Factory (ADF), Azure Data Lake Storage Gen2, and Azure Synapse Analytics. Develop and manage ETL processes, data pipelines, and batch jobs to ensure efficient data flow and transformation, optimizing pipeline runs and monitoring compute and storage usage. Implement metadata management solutions to ensure data quality and governance, leading to consistent data quality and integrity. Integrate data from key sources such as SAP, SQL Server, and cloud databases, IoT and other live streaming data into centralized data structures to support analytics and decision-making. Provide expertise on data ingestion (SAP, SQL), data transformation, and the automation of data pipelines in a manufacturing context. Ensure the data platform supports dashboarding and advanced analytics, enabling business users to independently create and evolve dashboards. Implement manufacturing-specific analytics solutions, including leadership and operational dashboards, and other analytics solutions across our value chain leveraging Azure’s comprehensive toolset. Define and monitor KPIs, ensuring data quality and the accuracy of insights delivered to business stakeholders. Identify and manage project risks related to data security, system integration, and scalability. Independently maintain the data platform, ensuring its reliability and performance, and implementing best practices for data security and compliance. Advise the Data Platform project manager and leadership team on best practices for data management and scaling needs, providing guidance on integrating data from IoT and other SaaS platforms, as well as newer systems as they come into the digital landscape. Work closely with data scientists to ensure data is available in the required format for their analyses and collaborate with Power BI developers to support dashboarding and reporting needs. Create data marts for business users to facilitate self-service analytics. Mentor and train junior engineers, fostering their professional growth and development, and providing guidance and support on best practices and technical challenges. Qualifications & Experience: Education: Bachelor’s degree in Engineering, Computer Science, or a related field. Experience: 8-10 years of experience, with a minimum of 5 years working on core data engineering responsibilities on a cloud platform. Project Management experience is a big plus. Proven track record of implementing data-driven solutions in areas such as plant automation, operational analytics, quality control, supply chain optimization. Technical Proficiency: Expertise in cloud-based data platforms, particularly within the Azure ecosystem (Azure Data Factory, Synapse Analytics, Databricks). Familiarity with SAP as a data source. Proficiency in programming languages such as SQL, Python, and R for analytics and reporting. Soft Skills: Strong analytical mindset with the ability to translate manufacturing challenges into data-driven insights and solutions. Excellent communication and organizational skills. What We Offer: The opportunity to work on transformative data analytics projects that drive innovation and operational excellence in manufacturing. A collaborative and dynamic work environment focused on professional growth and career development. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description We are looking for a skilled DevOps Engineer with solid experience in cloud infrastructure management, CI/CD pipelines, and deployment on Kubernetes. The ideal candidate will have a strong background in Azure cloud platforms, with proficiency in configuring, automating, and optimizing cloud deployments to ensure scalability, reliability, and security of our : Design, implement, and manage CI/CD pipelines using Azure DevOps, GitHub, and Jenkins for automated deployments of applications and infrastructure changes. Architect and deploy solutions on Kubernetes clusters (EKS and AKS) to support containerized applications and microservices architecture. Collaborate with development teams to streamline code deployments, releases, and continuous integration processes across multiple environments. Configure and manage Azure services including Azure Synapse Analytics, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), and other data services for efficient data processing and analytics workflows. Utilize AWS services such as Amazon EMR, Amazon Redshift, Amazon S3, Amazon Aurora, IAM policies, and Azure Monitor for data management, warehousing, and governance. Implement infrastructure as code (IaC) using tools like Terraform or CloudFormation to automate provisioning and management of cloud resources. Ensure high availability, performance monitoring, and disaster recovery strategies for cloud-based applications and services. Develop and enforce security best practices and compliance policies, including IAM policies, encryption, and access controls across Azure environments. Collaborate with cross-functional teams to troubleshoot production issues, conduct root cause analysis, and implement solutions to prevent recurrence. Stay current with industry trends, best practices, and evolving technologies in cloud computing, DevOps, and container : Bachelor's degree in Computer Science, Engineering, or related field; or equivalent work experience. 5+ years of experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps, Git, GitHub, Jenkins, and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS, AKS) and container orchestration platforms. Deep understanding of cloud-native architectures, microservices, and serverless computing. Familiarity with Azure Synapse, ADF, ADLS, and AWS data services (EMR, Redshift, Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform, CloudFormation, or ARM templates. Experience with monitoring tools (e.g., Prometheus, Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science Show more Show less

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Pune, Gurugram

Work from Office

Job Description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Azure Data Engineer. Experience: 4+ Years Skill Set: Azure Synapse, Pyspark, ADF and SQL. Location: Pune, Hyderabad, Gurgaon 5+ years of experience in software development, technical operations, and running large-scale applications. 4+ years of experience in developing or supporting Azure Data Factory (API/APIM), Azure Databricks, Azure DevOps, Azure Data Lake storage (ADLS), SQL and Synapse data warehouse, Azure Cosmos DB 2+ years of experience working in Data Engineering Any experience in data virtualization products like Denodo is desirable Azure Data Engineer or Solutions Architect certification is desirable Should have a good understanding of container platforms like Docker and Kubernetes. Should be able to assess the application/platform time to time for architectural improvements and provide inputs to the relevant teams Very Good troubleshooting skills (quick identification of the application issues and providing quick resolutions with no or minimal user/business impact) Hands-on experience in working with high-volume, mission-critical applications Deep appreciation of IT tools, techniques, systems, and solutions. Excellent communication skills along with experience in driving triage calls which involves different technical stake holders Has creative problem-solving skills related to cross-functional issues amidst the changing priorities. Should be flexible and resourceful to swiftly manage the changing operational goals and demands. Good experience in handling escalations and take complete responsibility and ownership of all critical issues to get a technical/logical closure. Good understanding of the IT Infrastructure Library (ITIL) framework and various IT Service Management (ITSM) tools available in the marketplace

Posted 1 month ago

Apply

4.0 - 7.0 years

8 - 15 Lacs

Hyderabad

Hybrid

We are seeking a highly motivated Senior Data Engineer OR Data Engineer within Envoy Global's tech team to join us on a full time, permanent basis. This role is responsible for designing, developing, and documenting data pipelines and ETL jobs to enable data migration, data integration and data warehousing. That includes ETL jobs, reports, dashboards and data pipelines. The person in this role will work closely with Data Architect, BI & Analytics team and Engineering teams to deliver data assets for Data Security, DW and Analytics. As our Senior Data Engineer OR Data Engineer, you will be required to: Design, build, test and maintain cloud-based data pipelines to acquire, profile, cleanse, consolidate, transform, integrate data Design and develop ETL processes for the Data Warehouse lifecycle (staging of data, ODS data integration, EDW and data marts) and Data Security (Data archival, Data obfuscation, etc.). Build complex SQL queries on large datasets and performance tune as needed Design and develop data pipelines and ETL jobs using SSIS and Azure Data Factory Maintain ETL packages and supporting data objects for our growing BI infrastructure Carry out monitoring, tuning, and database performance analysis Facilitate integration of our application with other systems by developing data pipelines Prepare key documentation to support the technical design in technical specifications Collaborate and work alongside with other technical professionals (BI Report developers, Data Analysts, Architect) Communicate clearly and effectively with stakeholders To apply for this role, you should possess the following skills, experience and qualifications: Design, Develop, and Document Data Pipelines and ETL Jobs: Create and maintain robust data pipelines and ETL (Extract, Transform, Load) processes to support data migration, integration, and warehousing. Data Assets Delivery: Collaborate with Data Architects, BI & Analytics teams, and Engineering teams to deliver high-quality data assets for data security, data warehousing (DW), and analytics. ETL Jobs, Reports, Dashboards, and Data Pipelines: Develop and manage ETL jobs, generate reports, create dashboards, and ensure the smooth operation of data pipelines. 3+ years of experience as a SSIS ETL developer, Data Engineer or a related role 2+ years of experience using Azure Data Factory Knowledgeable in Data Modelling and Data warehouse concepts Experience working with Azure stack Demonstrated ability to write SQL/TSQL queries to retrieve/modify data Knowledge and know-how to troubleshoot potential issues, and experience with best practices around database operations Ability to work in an Agile environment Should you have a deep passion for technology and a desire to thrive in a rapidly evolving and creative environment, we would be delighted to receive your application. Please provide your updated resume, highlighting your relevant experience and the reasons you believe you would be a valuable member of our team. We look forward to reviewing your subm

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. B.Tech degree and 5+ years of ETL development experience in Microsoft data track are required. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory Skill Sets ETL Development Preferred Skill Sets Microsoft Stack Years Of Experience Required 4+ Education Qualification B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Microsoft Stack Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

3.0 - 10.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Azure Data Factory, Data Engineering, Microsoft Azure Databricks Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies