Jobs
Interviews

1647 Adf Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role Description Job Title: Senior Data Engineer Experience: 7+ Years Employment Type: Full-time Job Summary We are seeking a skilled Senior Data Engineer with strong experience in building scalable data pipelines and transforming complex data across enterprise platforms. The ideal candidate should have hands-on expertise in Databricks, PySpark, SQL , and ETL/ELT tools such as Informatica, AWS Glue, or DataProc . Experience with cloud data warehouses like Snowflake, BigQuery, or Delta Lake , and a strong understanding of data security, compliance, and DevOps is essential. Domain knowledge in banking, financial services, or cybersecurity is highly desirable. Key Responsibilities Design, build, and optimize secure data pipelines for large-scale data processing. Develop ETL/ELT jobs and implement Data Quality (DQ) rules within Databricks and Aurora platforms. Collaborate with Data Architects, DQ Analysts, and Cyber SMEs in Agile POD teams. Manage data modeling, performance tuning, and infrastructure cost optimization. Support data governance, DQ controls (e.g., BCBS 239, DUSE, DMOVE), and compliance reporting. Document architecture, test strategies, and ensure code quality and scalability. Required Skills Strong proficiency in Databricks, PySpark, SQL Experience with ETL tools (e.g., Glue, DataProc, ADF, Informatica) Cloud experience with AWS, Azure, or GCP Hands-on with data modeling, DQ implementation, and performance tuning Understanding of data security, encryption, and risk controls Excellent communication and stakeholder collaboration skills Preferred Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field Experience in banking, financial services, or cybersecurity domains Familiarity with DUSE/DMOVE frameworks and cybersecurity metrics reporting Certification in cloud or data engineering tools is a plus Skills Databricks,Pyspark,Sql,Etl

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Greetings from TCS !!! TCS is Hiring for Azure Data Engineer Job Role: Azure Data Engineer Experience Range: 8+ Job Location: Noida / Chennai Interview Mode: Virtual (MS Teams) Responsibilities of / Expectations from the job: Developing, managing and optimizing robust and reliable data pipelines using azure native capabilities. Implement ADF workflows that perform data ingestion, data integration/ETL, statistical model executions etc. Creating architecture for data solutions with high performance characteristics Bringing data and analytics products to production Implement CI/CD pipelines for data solutions Build dashboards for data stewards and business reporting Design & build RDBMS data models Added Advantage: Python Azure data engineer certification TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. Referrals are always welcome!!! Kindly don't apply if already attended interview within 1 month. Thanks & Regards, Jerin L Varghese

Posted 1 month ago

Apply

4.0 - 9.0 years

13 - 23 Lacs

Pune, Chennai, Bengaluru

Hybrid

Skill - ADF, Snowflake, SQL Interested candidates please share resume on juisagars@hexaware.com with below details - Total exp Relevant exp Current company Current CTC Expected CTC Notice period/LWD

Posted 1 month ago

Apply

12.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Sr. Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 12 years hand-on experience in one or more of the above areas. Minimum 14 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

14.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Data warehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Datawarehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Hyderabad

Remote

Job Description Job Title: SQL Database Architect (SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, Azure Data Factory, Power BI) Job Summary: We are seeking a highly skilled SQL Database Architect with expertise in SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, and Power BI to design, develop, and maintain scalable database solutions . The ideal candidate will have experience in database architecture, data integration, ETL processes, cloud-based solutions, and business intelligence reporting . Excellent communication and documentation skills are essential for collaborating with cross-functional teams and maintaining structured database records. Key Responsibilities: Database Design & Architecture: Develop highly available, scalable, and secure database solutions using Azure SQL Server . ETL & Data Integration: Design and implement SSIS packages for data movement, transformations, and automation . Data Migration: Oversee database migration projects , including on-premises to cloud transitions and cloud to cloud transitions, data conversion, and validation processes. Azure Data Factory (ADF): Build and manage data pipelines, integrating various data sources and orchestrating ETL workflows in cloud environments Reporting & Business Intelligence: Develop SSRS reports and leverage Power BI for creating interactive dashboards and data visualizations . Performance Optimization: Analyze and optimize query performance, indexing strategies, and database configurations . Cloud Integration: Architect Azure-based database solutions , including Azure SQL Database, Managed Instances, and Synapse Analytics . Security & Compliance: Ensure data security, encryption, and compliance with industry standards. Backup & Disaster Recovery: Design and implement backup strategies, high availability, and disaster recovery solutions . Automation & Monitoring: Utilize Azure Monitor, SQL Profiler, and other tools to automate and monitor database performance. Collaboration & Communication: Work closely with developers, BI teams, DevOps, and business stakeholders , explaining complex database concepts in a clear and concise manner . Documentation & Best Practices: Maintain comprehensive database documentation , including design specifications, technical workflows, and troubleshooting guides . Required Skills & Qualifications: Expertise in SQL Server & Azure SQL Database. Experience with SSIS for ETL processes, data transformations, and automation. Proficiency in SSRS for creating, deploying, and managing reports. Strong expertise in Data Migration, including cloud and on-premises database transitions. Power BI skills for developing dashboards, reports, and data visualizations. Database modeling, indexing, and query optimization expertise. Knowledge of cloud-based architecture, including Azure SQL Managed Instance. Proficiency in T-SQL, stored procedures, Triggers, and database scripting. Understanding of security best practices, including role-based access control (RBAC). Excellent communication skills to explain database solutions to technical and non-technical stakeholders. Strong documentation skills to create and maintain database design specs, process documents, and reports. Preferred Qualifications: Knowledge of CI/CD pipelines for database deployments. Familiarity with Power BI and other data visualization tools. Experience with Azure Data Factory and Synapse Analytics for advanced data engineering workflows Qualifications Bachelor's or Master's degree in Business, Computer Science, Engineering, or a related field. Additional information Bachelor's or Master's degree in Business, Computer Science, Engineering, or a related field.

Posted 1 month ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Sr. Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 12 years hand-on experience in one or more of the above areas. Minimum 14 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. About UPS Fiche de poste : UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Greetings from TATA Consultancy Services Job Openings at TCS Skill: Azure Data Engineer Exp range : 4 yrs to 10 yrs Role: Permanent Role Preferred location: Kolkata Interview date : 1 2-July-2025 Pls find the Job Description below. Role: Azure Data Engineers Required technical Skillset: Azure Data Bricks , PySpark , Azure Data Lake Storage-Gen 2/SQL Db, Azure Data Factory, Storage Account/Key vault, Azure DevOps Languages/SQL Azure Data technologies – PaaS components (ADF, ADB, Data Lake) PL/SQL. Hands-on Azure Data technologies experience , PySpark Good-to-Have 1. Good communication skills 2. Azure DevOps 3. SQL Server If you are Interested in the above opportunity kindly share your updated resume to sivabhavani.t@tcs.com immediately with the details below (Mandatory) Name: Contact No. Email id: Total exp: Relevant Exp: Fulltime highest qualification (Year of completion with percentage scored): Current organization details (Payroll company): Current CTC: Expected CTC: Notice period: Current location: Any gaps between your education or career (If yes pls specify the duration): Available for the Face-to-face interview as on 12-July-2025 in Kolkata location (Yes/No):

Posted 1 month ago

Apply

40.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. The Fusion Supply Chain / Manufacturing Support Team is expanding to support our rapidly increasing customer base in the Cloud (SaaS), as well as growing numbers of on-premise accounts. The team partners with Oracle Development in supporting early adopters and many other new customers. This is a unique opportunity to be part of the future of Oracle Support and help shape the product and the organization to benefit our customers and our employees. This position is for supporting Fusion Applications, particularly under the Fusion SCM modules - Fusion SCM Planning, Fusion SCM Manufacturing, Fusion SCM Maintenance. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, PL/SQL,Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Career Level - IC3 Responsibilities RESPONSIBILITIES As a Sr. Support Engineer, you will be the technical interface to customer) for resolution of problems related to the maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 month ago

Apply

8.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Job Summary: We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering environment. Key Responsibilities: Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL workflows Mandatory Skills: ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data workflows Good-to-Have Skills: API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling)

Posted 1 month ago

Apply

6.0 years

7 - 8 Lacs

Hyderābād

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities: Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements: Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. Qualifications Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred: Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. #LI-Hybrid Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderābād

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC

Posted 1 month ago

Apply

2.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: AI/GenAI Engineer Job ID: POS-13731 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 3.00 Secondary skills: Python, LLM, Langchain, Vectors, and AWS Mode of Work: Work from Office Experience : 2-3 Years About The Job We are seeking a highly motivated and innovative Generative AI Engineer to join our team and drive the exploration of cutting-edge AI capabilities. You will be at forefront of developing solutions using Generative AI technologies, primarily focusing on Large Language Models (LLMs) and foundation models, deployed on either AWS or Azure cloud platforms. This role involves rapid prototyping, experimentation, and collaboration with various stakeholders to assess the feasibility and potential impact of GenAI solutions on our business challenges. If you are passionate about the potential of GenAI and enjoy hands-on building in a fast-paced environment, this is the role for you. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Develop GenAI Solutions: Develop, and rapidly iterate on GenAI solutions leveraging LLMs and other foundation models available on AWS and/or Azure platforms. Cloud Platform Implementation: Utilize relevant cloud services (e.g., AWS SageMaker, Bedrock, Lambda, Step Functions; Azure Machine Learning, Azure OpenAI Service, Azure Functions) for model access, deployment, data processing. Explore GenAI Techniques: Experiment with and implement techniques like Retrieval-Augmented Generation (RAG), evaluating the feasibility of model fine-tuning or other adaptation methods for specific PoC requirements. API Integration: Integrate GenAI models (via APIs from cloud providers, OpenAI, Hugging Face, etc.) into prototype applications and workflows. Data Handling for AI: Prepare, manage, and process data required for GenAI tasks, such as data for RAG indexes, datasets for evaluating fine-tuning feasibility, or example data for few-shot prompting. Documentation & Presentation: Clearly document PoC architectures, implementation details, findings, limitations, and results for both technical and non-technical audiences. Requirements Overall, 2-3 years of experience. Expert in Python with advance programming and concepts Solid understanding of Generative AI concepts, including LLMs, foundation models, prompt engineering, embeddings, and common architectures (e.g., RAG). Demonstrable experience working with at least one major cloud platform (AWS or Azure). Hands-on experience using cloud-based AI/ML services relevant to GenAI (e.g., AWS SageMaker, Bedrock; Azure Machine Learning, Azure OpenAI Service). Experience interacting with APIs, particularly AI/ML model APIs Bachelor’s degree in computer science, AI, Data Science or equivalent practical experience. About The Company Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title/ Skill - System Administrator - Oracle Weblogic and SOA Administrator Experience: 4 - 8 years Location: Hyderabad Required Skills: Experience in Oracle Weblogic Server Installation, Configuration, performance tuning and Troubleshooting. Knowledge and Hands-on experience on Oracle FMW SOA / OSB Administration work for Installation, Configuration, performance tuning and Troubleshooting. Experience in Patching / Upgrading Oracle FMW Products (Weblogic, ADF, SOA, OSB etc.). Experience in troubleshooting WebLogic and SOA / OSB logs and work with Oracle team for resolution

Posted 1 month ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Engineer Location: Pune, India (Hybrid) Type: Contract (6 Months) Experience: 5–8 Years Domain: Financial Services Work Timing: Regular Day Shift Background Check: Mandatory before onboarding Job Description: Seeking experienced Data Engineers with strong hands-on skills in SQL, Python, Azure Databricks, ADF, and PySpark. Candidates should have experience in data modeling, ETL design, big data technologies, and large-scale on-prem to cloud migrations using Azure data stack. Mandatory Skills: Azure Databricks Azure Data Factory Python PySpark Preferred Skills: Spark, Kafka Azure Synapse, Azure SQL, Azure Data Lake, Azure Cosmos DB Batch and real-time ingestion

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Exp level – 5 to 10 yrs Strong in Azure and ADF, Azure Data brick or Fabrics & Data pipeline Relevant 3 yrs of experience in Azure Data Brick.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience And Skills Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 1 month ago

Apply

8.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Job Description We are seeking a skilled and motivated Data Engineer with 8+ years of experience to join our team. The ideal candidate should be experienced in data engineering on Snowflake, Azure -ADF, Microsoft MDS, SQL ,Data Pipelines with a focus on developing, maintaining the Data Analytics solutions. You will collaborate with cross-functional teams to deliver high-quality data solutions that meet business requirements. Required Skills And Experience Bachelor or Master degree in computer science, Data Science, Engineering, or a related field. 8+ years of experience in data engineering or related fields. Strong proficiency in SQL, Snowflake, Stored procedure, Views . Hands-on experience with Snowflake SQL, ADF (Azure Data Factory), Microsoft MDS(Master Data Service). Knowledge of data warehousing concepts. Experience with cloud platforms (Azure). Understanding of data modeling and data warehousing principles. Strong problem-solving and analytical skills, with attention to detail. Excellent communication and collaboration skills. Bonus Skills Exposure to CI/CD practices using Microsoft Azure DevOps . Basic knowledge or understanding of PBI. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines using Azure Data Factory (ADF). Build and optimize data models and data warehousing solutions within Snowflake. Develop and maintain data integration processes, ensuring data quality and integrity. Utilize strong SQL skills to query, transform, and analyze data within Snowflake. Develop and manage stored procedures and views in Snowflake. Implement and manage master data using Microsoft Master Data Services (MDS). Collaborate with data analysts and business stakeholders to understand data requirements and deliver effective data solutions. Ensure the performance, reliability, and security of data pipelines and data warehousing systems. Troubleshoot and resolve data-related issues in a timely manner. Stay up-to-date with the latest advancements in data engineering technologies, particularly within the Snowflake and Azure ecosystems. Contribute to the documentation of data pipelines, data models, and ETL processes (ref:hirist.tech)

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering Responsibilities : Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL Skills : ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data Skills : API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling) (ref:hirist.tech)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Databricks Data Engineer Key Responsibilities : Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver Skills & Experience : 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice To Have Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. (ref:hirist.tech)

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Years of Experience : 5 Job Description We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : Responsibility : Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages. (ref:hirist.tech)

Posted 1 month ago

Apply

7.0 years

0 Lacs

Greater Kolkata Area

Remote

Job Title : Senior Data Engineer Azure, ETL, Snowflake. Experience : 7+ yrs. Location : Remote. Job Summary We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT. Key Responsibilities Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake. Write optimized SQL queries, stored procedures, and views to transform and retrieve data. Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc. Work on data mapping, modelling, and transformation tasks across multiple data sources. Build and deploy custom connectors using Python, PySpark, or ADF. Implement and manage Snowflake as a data storage and processing solution. Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub. Ensure smooth cloud migration and data pipeline deployment using Azure services. Work with Fivetran and DBT for ingestion and transformation as required. Participate in Agile/Scrum ceremonies and follow DevSecOps practices. Mandatory Skills & Qualifications 7 years of experience in Data Engineering, ETL development, or similar roles. Proficient in SQL with strong understanding of joins, filters, and aggregations. Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc. Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage. Experience with Azure Cloud Services, specifically : Azure Data Factory (ADF). Databricks. Azure Data Lake. Hands-on experience in Snowflake implementation (ETL or Storage Layer). Familiarity with data modelling, data mapping, and pipeline creation. Experience working with semi-structured/unstructured data formats. Working knowledge of GitHub for version control and code management. Good To Have / Preferred Skills Experience using Fivetran and DBT for ingestion and transformation. Knowledge of AWS or GCP cloud environments. Familiarity with DevSecOps processes and CI/CD pipelines within Azure. Proficiency in Excel and Macros. Exposure to Agile methodologies (Scrum/Kanban). Understanding of custom connector creation using PySpark or ADF. Soft Skills Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Ability to work independently and take ownership of deliverables. Detail-oriented with a commitment to quality. Why Join Us? Work on modern, cloud-based data platforms. Exposure to a diverse tech stack and new-age data tools. Flexible remote working opportunity aligned with a global team. Opportunity to work on critical enterprise-level data solutions. (ref:hirist.tech)

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies