Jobs
Interviews

1432 Adf Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Job Title: Java Backend Developer Location: Chennai / Trivandrum Experience: 4+ Years Job Summary We are looking for a skilled Java Backend Developer to join our growing Services Engineering team, contributing to the development of robust, scalable, and innovative solutions in a high-performance environment. This role involves working on Oracle Fusion BPM solutions, developing APIs, and supporting key business processes for a global credit card platform. You’ll play a key role in designing backend components that deliver real impact to end users. Key Responsibilities Design, develop, and maintain backend services and APIs using Java and related technologies. Collaborate within a self-organized engineering team to build features aligned with the product roadmap and business goals. Work within a Service-Oriented Architecture (SOA) methodology, focusing on Oracle Fusion BPM and associated workflows. Administer BPM architecture and configurations across enterprise software applications. Develop custom BPM workflows and ADF user interfaces based on business and user requirements. Write clean, maintainable, and well-documented code while ensuring high-quality deliverables through practices such as TDD, BDD, and pair programming. Support continuous integration and deployment, collaborating with DevOps and QA teams. Maintain and document customizations and extensions in Oracle Fusion and database components. Contribute to team-level innovation and efficiency improvements. Mandatory Skills 4+ years of professional experience in backend development using Java and Object-Oriented Programming (OOP) principles. Proficient in API development, RESTful services, and HTTP protocols. Experience in Oracle Fusion BPM development, including custom BPM workflows and ADF UI. Solid understanding of SOA methodologies and enterprise application integration. Proficiency in Oracle 12c DB, SQL, and relational database concepts. Strong collaboration and communication skills with the ability to work in a fast-paced team environment. Experience with TDD, BDD, and pair programming practices. Good-to-Have Skills Experience working in cloud environments such as AWS. Exposure to regulated domains or working in financial services is a plus. Experience with solving real-world problems in complex systems. Familiarity with modern CI/CD practices and DevOps tools. Keywords Java, Oracle Fusion, BPM, ADF, API Development, SQL, Cloud (AWS), Software Engineering, Backend Development Skills Cloud Computing,Java,Aws Cloud Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Snowflake Developer Location: Gurugram Experience: 3 to 7 years Skillset: Snowflake, Azure, ADF (Azure Data Factory) Job Type: Full-Time Overview: We are looking for a talented Snowflake Developer with expertise in Snowflake, Azure, and Azure Data Factory (ADF) to join our dynamic team. In this role, you will be responsible for developing, implementing, and optimizing data pipelines and ETL processes. You will work on cloud-based data platforms to ensure the effective and seamless integration of data across systems. The ideal candidate will have a solid background in working with Snowflake and cloud data services, and be ready to travel to client locations as required. Key Responsibilities: • Design, develop, and implement data solutions using Snowflake and Azure technologies. • Develop and manage ETL pipelines using Azure Data Factory (ADF) for seamless data movement and transformation. • Collaborate with cross-functional teams to ensure that the data platform meets business needs and aligns with data architecture best practices. • Monitor, optimize, and troubleshoot data pipelines and workflows in Snowflake and Azure environments. • Implement data governance and security practices in line with industry standards. • Perform data validation and ensure data integrity across systems and platforms. • Ensure data integration and management processes are optimized for performance, scalability, and reliability. • Provide technical support and guidance to junior developers and team members. • Collaborate with the client to understand project requirements and ensure deliverables are met on time. • Be open to travelling to client locations as needed for project delivery and stakeholder engagements. Skills and Qualifications: • 3 to 7 years of hands-on experience in Snowflake development and data management. • Strong working knowledge of Azure (Azure Data Services, Azure Data Lake, etc.) and Azure Data Factory (ADF). • Expertise in designing and developing ETL pipelines and data transformation processes using Snowflake and ADF. • Proficiency in SQL and data modeling, with experience working with structured and semi-structured data. • Knowledge of data warehousing concepts and best practices in Snowflake. • Understanding of data security, privacy, and compliance requirements in cloud environments. • Experience with cloud-based data solutions and integration services. • Strong problem-solving and debugging skills. • Ability to work effectively with both technical and non-technical teams. • Good communication skills to collaborate with clients and team members. • Bachelor’s degree in Computer Science, Information Technology, or a related field. Preferred Skills: • Experience with other Azure services like Azure SQL Database, Azure Synapse Analytics, and Power BI. • Familiarity with data governance tools and data pipeline orchestration best practices. • Ability to optimize Snowflake queries and database performance. Why Join Us: • Work with cutting-edge cloud technologies like Snowflake and Azure. • Exposure to complex, large-scale data projects across industries. • Collaborative work environment that promotes innovation and learning. • Competitive salary and benefits package. • Opportunities for career growth and development. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Develop and manage ETL workflows using Azure Data Factory (ADF). Design and implement data pipelines using PySpark on Azure Databricks. Work with Azure Synapse Analytics, Azure Data Lake, and Azure Blob Storage for data ingestion and transformation. Optimize Spark jobs for performance and scalability in Databricks. Automate data workflows and implement error handling & monitoring in ADF. Collaborate with data engineers, analysts, and business teams to understand data requirements. Implement data governance, security, and compliance best practices in Azure. Debug and troubleshoot PySpark scripts and ADF pipeline failures. 4+ years of experience in ETL development with Azure Data Factory (ADF). Hands-on experience with Azure Databricks and PySpark for big data processing. Strong knowledge of Azure services Proficiency in Python and PySpark for data transformation and processing. Experience with CI/CD pipelines for deploying ADF pipelines and Databricks notebooks. Strong expertise in SQL for data extraction and transformations. Knowledge of performance tuning in Spark and cost optimization on Azure. Skills Azure Data Factory,Pyspark,Azure Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

We’re hiring a Senior ML Engineer (MLOps) — 3-5 yrs Location: Kochi or Chennai What you’ll do Tame data → pull, clean, and shape structured & unstructured data. Orchestrate pipelines → Airflow / Step Functions / ADF… your call. Ship models → build, tune, and push to prod on SageMaker, Azure ML, or Vertex AI. Scale → Spark / Databricks for the heavy lifting. Automate everything → Docker, Kubernetes, CI/CD, MLFlow, Seldon, Kubeflow. Pair up → work with engineers, architects, and business folks to solve real problems, fast. What you bring 3+ yrs hands-on MLOps (4-5 yrs total software experience). Proven chops on one hyperscaler (AWS, Azure, or GCP). Confidence with Databricks / Spark , Python, SQL, TensorFlow / PyTorch / Scikit-learn. You debug Kubernetes in your sleep and treat Dockerfiles like breathing. You prototype with open-source first, choose the right tool, then make it scale. Sharp mind, low ego, bias for action. Nice-to-haves Sagemaker, Azure ML, or Vertex AI in production. Love for clean code, clear docs, and crisp PRs. Why Datadivr? Domain focus: we live and breathe F&B — your work ships to plants, not just slides. Small team, big autonomy: no endless layers; you own what you build. 📬 How to apply Shoot your CV + a short note on a project you shipped to careers@datadivr.com or DM me here. We reply to every serious applicant. Know someone perfect? Please share — good people know good people. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Implementation of data pipelines using big data tools, such as Azure Data Factory, PySpark, SparkSQL,SQL, and Azure DevOps, data workflows.2.Hands-on PySpark programming.3. Hands-on with SQL (Stored procedures, Complex SQL queries), data modelling.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to accomplish their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Lead Engineer (.NET/Azure) is a hands-on, contributor position, responsible for creating solutions and architectures for high-volume, high-transaction applications across the Experian Employer Services (EES) organization. The Lead Software Engineer (.NET/Azure) will write code, participate in code reviews, evaluate SAST findings, and collaborate closely with other members of the larger Experian Employer Services organization, to provide high-quality software solutions to our clients and partners. The Lead Engineer (.NET/Azure) will also mentor other engineers, delegating work, and evaluating acquired technologies and guiding the best way to incorporate these acquired technologies into the Experian Employer Services product ecosystem. You will be reporting to Director - Engineering. Responsibilities Analyze new feature requirements including: Architectural design considerations Software development best practices Test strategies Database design Security considerations Cloud architecture considerations Create new and modernize existing applications that look great across multiple devices Create new and modernize existing API's and partner integrations Implement high-quality code and unit tests Lead design sessions to define Azure-based technical solutions Participate in code reviews and provide meaningful feedback Adhere to Experian's Secure SDLC practices to ensure secure and compliant development. Resolve bugs identified by QA in a timely manner Demonstrate functionality to Product team for approval Promote DevOps culture and work closely with IT as required Assist other team members as needed Delegate tasks to other team members and oversee work quality Be on-call rotation for any platform emergencies Technical Requirements Extensive experience with C#, .NET Framework, .NET Core, Azure Extensive experience with MS SQL Server, Azure SQL, T-SQL, Relational Database Design Extensive experience with Frontend technologies (HTML, CSS, Javascript, Angular, ReactJS) Extensive experience with Azure Cloud Solutions (IaaS, SaaS) Extensive experience with API's microservices, container development and integrations Extensive experience with ETL Technologies like SSIS, ADF Expert knowledge of the latest Architectural Patterns and Cloud Native development Experience with Azure DevOps CI/CD pipelines Experience with Agile software methodologies Experience with Entity Framework or other ORM General Requirements 12+ years of experience in Microsoft stack (.NET, DotNet Core, C# & SQL Server) and architectural experience 3+ years of team lead experience 5+ years of Azure cloud experience Qualifications Bachelor's in computer science, Masters preferred Preferred Qualifications Azure certifications Understanding ML/AI concepts. Strong understanding of DevOps practices and tools. Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

We are seeking an experienced Azure Data Architect to lead data architecture design and implementation on Azure cloud. The ideal candidate will have deep expertise in Databricks , Azure Functions , and hold a valid Azure Certification . Required Skills: experience in data architecture and engineering Strong hands-on experience with Azure Databricks and Spark Proficiency in Azure Functions , Data Lake , Azure Synapse , ADF Certified in Microsoft Azure (e.g., AZ-305, DP-203) Solid understanding of data modeling, governance, and performance tuning Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience: As mentioned in table below Notice Period: Immediate joiners Work Timings: 1PM – 10 PM Location: Gurgaon, Work from office -Hybrid mode, client location Technical Role Primary & Mandatory Skill SQL + ADF •Strong exp in SQL development along with exp in cloud AWS & good exp in ADF 5 + years Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

Location - Bangalore/ Gurgaon Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the client’s data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

Remote

Position Title : Manager-Data Science Location : Remote (Hybrid option available, if in Chennai) Company : ADF Data Science Pvt. Ltd. : Analytics, Risk and R&D Position Type : Full-Time Job Summary We are seeking skilled and motivated Data Scientists with 4+ years of experience in data science with good domain understanding. The ideal candidate will have a strong foundation in data science concepts, proficiency in analytical tools, and the ability to translate data insights into actionable business recommendations. This role requires a blend of technical expertise and business acumen, preferable in financial (credit, risk) fields, to drive data-driven decision making. This will be an individual contributor role or lead for a small team (if relevant experience is present) Qualifications Education : Bachelor of Engineering or master's in quantitative areas. It is mandatory that the ideal candidate should be from tier 1 institutes. Experience 4+ years of experience in data science and business analytics projects. The ideal candidate should exposure in Credit risk analytics. Proven experience in data handling, analytics with good exposure to statistical analysis and machine learning. Technical Skills Expertise in programming languages such as Python and SQL. Expertise in machine learning algorithms. Soft Skills Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to lead a team. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 18 Lacs

Pune, Gurugram, Bengaluru

Hybrid

5-8 years of strong experience of working on Microsoft Azure based technologies and frameworks. Good experience in design and implementation of Azure-based solutions using .Net Core, C#, Function Apps, Azure Service Bus and Azure SQL Server, SSIS Packages, among other Azure-related technologies. Experience designing and implementing in REST+JSON Web Services. Expertise in the application of software design patterns, object-oriented practices, and software development life cycle, testing, version control, deployment, production support and maintenance Expertise in relational and non-relational database concepts, design and database management systems Ability capture, document, and implement functional and non-functional requirements into technical solutions. Experience of working in Agile driven development model Ensure quality of deliverables within project timelines Independently manage daily client communication, especially over calls Drives the work towards completion with accuracy and timely deliverables Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Good to have Financial Services knowledge

Posted 1 month ago

Apply

5.0 years

0 Lacs

India

On-site

Job Summary: We are looking for a Data Engineer with solid hands-on experience in Azure-based data pipelines and Snowflake to help build and scale data ingestion, transformation, and integration processes in a cloud-native environment. Key Responsibilities: Develop and maintain data pipelines using ADF, Snowflake, and Azure Storage Perform data integration from various sources including APIs, flat files, and databases Write clean, optimized SQL and support data modeling efforts in Snowflake Monitor and troubleshoot pipeline issues and data quality concerns Contribute to documentation and promote best practices across the team Qualifications: 3–5 years of experience in data engineering or related role Strong hands-on knowledge of Snowflake, Azure Data Factory, SQL, and Azure Data Lake Proficient in scripting (Python preferred) for data manipulation and automation Understanding of data warehousing concepts and ETL/ELT patterns Experience with Git, JIRA, and agile delivery environments is a plus Strong attention to detail and eagerness to learn in a collaborative team setting Show more Show less

Posted 1 month ago

Apply

4.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience Level: 4 to 5 years Immediate joiner can apply We are looking for a skilled Data Engineer with expertise in Azure services to design, develop, and maintain scalable data pipelines and integrations across structured and unstructured data sources. The ideal candidate should have experience in building solutions using Azure Data Factory, Logic Apps, and Functions with strong focus on governance, monitoring, and automation. Responsibilities include: Building and managing pipelines using Azure Data Factory , automating workflows via Logic Apps , and implementing custom logic with Azure Functions (Python/C#) . Setting up API integrations and managing secrets via Key Vault , implementing access control with Entra ID , and cataloging with Azure Purview . Performing thorough unit and integration testing , ensuring data integrity and pipeline optimization with Azure Monitor . Required Skills: Strong understanding of Azure Integration services (ADF, Functions, Event Hubs) . Hands-on experience with ETL/ELT pipelines , REST APIs , and JSON/XML processing . Knowledge of Azure governance and monitoring tools , including Purview and Azure Monitor. Proficiency in Python or C# , with working knowledge of SQL. Show more Show less

Posted 1 month ago

Apply

40.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Career Level - IC4 Responsibilities As an Advisory Systems Engineer, you are expected to be an expert member of the problem-solving/avoidance team and be highly skilled in solving extremely complex (often previously unknown), critical customer issues. Performing the assigned duties with a high level of autonomy and reporting to management on customer status and technical matters on a regular basis, you will be expected to work with very limited guidance from management. Further, the Advisory Systems Engineer is sought by customers and Oracle employees to provide expert technical advice. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

40.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Career Level - IC4 Responsibilities As an Advisory Systems Engineer, you are expected to be an expert member of the problem-solving/avoidance team and be highly skilled in solving extremely complex (often previously unknown), critical customer issues. Performing the assigned duties with a high level of autonomy and reporting to management on customer status and technical matters on a regular basis, you will be expected to work with very limited guidance from management. Further, the Advisory Systems Engineer is sought by customers and Oracle employees to provide expert technical advice. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

7.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. At Circle K, we are building a best-in-class global data engineering practice to support intelligent business decision-making and drive value across our retail ecosystem. As we scale our engineering capabilities, we’re seeking a Lead Data Engineer to serve as both a technical leader and people coach for our India-based Data Enablement pod. This role will oversee the design, delivery, and maintenance of critical cross-functional datasets and reusable data assets while also managing a group of talented engineers in India. This position plays a dual role: contributing hands-on to engineering execution while mentoring and developing engineers in their technical careers. About The Role The ideal candidate combines deep technical acumen, stakeholder awareness, and a people-first leadership mindset. You’ll collaborate with global tech leads, managers, platform teams, and business analysts to build trusted, performant data pipelines that serve use cases beyond traditional data domains. Responsibilities Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines) Architect data models and re-usable layers consumed by multiple downstream pods Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks Mentoring and coaching team Partner with product and platform leaders to ensure engineering consistency and delivery excellence Act as an L3 escalation point for operational data issues impacting foundational pipelines Own engineering best practices, sprint planning, and quality across the Enablement pod Contribute to platform discussions and architectural decisions across regions Job Requirements Education Bachelor’s or master’s degree in computer science, Engineering, or related field Relevant Experience 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse Knowledge And Preferred Skills Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, Snowflake, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use : Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.Tech or equivalent 7+ years of overall experience in IT industry and 6+ years of experience in data engineering with 3+ years of hands-on experience in Azure Databricks Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Proven ability to balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Proven ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications Azure certifications DP-203, AZ-304 etc. Experience in infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Role: Data Platform Architect Location: India(remote) Type of Employment: Contract Required Skills: C, C#, .NET ,Python, SCALA, Data bricks, ADF, event hub, event grid, adls, adx, data explorer, fluent d, azure app services, architect new solutions Responsibilities: - Design and architect scalable data platform solutions. - Lead implementation of data pipelines and integration workflows. - Collaborate with stakeholders to define data strategies. - Ensure platform performance, security, and compliance. - Hands on experience architecting solutions for Data Platform holding 1-2 Petabytes of Data Required Skills: - Strong experience in data engineering and architecture - Expertise in modern data engineering tools and Azure services. - Strong programming background in C-family languages and Python. - Strong problem-solving and analytical skills. - Excellent communication and teamwork abilities. - Azure certifications, experience with real-time data processing. Preferred: - Knowledge of industry best practices and standards. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC4 Responsibilities Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the area - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalisation. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilise the organisational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organisational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self driven and result oriented Strong problem solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3/IC4/IC5 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 16 Lacs

Gurugram

Work from Office

Strong experience in SQL development with solid expertise in AWS cloud services. Proficient in Azure Data Factory (ADF) for building and managing data pipelines in cloud-based data integration solutions. Mail:kowsalya.k@srsinfoway.com

Posted 1 month ago

Apply

7.0 years

4 - 9 Lacs

Gurgaon

Remote

AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. At AHEAD, we prioritize creating a culture of belonging, where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer, and do not discriminate based on an individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, marital status, or any other protected characteristic under applicable law, whether actual or perceived. We embrace all candidates that will contribute to the diversification and enrichment of ideas and perspectives at AHEAD. AHEAD is looking for a Sr. Data Engineer (L3 support) to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients’ advanced analytics, data science, and other data engineering initiatives. This consultant will build and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. The appropriate candidate must be a subject matter expert in managing data platforms. Responsibilities: A Sr. Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as EventHub’s, ADF and other cloud native tools as required to address streaming use cases Engineers and maintain ELT processes for loading data lake (Cloud Storage, data lake gen2) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and escalations and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Should possess ownership and leadership skills to collaborate effectively with Level 1 and Level 2 teams. Must have experience in raising tickets with Microsoft and engaging with them to address any service or tool outages in production. Qualifications: 7+ years of professional technical experience 5+ years of hands-on Data Architecture and Data Modelling – SME level 5+ years of experience building highly scalable data solutions using Azure data factory, Spark, Databricks, Python 5+ years of experience working in cloud environments (AWS and/or Azure) 3+ years of programming languages such as Python, Spark and Spark SQL. Should have strong knowledge on architecture of ADF and Databricks. Able to work with Level1 and Level 2 teams to resolve platform outages in production environments. Strong client-facing communication and facilitation skills Strong sense of urgency, ability to set priorities and perform the job with little guidance Excellent written and verbal interpersonal skills and the ability to build and maintain collaborative and positive working relationships at all levels Strong interpersonal and communication skills (Written and oral) required Should be able to work in shifts Should have knowledge on azure Dev Ops process. Key Skills: Azure Data Factory, Azure Data bricks, Python, ETL/ELT, Spark, Data Lake, Data Engineering, EventHub’s, Azure delta, Spark streaming Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include: Medical, Dental, and Vision Insurance 401(k) Paid company holidays Paid time off Paid parental and caregiver leave Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (“OTE”) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidate’s relevant experience, qualifications, and geographic location.

Posted 1 month ago

Apply

5.0 years

4 - 16 Lacs

Gurgaon

On-site

Position: SQL+ ADF (Azure Data Factory) Experience Required: Minimum 5+ Years Location: Gurgaon/ Hybrid Job Type: Permanent Work Timings: 1PM – 10 PM Notice Period: Immediate only Mode of Interview: Virtual Required Experience: Must have strong experience in SQL development. Must have experience in AWS Cloud. Must have experience in ADF (Azure Data factory) Job Type: Full-time Pay: ₹400,000.00 - ₹1,600,000.00 per year Benefits: Health insurance Schedule: Day shift Monday to Friday Work Location: In person

Posted 1 month ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Qualifications - External Required Qualifications: B.Tech or equivalent 7+ years of overall experience in IT industry and 6+ years of experience in data engineering with 3+ years of hands-on experience in Azure Databricks Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Proven ability to balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Proven ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications: Azure certifications DP-203, AZ-304 etc. Experience in infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies