Home
Jobs

795 Adf Jobs - Page 28

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bangalore/ Bengaluru

Work from Office

Naukri logo

Job Title: Oracle SCM Developer/Consultant x3 Job Location: (Multiple) =========== Dubai - UAE Doha - Qatar Kuwait city - Kuwait Riyadh - Saudi Arabia Muscat - Oman Email: spectrumconsulting1997@gmail.com Salary Per Month: 10k to 15k AED per month Full Tax Free Salary Project Duration: 2 Years Desired Experience: 5 - 10 Years Work Permit & visa - will be sponsored by the company Qualification: B.E/B.Tech/ MCA/ M.Tech / MSc IT or any equivalent Key Responsibilities: Implement, configure, and support Oracle E-Business Suite (EBS) SCM modules such as Inventory, Purchasing, Order Management, Advanced Pricing, and WMS. Develop and maintain PL/SQL stored procedures, functions, packages, and triggers. Write and optimize complex SQL queries for data manipulation and reporting. Design and implement data migration strategies and ETL processes. Develop and customize reports using Oracle BI Publisher, Oracle Reports, and Oracle Analytics Cloud. Customize and extend Oracle Forms and Reports, OAF, and ADF applications. Design and implement automated workflows using Oracle Workflow Builder and AME. Collaborate with clients to gather requirements, conduct gap analysis, and translate business needs into technical solutions. Perform root cause analysis and resolve complex technical issues. Participate in Oracle SCM implementation projects, including requirements gathering, configuration, testing, and deployment. Provide training and support to end-users and clients. Required Skills and Qualifications: 5-10 years of experience as an Oracle SCM Consultant/Developer. In-depth knowledge of Oracle EBS SCM modules and Oracle Fusion Cloud SCM (optional but beneficial). Proficiency in PL/SQL programming and SQL query optimization. Experience with Oracle Integration Cloud (OIC) and web services integration (REST/SOAP). Strong understanding of data migration strategies and ETL processes. Proficiency in Oracle BI Publisher, Oracle Reports, and Oracle Analytics Cloud. Experience with Oracle Forms and Reports customization, OAF, and ADF. Knowledge of Oracle Workflow Builder and AME. Hands-on experience with Oracle SCM implementation and upgrade projects. Functional Vertical: - Any Financial Services (Banking/Insurance or related) - Telecom /Healthcare/ Retail - Logistics /Utilities/Energy Sector (Oil and Gas/Power) Nice to have: - Any oracle certifications are added advantage - Any Onsite experience is added advantage No. of. positions: 3 Benefits: - Onsite Work Permit + visa + Insurance + air ticket will be sponsored by the company - Long term (2 Years) Project Job Ref code:- ORA_SCM_0525 Email: spectrumconsulting1997@gmail.com If you are interested, please email or WhatsApp your CV as ATTACHMENT with job ref. code [ ORA_SCM_0525 ] as subject

Posted 3 weeks ago

Apply

5.0 - 9.0 years

7 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Looking for Azure Data Engineer with ADF and Python, Pyspark

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Azure Data Engineer Experience: 7+ Years Location: Chennai Type: [Full-time] Notice Period: [Immediate joiners preferred / 20 Days ] Job Summary: We are hiring an experienced Azure Data Engineer with 7+ years of hands-on experience in building scalable data analytics and data warehouse solutions on Azure. The candidate must have strong proficiency in Python and Azure services, with the ability to develop and monitor robust data pipelines. Key Responsibilities: Design and develop data pipelines using ADF, Synapse, Databricks, and other Azure services. Implement monitoring and alerting for pipeline performance and failures. Write efficient Python scripts for data transformation and automation. Collaborate with cross-functional teams to deliver end-to-end data solutions. Key Skills: Azure Data Explorer Azure Databricks Azure Data Factory (ADF) Azure Synapse Analytics Azure Fabric Python (advanced) Data pipeline development & monitoring ETL/ELT processes Data Warehousing on Azure Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Azure Data Engineer Job Description Role: Azure Data Engineer Experience: Minimum 3-5 years Location: Spaze ITech Park, Sector-49, Gurugram Working Days: Monday to Friday (9:00 Am- 6:00 Pm) Joining: < 15 days About Us: Panamoure is UK based group with offshore office in Gurgaon, India. We are known to be the ultimate Business and Technology Change partner for our clients including PE groups and ambitious mid-market businesses. Panamoure is a fast paced and dynamic management consultancy delivering Business and Technology change services to the UKs fastest growing companies. Our ability to deliver exceptional quality to our clients has seen us grow rapidly over the last 36 months and we have ambitious plans to scale substantially further moving forward. As part of this growth we are looking to expand both our UK and India team with bright, ambitious and talented individuals that want to learn and grow with the business. Primary Skills The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities: · Design, develop, and manage SQL databases , tables , stored procedures , and T-SQL queries . · Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. · Build and optimize ETL/ELT processes to transfer data between Azure Data Lake , SQL Server , and other systems. · Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. · Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake , SQL Server , and external data sources . · Develop and implement data modeling strategies using star schema , snowflake schema , and dimensional models to support analytics use cases. · Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. · Monitor, troubleshoot, and optimize data pipelines using Azure Monitor , Log Analytics , and Fabric Monitoring capabilities . · Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview . · Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. · Utilize Azure Monitor , Log Analytics , and Application Insights for pipeline monitoring and performance optimization. · Stay updated on Azure Data Services and Microsoft Fabric innovations , recommending enhancements for performance and scalability. Desired Candidate Profile: · 4+ years of experience in data engineering with strong expertise in SQL development. · Proficiency in SQL Server , T-SQL , and query optimization techniques . · Hands-on experience with Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure SQL Database . · Solid understanding of ETL/ELT processes , data integration patterns , and data transformation . · Practical experience with Microsoft Fabric components : o Fabric Dataflows for self-service data preparation. o Fabric Lake houses for unified data storage. o Fabric Synapse Real-Time Analytics for streaming data insights. o Fabric Direct Lake mode with Power BI for optimized performance. · Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. · Proficiency in Python or Scala for data transformation tasks. · Experience with Azure DevOps , Git , and CI/CD pipeline automation . · Knowledge of data governance practices , including data lineage , sensitivity labels , and RBAC . · Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates . · Understanding of data security protocols like data encryption and network security groups (NSGs) . · Familiarity with streaming services like Azure Event Hub or Kafka is a plus. · Excellent problem-solving, communication , and team collaboration skills. · Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer: · Opportunity to work with modern data architectures and Microsoft Fabric innovations. · Competitive salary and benefits package, tailored to experience and qualifications. · Opportunities for professional growth and development in a supportive and collaborative environment. · A culture that values diversity, creativity, and a commitment to excellence. Benefits and Perks: · Provident Fund · Health Insurance · Flexible Timing · Providing office Lunch How to Apply: Interested candidates should submit their resume along with cover letter to hr@panamoure.com We look forward to adding a skilled Azure Data Engineer to our team! Show more Show less

Posted 4 weeks ago

Apply

8 - 10 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Technical Delivery Manager We are looking for a seasoned Technical Delivery Manager to oversee the delivery of data-driven projects within the H ealthcare sector . The ideal candidate will possess strong Technical depth in data engineering and analytics, combined with extensive experience managing distributed delivery teams, ensuring high performance and client satisfaction. Roles and Responsibilities: Lead and manage end-to-end data platform deliveries, including architecture, implementation, and integration. Collaborate closely with healthcare clients to translate business needs into technical solutions. Drive technical planning, sprint management, resource allocation, and risk mitigation. Act as the bridge between client teams and internal engineering teams. Ensure data security and compliance with healthcare regulations (HIPAA, GDPR). Report on project progress, budgets, and KPIs to senior stakeholders. Promote innovation and continuous improvement in delivery practices. Technical Expertise Required: Strong experience delivering projects using Azure Data Services, Snowflake, Databricks, or similar platforms, ADO Board, Health Care. Familiarity with ETL/ELT pipelines, DBT, ADF, and data lake architectures. Understanding of healthcare data models, EHR systems, FHIR, HL7 standards. Experience with BI tools like Power BI or Tableau is a plus. Prior experience working with Agile/Scrum methodologies and tools (JIRA, Azure DevOps). Experience: 8- 10 Years Location : Pune, Chennai, Bangalore, Coimbatore, Notice : Immediate to 15 Days Regards, TA Team KANINI Software Solutions Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Role- Senior Azure data Engineer Location- Kolkata Experience- 7+ Must-Have** Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Good-to-Have Working knowledge of Azure DevOps SN Responsibility of / Expectations from the Role 1 Customer Centric Work closely with client teams to understand project requirements and translate into technical design Experience working in scrum or with scrum teams Internal Collaboration Work with project teams and guide the end to end project lifecycle, resolve technical queries Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Soft Skills Good communication skills Ability to interact with various internal groups and CoEs Show more Show less

Posted 4 weeks ago

Apply

7 - 9 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Power BI Developer Location: Pune - Onsite Experience Required: 7 - 9 years Relevant Experience: 3 - 5 years Job Summary We are seeking a skilled Power BI Developer with hands-on experience in building dashboards, data models, and reporting solutions. The ideal candidate will have a strong background in SQL, data transformation, and Microsoft’s BI stack. You will be responsible for converting raw data into actionable insights through interactive and user-friendly dashboards and reports. Key Responsibilities Understand business requirements and translate them into Power BI dashboards and reports. Design and build data models and semantic layers. Extract, transform, and load (ETL) data from various sources using Power BI, M query, and SQL. Develop DAX measures and calculated columns for analytics. Implement Row-Level Security (RLS) in reports. Publish and schedule Power BI reports on both Power BI Service and On-premise environments. Manage report access and configure Power BI Data Gateway. Work closely with BI analysts, data engineers, and application developers to integrate reporting into business processes. Stay current with Power BI features and best practices. Required Skills Total Exp: 7-9 Years Relevant Exp:3-5 Years Hands-on experience in Power BI, SQL, and SSIS (Integration) Good to Have: Experience with Azure Data Services such as ADF and Synapse Skills And Experience 3+ years experience in Power BI reports and dashboards development 3+ years of experience in BI, data warehousing, data profiling, and data integration. Excellent experience in writing SQL and working with RDBMS Well versed in SSIS and ETL tools Excellent experience in MSFT BI Stack and SSAS Tabular modeling Excellent experience in DAX and M query scripts Exposure to working with variety of data sources including NoSQL and files (json, csv, etc.) on the cloud and on-prem settings Good understanding about Power BI deployment options and the license types Experience on Azure data services like ADF and Synapse is a plus Skills: data modeling,power bi,ssis,azure,sql,dax,integration,models,data services,etl,rdbms,dashboards,data warehousing,m query,azure data services,ssas tabular modeling,data integration,data Show more Show less

Posted 4 weeks ago

Apply

6 years

0 Lacs

Tamil Nadu, India

On-site

Linkedin logo

We are seeking a highly skilled Senior Azure Databricks Data Engineer to design, develop, and optimize data solutions on Azure . The ideal candidate will have expertise in Azure Data Factory (ADF), Databricks, SQL, Python , and experience working with SAP IS-Auto as a data source . This role involves data modeling, systematic layer modeling, and ETL/ELT pipeline development to enable efficient data processing and analytics. Experience: 6+ years Key Responsibilities: Develop & Optimize ETL Pipelines : Build robust and scalable data pipelines using ADF, Databricks, and Python for data ingestion, transformation, and loading. Data Modeling & Systematic Layer Modeling : Design logical, physical, and systematic data models for structured and unstructured data. Integrate SAP IS-Auto : Extract, transform, and load data from SAP IS-Auto into Azure-based data platforms. Database Management : Develop and optimize SQL queries, stored procedures, and indexing strategies to enhance performance. Big Data Processing : Work with Azure Databricks for distributed computing, Spark for large-scale processing, and Delta Lake for optimized storage . Data Quality & Governance : Implement data validation, lineage tracking, and security measures for high-quality, compliant data. Collaboration : Work closely with business analysts, data scientists, and DevOps teams to ensure data availability and usability. Required Skills: Azure Cloud Expertise : Strong experience in Azure Data Factory (ADF), Databricks, and Azure Synapse . Programming : Proficiency in Python for data processing, automation, and scripting. SQL & Database Skills : Advanced knowledge of SQL, T-SQL, or PL/SQL for data manipulation. SAP IS-Auto Data Handling : Experience integrating SAP IS-Auto as a data source into data pipelines. Data Modeling : Hands-on experience in dimensional modeling, systematic layer modeling, and entity-relationship modeling . Big Data Frameworks : Strong understanding of Apache Spark, Delta Lake, and distributed computing . Performance Optimization : Expertise in query optimization, indexing, and performance tuning . Data Governance & Security : Knowledge of RBAC, encryption, and data privacy standards . Preferred Qualifications: Experience with CI/CD for data pipelines using Azure DevOps. Knowledge of Kafka/Event Hub for real-time data processing. Experience with Power BI/Tableau for data visualization (not mandatory but a plus). Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: Data Engineer Location: Bangalore Type: Fulltime Experience: 6 to 10 yrs Notice: Immediate Job Description – Data Engineer (Azure, ADF, Databricks, PySpark, SCD, Unity Catalog, SQL) Role Overview: Looking for highly skilled experienced Data Engineer with expertise in Azure Data Factory (ADF), Azure Databricks, Delta Tables, Unity Catalog, Slowly Changing Dimension Type 2 (SCD2), and PySpark. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and ETL workflows while ensuring data integrity, scalability, and security within the Azure ecosystem. Key Responsibilities: Develop and optimize data pipelines using Azure Data Factory (ADF) and Azure Databricks for large-scale data processing. Implement Slowly Changing Dimension in Delta Tables to manage historical data changes effectively. Leverage Unity Catalog for secure and organized data governance, cataloging, and access control across Databricks. Write efficient PySpark code to process and transform large datasets, ensuring high performance and scalability. Design and implement ETL/ELT solutions to integrate data from multiple sources into Delta Lake. Monitor, debug, and optimize existing data pipelines to ensure smooth operations and minimal downtime. Ensure data quality, consistency, and lineage tracking through best practices and automation. Collaborate with data architects, analysts, and business teams to define requirements and implement data-driven solutions. Required Skills & Qualifications: 6+ years of experience in Data Engineering with a focus on Azure technologies. Expertise in Azure Data Factory (ADF) & Azure Databricks for ETL/ELT workflows. Strong knowledge of Delta Tables & Unity Catalog for efficient data storage and management. Experience with Slowly Changing Dimensions (SCD2) implementation in Delta Lake. Proficiency in PySpark for large-scale data processing & transformation. Hands-on experience with SQL & performance tuning for data pipelines. Understanding of data governance, security, and compliance best practices in Azure. Knowledge of CI/CD, DevOps practices for data pipeline automation. Preferred Qualifications: Experience with Azure Synapse Analytics, Data Lakes, and Power BI integration . Knowledge of Kafka or Event Hub for real-time data ingestion. Certifications in Microsoft Azure (DP-203, DP-900) or Databricks are a plus. Show more Show less

Posted 4 weeks ago

Apply

4 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We have an Urgent Opening for: Snowflake Developer / Senior Snowflake Developer Experience: 4 years - 8 years Job location: Hyderabad / Pune /Coimbatore (WFO) for all the 5 days Mandatory Skills: Snowflake , ADF, SQL Notice Period: Immediate joiner to 15Days Job Description :- We are looking for a Snowflake and ADF developer to join our Data leverage team – a team of high-energy individuals who thrive in a rapid-pace and agile product development environment. As a Developer, you will provide accountability in the ETL and Data Integration space, from the development phase through delivery. You will work closely with the Project Manager, Technical Lead, and client teams. Your prime responsibilities will be to develop bug free code with proper unit testing and documentation. You will provide inputs to planning, estimation, scheduling, and coordination of technical activities related to ETL-based applications. Key Responsibilities :- • Develop, implement, and optimize complex SQL queries and functions using Snowflake. • Good experience in writing Snowflake scripts. • Strong experience in SQL query JOINS, CASE statements, data format conversion SQL functions. • Strong experience in working with heterogeneous sources, transforming the data into output files. • Understand the business requirements for Data flow process needs. • Understand requirements, functional and technical specification documents. • Development of mapping document and transformation business rules as per scope and requirements/Source to target. • Develop Airflow Dag’s for Data flow process needs • Analyse existing SQL/Snowflake queries for performance improvements • Collaborate closely with onsite lead data analysts for dependencies and requirements • Responsible for continuous formal and informal communication on project status. • Good understanding of JIRA stories process for SQL development activities. About Our Client Company- Our Client is amongst the fastest growing insurance-focused IT services providers in North America. Leading insurers trust with their core, digital and data transformation initiatives. Having grown consistently every year by 24%, we have now grown to over 4000 employees. They are committed to integrity and to ensuring that each team and employee is successful.They foster an open work culture where employees' opinions are valued. They believe in teamwork and cultivate a sense of fun, fellowship, and pride among our employees. Kindly share your updated resume or convey in your network. ankita.jaiswal@firstwave-tech.com Show more Show less

Posted 4 weeks ago

Apply

3 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

**Job Title:** Azure Integration Developer **Location:** Hyderabad/Indore/Remote **About the Company:** ValueLabs is a leading provider of high-tech IT services and solutions, specializing in software development, cloud computing, data analytics, cybersecurity, and more. Our clients span various sectors including hardware and consumer electronics, software and cloud computing, IT services and consulting, internet and e-commerce, semiconductors and electronics, and telecom. **Job Summary:** We are seeking a highly skilled Azure Integration Developer with at least 3 years of experience to join our team. The ideal candidate will have expertise in designing, implementing, and managing Azure integration solutions that enable seamless data exchange between various systems and applications. This role involves working closely with cross-functional teams to deliver robust, scalable, and secure integration solutions. **Requirements:** - **Education:** Bachelor’s degree in Computer Science, Information Technology, or related field. - **Experience:** Minimum of 3 years of experience in developing Azure integration solutions. - **Skills:** - Proficient in Azure Data Factory (ADF), Logic Apps, Functions, Azure SQL, Azure Data Warehouse, Power BI, DAX, Power Apps, Power Automate, Azure DevOps. - Strong knowledge of ETL processes and data warehousing concepts.. - Familiarity with Azure Data Bricks and Azure Synapse Analytics. - Knowledge of Agile methodologies and CI/CD practices. - **Certifications:** Azure Certified Professional or equivalent certifications in relevant technologies. Show more Show less

Posted 4 weeks ago

Apply

8 - 18 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from TCS!! TCS is Hiring for Data Architect Interview Mode: Virtual Required Experience: 8-18 years Work location: Chennai, Kolkata, Hyderabad Data Architect (Azure/AWS) Hands on Experience in ADF, HDInsight, Azure SQL, Pyspark, python, MS Fabric, data mesh Good to have - Spark SQL, Spark Streaming, Kafka Hands on exp in Databricks on AWS, Apache Spark, AWS S3 (Data Lake), AWS Glue, AWS Redshift / Athena Good To Have - AWS Lambda, Python, AWS CI/CD, Kafka MLflow, TensorFlow, or PyTorch, Airflow, CloudWatch If interested kindly send your updated CV and below mentioned details through E-mail: srishti.g2@tcs.com Name: E-mail ID: Contact Number: Highest qualification: Preferred Location: Highest qualification university: Current organization: Total, years of experience: Relevant years of experience: Any gap: Mention-No: of months/years (career/ education): If any then reason for gap: Is it rebegin: Previous organization name: Current CTC: Expected CTC: Notice Period: Have you worked with TCS before (Permanent / Contract ) : Show more Show less

Posted 4 weeks ago

Apply

12 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview At Pepsico we’re redefining operational excellence with a data-driven mindset, and our Global IT team is at the forefront of this transformation. Our technology teams leverage advanced analytics to deliver predictive insights, enhance operational efficiency, and create unmatched consumer and customer experiences. Our culture is guided by our core values which define our mission to excel in the marketplace and act with integrity in everything we do. We’re creating value with every initiative while promoting a sustainable and socially impactful agenda. Responsibilities Key Areas: Predictive AI-based Operations ServiceNow “Now Assist” at Service Desk and Digital Experience Descriptive Analytics and Insights generation on ServiceNow Data Azure Cloud, Data Architecture and Azure ML Services for Global Service Desk, IT Service Management (ITSM) and Global Workplace Leadership & Stakeholder Management Responsibilities: Predictive Ops and IT Experience Management: Leverage your extensive domain expertise in ServiceNow ITSM, Service Desk Management, and End User Experience Management to identify areas for improvement and opportunities for AI and Predictive IT Ops applications, and building capabilities and optimizing Workplace Efficiency Azure Machine Learning: Lead the exploration and identification of Predictive and Forecasting use cases specifically tailored to the ServiceNow platform, focusing on maximizing business impact and user adoption using Azure stack. Utilize Azure Machine Learning to develop and deploy predictive models, ensuring integration with Azure services and seamless operationalization. Product Management: Prioritize and manage the Digital Brain (i.e. AI use cases) product backlog, ensuring the timely delivery of predictive models, features, and improvements. Oversee the release of high-quality predictive solutions that meet organizational goals. Leadership and Management: Partner with the leadership to develop a strategic roadmap for applying AI and Predictive capabilities across ITSM, Service Desk, and Digital Experience functions leveraging ServiceNow data. Stakeholder Collaboration: Collaborate extensively with stakeholders to understand pain points and opportunities, translating business needs into precise user stories and actionable tasks. Ensure clear communication and alignment between business objectives and technical implementation Lead other team members in the different digital projects acting as a data science lead for the project. Act as a subject matter expert across different digital projects. Act as stream leader in innovation activities Partner with product managers in taking DS requirements and assessing DS components in roadmaps. Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Lead ML engineers working on industrialization. Coordinate work activities with Business teams, other IT services and as required. Drive the use of the Platform toolset and to also focus on 'the art of the possible' demonstrations to the business as needed. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Experience in cloud-based development and deployment (Azure preferred) Set KPIs and metrics to evaluate analytics solution given a particular use case. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Experience in leading contractors or other team members Qualifications Education: Bachelor’s or Master’s degree in computer science, Information Systems, or a related field. Experience: Extensive experience (12+ years) in ITSM / Service Desk Transformation / IT Operations arena with exposure to predictive intelligence, data architecture, data modelling, and data engineering, with a focus on Azure cloud-based solutions. Technical Skills: Knowledge of Azure cloud services, including Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure Databricks (ADB), and Azure Machine Learning. Domain Knowledge: Deep understanding of ServiceNow modules, specifically ITSM related to incident, problem, request, change management coupled with extensive knowledge of predictive analytics, data science principles. Understanding and visibility into IT Operations and Support services including, Global Workplace Services like End user compute, Workplace management solutions, Unified Communications and Collaborations is an added advantage. Analytical Skills: Outstanding analytical and problem-solving skills to translate extensive business experience into highly effective predictive intelligence solutions. Communication: Exceptional communication and interpersonal skills, honed through years of collaboration with diverse stakeholders and vendors. Methodologies: Extensive experience with agile methodologies and a record of working in highly dynamic and agile development environments. Project Management: Proven ability to manage multiple projects concurrently, prioritizing tasks effectively to drive impactful results. Leadership: Demonstrated leadership and management capabilities, with a track record of guiding teams to achieve strategic goals and fostering a collaborative team environment. Strong Knowledge in Statistical/ML/AI techniques to solve supervised (regression, classification) and unsupervised problems, with focus on time series forecasting. Experiences with Deep Learning are a plus. Functional Knowledge – at least one of these IT Service Management (ITSM) IT Service Desk ServiceNow (ITSM Module) Digital Workplace Services Technical Knowledge Azure Machine Learning (AML) – Mandatory Azure Databricks (ADB) – Mandatory Azure Data Factory (ADF) – Optional Azure Data Lake Storage (ADLS) – Optional Certifications – at least one of these Azure Fundamentals (AI-900) Azure AI Engineer Azure Data Scientist ITIL Foundation or above Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Principal Engineer -Web Application Position: Principal Engineer – Web Application Experience: 8 – 12 Years Shift Timings: 3:30 PM IST to 12:30 AM IST . Work Mode – 3 days from Office, 2 days from home Location – Orbit by Auro, Raidurg, Hyderabad. Job Summary: The Engineer, Web Application will be responsible for developing web applications that support strategic business processes. The person will work independently,partner with IS and business liaisons to understand requirements to deliver enhanced end-user experience for web applications. Recognized expert who anticipates business challenges and drives process, product, and service improvements. Handles complex, abstract tasks requiring integration of diverse functions, industry trends, and policies. Holds high responsibility for budgets, expenditures, and strategic direction, navigating ambiguity with minimal guidance. Leads project teams to achieve key milestones and objectives. Key Responsibilities: ● Provide functional and technical guidance, set priorities, develop Proof of Concepts, review work products, and report on team activities. ● Offer strong technical support and hands-on development. ● Implement industry best practices and Proof of Concepts for emerging technologies, ensuring compliance with policies and maintaining proper documentation and quality controls. ● Design robust data structures and processes to resolve data integrity issues across web applications, supporting business growth and evolving priorities. ● Collaborate with consultants and contractors, review deliverables, and ensure adherence to SDLC, policies, and security standards.Translate business requirements into technical specifications, recommend design improvements, and update user and application documentation. ● Partner with managers and end users to determine optimal IT solutions. Preferred Skills: ● Strong experience in Oracle ADF/Angular, React, Java, JavaScript, HTML, CSS, Spring boot and Node JS ● Proficient understanding of REST web services and responsive web design ● Experience developing, testing and debugging software preferably using SQL/PLSQL , Java,Python and/or Oracle/Postgres back-end. ● Academic experience acceptable. Experience: ● 8-12 years of working experience. ● Relevant experience with systems software programming. Experience providing z/OS systems support and operational experience related to scheduling, monitoring, and facilitating the troubleshooting of production issues within a mainframe environment. ● Bachelor Degree - Computer Science/Engineering/Information System ● Ability to multi-task and prioritize work for self and team mates; Show more Show less

Posted 4 weeks ago

Apply

6 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description C#, Asp.Net REST API Azure Storages MS Experience Knowledge of AI Passionate about Learning Good to have: Synapse, ADF, PySpark, Python, Selenium Other requirements: Preferred Location: Hyderabad Project Type: Managed Communication: Good Should be able to work in couple of overlap PST hours Experience: 6+ Years Position: 6 - 8 Show more Show less

Posted 4 weeks ago

Apply

6 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Minimum experience needed 6+years in dot net only C#, Asp.Net REST API Azure Storages MS Experience Knowledge of AI Passionate about Learning Good to have: Synapse, ADF, PySpark, Python, Selenium Other requirements: Preferred Location: Hyderabad Project Type: Managed Communication: Good Should be able to work in a couple of overlap PST hours Experience: 6+ Years Position: 6 - 8 Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Overview: We are seeking an experienced Azure ETL Tester to validate and verify data pipelines, transformation logic, and data quality across our Azure-based data platform. The ideal candidate will have strong ETL testing skills, deep knowledge of SQL, and hands-on experience with Azure Data Factory (ADF) , Synapse Analytics , and related services. Key Responsibilities: Design and execute test plans for ETL workflows built on Azure Data Platform. Validate data movement, transformation logic, and data loading across systems. Write and execute complex SQL queries for data validation, reconciliation, and defect analysis. Test data pipelines built using Azure Data Factory , Azure Synapse , and Databricks . Identify, report, and track data quality and integrity issues. Collaborate with developers and data engineers to resolve defects and improve data accuracy. Document test cases, test data requirements, test results, and maintain traceability. Support performance and regression testing of ETL jobs. Required Skills: 4–7 years of experience in ETL testing , data validation, and data warehouse projects. Strong SQL skills – ability to write complex joins, aggregations, and data comparison queries. Hands-on experience with Azure Data Factory (ADF) , Azure Synapse Analytics , and/or Azure Databricks . Good understanding of ETL processes, source-to-target mappings, and data profiling. Experience in defect tracking and test management tools (e.g., JIRA, TestRail, HP ALM). Strong analytical and problem-solving abilities. Good communication and documentation skills. Nice to Have: Experience with Python or PowerShell for automation. Familiarity with data quality tools like Great Expectations or Informatica DQ . Exposure to CI/CD pipelines for data testing automation. Show more Show less

Posted 4 weeks ago

Apply

4 - 12 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from TCS!!! **TCS is Hiring for Azure Data Engineer ** Walk-in Interview for Azure Databricks Data Engineer in Hyderabad Walk-in Interview Date: 24th May 2025 (Saturday) Role: Azure Data Engineer Desired Experience: 4- 12 Years Job Location: PAN INDIA Job Description: Minimum 4+ years of development experience in Azure. ● Must have “Data Warehouse / Data Lake” development experience. ● Must have “Azure Data Factory (ADF) & Azure SQL DB” ● Must have “Azure Data Bricks” experience using Python or Spark or Scala. ● Nice to have “Data Modelling” & “Azure Synapse” experience. ● Passion for Data Quality with an ability to integrate these capabilities into the deliverables. ● Prior use of Big Data components and the ability to rationalize and align their fit for a business case. ● Experience in working with different data sources - flat files, XML, JSON, Avro files and databases ● Experience in developing implementation plans and schedules and preparing documentation for the jobs according to the business requirements. ● Proven experience and ability to work with people across the organization and skilled at managing cross-functional relationships and communicating with leadership across multiple organizations Walk In Drive Date: 24th May 2025 Registration Time: 09:30 AM – 12:30PM Venue: Tata Consultancy Services Ltd, Hyderabad Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Role: Data Engineer Years of experience: 5 plus years Location: Remote Notice: Only immediate joiners Min 5+ yrs of experience in relevant field. Deep understanding of Private and Public Cloud Architectures. Experience in Azure Services including Azure Data Factory (ADF) for orchestrating complex workflows. Hands-on expertise with Databricks and PySpark for big data transformation and advanced analytics. Strong emphasis and deep experience working with Relational Databases (RDBMS) including SQL Server, PostgreSQL, MySQL – with advanced SQL skills. Experience in Terraform, Kubernetes, and Service Mesh. Expertise with open-source stack technologies. Design and develop ETL pipelines using Java, Scala, or Python. Ingestion and transformation of data to/from RDBMS and NoSQL databases (e.g., Cassandra, PostgreSQL, Yugabyte DB). Job orchestration using Apache Airflow or Oozie. Adept with Agile Software Development Lifecycle and DevOps principles. Prior experience in Informatica PowerCenter or any other ETL tools. Interested candidate can drop in your profile to anuritha@prosmcloudinc.com Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. * Singapore Onsite *looking for only short term notice candidates Job Description: We are looking for an experienced Oracle IAM Consultant expertise to join our dynamic team Design and Implementation: Designing and implementing IAM solutions using OIM /OIG Developing custom connectors to integrate OIM with various applications and systems. Building and configuring OIM workflows, approval policies, and entitlements. Developing custom UI components for OIM self-service pages. Skills and Experience: Experienced in an end-to-end integration of IAM Solution using Oracle Identity governance. Prior experience with requirement gathering, analysis, design, development, maintenance, and upgrades in different environments like DEV, QA, UAT, PROD. Experience with ICF Based Framework connector to integrate with target applications and perform CRUD Operations and managing roles to the Target system. Extended hands-on experience with custom code development such as Event Handlers, Validation Plugin and Schedule Tasks using Java API. Experience with Audit reports with OIM BI Publisher and customized the logo and header of the UI Screen and audit reports. Implement Oracle ADF customizations for user interfaces. Build custom Oracle SOA composites for workflows. Java Experience: Best practice based secure java development Exposure and hands on experience with REST APIs and web services Ability to re-use existing code and extend frameworks Administration and Management: Administering and managing OIM environments. Ensuring the IAM platform is secure, scalable, and supports business requirements. Monitoring the performance and health of IAM systems. Security and Compliance: Developing and enforcing IAM policies and procedures. Collaborating with security teams to address vulnerabilities. Support and Troubleshooting: Supporting end-users with access-related issues and requests. Troubleshooting and resolving technical issues related to OIM implementation. Good to Have: Hands-on experience with Oracle Access manager Good understanding of AS400 and relevant infrastructure Unix Scripting String SQL knowledge WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools Skills And Attributes For Success General Skills Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We are looking for passionate and skilled Software Developers ready to take on the next big challenge—building AI-powered agentic applications for Oracle Fusion ERP. As part of the Fusion Financial Technology team, you will contribute to the design and development of next-generation enterprise agents like Ledger Agent and Payment Agent, which leverage Generative AI and Autonomous Decision-Making to simplify financial operations for our customers. Fusion Applications on Cloud is the next generation applications offering from Oracle portfolio. We at Fusion Financials team are building world class Financial Products, Common Components and Tools using the cutting edge technologies - SOA, SOAP & REST Services, BIP, ESS, ADF from Oracle Fusion middle-ware tech stack. We are looking for highly talented dynamic professionals to be part of one of its kind project “Fusion”. Come and join the most vibrant team common core team in Fusion Financials and make the difference. Responsibilities Lead the design and development of AI-powered agentic applications such as Ledger Agent and Payment Agent within the Oracle Fusion ERP suite. Architect and build scalable, cloud-native financial services using Java, SOA, REST/SOAP, ADF, and Oracle middleware technologies. Drive the integration of Generative AI techniques to enable intelligent decision-making in enterprise workflows. Define and implement microservices-based solutions using Docker, Kubernetes, and CI/CD pipelines (Jenkins, Git). Collaborate with product managers, architects, and cross-functional teams to align technical design with business goals. Mentor and guide junior developers, perform code reviews, and ensure adherence to coding and design best practices. Own complex problem-solving efforts and troubleshoot production issues with a focus on root-cause resolution. Stay ahead of technology trends, especially in AI/Gen AI, and drive innovation into existing and new application components. Champion agile methodologies and continuous improvement in development processes, tooling, and team collaboration. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Telangana, India

On-site

Linkedin logo

Job Description We are looking for passionate and skilled Software Developers ready to take on the next big challenge—building AI-powered agentic applications for Oracle Fusion ERP. As part of the Fusion Financial Technology team, you will contribute to the design and development of next-generation enterprise agents like Ledger Agent and Payment Agent, which leverage Generative AI and Autonomous Decision-Making to simplify financial operations for our customers. Fusion Applications on Cloud is the next generation applications offering from Oracle portfolio. We at Fusion Financials team are building world class Financial Products, Common Components and Tools using the cutting edge technologies - SOA, SOAP & REST Services, BIP, ESS, ADF from Oracle Fusion middle-ware tech stack. We are looking for highly talented dynamic professionals to be part of one of its kind project “Fusion”. Come and join the most vibrant team common core team in Fusion Financials and make the difference. Responsibilities Lead the design and development of AI-powered agentic applications such as Ledger Agent and Payment Agent within the Oracle Fusion ERP suite. Architect and build scalable, cloud-native financial services using Java, SOA, REST/SOAP, ADF, and Oracle middleware technologies. Drive the integration of Generative AI techniques to enable intelligent decision-making in enterprise workflows. Define and implement microservices-based solutions using Docker, Kubernetes, and CI/CD pipelines (Jenkins, Git). Collaborate with product managers, architects, and cross-functional teams to align technical design with business goals. Mentor and guide junior developers, perform code reviews, and ensure adherence to coding and design best practices. Own complex problem-solving efforts and troubleshoot production issues with a focus on root-cause resolution. Stay ahead of technology trends, especially in AI/Gen AI, and drive innovation into existing and new application components. Champion agile methodologies and continuous improvement in development processes, tooling, and team collaboration. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Hi {fullName} There is an opportunity for Azure Data Engineer(Databricks, Pyspark, Python, SQL)IN Hyderabad for which WALKIN interview is there on 24th May 25 between 9:30 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as Azure Data Engineer 24th May 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW ON 24TH MAY 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. Role Azure Data Engineer Desired Competencies (Technical/Behavioral Competency) Must-Have Minimum 4+ years of development experience in Azure. Must have “Data Warehouse / Data Lake” development experience. Must have “Azure Data Factory (ADF) & Azure SQL DB” Must have “Azure Data Bricks” experience using Python or Spark or Scala. Nice to have “Data Modelling” & “Azure Synapse” experience. Passion for Data Quality with an ability to integrate these capabilities into the deliverables. Prior use of Big Data components and the ability to rationalize and align their fit for a business case. Experience in working with different data sources - flat files, XML, JSON, Avro files and databases Experience in developing implementation plans and schedules and preparing documentation for the jobs according to the business requirements. Proven experience and ability to work with people across the organization and skilled at managing cross-functional relationships and communicating with leadership across multiple organizations Proven capabilities for strong written and oral communication skill with the ability to synthesize, simplify and explain complex problems to different audiences. Good-to-Have Nice to have Azure Data Engineer Certifications Roles & Responsibilities Ability to integrate into a project team environment and contribute to project planning activities. Lead ambiguous and complex situations to clear measurable plans. Show more Show less

Posted 1 month ago

Apply

Exploring ADF Jobs in India

The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.

Top Hiring Locations in India

Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai

Average Salary Range

The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.

Related Skills

In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.

Interview Questions

Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?

Closing Remark

As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies