Jobs
Interviews

1655 Adf Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO), Any Graduation,12th/PUC/HSC

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Title - + + Management Level: Location: Kochi, Coimbatore, Trivandrum Must have skills: Python/Scala, Pyspark/Pytorch Good to have skills: Redshift Job Summary You’ll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Responsibilities Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured→ unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional And Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture , Experience: 3.5 -5 years of experience is required Educational Qualification: Graduation (Accurate educational details should capture)

Posted 3 weeks ago

Apply

0.0 - 2.0 years

5 - 12 Lacs

Pune, Maharashtra

On-site

Company name: PibyThree consulting Services Pvt Ltd. Location : Baner, Pune Start date : ASAP Job Description : We are seeking an experienced Data Engineer to join our team. The ideal candidate will have hands-on experience with Azure Data Factory (ADF), Snowflake, and data warehousing concepts. The Data Engineer will be responsible for designing, developing, and maintaining large-scale data pipelines and architectures. Key Responsibilities: Design, develop, and deploy data pipelines using Azure Data Factory (ADF) Work with Snowflake to design and implement data warehousing solutions Collaborate with cross-functional teams to identify and prioritize data requirements Develop and maintain data architectures, data models, and data governance policies Ensure data quality, security, and compliance with regulatory requirements Optimize data pipelines for performance, scalability, and reliability Troubleshoot data pipeline issues and implement fixes Stay up-to-date with industry trends and emerging technologies in data engineering Requirements: 4+ years of experience in data engineering, with a focus on cloud-based data platforms (Azure preferred) 2+ years of hands-on experience with Azure Data Factory (ADF) 1+ year of experience working with Snowflake Strong understanding of data warehousing concepts, data modeling, and data governance Experience with data pipeline orchestration tools such as Apache Airflow or Azure Databricks Proficiency in programming languages such as Python, Java, or C# Experience with cloud-based data storage solutions such as Azure Blob Storage or Amazon S3 Strong problem-solving skills and attention to detail Job Type: Full-time Pay: ₹500,000.00 - ₹1,200,000.00 per year Schedule: Day shift Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: total work: 4 years (Preferred) Pyspark: 2 years (Required) Azure Data Factory: 2 years (Required) Databricks: 2 years (Required) Work Location: In person

Posted 3 weeks ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Role & Responsibilities: Within the technical team and under the guidance of the Team Manager, you will: Be in charge of installing, configuring, and upgrading/patching the Product applications internally Handle and follow up technical issues (wide diversity and complexity) and perform corrective actions Interact actively with the functional and technical teams (including development and architecture) located around the globe Provide advice for choices and implementation of interfaces / surrounds (inbound, and outbound), including advice and support on how to develop client reporting Propose solutions to address client challenges Provide on call support (24x7) on rotation basis and weekend/holiday support Support in shifts on rotation basis Contribute towards the Technical Knowledgebase (preparation of documents / presentations on related topics) Provide training, guidance and support to client IT teams Job Description: Good skills in Oracle Fusion Middleware 11g /12c (Forms & Report, ADF, BI Publisher, Oracle Identify Management (OID/OAM)) Good skills in handling middleware vulnerabilities and security (CVE) related queries Excellent analytical and logical skills Ability to address problems with methodology Ability to anticipate client needs and be proactive Strong motivation to continuously increase quality and efficiency Good presentation and communication skills Autonomous, rigorous, and well organized Capability to work within a global team (spread across geographies) and interact with different teams Willingness to work in rotational shifts Quick learner and keen to learn SQL, PL/SQL and Oracle database and new technologies Good to have Knowledge of SQL, PL/SQL and Oracle database

Posted 3 weeks ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Chennai, Delhi / NCR, Mumbai (All Areas)

Hybrid

Job Description (JD): Azure Databricks / ADF / Synapse , with strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Azure (Databricks / ADF / Synapse) Experience: 4 to 7 Years Location: Pan India Employment Type: Full-Time Notice Period: Immediate to 30 Days Job Summary: We are looking for a skilled and experienced Data Engineer with 4 to 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Azure Databricks , Azure Data Factory (ADF) , or Azure Synapse Analytics , along with Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 18 Lacs

Hyderabad, Bengaluru

Work from Office

Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 23 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 23 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

25 - 35 Lacs

Bengaluru

Hybrid

We are hiring Azure Data Engineers for an active project-Bangalore location Interested candidates can share details on the mail with their updated resume. Total Exp? Rel exp in Azure Data Engineering? Current organization? Current location? Current fixed salary? Expected Salary? Do you have any offers? if yes mention the offer you have and reason for looking for more opportunity? Open to relocate Bangalore? Notice period? if serving/not working, mention your LWD? Do you have PF account ?

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 18 Lacs

Hyderabad, Bengaluru

Work from Office

Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP – Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 2–3 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 2–3 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills. Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Pune

Hybrid

Azure Data Engineer Remote/Pune-Hybrid Full time Permanent Company: Academian Job Description: We are seeking a skilled Data Engineer with strong experience in Microsoft Azure Cloud services to design, build, and maintain robust data pipelines and architectures. In this role, you will design, implement, and maintain our data infrastructure, ensuring efficient data processing and availability throughout the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines in Azure. Work with Azure services such as Azure Data Factory, Azure Data Lake, Synapse Analytics, Azure SQL, and Databricks . Implement and optimize data storage and retrieval solutions in the cloud. Ensure data quality, consistency, and governance through robust validation and monitoring. Develop and manage CI/CD pipelines for data workflows using tools like Azure DevOps . Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions. Support and troubleshoot data issues and ensure high availability of data infrastructure. Follow best practices in data security, privacy, and compliance Develop and maintain data architectures (data lakes, data warehouses). Integrate data from a wide variety of sources (APIs, logs, third-party platforms). Monitor data workflows and troubleshoot data-related issues. Required Skills & Experience Bachelors degree in computer science, Information Technology, or related field 5+ years of experience in data engineering or similar role Strong hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, and Databricks. Proficiency in SQL, Python, and PySpark. Experience with data modeling, schema design, and data warehousing. Familiarity with CI/CD processes, version control (e.g., Git), and deployment in Azure DevOps. Knowledge of data governance tools and practices (e.g., Azure Purview, RBAC). Strong SQL skills and experience with relational databases Proficiency with Apache Kafka and streaming data architectures Knowledge of ETL tools and processes Familiarity with DW-BI Tools PowerBI Strong knowledge of database systems (PostgreSQL, MySQL, NoSQL). Understanding of distributed systems like Kafka or MSK Preferred Skills: Experience of data visualization tools Experience with NoSQL databases Understanding of machine learning pipelines and workflows Regards Manisha Koul mkoul@academian.com www.linkedin.com/in/koul-manisha

Posted 3 weeks ago

Apply

3.0 - 7.0 years

4 - 7 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

About the Role: We're hiring 2 Cloud & Data Engineering Specialists to join our fast-paced, agile team. These roles are focused on designing, developing, and scaling modern, cloud-based data engineering solutions using tools like Azure, AWS, GCP, Databricks, Kafka, PySpark, SQL, Snowflake, and ADF. Position 1: Cloud & Data Engineering Specialist Resource 1 Key Responsibilities: Develop and manage cloud-native solutions on Azure or AWS Build real-time streaming apps with Kafka Engineer services using Java and Python Deploy and manage Kubernetes-based containerized applications Process big data using Databricks Administer SQL Server and Snowflake databases, write advanced SQL Utilize Unix/Linux for system operations Must-Have Skills: Azure or AWS cloud experience Kafka, Java, Python, Kubernetes Databricks, SQL Server, Snowflake Unix/Linux commands Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai, Hyderabad

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modeling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus Mandatory skill sets: ADE, ADB, ADF Preferred skill sets: ADE, ADB, ADF Years of experience required: 4-8 Years Education qualification: BE, B.Tech, MCA, M.Tech d Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ADF Business Components, ADL Assistance, Android Debug Bridge (ADB) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 3 weeks ago

Apply

3.0 years

3 - 5 Lacs

Gurgaon

On-site

#Freepost Designation: Middleware Administrator L2 Experiences: 3+ Years Qualification: BE/BTech/Diploma in IT Background Roles & Responsibilities: Application Monitoring Services ✓ Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. ✓ It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. ✓ It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. ✓ It should automatically segment response-time information into network, server and local workstation components to easily identify the source of bottlenecks. ✓ Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc Job Type: Full-time Pay: ₹350,000.00 - ₹500,000.00 per year Benefits: Health insurance Provident Fund Work Location: In person

Posted 3 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Gurgaon

On-site

# Freepost Designation: Middleware administrator L1 Location: Gurgaon Experience: 2-4 years of experience Qualification: B.E. / B. Tech/BCA Required Key Skills: 1. Application Monitoring Services1. Real-Time Performance Monitoring Monitor application response times from the end-user perspective. Trigger alerts when performance is below acceptable thresholds. Segment response times to quickly identify problem sources and reduce resolution time. 2. Transaction-Level Monitoring Enable tracking of specific, business-critical application transactions. Allow targeted monitoring for selected operations for better visibility and control. 3. Baseline-Oriented Threshold Alerts Use dynamic baselines to raise alerts on deviation in application response times. Help administrators detect and address issues proactively. 4. Response Time Segmentation Automatically categorize response time into: Network Server Local Workstation Assist in pinpointing performance bottlenecks. 5. Supported Applications and Platforms Monitoring support includes: Oracle Forms 10g, 12.2.1.3 Oracle SSO 10g, Oracle Access Manager 12.2.1.3 Oracle Internet Directory (OID) 10g, 12.2.1.3 Oracle Portal 10g, WebCenter Portal 12.2.1.3 Oracle Reports 10g, 12.2.1.3 Oracle Web Server (OWS) 10.1.2.2.0 Oracle Internet Application Server (OAS) 10.1.2.2.0 Oracle WebLogic Server 12.2.1.3 Oracle HTTP Server 12.2.1.3 Oracle ADF (Fusion Middleware) 12.2.1.3 Mobile applications, Windows IIS, Web Cache BizTalk Applications, DNS Applications, Apache Tomcat, etc. 6. Operational Activities Application shutdown and startup MIS report generation Load and performance monitoring Script execution for user account management Event and error log monitoring Daily health checklist compliance Portal status and content updates 7. Logging and Reporting System events and incidents logging Update SR and Incident tickets in Symphony iServe Tool Application Release Management 1. Release Coordination Schedule, coordinate, and manage application releases across environments. 2. Deployment Management Perform pre-deployment activities including: Code backup New code placement Restarting services post-deployment Job Types: Full-time, Permanent Benefits: Health insurance Provident Fund Schedule: Morning shift Rotational shift Work Location: In person

Posted 3 weeks ago

Apply

3.0 years

1 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design, develop, and implement data models and ETL processes for Power BI solutions Be able to understand and create Test Scripts for data validation as it moves through various lifecycles in cloud-based technologies Be able to work closely with business partners and data SMEs to understand Healthcare Quality Measures and its related business requirements Conduct data validation after major/minor enhancements in project and determine the best data validation techniques to implement Communicate effectively with leadership and analysts across teams Troubleshoot and resolve issues with Jobs/pipelines/overhead Ensure data accuracy and integrity between sources and consumers Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent (B.Tech./MCA preferred) with overall 3+ years of work experience 3+ years of advanced understanding to at least one programming language - Python, Spark, Scala Experience of working with Cloud technologies preferably Snowflake, ADF and Databricks Experience of working with Agile Methodology (preferably in Rally) Knowledge of Unix Shell Scripting for automation & scheduling Batch Jobs Knowledge of Configuration Management - Github Knowledge of Relational Databases - SQL Server, Oracle, Teradata, IBM DB2, MySQL Knowledge of Messaging Queues - Kafka/ActiveMQ/RabbitMQ Knowledge of CI/CD Tools - Jenkins Understanding Relational Database Model & Entity Relation diagrams Proven solid communication and interpersonal skills Proven excellent written and verbal communication skills with ability to provide clear explanations and overviews to others (internal and external) of their work efforts Proven solid facilitation, critical thinking, problem solving, decision making and analytical skills Demonstrated ability to prioritize and manage multiple tasks At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Job Title: Data Engineer Experience: 5+ Years Location: Pan India Mode: Hybrid Skill combination- Python AND AWS AND Databricks AND Pyspark AND Elastic Search We are looking for a Data Engineer to join our Team to build, maintain, and enhance scalable, high-performance data pipelines and cloud-native solutions. The ideal candidate will have deep experience in Databricks , Python , PySpark , Elastic Search , and SQL , and a strong understanding of cloud-based ETL services, data modeling, and data security best practices. Key Responsibilities: Design, implement, and maintain scalable data pipelines using Databricks , PySpark , and SQL . Develop and optimize ETL processes leveraging services like AWS Glue , GCP DataProc/DataFlow , Azure ADF/ADLF , and Apache Spark . Build, manage, and monitor Airflow DAGs to orchestrate data workflows. Integrate and manage Elastic Search for data indexing, querying, and analytics. Write advanced SQL queries using window functions and analytics techniques. Design data schemas and models that align with various business domains and use cases. Optimize data warehousing performance and storage using best practices. Ensure data security, governance, and compliance across all environments. Apply data engineering design patterns and frameworks to build robust solutions. Collaborate with Product, Data, and Engineering teams; support executive data needs. Participate in Agile ceremonies and follow DevOps/DataOps/DevSecOps practices. Respond to critical business issues as part of an on-call rotation. Must-Have Skills: Databricks (3+ years): Development and orchestration of data workflows. Python & PySpark (3+ years): Hands-on experience in distributed data processing. Elastic Search (3+ years): Indexing and querying large-scale datasets. SQL (3+ years): Proficiency in analytical SQL including window functions . ETL Services : AWS Glue GCP DataProc/DataFlow Azure ADF / ADLF Airflow : Designing and maintaining data workflows. Data Warehousing : Expertise in performance tuning and optimization. Data Modeling : Understanding of data schemas and business-oriented data models. Data Security : Familiarity with encryption, access control, and compliance standards. Cloud Platforms : AWS (must), GCP and Azure (preferred). Skills Python,Databricks,Pyspark,Elastic Search

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Location HYDERABAD OFFICE INDIA Job Description Are you looking to take your career to the next level? We’re looking for a Junior Software Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will demonstrate modern Agile and DevOps practices to craft, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to use data engineering skills to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine software engineering standard methodologies (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture carry out to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final products to improve overall performance, efficiency and/or to enable adaptation of new business processes. Qualifications Job Qualifications Bachelor’s degree in computer science or related technical field. 4+ years of experience working as Software Engineer (with focus on developing in Python, PySpark, Databricks, ADF) Fullstack engineering experience (Python/React/Javascript/APIs) Experience demonstrating modern software engineering practices (code standards, Gitflow, automated testing, CICD, DevOps) Experience working with Cloud infrastructure (Azure preferred) Strong verbal, written, and interpersonal communication skills. A strong desire to produce high quality software through multi-functional teamwork, testing, code reviews, and other best practices. YOU ALSO SHOULD HAVE: Strong written and verbal English communication skills to influence others Proven use of data and tools Ability to prioritize multiple priorities Ability to work collaboratively across different functions and geographies We produce globally recognized brands and we grow the best business leaders in the industry. With a portfolio of trusted brands as diverse as ours, it is paramount our leaders are able to lead with courage the vast array of brands, categories and functions. We serve consumers around the world with one of the strongest portfolios of trusted, quality, leadership brands, including Always®, Ariel®, Gillette®, Head & Shoulders®, Herbal Essences®, Oral-B®, Pampers®, Pantene®, Tampax® and more. Our community includes operations in approximately 70 countries worldwide. Visit http://www.pg.com to know more. We are an equal opportunity employer and value diversity at our company. We do not discriminate against individuals on the basis of race, color, gender, age, national origin, religion, sexual orientation, gender identity or expression, marital status, citizenship, disability, HIV/AIDS status, or any other legally protected factor. "At P&G, the hiring journey is personalized every step of the way, thereby ensuring equal opportunities for all, with a strong foundation of Ethics & Corporate Responsibility guiding everything we do. All the available job opportunities are posted either on our website - pgcareers.com, or on our official social media pages, for the convenience of prospective candidates, and do not require them to pay any kind of fees towards their application.” Job Schedule Full time Job Number R000134973 Job Segmentation Experienced Professionals (Job Segmentation)

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Azure Data Engineer Experience: 5+ Years About the Company: EY is a leading global professional services firm offering a broad range of services in assurance, tax, transaction, and advisory services. We're looking to hire a skilled ADF Developer who has proficiency in Python and Microsoft Power Platform. Job Responsibilities: Analyse and translate business requirements into technical requirements and architecture. Design, develop, test, and deploy Azure Data Factory (ADF) pipelines for ETL processes. Create, maintain, and optimize Python scripts that interface with ADF and support data operations. Utilize Microsoft Power Platform for designing intuitive user interfaces, automating workflows, and creating effective database solutions. Implements Power Apps, Power Automate, and Power BI to support better business decisions. Collaborate with diverse teams to ensure seamless integration of ADF solutions with other software components. Debug and resolve technical issues related to data transformations and processing. Implement robust data validation and error handling routines to ensure data consistency and accuracy. Maintain documentation of all systems and processes developed, promoting transparency, and consistency. Monitor and optimize performance of ADF solutions regularly. Proactively stay up-to-date with the latest technologies and techniques in data handling and solutions. Required Skills: Proven working experience as an ADF Developer. Hands-on experience with data architectures including complex data models and data governance. Strong proficiency in Python and demonstrated experience with ETL processes. Proficient knowledge of Microsoft Power Platform (Power BI, Power Apps, and Power Automate). Understanding of SQL and relational database concepts. Familiarity with cloud technologies, particularly Microsoft Azure. Excellent problem-solving skills and ability to debug complex systems. Preferred Skills: Knowledge of standard authentication and authorization protocols such as OAuth, SAML, and LDAP. Education : BS/MS degree in Computer Science, Engineering, or a related subject is required. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Company Why Join 7-Eleven Global Solution Center? When you join us, you'll embrace ownership as teams within specific product areas take responsibility for end-to-end solution delivery, supporting local teams and integrating new digital assets. Challenge yourself by contributing to products deployed across our extensive network of convenience stores, processing over a billion transactions annually. Build solutions for scale, addressing the diverse needs of our 84,000+ stores in 19 countries. Experience growth through cross-functional learning, encouraged and applauded at 7-Eleven GSC. With our size, stability, and resources, you can navigate a rewarding career. Embody leadership and service as 7-Eleven GSC remains dedicated to meeting the needs of customers and communities. Why We Exist, Our Purpose and Our Transformation? 7-Eleven is dedicated to being a customer-centric, digitally empowered organization that seamlessly integrates our physical stores with digital offerings. Our goal is to redefine convenience by consistently providing top-notch customer experiences and solutions in a rapidly evolving consumer landscape. Anticipating customer preferences, we create and implement platforms that empower customers to shop, pay, and access products and services according to their preferences. To achieve success, we are driving a cultural shift anchored in leadership principles, supported by the realignment of organizational resources and processes. At 7-Eleven we are guided by our Leadership Principles. Each principle has a defined set of behaviours which help guide the 7-Eleven GSC team to Serve Customers and Support Stores. Be Customer Obsessed Be Courageous with Your Point of View Challenge the Status Quo Act Like an Entrepreneur Have an “It Can Be Done” Attitude Do the Right Thing Be Accountable About the Role Job Title: Manager - QA Location: Bengaluru Responsibilities 10+ years of experience to lead QA strategy across Oracle MOM modules (ORMS, MFCS, RPM, Invoice Matching, ReSA, EBS, EPM). Functional knowledge on Oracle MOM is an advantage. Manage test plan, execution, and defect triage across all test phases (Functional, Regression, UAT, Integration). Own and evolve the automated test framework using Selenium, Robot Framework, Python, and RPA principles. Oversee CI/CD test pipeline integration via Jenkins. Define and execute performance testing plans using tools like JMeter or LoadRunner. Coordinate API testing using Postman, REST Assured, or similar tools for MFCS and other services. Drive regular regression cycles across patch releases, upgrades, and enhancements. Collaborate with business analysts, developers, and infrastructure teams to align test coverage. Ensure proper test data management and environment readiness. Experience in managing automation team sizes 5-7 resources. Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field. 10+ years of experience to lead QA strategy across Oracle MOM modules (ORMS, MFCS, RPM, Invoice Matching, ReSA, EBS, EPM). Functional knowledge on Oracle MOM is an advantage. Experience working in Agile environments is highly preferred. Required Skills Proficient in developing UI test automation frameworks using Python and RPA tools (e.g., UiPath, Automation Anywhere, or Blue Prism), Jenkins and Java/JavaScript. Must have a good hands on UI automation using Selenium/ RPA/ Cucumber. Write and maintain automated test scripts using Java/Python and RPA tools (e.g., UiPath, Automation Anywhere, or Blue Prism). Work with the development team to create and implement effective test automation strategies for continuous integration/continuous delivery (CI/CD) pipelines. Develop automated test frameworks and enhance existing automation suites to improve testing efficiency and coverage. Hands-on experience in Oracle ADF Web, API testing and Oracle Database testing. Strong knowledge of test management and defect tracking tools (e.g., JIRA). Experience with database validations using Oracle SQL queries. Familiarity with Agile methodologies and practices. Ability to perform backend data verification. Preferred Skills Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Ability to work collaboratively across cross-functional teams. Attention to detail and a proactive approach to identifying issues. Ability to manage multiple tasks and priorities effectively. Pay range and compensation package 7-Eleven Global Solution Center offers a comprehensive benefits plan tailored to meet the needs and improve the overall experience of our employees, aiding in the management of both their professional and personal aspects. Equal Opportunity Statement 7-Eleven Global Solution Center is an Equal Opportunity Employer committed to diversity in the workplace. Our strategy focuses on three core pillars – workplace culture, diverse talent and how we show up in the communities we serve. As the recognized leader in convenience, the 7-Eleven family of brands embraces diversity, equity and inclusion (DE+I). It’s not only the right thing to do for customers, Franchisees and employees—it’s a business imperative.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Senior Data engineer for production support who will provide daily end-to-end support for daily data loads & manage production issues. What Will You Do Monitor & support various data loads for our Enterprise Data Warehouse. Support business users who are accessing POWER BI dashboards & Datawarehouse tables. Handle incidents, service requests within defined SLA’s. Work with team on managing Azure resources including but not limited to Databricks, Azure Data Factory pipelines, ADLS etc. Build new ETL/ELT pipelines using Azure Data Products like Azure Data Factory, Databricks etc. Help build best practices & processes. Coordinate with upstream/downstream teams to resolve data issues. Work with the QA team and Dev team to ensure appropriate automated regressions are added to detect such issues in future. Work with the Dev team to improve automated error handling so manual interventions can be reduced. Analyze process and pattern so other similar unreported issues can be resolved in one go. What You Will Need Strong IT professional with 3-4 years of experience in Data Engineering. The candidate should have strong analytical and problem-solving skills. Must Have 3-4 years of experience in Data warehouse design & development and ETL using Azure Data Factory (ADF) Experience in writing complex TSQL procedures on MPP platforms - Synapse, Snowflake etc. Experience in analyzing complex code to troubleshoot failure and where applicable recommend best practices around error handling, performance tuning etc. Ability to work independently, as well as part of a team and experience working with fast-paced operations/dev teams. Good understanding of business process and analyzing underlying data Understanding of dimensional and relational modelling Detailed oriented, with the ability to plan, prioritize, and meet deadlines in a fast-paced environment. Can be added to SDE Knowledge of Azure cloud technologies Exceptional problem-solving skills Nice To Have Experience crafting, building, and deploying applications in a DevOps environment utilizing CI/CD tools Understanding of dimensional and relational modeling Relevant certifications Basic knowledge of Power BI. Who Are You Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99740 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Looking for Solution Architect with at least 8 years of experience in APEX, ADF, Workflow, ATP, Pl/SQL and OIC. Job Summary: We are seeking an experienced Solution Architect with deep expertise in Oracle technologies, including APEX, ADF, Workflow, ATP, PL/SQL, and Oracle Integration Cloud (OIC) . The ideal candidate will play a key role in designing and delivering enterprise-grade solutions, providing technical leadership, and ensuring seamless integration of Oracle applications and services across the ecosystem. Key Responsibilities: Architect and design robust, scalable, and secure enterprise applications using Oracle APEX, ADF, and OIC. Lead solution development and integration efforts across Oracle Cloud and on-premise systems. Define and implement workflow processes using Oracle Workflow and related technologies. Develop complex PL/SQL procedures, functions, and triggers to support business logic. Optimize and manage Oracle Autonomous Transaction Processing (ATP) environments. Provide end-to-end integration solutions using Oracle Integration Cloud (OIC). Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications. Lead technical design reviews, conduct code reviews, and ensure best practices. Ensure solutions meet performance, scalability, and security requirements. Provide mentorship and guidance to development teams. Required Skills & Qualifications: Proven experience (8–10+ years) in Oracle technologies, especially APEX, ADF, and PL/SQL. Strong understanding of Oracle Autonomous Database (ATP) and its capabilities. Hands-on experience designing and developing integrations using Oracle Integration Cloud (OIC) . Experience in Oracle Workflow development and customization. Proficiency in database performance tuning, PL/SQL optimization, and Oracle SQL. Sound understanding of cloud infrastructure, REST/SOAP web services, and integration patterns. Excellent communication and stakeholder management skills. Bachelor's degree in Computer Science, Engineering, or a related field (Master’s preferred). Oracle certifications are a plus.

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

🚀 We’re Hiring – Oracle Retail Techno-Functional Consultant (Remote Opportunity) 🛍️🌐 Join us on a mission to modernize retail systems for a leading US-based retailer ! We’re looking for Oracle Retail experts who can bring both functional clarity and technical depth to high-impact projects. 🔹 Key Skills & Experience: ✔️ Strong expertise in Oracle Retail modules : RMS, RPM, ReIM, ReSA, RMFCS, RPCS (cloud experience is a big plus) ✔️ Proficiency in PL/SQL, Oracle APEX , and BI Publisher ✔️ Deep functional understanding of merchandising , inventory , purchase orders , pricing , and invoice matching ✔️ Hands-on experience in data migration & cleansing — familiarity with Azure Databricks, ADF, or Python preferred ✔️ Agile and DevOps project experience with proven global stakeholder communication ✔️ Ability to lead modules , manage offshore teams, and troubleshoot production issues independently 🧑‍💻 Work Mode : 100% Remote 🌍 Client : US-based retailer 📈 Great opportunity to work with cross-functional global teams on Oracle Retail transformation programs! 📩 Ready to apply or refer a colleague? Message us or drop your resume today. #OracleRetail #RemoteJobs #RMS #RetailTransformation #PLSQL #Databricks #ADF #TechLead #RetailIT #HiringNow #OracleJobs #APEX #BIpublisher

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities. Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why we're hiring: At WPP, technology is at the heart of everything we do, and it is WPP IT’s mission to enable everyone to collaborate, create and thrive. WPP IT is undergoing a significant transformation to modernise ways of working, shift to cloud and micro-service-based architectures, drive automation, digitise colleague and client experiences and deliver insight from WPP’s petabytes of data. WPP Media is the world’s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the outcomes-driven programmatic audience company, Xaxis and data and technology company Choreograph. WPP Medias portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business. The WPP Media team in WPP IT are the technology solutions partner for the WPP Media group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the WPP Media IT technology life-cycle and innovation pipeline. As part of the global Data & Measure team, this role will play a key part in building scalable, insightful and user-centric data products. Working closely with global stakeholders and cross-functional teams, the Senior Power BI Developer will lead both the frontend development and backend data modeling for business-critical dashboards and analytics solutions. This is not a pure report-builder role — success depends on a solid understanding of data architecture, ETL, and integration, paired with sharp visual storytelling and dashboard design. What you'll be doing: Primary Responsibilities: Design and build interactive dashboards and reports using Power BI Work closely with product teams to understand reporting needs and translate them into scalable data products Own data transformation and modeling in Power Query and DAX Maintain and optimize data flows, datasets, and semantic models Ensure data accuracy, usability, and access control across published solutions Collaborate with backend teams to shape data sources for frontend consumption Document BI solutions, including business logic, KPIs, and metadata Partner with global teams to define standards and reuse patterns Additional Responsibilities: Support Power BI integration into existing or evolving global platforms, such as Measure. Contribute to defining global best practices for BI and self-service enablement Participate in data quality reviews and source system integration discussions Guide junior developers or offshore partners when necessary What you'll need: Required Skills: Bachelor’s degree in computer science, Engineering, or related field 5+ years of experience with Power BI development (incl. DAX, Power Query) Experience working with large datasets and complex data models Strong understanding of SQL and backend data principles Ability to bridge business needs with technical implementation Excellent data visualization and storytelling skills Experience working in cross-functional, global teams Strong English communication skills (verbal and written) Preferred Skills: Experience with Azure SQL, Synapse, or similar data platforms Familiarity with Azure Data Factory (ADF) for orchestrating data pipelines and ETL processes Exposure to data warehousing or MDM concepts Familiarity with DevOps processes and version control (e.g., Git) Experience working in an agile or product-based environment Power Platform exposure (Power Automate, Power Apps) is a plus Who you are: You're open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working. You're optimistic : We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. You're extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What we'll give you: Passionate, inspired people – We aim to create a culture in which people can do extraordinary work. Scale and opportunity – We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work – Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge? We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we’ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers. Please read our Privacy Notice (https://www.wpp.com/en/careers/wpp-privacy-policy-for-recruitment) for more information on how we process the information you provide.

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

India

On-site

Job Title: Azure Administrator Location: Bangalore - Hybrid Job Type: Full-time Experience Level: Mid to Senior (4–12 years) About The Role: We are seeking a skilled and proactive Azure Administrator to join our cloud infrastructure and operations team. The ideal candidate will have strong experience managing and securing Microsoft Azure environments, implementing DevOps practices, and supporting CI/CD pipelines. In addition to infrastructure responsibilities, the role requires hands-on experience with Azure Data Factory for managing data workflows and integrations across systems. You will help ensure seamless infrastructure performance, secure access, and efficient data operations across the cloud ecosystem. Key Responsibilities: Azure Infrastructure Management: Administer and optimize Azure services including Virtual Machines, Virtual Networks, Storage Accounts, Load Balancers, and Azure Resource Groups. Manage provisioning, scaling, and cost optimization of Azure resources using best practices. Azure Data Factory (ADF): Design, build, and manage data pipelines and integration workflows using Azure Data Factory. Collaborate with data teams to support ETL/ELT operations, data movement, and transformation across hybrid and cloud systems. Monitor data pipeline performance, troubleshoot failures, and optimize pipeline efficiency. DevOps & CI/CD: Build and maintain automated CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins. Integrate deployment automation using Infrastructure-as-Code (IaC) tools such as Terraform, Bicep, or ARM templates. Ensure seamless deployment, versioning, and rollback of infrastructure and application components. Identity & Access Management (IAM): Administer Azure Active Directory (Azure AD), manage user identities, groups, service principals, and enterprise applications. Implement and enforce MFA, Conditional Access policies, and SSO across services. Role-Based Access Control (RBAC): Define and apply granular RBAC policies to control access to resources. Ensure least-privilege access principles and maintain access audit logs. Monitoring & Security: Set up and manage monitoring using Azure Monitor, Log Analytics, and Application Insights. Respond to incidents, participate in on-call rotations, and resolve critical system issues. Apply security recommendations from Azure Security Center and enforce compliance standards. Documentation & Process Improvement: Maintain clear and updated documentation of infrastructure configurations, access controls, and deployment processes. Identify opportunities to improve automation, performance, and data reliability. Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent hands-on experience). 4–7 years of experience in Azure administration, cloud infrastructure, and data integration. Strong hands-on expertise with Azure services, including: Azure Virtual Networks, VMs, Storage, and Networking. Azure Active Directory and RBAC. Azure Data Factory (ADF) – data pipeline creation, management, and troubleshooting. Experience with CI/CD pipeline tools and DevOps practices. Proficiency in scripting (PowerShell, Bash, Azure CLI). Experience with IaC tools: Terraform, Bicep, or ARM templates. Strong understanding of cloud security and identity management. Excellent analytical and troubleshooting skills. Strong written and verbal communication. Collaborative, detail-oriented, and proactive mindset. Preferred Skills: Azure Certifications (e.g., AZ-104, AZ-400, AZ-500, or DP-203). Experience with hybrid cloud environments. Knowledge of data integration with SQL, Blob Storage, Synapse, or on-prem systems via ADF. Familiarity with Docker, Kubernetes, or AKS is a plus.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the Role: We are seeking a seasoned Power BI Technical Manager to lead our business intelligence and data visualization initiatives. The ideal candidate will have deep expertise in Power BI and Microsoft’s data stack, combined with proven leadership experience managing BI teams and driving data strategy across the organization. Key Responsibilities: Lead the end-to-end design, development, deployment, and maintenance of Power BI dashboards and reports. Collaborate with business stakeholders to understand reporting needs, translate requirements into technical solutions, and deliver high-impact dashboards. Architect scalable Power BI data models, ensure performance optimization, and enforce data governance best practices. Manage a team of BI developers and analysts; mentor and support their growth and technical development. Oversee integration of data from various sources (SQL Server, Azure, Excel, APIs, etc.) into Power BI. Ensure data accuracy, security, and compliance with internal and external policies. Drive adoption of self-service BI practices across departments. Collaborate with cross-functional teams including data engineering, IT, and business functions. Stay updated on the latest trends in BI, data visualization, and Microsoft Power Platform. Required Skills & Experience: 10 to 12 years of overall experience in Business Intelligence, with at least 5+ years hands-on in Power BI. Strong experience in DAX, Power Query (M language), data modeling, and visual storytelling. Proficiency with SQL Server, SSIS, and Azure Data Services (ADF, Synapse, etc.). Solid understanding of data warehousing concepts and data governance frameworks. Experience managing and mentoring BI/analytics teams. Ability to translate business needs into technical solutions with high accuracy and efficiency. Experience working in Agile/Scrum environments. Excellent communication, stakeholder management, and presentation skills. Microsoft certifications (e.g., DA-100, PL-300, or related) preferred. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field. Why Join Us? Work on cutting-edge BI solutions that drive strategic decisions. Collaborative and innovation-driven work environment. Competitive compensation and performance-based growth opportunities.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies