Jobs
Interviews

9 Data Access Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

How is this team contributing to the vision of Providence The EBAT-HCM team is responsible for providing Oracle ERP Implementation Support across HCM, SCM & Finance domains. We cater to Functional, Technical, Infrastructure & App Security requirements for End-To-End Oracle Cloud ERP Implementation/Support. What will you be responsible for As an ERP Analyst, you are responsible for analyzing business needs to help ensure Oracle solution meets the customers objectives by combining industry best practices, product knowledge, and business acumen. Your primary role will focus on support of the existing solution in the Oracle HCM Talent Management solution, as well as implementation /solution design aspects of continual engagement(s), ensuring high quality, integrated software solutions within constraints of time and budget. In addition, support existing & new integration solutions/needs using Oracle&aposs SaaS/PaaS offerings. Secondary skills in the domain of Absence Management, Benefits & Compensation will be preferred. What would your day look like Function as domain expert providing best-practice guidance on intercompany business processes and implementation approaches. Assist with defining Scope and estimates for new projects or builds. Understand business requirements and convert into system configurations in Oracle modules and bring in diverse perspectives. Ability to gather requirements, conduct fit-gap analysis, impact analysis and design solutions. Ability to communicate complex technology solutions to diverse teams including - technical, business and management teams. Draft and review key deliverable documents such as functional specification documents, testing documents, configuration workbooks. Help investigate and resolve system functional and technical errors. Troubleshoot on systems and data and generate solutions which may include systems modifications enhancements or new procedures. Identify and analyze operational and systems issues and opportunities and produce effective solutions. Proactively review all relevant & upcoming release features & tie it back to the solution at hand. Work closely with Oracle wherever required on bug fixes, CWBs, remediate release testing issues. Monitor critical ongoing processes that are vital to business functioning. Work on Redwood changes & AI features & demo with key stakeholders for feasibility & adoption Who are we looking for 4+ Yrs of full lifecycle Oracle HCM Cloud Fusion experience with a minimum of 3 large Oracle HCM implementations with hands-on Redwood experience (advanced preferred). Primary Skill in Oracle Talent Management (covering Goal Management, Performance Management, Talent Review, Succession Planning, Career Development & Profile Management). Secondary skills in other HCM modules (Absence Management, Benefits & Compensation expertise will be given preference). Configuration & debugging of approvals. A strong understanding of best practices across a range of business processes, cross-pillar dependencies and related application implementation design and configuration options within large scale multi-application implementation and business transformation programs. Experience with designing solutions, conducting fit gap analysis, configuration with setups in different HCM modules and drafting TFS documents. Awareness and understanding of the capabilities across a wide range of Oracle applications and platform components, including ERP, EPM, SCM, HCM, Analytics, Integration. Exposure to technical skills in BIP Reports, OTBI, HCM Extracts, conversions (HDL, HSDL), approvals/workflows, security (data access) and notification templates. Bachelor&aposs Degree (Computer Science, Business Management, Information Services or an equivalent combination of education and relevant experience). Show more Show less

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Responsibilities Job Description : Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks. Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologie Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform, Extract Transform Load (ETL), PySpark, Python (Programming Language), Structured Query Language (SQL) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date Show more Show less

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Software Engineer in the Developer Experience and Productivity Engineering team at Coupa, you will play a crucial role in designing, implementing, and enhancing our sophisticated AI orchestration platform. Your primary responsibilities will revolve around architecting AI and MCP tools architecture with a focus on scalability and maintainability, developing integration mechanisms for seamless connectivity between AI platforms and MCP systems, and building secure connectors to internal systems and data sources. You will have the opportunity to collaborate with product managers to prioritize and implement features that deliver significant business value. Additionally, you will mentor junior engineers, contribute to engineering best practices, and work on building a scalable, domain-based hierarchical structure for our AI platforms. Your role will involve creating specialized tools tailored to Coupa's unique operational practices, implementing secure knowledge integration with AWS RAG and Knowledge Bases, and designing systems that expand capabilities while maintaining manageability. In this role, you will get to work at the forefront of AI integration and orchestration, tackling complex technical challenges with direct business impact. You will collaborate with a talented team passionate about AI innovation and help transform how businesses leverage AI for operational efficiency. Furthermore, you will contribute to an architecture that scales intelligently as capabilities grow, work with the latest LLM technologies, and shape their application in enterprise environments. To excel in this position, you should possess at least 5 years of professional software engineering experience, be proficient in Python and RESTful API development, and have experience in building and deploying cloud-native applications, preferably on AWS. A solid understanding of AI/ML concepts, software design patterns, system architecture, and performance optimization is essential. Additionally, you should have experience with integrating multiple complex systems and APIs, strong problem-solving skills, attention to detail, and excellent communication abilities to explain complex technical concepts clearly. Preferred qualifications include experience with AI orchestration platforms or building tools for LLMs, knowledge of vector databases, embeddings, and RAG systems, familiarity with monitoring tools like New Relic, observability patterns, and SRE practices, and experience with DevOps tools like Jira, Confluence, GitHub, or similar tools and their APIs. Understanding security best practices for AI systems and data access, previous work with domain-driven design and microservices architecture, and contributions to open-source projects or developer tools are also advantageous. Coupa is committed to providing equal employment opportunities to all qualified candidates and employees, fostering a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or performance evaluation are made fairly, in compliance with relevant laws and regulations. Please note that inquiries or resumes from recruiters will not be accepted. By applying to this position, you acknowledge that Coupa collects your application, including personal data, for managing ongoing recruitment and placement activities, as well as for employment purposes if your application is successful. You can find more information about how your application is processed, the purposes of processing, and data retention in Coupa's Privacy Policy.,

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Data Science Trainer, your primary responsibility will be to deliver advanced training sessions in various topics related to data science. You will cover a wide range of subjects including Python Programming for Data Science, Mathematics & Statistics for AI, AI Ethics, Data Access, Handling and Visualization, Analyzing Data with Python, Visualizing Data with Python, Tableau, Machine Learning, Deep Learning, Natural Language Processing, Computer Vision (CV), Full Stack Development, Generative AI, R Programming, Scala, and Spark. This is a full-time position that requires you to conduct training sessions in person. You will be expected to start on the 23rd of July, 2025.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Key Responsibilities: - Define data quality standards - Investigate data quality issues - Collaborate with technology partners - Establish dashboards and metrics - Support data view and lineage tools - Embed data quality in UAT cycles - Assist Operations users with data access - Work with project teams on implementations - Implement data ownership processes - Deliver tools and training for data owners - Champion improvements to data quality Required Qualifications, Capabilities, and Skills: - Engage effectively across teams - Understand data components for IBOR - Comprehend trade lifecycle and cash management - Possess technical data management skills - Solve operational and technical issues - Deliver with limited supervision - Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills: - Demonstrate strong communication skills - Exhibit leadership in data governance - Adapt to changing project requirements - Analyze complex data sets - Implement innovative data solutions - Foster collaboration across departments - Drive continuous improvement initiatives,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Responsibilities: - Define data quality standards - Investigate data quality issues - Collaborate with technology partners - Establish dashboards and metrics - Support data view and lineage tools - Embed data quality in UAT cycles - Assist Operations users with data access - Work with project teams on implementations - Implement data ownership processes - Deliver tools and training for data owners - Champion improvements to data quality Required Qualifications, Capabilities, and Skills: - Engage effectively across teams - Understand data components for IBOR - Comprehend trade lifecycle and cash management - Possess technical data management skills - Solve operational and technical issues - Deliver with limited supervision - Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills: - Demonstrate strong communication skills - Exhibit leadership in data governance - Adapt to changing project requirements - Analyze complex data sets - Implement innovative data solutions - Foster collaboration across departments - Drive continuous improvement initiatives,

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

PROFESSIONAL REQUIREMENTS: Ability to work in an agile environment Experience in Banking, Finance or Insurance is a requirement . Experience in Re-insurance & Catastrophe are a plus Good analytical capabilities Experience using programming languages and database technologies is an advantage Ability to manage customers in a dynamically changing environment Understanding complex business scenarios using professional analytical skills Academic Background: Bachelor's degree in Computer Science Economics or related quantitative field English language proficiency required German language skills are a plus Self-driven, autonomous, with ability to work closely with other business analysts, architects and development team to work towards quality product rollout Experience dealing with complex systems: real time, operational, analytical (interacting with and specifications definition for these systems is included) Demonstrated experience with software development life cycle methodologies, including agile Worked in environments that demand fast paced effective delivery under tight timelines Data access, data exploration, SQL, etc. for interacting with data Working with APIs is a good plus - needed for interaction with data and systems Confluence for documentation, specifications etc. JIRA BDD template

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

, India

On-site

How is this team contributing to vision of Providence EBA team is responsible to provide Oracle ERP Implementation Support across HCM, SCM & Finance domains. We cater to Functional, technical, Infrastructure & App Security requirements for end-end Oracle Cloud ERP Implementation. What will you be responsible for As an Analyst, you are responsible for analyzing business needs to help ensure Oracle solution meets the customer's objectives by combining industry best practices, product knowledge, and business acumen. Your specialization will be focused on solution design and implementation aspects of engagement(s) ensuring high quality, integrated software solutions within constraints of time and budget. Deliver innovative integration solutions using Oracle's PaaS offerings and maintain or enhance the existing integration solution. What would your day look like Act as domain expert providing best-practice guidance on intercompany business processes and implementation approaches. Assist with defining Scope and estimates for new project or builds. Understand business requirements and should be able to convert into system configurations in Oracle modules and bring in diverse perspectives. Ability to gather requirements, do fit-gap analysis, Impact analysis and design solutions Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams Draft and review the functional specification documents. Help investigate and resolve system functional and technical errors. Troubleshoot on systems and data and generate solutions which may include systems modifications enhancements or new procedures. Identify and analyze operational and systems issues and opportunities and produce effective solutions Who are we looking for 4+ Yrs of full lifecycle experience of a minimum of 3 large Oracle HCM implementations A strong understanding of best practices across a range of the business processes, cross-pillar dependencies and related application implementation design and configuration options within large scale multi-application implementation and business transformation programs. Experience with designing solution, doing fit gap analysis, configuring or doing setups in different HCM module and drafting TFS documents. Awareness and understanding of the capabilities across a wide range of Oracle applications and platform components, including ERP, EPM, SCM, HCM, Analytics, Integration. Oracle HCM Cloud Fusion experience in at least one or more of these modules: Global Human Resources, Benefits, Global Payroll, Time & Labor, Absence Management, Goal Management, Performance Management, Talent Review and Workforce Compensation Experience is technical skills like BIP Reports, OTBI, HCM Extracts, conversions (HDL, PBL), workflows, security (Data Access) and notification templates. Bachelor's Degree (Computer Science, Business Management, Information Services or an equivalent combination of education and relevant experience).

Posted 2 months ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

The Database Test and Tools Development for Linux/Unix OS platforms team is looking for bright and talented engineers to work on Linux on Zseries platform. It is an opportunity to demonstrate your skills as a Test Development Engineer. The team has the unique opportunity to make significant contributions to the Oracle database technology stack testing across different vendor platforms like Zlinux and LoP. Detailed Description and Job Requirements The team works on upcoming releases of the Oracle Database - XML/XDB, Real Application Clusters, Flashback, Oracle Storage Appliance, Automatic Storage Management, Data access, Data Warehouse, Transaction Management, Optimization, Parallel Query, ETL, OLAP, Replication/Streams, Advanced queuing / Messaging, OracleText, Backup/Recovery, High availability and more functional areas The team has good opportunities to learn, identify and work on initiatives to improve productivity, quality, testing infrastructure, and tools for automation. We are looking for engineers with below requirements Requirement: B.E / B.Tech in CS or equivalent with consistently good academic record with 4+ years of experience. Strong in Oracle SQL, PLSQL and Database concepts. Experience with UNIX Operating system. Good in UNIX operating system concepts, commands and services. Knowledge of C/C++ or Java. Experience with Shell scripting, Perl, Python, Proficiency in any one or two. Good communication skills. Good debugging skills.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies