Home
Jobs
Companies
Resume
37 Job openings at KPI Partners
About KPI Partners

KPI Partners is a global strategic partner for Analytics and Digital Transformation solutions, delivering advanced capabilities in Data Science, GenAI, AI/ML, Cloud Databases, Data Engineering, Analytics & Visualization, and DevOps/DevSecOps/MLOps. Founded in 2006 and headquartered in Newark, California, KPI Partners supports a wide range of clients, including several Fortune 500 companies across industries such as high-tech, retail, CPG, financial services, insurance, manufacturing, and life sciences. Recognized by Gartner, the company has completed over 1000 successful projects for more than 300 clients. With a global team of 600+ consultants located across North America, Latin America, and India, KPI Partners offers deep industry expertise and broad delivery capabilities. Offices are located in Silicon Valley (HQ), Boston, New York, Atlanta, Los Angeles, Guadalajara (Mexico), Bangalore, Hyderabad, and Pune. For organizations seeking a trusted partner in Data Analytics, AI, and Digital Transformation, KPI Partners offers scalable, cloud-native solutions to unlock data-driven growth. Gartner Inc’s “Tool: Vendor Identification for AI, Data and Analytics Service Providers https://t.ly/iWEo Gartner's Market Guide Data Analytics Service Providers 2019 - https://bit.ly/2oOW4rU

SAP SAC and Datasphere Consultant

Pune, Bengaluru, Hyderabad

3 - 8 years

INR 20.0 - 32.5 Lacs P.A.

Hybrid

Full Time

Role: SAP SAC and Datasphere Consultant Job Description Report building all on SAP Datasphere Back end data modeling - logical implementation will be done on SAC – S4/Hana – SQL – preparing the data, data cleansing, data enabling to Datasphere – building the reports S4/Hana backend database – SAC and Datasphere Strong SQL background 3 or 4 years in these areas should be fine (SAC and Datasphere – 1 year as a target for Datasphere)

QA Automation Tester

Pune

5 - 8 years

INR 7.0 - 10.0 Lacs P.A.

Work from Office

Full Time

KPI Partners, Inc. is looking for QA Automation Tester to join our dynamic team and embark on a rewarding career journey. A QA Automation Tester, also known as an Automation Test Engineer or QA Automation Engineer, is responsible for designing, developing, and executing automated test scripts to ensure the quality and functionality of software applications. Here are some key responsibilities and tasks associated with the role of a QA Automation Tester : Test Planning and Strategy : Collaborating with the QA team and other stakeholders to understand project requirements and define the test strategy. This involves identifying test objectives, determining the scope of automation, and selecting appropriate automation tools and frameworks. Test Script Development : Designing and developing automated test scripts using scripting or programming languages such as Java, Python, or C#. These scripts are created to simulate user interactions, validate software functionalities, and perform data-driven testing. Test Execution and Reporting : Executing automated test scripts and analyzing the results to identify defects or issues. QA Automation Testers conduct regression testing, performance testing, and other types of tests to ensure the software meets the specified requirements. They also generate test reports and provide detailed defect reports to facilitate bug tracking and resolution. Test Framework Development : Building and maintaining test frameworks and libraries to support test automation efforts. This involves creating reusable components, implementing coding standards, and integrating automation scripts with Continuous Integration (CI) systems. Test Environment Setup : Configuring and maintaining test environments, including software, hardware, and network setups. This includes ensuring that the testing environment is consistent and representative of the production environment.

Data Analyst

Pune

0 - 5 years

INR 3.0 - 8.0 Lacs P.A.

Work from Office

Full Time

"Analytics Use Case Lifecyle - Requirements through Implementation. Partner with business teams to understand their data, data quality, KPI and reporting requirements Work closely with product managers, architects, and business units for knowledge sharing, mentoring, and training Strong analytical skills - creative problem solver with the ability to aggregate and analyze a large amount of data from various sources Knowledge of BI Tools and dashboard design and information visualization techniques Experience with database technologies Self-directed, passionate about data, delivering the right results, deriving meaning and optimizing visual display of information A sense of pride and personal accountability for resulting customer experience is a must You have a curious business mindset with an ability to condense complex concepts and analysis into clear and concise takeaways that drive action Strong written and verbal communication skills

Azure Software Engineer

Pune

0 - 5 years

INR 3.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Job Description: We are seeking a Software Engineer with expertise in Generative AI and Microsoft technologies to design, develop, and deploy AI-powered solutions using the Microsoft ecosystem. You will work with cross-functional teams to build scalable applications leveraging generative AI models and Azure services. Key Responsibilities: Design and fine-tune generative AI models (e.g., GPT, Turing) for text, image, or other modalities using Azure. Integrate AI features into applications leveraging Azure Cognitive Services, Power Platform, and .NET. Build, deploy, and optimize AI solutions using Azure Machine Learning and Azure OpenAI Service. Implement efficient pipelines with Azure Kubernetes Service, Azure Functions, or App Services. Write clean, maintainable code using C#, Python, and Microsoft DevOps tools. Monitor and update deployed AI models, ensuring performance and compliance with Microsoft Responsible AI principles. Qualifications: Strong proficiency in C#, .NET, Python, and Azure cloud services. Experience with Azure Machine Learning, Azure OpenAI, and Cognitive Services. Knowledge of integrating AI with Power Apps, Power Automate, or Dynamics 365. Familiarity with AI frameworks (e.g., PyTorch, TensorFlow) and ONNX on Azure. Experience with Azure DevOps for CI/CD and Git workflows. Understanding of data systems like Azure SQL, Cosmos DB, or Dataverse. Education: Bachelor s or Master s in Computer Science, AI, or related field." Job Description: We are seeking a Software Engineer with expertise in Generative AI and Microsoft technologies to design, develop, and deploy AI-powered solutions using the Microsoft ecosystem. You will work with cross-functional teams to build scalable applications leveraging generative AI models and Azure services. Key Responsibilities: Design and fine-tune generative AI models (e.g., GPT, Turing) for text, image, or other modalities using Azure. Integrate AI features into applications leveraging Azure Cognitive Services, Power Platform, and .NET. Build, deploy, and optimize AI solutions using Azure Machine Learning and Azure OpenAI Service. Implement efficient pipelines with Azure Kubernetes Service, Azure Functions, or App Services. Write clean, maintainable code using C#, Python, and Microsoft DevOps tools. Monitor and update deployed AI models, ensuring performance and compliance with Microsoft Responsible AI principles. Qualifications: Strong proficiency in C#, .NET, Python, and Azure cloud services. Experience with Azure Machine Learning, Azure OpenAI, and Cognitive Services. Knowledge of integrating AI with Power Apps, Power Automate, or Dynamics 365. Familiarity with AI frameworks (e.g., PyTorch, TensorFlow) and ONNX on Azure. Experience with Azure DevOps for CI/CD and Git workflows. Understanding of data systems like Azure SQL, Cosmos DB, or Dataverse. Education: Bachelor s or Master s in Computer Science, AI, or related field."

Senior Data Scientist Lead

Pune, Bengaluru, Hyderabad

8 - 12 years

INR 12.0 - 22.0 Lacs P.A.

Hybrid

Full Time

Desired candidate profile 8-12 years of experience as a Data Scientist or related field with expertise in machine learning algorithms. Strong proficiency in programming languages such as Python with knowledge of popular libraries like NumPy, Pandas, Matplotlib etc. .with experience in developing scalable and efficient code for AI/ML workflows. Experience working on big data processing technologies like Spark/Hadoop/Azure Databricks etc. . Excellent problem-solving skills with ability to work independently on projects requiring strong analytical thinking. Proficiency in Pandas and PySpark for data preprocessing Experience with cloud platforms such as Databricks, Azure ML, AWS SageMaker, or equivalent services for managing ML workflows, and AI/ML engineering. Expertise in working with API's Python REST APIs and Open API for integrating machine learning models and data solutions. Proficient in end-to-end AI/ML lifecycle management, including MLOps and LLMOps, for seamless model development, deployment, monitoring, and retraining. Roles and Responsibilities: Develop and implement data science solutions using Python, AI, Machine Learning, ML, Deep Learning techniques. Collaborate with cross-functional teams to design and develop predictive models for complex business problems. Design and maintain large-scale data pipelines to extract insights from structured and unstructured data sources. Conduct exploratory data analysis (EDA) to identify trends, patterns, and correlations in datasets. Communicate technical results effectively through reports, presentations, and visualizations.

Python Engineer

Pune, Bengaluru, Hyderabad

3 - 6 years

INR 7.0 - 16.0 Lacs P.A.

Hybrid

Full Time

About KPI Partners KPI Partners is a 5 times Gartner recognized data, analytics, and AI consulting company. We are one of the top data, analytics and AI partners for Microsoft. AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 500 consultants and has successfully delivered over 1,000 projects to our clients in the US. We are looking for skilled data engineers who want to work with the best team in data engineering! About the Role: We are seeking highly skilled and experienced Data Engineers to join our dynamic team at KPI's Benguluru office. You will work on challenging and multi-year data transformation projects for our clients. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you. Role & responsibilities Design, develop, test, and deploy data pipelines using Azure Data Factory (ADF) to extract insights from various sources. Collaborate with cross-functional teams to gather requirements and design solutions that meet business needs. Develop high-quality code in Python using NumPy, Pandas, and SQL to process large datasets. Troubleshoot issues related to ADF pipeline execution and provide timely resolutions. Key skills Python,Pandas, sql , Azure

Scum Master

Pune, Bengaluru, Hyderabad

8 - 13 years

INR 20.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Position: Scum Master Experience: 8+ years Location: Hyderabad/ Bangalore/Pune/Chennai Key Responsibilities: Lead Agile ceremonies (sprint planning, stand-ups, retrospectives) for data projects. Facilitate collaboration between global stakeholders and cross-functional teams. Ensure project deliverables are aligned with business goals and timelines. Manage dependencies and remove blockers for the data engineering and analytics teams. Guide the team on continuous improvement and Agile best practices. Track and report on project progress, risks, and issues. Collaborate with multiple external partners/vendors to ensure project success. Qualifications: 5+ years of experience as a Scrum Master, with a focus on data/analytics projects. Experience working in cloud environments, especially GCP and Looker. Strong communication and stakeholder management skills. Proven ability to lead distributed teams across different time zones. Certified Scrum Master (CSM) or similar Agile certification preferred. Experience working with external partners and vendors.

Etl Developer

Pune, Bengaluru, Hyderabad

5 - 10 years

INR 20.0 - 35.0 Lacs P.A.

Hybrid

Full Time

Position: ETL Developer Experience: 5+ years Location: Hyderabad/ Bangalore/Pune Mandatory Skills: Python, Informatica, Salesforce Job Description Develop data pipelines for Informatica to AWS Glue. Glue as the orchestration tool. Glue with visualizations. Role & responsibilities

Senior Python Web services Developer

Pune, Bengaluru, Hyderabad

7 - 12 years

INR 15.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Role: Senior Python Web services Developer Experience: 7-12 yrs Location: Hyderabad/Bangalore/Pune Notice: Immediate to 3 weeks Must Have skills: Python, API's, FastAPI, AWS Interested candidate can share their resume at sahiba.nagpal@kpipartners.com with subject line as 'Sr. Python Developer' JD - "We are seeking a highly skilled and experienced Senior Python Developer to join our dynamic team. The ideal candidate will have a strong background in Python development, experience with FastAPI, and a solid understanding of AWS. As a senior developer, you will be expected to take a proactive role in driving the team's deliverables and ensuring high-quality outcomes. Key Responsibilities: Design, develop, and maintain high-quality software solutions using Python. Build and maintain APIs using FastAPI to ensure robust and scalable backend services. Utilize AWS services to deploy, manage, and scale applications in a cloud environment. Lead and mentor a team of developers, providing technical guidance and ensuring best practices are followed. Collaborate with cross-functional teams to define, design, and ship new features. Proactively identify areas for improvement in the codebase and propose solutions. Ensure the performance, quality, and responsiveness of applications. Conduct code reviews to maintain code quality and share knowledge with the team. Troubleshoot and resolve complex technical issues as they arise. Stay up-to-date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Skills: 7-12 years of professional experience in software development with a strong focus on Python. Extensive experience with FastAPI for building APIs. Proficient in AWS services and cloud architecture (such as EC2, S3, RDS, Lambda, etc.). Strong understanding of web development principles and backend technologies. Demonstrated ability to lead and mentor a development team. Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work effectively in a collaborative team environment. Proactive mindset with the ability to drive projects and deliverables independently. Experience with Angular for front-end development. Knowledge of containerization technology" TIA, TAG - KPI Partners

ETL Developer

Pune, Bengaluru, Hyderabad

5 - 10 years

INR 22.5 - 27.5 Lacs P.A.

Hybrid

Full Time

Position: ETL Developer Experience: 5+ years Location: Hyderabad/ Bangalore/Pune Mandatory Skills: Python, Informatica, Salesforce Job Description Develop data pipelines for Informatica to AWS Glue. Glue as the orchestration tool. Glue with visualizations.

Azure Data Engineer

Pune, Bengaluru, Hyderabad

6 - 11 years

INR 12.0 - 22.0 Lacs P.A.

Hybrid

Full Time

Roles and Responsibilities Roles & Responsibilities: Overall 5+ years of experience with a minimum of 2+yrs of experience in Azure Data engineering 4+ years of experience in Data Engineering, or related field in Architecting and developing end to end scalable data applications and data pipelines 3+ years of hands-on experience in building big data solution using Azure df, Azure db , Azure data lake ,sql, big data ,pyspark. 3+ years of coding experience with modern programming or scripting language (Python) and data processing packages/libraries. 2+ years of experience in advanced data modelling, SQL and query performance tuning skills Lead design, data strategy and road map exercises, and implementation Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes and testing Review code and provide recommendations and act as repo/technology repo owner. Excellent Communication skills is Mandatory Desired Candidate Profile Adf, Strong sql,Tsql , Python, pyspark , bigdata Perks and Benefits

Aws Data Engineer

Pune, Bengaluru, Hyderabad

7 - 8 years

INR 18.0 - 33.0 Lacs P.A.

Work from Office

Full Time

About KPI Partners KPI Partners is a 5 times Gartner recognized data, analytics, and AI consulting company. We are one of the top data, analytics and AI partners for Microsoft. AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 500 consultants and has successfully delivered over 1,000 projects to our clients in the US. We are looking for skilled data engineers who want to work with the best team in data engineering! About the Role: We are seeking highly skilled and experienced Data Engineers to join our dynamic team at KPI's Hyderabad,Pune ,Bangalore Offices You will work on challenging and multi-year data transformation projects for our clients. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you. Key Responsibilities: Design and build data engineering pipelines using SQL and PySpark Collaboration: Work closely with cross-functional teams to understand business requirements and translate them into robust data solutions. Data Warehousing: Design and implement data warehousing solutions, ensuring scalability, performance, and reliability. Continuous Learning: Stay up to date with modern technologies and trends in data engineering and apply them Experience in agile delivery methodology in a leading role as part of a wider team Strong team collaboration and experience working with KPI team members and client team members in the US, India and other global locations Mentorship: Provide guidance and mentorship to junior data engineers, ensuring best practices in coding, design, and development. Must-Have Skills & Qualifications: 3+ years of PySpark, SQL experience in building data engineering pipelines. Proven expertise in SQL for querying, manipulating, and analyzing large datasets AWS , Data bricks Good-to-Have Skills: Databricks Certification is a plus Azure Certification is a plus Snowflake Certification is a plus Education: BA/BS in Computer Science, Math, Physics, or other technical fields is a plus. Apply Now! If youre ready to take on a key role in data engineering and work on transformative projects with a talented team, we encourage you to apply today.

Oracle Fusion Techno Functional Consultant (Lead)

Bengaluru

10 - 19 years

INR 30.0 - 40.0 Lacs P.A.

Remote

Full Time

Job Title: Oracle Fusion Technical Consultant (Senior/Lead) Location: Hyderabad/Bangalore/Pune/Remote Experience: 9+ years Employment Type: Full-time We are seeking an experienced Oracle Fusion Technical Consultant with expertise in BI Publisher (BIP) and Oracle Transactional Business Intelligence (OTBI) and hands-on knowledge of Finance and Supply Chain modules. The ideal candidate should also possess strong SQL skills and have a functional understanding of Oracle ERP processes. Key Responsibilities: Design and develop reports using BI Publisher (BIP) and OTBI . Developed and optimized SQL queries for custom reporting and data extraction. Work with stakeholders to gather requirements and translate them into technical specifications. Support Finance and Supply Chain teams by developing solutions aligned with business processes. Troubleshoot and resolve technical issues related to Fusion reports and data flows. Collaborate with functional consultants to validate the reports and ensure accuracy. Required Skills & Experience: 4+ years of experience in Oracle Fusion technical development. Strong hands-on experience with BIP (BI Publisher) and OTBI reporting tools. Proficiency in SQL and PL/SQL for report development and data analysis. Experience with Oracle Fusion Finance and Supply Chain modules (e.g., AP, AR, GL, PO, INV). Good understanding of data models and functional flows in Oracle Fusion. Ability to work in an agile environment and manage multiple priorities. Good to Have: Functional knowledge or experience in Oracle Fusion Financials and/or Supply Chain. Exposure to OAC Educational Qualification: Bachelor's degree in Computer Science, Information Technology, or a related field.

Senior Data Architect

Hyderabad, Pune, Bengaluru

8 - 10 years

INR 30.0 - 45.0 Lacs P.A.

Hybrid

Full Time

Role & responsibilities Senior Data Architect with at least 10 years of experience with at least 3 years working in a cloud environment. Preferred candidate profile

Junior Data Modeller

Hyderabad, Pune, Bengaluru

2 - 6 years

INR 10.0 - 15.0 Lacs P.A.

Work from Office

Full Time

More than 2+ years of experience in data modelling designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeller and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in HANA and S/4HANA, Retail Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG" Immediate Joiners only Please apply only if you meet the given requirements

Data Scientist

Pune

5 - 10 years

INR 8.0 - 12.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are seeking a highly skilled Data Scientist with deep expertise in Computer Vision, particularly with YOLO (You Only Look Once) object detection models. The ideal candidate will have experience in designing, training, and deploying YOLO-based solutions for real-time or near real-time applications. You will collaborate with cross-functional teams to develop AI models that drive innovation and efficiency across our products and platforms. Key Responsibilities: Develop and deploy state-of-the-art object detection models using YOLO (v5/v6/v7/v8). Design and execute computer vision experiments for image and video analysis, including detection, tracking, and classification. Collect, preprocess, and annotate image and video datasets tailored for computer vision tasks. Optimize model performance for deployment on edge devices, cloud services, or embedded systems. Work closely with data engineers and MLOps teams to integrate models into production pipelines. Continuously evaluate and improve models using techniques such as transfer learning, pruning, quantization, and A/B testing. Stay updated with the latest research and advancements in computer vision and deep learning. Required Qualifications: Bachelor s or Master s degree in Computer Science, Data Science, Electrical Engineering, or related field (PhD preferred). 5+ years of experience in computer vision, with a focus on object detection. Proven hands-on experience with YOLO (v5 or later), including custom training and deployment. Proficient in Python and deep learning libraries such as PyTorch, TensorFlow, or OpenCV. Experience working with GPU acceleration and cloud-based ML platforms (e.g., AWS SageMaker, Azure ML, GCP AI Platform). Strong understanding of CNNs, image augmentation, model evaluation metrics, and visual debugging.

Power BI Developer

Hyderabad, Pune, Bengaluru

3 - 7 years

INR 7.0 - 11.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Modify and enhance existing Power BI dashboards built on Workday data. Design and develop new dashboards and visualizations to support business operations. Build and manage Power BI data imports from Workday and related systems using APIs or data connectors. Solid understanding of APIs, data gateways, and secure data access. Develop and optimize Power Query (M scripts), and data transformations, data modelling. Ensure data integrity, accuracy, and security within Power BI reports. Stay updated on Power BI and Workday integration capabilities and best practices. Knowledge in Azure, Power Apps is an added advantage.

Data Modeller

Hyderabad, Pune, Bengaluru

5 - 7 years

INR 6.0 - 10.0 Lacs P.A.

Work from Office

Full Time

More than 5+ years of experience in data modelling – designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG"

Power BI Developer

Pune

3 - 8 years

INR 5.0 - 10.0 Lacs P.A.

Work from Office

Full Time

Key Responsibilities: Modify and enhance existing Power BI dashboards built on Workday data. Design and develop new dashboards and visualizations to support business operations. Build and manage Power BI data imports from Workday and related systems using APIs or data connectors. Solid understanding of APIs, data gateways, and secure data access. Develop and optimize Power Query (M scripts), and data transformations, data modelling. Ensure data integrity, accuracy, and security within Power BI reports. Stay updated on Power BI and Workday integration capabilities and best practices. Knowledge in Azure, Power Apps is an added advantage.

Senior Data Architect

Hyderabad, Pune, Bengaluru

9 - 12 years

INR 16.0 - 21.0 Lacs P.A.

Work from Office

Full Time

Company: KPI Partners Location: Bangalore, Karnataka, India; Hyderabad, Telangana, India; Pune, Maharashtra, India Experience: 9 to 16 years Job Description: KPI Partners, a leader in providing analytics and data management solutions, is seeking a highly skilled **Senior Data Architect** to join our dynamic team. This position offers an exciting opportunity to work on innovative data solutions that drive business value for our clients. You will be responsible for designing, developing, and implementing data architectures that align with our organizational goals and client requirements. Key Responsibilities: - Lead the design and implementation of data architecture solutions, ensuring alignment with best practices and compliance standards. - Develop comprehensive data models to support different business applications and analytical needs. - Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. - Oversee the integration of SAP and other data sources into the Datasphere platform. - Create strategies for data governance, quality assurance, and data lifecycle management. - Ensure scalability and performance of data systems through efficient architecture practices. - Mentor and guide junior data professionals in data architecture and modeling best practices. - Stay updated with industry trends and emerging technologies in data architecture and analytics. Required Skills: - 9 to 12 years of experience in data architecture, data modeling, and data management. - Strong expertise in SAP systems and data integration processes. - Proficient in DataSphere or similar data management platforms. - Solid understanding of data governance, data warehousing, and big data technologies. - Excellent analytical and problem-solving abilities. - Strong communication skills to engage with technical and non-technical stakeholders. - Ability to work independently and as part of a team in a fast-paced environment. Preferred Qualifications: - Bachelor’s or master’s degree in computer science, Information Systems, or a related field. - Relevant certifications in data architecture or SAP technologies. - Experience in cloud data solutions and platform migrations is a plus. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative work environment fostering innovation and creativity. - The chance to work with cutting-edge technology and notable clients.

FIND ON MAP

KPI Partners

KPI Partners

KPI Partners

Business Consulting and Services

Newark CA

501-1000 Employees

37 Jobs

    Key People

  • Meghan McCarthy

    CEO
  • Greg D. Lentz

    CTO

My Connections KPI Partners

Download Chrome Extension (See your connection in the KPI Partners )

chrome image
Download Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview

Senior Data Architect (2)
Power BI Developer (2)
SAP SAC and Datasphere Consultant (1)
QA Automation Tester (1)