The AI technology behind SAP's Autonomous Enterprise pitch
Product management VP Richard Grandpierre explains Joule Work, RPT-1 upgrades, deployment options and what giving AI to on-premises customers means for the 2027 S/4HANA deadline.
ORLANDO, Fla. -- SAP announced major additions to its Business AI portfolio at the annual Sapphire conference this week, including new Joule assistants that coordinate teams of AI agents, and packages of AI agents for eight vertical industries. The vendor also showed important updates to Business AI's technical underpinnings, including AI models, infrastructure and development tools, all geared toward achieving the vision of the "autonomous enterprise."
Richard Grandpierre leads the product management team in SAP's Business AI organization, responsible for the AI "core" where SAP's AI models are orchestrated, including foundational models from SAP and agentic AI components. He sat down with Informa TechTarget at Sapphire to explain the announcements.
Editor's note: The interview was edited for length and clarity.
Richard Grandpierre
What do you consider the biggest AI advancements announced this week?
Richard Grandpierre: The biggest announcement is our strategic shift to the autonomous enterprise and the Business AI Platform that is powering that autonomous enterprise. But ultimately, we are rethinking the way SAP end users will work and interact with their SAP applications in the future. That is a fundamental shift for our customers as well as ourselves. Before, we had a more user-focused SaaS application portfolio, and we focused on how we could build everything around the user with features and functions that we continuously enhanced to make our ERP structure bigger and more powerful.
Now we're moving away from building lots of features and rethinking how we can automate and take away busy work so the user doesn't have to do all these things manually. The user becomes a different paradigm for how people engage with SAP software. They become the controller of a process that orchestrates a fleet of assistants and agents that do the work for them.
We're very excited because it's leapfrogging many of the challenges presumed from the user interface. We're going a step beyond so that, ultimately, you just need one interface or connection point to orchestrate work across all the processes involved in your role, through the power of the new Joule framework we're launching.
Joule will be the user's connection point?
Grandpierre: Correct. Joule is the interface on your desktop, your web application or mobile device. Wherever you are, Joule is basically the engagement point for work.
One of the cool things we're announcing today is very different from what we've talked about so far with Joule: It's not just the assistant, the copilot that you interact with in natural language. Now it's Joule Work, a workspace where you can prompt Joule, ask questions and fetch data. Think of it like a dashboard where you can start to configure tiles, different apps that you want to surface, insights that you want to highlight, and then use that interface to get your work done.
You're moving more of your work into this new interface that is a lot more dynamic and powerful in how you can bring data together in real time, change and modify it and transact.
Another announcement here is a .5 upgrade for your RPT-1, or Rapid 1, foundational AI model for predicting outcomes on tabular data, to version 1.5. What's important about it?
Grandpierre: Three things are key. One is we extended our context. The Rapid models work in a similar way to large language models, where you bring some data into the context. Like when you have a prompt and add the context data that it should use to generate the response, in the same way, the Rapid prediction model is using the data set that added to the context to then predict values. So far, in the context window we've been limited to certain sizes depending on our models. Now, we built something called RAP, for retrieval augmented prediction -- same thing as retrieval-augmented generation (RAG) for LLMs -- where we are able to ingest infinite data sets of any size and then use the RAP capability in the right context for your prediction. You can now ingest much larger data tables to generate predictions.
Second, we introduced explainability to the model. It's a prediction model that predicts fields, but we also use LLMs to explain how the model came to its conclusions, and its confidence level. Now you can also chat with your model. We added this to our playground. You can try it out today, take any data set, chat with it and understand its key influences.
Third, we introduced what we call tabular AI. It's basically multi-modal selection of models like we have with LLMs, where you can use models perhaps from Gemini, OpenAI, AWS or any other vendor. We're also integrating tabular models from Prior Labs, a startup from Germany [whose acquisition by SAP was announced last week]. You will be able to orchestrate between these different models and use them for different qualities.
The opening keynote seemed more technical than usual, which suggests practical use of AI remains a technical challenge for customers. But what do these developments mean for business leaders, such as CFOs?
Grandpierre: The AI space is evolving quickly with new models and capabilities coming out every week. The technology race is a big topic for SAP as well. You can feel it in how it enters our messaging and our Business AI Platform that we're now talking about in more detail. These are ultimately differentiating capabilities that enable very powerful use cases for business users in different line-of-business domains, such as finance, logistics and supply chain.
There's still a huge learning curve for everyone in the industry to understand the different technologies and how to use them in business. We also want to provide a glimpse under the hood to explain how we're building this and have credibility that the technology stack is well architected and delivers powerful, reliable and secure outcomes in your business.
Ultimately, of course, the one thing that counts is the actual business value that materializes for CFOs, CIOs, CHROs -- the different domain owners. That's where we have a very strong story with the autonomous enterprise. We go through all these business processes -- financial closing, cash collections, accounts receivable, accounts payable, and so on -- build agents and assistants, and integrate them into your application to do the work for you. You don't have to worry about the tech stack. SAP is taking care of that.
It's understood that large companies probably have relationships with consulting firms like Deloitte and EY. What about medium-sized businesses with, say, under $1 billion in revenue but maybe have data scientists on staff? How are they going to get it done?
Grandpierre: There are very different ways to do it, and one is not necessarily better than the other. If you have partners to help you implement, that is always a great starting point and typically a good situation to be in. The smaller you are the less common that is.
Then there are different ways of approaching this with in-house teams. I always advocate having a certain level of AI expertise in your organization because ultimately it is a very complex field. SAP is typically running in a multi-platform landscape, so you have to have a good sense of how to use the different applications, how to connect them and where to prioritize -- also when to partner versus building something on your own. Having your own teams can be a successful way, depending on your size, to take these capabilities and implement them yourself.
SAP is putting a strong focus on making it as easy as possible. We are moving a lot of the AI stack and capabilities into SAP-managed deployment for customers so they don't have to do it themselves. They basically just have to trigger the deployment. Then we provision the systems, connect them and integrate them with our Business Technology Platform components and your business applications.
It can be a good path for smaller companies, as well, to just use the technology from SAP and build it themselves. We are helping our customers on those projects and working very closely with them. We have a special team that we call the Regional Implementation Group.
We have a strong interest in making this easy and seamless so we can get adoption of our AI capabilities and learn from that. It's in our interest for it to work very easily for customers.
Can you explain the role of the SAP Knowledge Graph and the new features announced here?
Grandpierre: A knowledge graph is a semantical representation of your data structures. Think about your APIs and the different endpoints in an application that fetch some data -- maybe sales order data from your S/4 HANA system. The knowledge graph is basically a semantical representation of all the APIs and the structure of the tables. If you have an LLM to orchestrate and pull from thousands and thousands of tables and views that we built on top of our data, the SAP Knowledge Graph helps it navigate through this complex structure, find the right tables and query the right components to give you accurate and reliable results.
We've been talking about our Knowledge Graph for quite some time because we know it's a fundamental technology we need to deliver. What is new here at Sapphire is that we are bringing it into production within Joule. In the new Joule Work interface and architecture, the Knowledge Graph is baked in and pulls data from S/4HANA public cloud, private cloud, Ariba and SuccessFactors. Joule is now knowledgeable about all these applications. The Knowledge Graph helps Joule generate responses much more accurately than before. In the past, Joule was a bit limited. We also had feedback from customers saying it works well on some entities, business objects and queries, but for others we didn't train it for because the Knowledge Graph wasn't in place, it didn't work so well. It returned a message saying it couldn't fetch the data.
SAP announced today that customers who have said they will migrate to S/4HANA can get some AI features on their legacy on-premises systems. Where does that put the 2027 migration deadline? Until a year or two ago, migrating to S/4HANA was a high priority in SAP's messaging.
Grandpierre: All our work and focus is on the cloud, and we're focused on cloud delivery. It's a whole different game in how we can deliver out-of-the-box AI that works reliably in the customer's system vs. on-premises systems, where we don't have that level of access and can't configure it the right way.
The reality is a lot of customers have on-premises installations. For customers that are migrating to S/4HANA and are already on the journey, we have criteria. They have to have more than 50% of their maintenance converted to the cloud to qualify for that program. We say, OK, you are migrating with us to the cloud so we can deliver all the newest innovations and capabilities to you in the future. We don't want to leave you hanging even if the transformation takes a year or two. We want to support you with whatever we can on your S/4HANA and ECC on-premises systems.
We have to be clear about this. We want to bring these AI capabilities to them but it's still a significant effort. It's not like the technical requirements haven't changed. If an agent in Joule wants to fetch data from the system, when we build this out-of-the-box in the cloud for S/4HANA systems, we know where the endpoints are to fetch that data and configure that. In an ECC system, those endpoints might not exist because ECC has a completely different technology layer and customers might have highly customized it. It will require a lot of heavy lifting from us and the customer to deliver certain AI capabilities to on-premises customers.
David Essex is an industry editor who creates in-depth content on enterprise applications, emerging technology and market trends for several Informa TechTarget websites.