These developments suggest that AI's role in 2024 has evolved significantly, moving from experimental tools to integral components of business strategy and daily operations. The question of whether the corporate world should "embrace or evade" AI is a complex one, reflecting both the transformative potential of AI and the risks it introduces. On the one hand, the competition will use it and therefore get an advantage. On the other hand, it raises ethical questions, particularly around data privacy, bias, job displacement and accountability. The answer may not be to embrace or evade AI outright, but rather to approach it with a balanced, strategic mindset. Embracing AI thoughtfully can enable companies to innovate and grow while also safeguarding against its risks in order to create an ethical and effective AI-driven future.
Since SAP jumped on the AI hype train, we as an SAP minded company wanted to explore the possibilities SAP has to offer to realize the aforementioned AI-driven approach by building an AI tool ourselves. In our use-case we focused on customer support as the target business user, since both our company and a lot of our customers operate in customer support, making it a win-win situation. Furthermore, customer support lends itself for a perfect AI case as:
- It requires a lot of analysis and a lot of text to analyze;
- One needs to respond to the customer in a conversational manner;
- The responses need to be business-specific.
In our case, we leveraged Generative AI to analyze incoming emails from customers by performing categorization, sentiment analysis, and urgency assessment. This helps customer support in analyzing their workload. To make this process more customer or business specific, we leverage Retrieval Augmented Generation (RAG) to store and retrieve past emails and responses to learn with real-world data. We use the information from RAG as context for a prompt in order to generate a user-friendly response to incoming emails, making responding to customers easier and more effective. Finally, our use-case breaks language barriers by translating the emails and its corresponding responses to our preferred working language, in this case Dutch. Faster and better responses lead to customer satisfaction and enhance the workflow for customer support.
Now that we have described our AI use-case, let’s briefly take a look at some terminologies. Firstly, we mentioned the use of Generative Ai in our use-case, but what do we actually mean? To put it simply, Generative AI is the subset of Ai-models that can generate new content, such as text, images and audio. Currently, Generative AI relies on the use of Large Language Models (LLMs), that are designed to understand and generate human-like language. Tools leveraging Generative AI communicate with LLMs with prompts to produce relevant content. Here, a prompt is an instruction that provides context and direction to the LLM. In our use-case, we used the SAP AI Core in order to infuse our use case with generative AI. In SAP AI Core you can build and maintain your own use cases with LLMs. Furthermore, you can work with pre-built scenarios to fully create Generative AI use-cases by leveraging LLMs from third parties, such as AWS, Microsoft Azure, or Google Cloud. For admin and prompt management, the SAP AI Core works seamlessly with the SAP AI Launchpad, providing a more user-friendly experience with the AI Core.
Secondly, we mentioned the use of RAG to make our email responses more business specific. Retrieval Augmented Generation (RAG) is a procedure that combines the retrieval of information with generation of content. RAG starts by transforming the information into embeddings (i.e. large vectors) with a so-called embedding model. Then, based on an incoming prompt, the most relevant embeddings are retrieved and used as context for the prompt. You can see this as an automated form of prompt engineering, where the context is automatically retrieved based on the prompt sent by the user. In our use-case, we used SAP HANA Cloud Vector Engine for RAG to efficiently store and retrieve our embeddings, in this case previously answered emails.
Lastly, in order to properly use Generative AI and RAG in our use-case, we built an application on the SAP Business Technology Platform (SAP BTP), using a standard stack with HANA Cloud, Cloud Application Programming (CAP)
framework as backend, and SAP UI5 as front-end. The standard plugins available on CAP allow the application to seamlessly leverage the SAP AI Core as its Generative AI component. We can also use SAP’s Cloud Software Development Kit (SDK) to directly communicate with SAP AI Core. Putting everything together, we get the architecture shown above, where you can see that all the necessary technology runs in the same stack on SAP BTP. Notice that you can also connect the application with other SAP Cloud solutions, such as SAP S/4HANA among others. To top it off, we can reuse this architecture for other use-cases by making use of Terraform.
Finally, let’s look at some pros and cons of our use case. Some advantages of our use case include:
- Easily switch between models from top tech companies (e.g. AWS, Azure) in SAP AI Core, without using their platform directly;
- Everything running in the same stack (i.e. SAP BTP), making the application easy to maintain;
- Ensured security and protection, since everything is ran and maintained by SAP (i.e. data does not leave SAP landscape).
However, there are also some disadvantages of our use case:
- Extra costs for accessing models through SAP AI Core, in contrast to accessing them directly through the platform of tech companies themselves;
- You are limited to the models and scenarios that SAP provides and supports in Sap AI Core;
- Not able to finetune LLMs and other models in SAP AI Core (yet).
Conclusion
In conclusion, we showed a use-case that leverages Generative AI and RAG in order to streamline the workflow of customer support, creating value by sending more adequate responses to customers at a faster pace. Important to note is that this application does not merely replace the work of customer support, but rather assists customer support in their workflows. If you are interested in this particular use-case, come visit our live demo of this use-case at the VNSG conference at November 28th! Should you not be able to make it to the live demo, feel free to contact us if we have piqued your interest.
Credits
This blog was written by our experts Erik Leemans and Chris Al Gerges.