CAP LLM Plugin helps developers create tailored Generative AI based CAP applications by leveraging SAP HANA Cloud Data Anonymization and Vector Engine and SAP AI Core Services. Detailed features:
- Without exposing confidential data to LLM by anonymizing sensitive data leveraging SAP HANA Cloud Data Anonymization.
- Seamlessly generate vector embeddings via SAP AI Core.
- Easily retrieve Chat Completion response via SAP AI Core.
- Efforlessly perform similarity search via SAP HANA Cloud Vector engine.
- Simplified single RAG (retrieval-augmented generation) retrieval method powered by SAP AI Core and SAP HANA Cloud Vector Engine.
- Access the harmonized chat completion API of the SAP AI Core Orchestration service.
Feature | Details |
---|---|
Seamlessly anonymize sensitive data using a variety of SAP HANA Cloud's anonymization capabilities | Effortlessly anonymize sensitive data within a CAP application by employing a single @anonymize annotation using a diverse range of SAP HANA Cloud's anonymization algorithms, including but not limited to: |
Effortlessly replace the anonymized data within the LLM response with genuine information | Given that the data provided to the LLM consists of anonymized information, the CAP LLM plugin ensures a seamless replacement of anonymized content within the LLM response with the corresponding authentic data. |
Feature | Details |
---|---|
Embedding generation via SAP AI Core | Easily connect to embedding models via SAP AI Core and generate embeddings seamlessly |
Similarity search | Leverage the SAP HANA Cloud's Vector engine to perform similarity search via CAP LLM Plugin |
Chat LLM Access via SAP AI Core | Simple access to LLM models via SAP AI Core with simplified method for chat completion |
Streamlining RAG retrieval | Single method to streamline the entire RAG retrieval process leveraging SAP AI Core and SAP HANA Cloud Vector Engine |
Orchestration Service Support | Support for SAP AI Core orchestration service's harmonized chat completion APIs |
Please check the samples and documentation: For API documentation of CAP LLM Plugin, check refer to SAP Samples
For sample use cases leveraging CAP LLM Plugin, refer to SAP Samples.
From 1.3.* to 1.4.2 (function signature changed for following methods, version not recommended):
getEmbedding
getChatCompletion
getRagResponse
From 1.3.* to 1.4.4 and above(backwards compatible, new methods to support more models):
No change required unless you want to use new methods supporting new models as mentioned in API document:
(old)getEmbedding -> getEmbeddingWithConfig
(old)getChatCompletion -> getChatCompletionWithConfig
(old)getRagResponse -> getRagResponseWithConfig
This project is open to suggestions, bug reports etc. via GitHub issues. For more information, see our Contribution Guidelines.
If you find any bug that may be a security problem, please follow our instructions at in our security policy on how to report it. Please do not create GitHub issues for security-related doubts or problems.
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone. By participating in this project, you agree to abide by its Code of Conduct at all times.
Copyright 2025 SAP SE or an SAP affiliate company and cap-llm-plugin contributors. Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.