Skip to content

OLS-1869: Update LLM overview topic #95212

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 26, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions install/ols-installing-openshift-lightspeed.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ include::_attributes/common-attributes.adoc[]

toc::[]

The installation process for {ols-official} consists of two main tasks: installing the {ols-short} Operator and configuring the large language model (LLM) provider.
The installation process for {ols-official} consists of two main tasks: installing the {ols-short} Operator and configuring the {ols-short} Service to interact with the large language model (LLM) provider.

include::modules/ols-large-language-model-configuration-overview.adoc[leveloffset=+1]
include::modules/ols-large-language-model-overview.adoc[leveloffset=+1]
include::modules/ols-about-subscription-requirements.adoc[leveloffset=+1]
include::modules/ols-installing-operator.adoc[leveloffset=+1]
Original file line number Diff line number Diff line change
@@ -1,11 +1,21 @@
// Module included in the following assemblies:
// * lightspeed-docs-main/install/ols-installing-openshift-lightspeed.adoc

:_mod-docs-content-type: CONCEPT
[id="ols-large-language-model-configuration-overview_{context}"]
[id="ols-large-language-model-overview_{context}"]

= Large Language Model (LLM) overview

A large language model (LLM) is a type of artificial intelligence program trained on vast quantities of data. The {ols-long} Service interacts with the LLM to generate answers to questions.

= Large Language Model (LLM) configuration overview
You can configure {rhelai} or {rhoai} as the LLM provider for the {ols-long} Service. Either LLM provider can use a server or inference service that processes inference queries.

You can configure {rhelai} or {rhoai} as large language model (LLM) provider for the {ols-long} Service. Either of those LLM providers can use a server or inference service that processes inference queries. Configure the LLM provider before you install the {ols-long} Operator.
Alternatively, you can connect the {ols-long} Service to a publicly available LLM provider, such as {watsonx}, {openai}, or {azure-openai}.

Alternatively, you can connect the {ols-long} Service to one of the publicly available LLM providers, such as {watsonx}, {openai}, or {azure-openai}.
[NOTE]
====
Configure the LLM provider before you install the {ols-long} Operator. Installing the Operator does not install an LLM provider.
====

[id="rhelai-with-ols_{context}"]
== {rhelai} with {ols-long}
Expand Down