Skip to content

feat(observability): Refactor content observation mechanism to support both logging and tracing #3612

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

traburiss
Copy link

This commit refactors the content observation mechanism by adding support for trace recording without altering the original logging functionality. The motivation for this change is to ensure complete context availability in scenarios such as integration with Langfuse, which is essential for proper functionality.

Main changes:

  1. New configuration options:
    1. trace-prompt: Enables recording prompts to trace
    2. trace-completion : Enables recording completions to trace
    3. trace-prompt-size : Limits the length of prompt context
    4. content-formatter : Handles content formatting, currently supporting both 'text' and 'langfuse' modes
  2. Added two handlers, ChatModelCompletionObservationTraceHandler and ChatModelPromptContentObservationTraceHandler, to support recording context to trace.
  3. Introduced the MessageFormatter interface and its subclasses to support formatting for prompt and completion content.
  4. Rolled back parts of the code and dependencies from commit ca843e8.

@traburiss traburiss force-pushed the feature/content-trace branch from b212a6f to c485721 Compare June 19, 2025 13:12
…t both logging and tracing

This commit refactors the content observation mechanism by adding support for trace recording without altering the original logging functionality. The motivation for this change is to ensure complete context availability in scenarios such as integration with Langfuse, which is essential for proper functionality.

Main changes:

1. New configuration options:
   1. `trace-prompt`: Enables recording prompts to trace
   2. `trace-completion` : Enables recording completions to trace
   3. `trace-prompt-size` : Limits the length of prompt context
   4. `content-formatter` : Handles content formatting, currently supporting both 'text' and 'langfuse' modes
2. Added two handlers, ChatModelCompletionObservationTraceHandler and ChatModelPromptContentObservationTraceHandler, to support recording context to trace.
3. Introduced the MessageFormatter interface and its subclasses to support formatting for prompt and completion content.
4. Rolled back parts of the code and dependencies from commit ca843e8.

Signed-off-by: tingchuan.li <[email protected]>
@traburiss traburiss force-pushed the feature/content-trace branch from c485721 to ef2ad4d Compare June 19, 2025 13:18
@ilayaperumalg
Copy link
Member

@jonatan-ivanov Could you take a look a this PR? Thanks!

@shakuzen
Copy link
Member

It would help in reviewing this if you could share a link to the relevant parts of Langfuse's documentation that specifies what they're expecting for the desired integration. How likely is it to change? This is reintroducing using the deprecated API on OTel Spans to add Event Attributes, which was removed because it's deprecated and the suggested alternative was Log Events.

@traburiss
Copy link
Author

@shakuzen

https://langfuse.com/docs/integrations/spring-ai

Langfuse doc still uses include-prompt and include-completion, but these parameters are now deprecated. However, the new parameters log-prompt and log-completion cannot work with Langfuse.
Langfuse need both prompt and completion data included within spans.

name = "chatModelPromptContentObservationTraceHandler")
@ConditionalOnProperty(prefix = ChatObservationProperties.CONFIG_PREFIX, name = "trace-prompt",
havingValue = "true")
TracingAwareLoggingObservationHandler<ChatModelObservationContext> chatModelPromptContentObservationTraceHandler(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are these wrapped into TracingAwareLoggingObservationHandler?
Please check what it does, it is used to make the tracing data available for log entries (log correlation). If I understand correctly, this is not needed in this use-case.

@@ -61,7 +61,7 @@

<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing</artifactId>
<artifactId>micrometer-tracing-bridge-otel</artifactId>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spring-ai-commons should not depend on a bridge (implementation instead of the abstraction).

}

@Nullable
public static Span extractOtelSpan(@Nullable TracingObservationHandler.TracingContext tracingContext) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should depend on the abstraction (tracing API) instead of a single implementation (OTel API/SDK).

@@ -60,7 +60,7 @@

<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-tracing</artifactId>
<artifactId>micrometer-tracing-bridge-otel</artifactId>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spring-ai-model should not depend on a bridge (implementation instead of the abstraction).

.getRequired(TracingObservationHandler.TracingContext.class);
Span currentSpan = TracingHelper.extractOtelSpan(tracingContext);
if (currentSpan != null) {
currentSpan.addEvent(AiObservationEventNames.CONTENT_COMPLETION.value(),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems this API will be deprecated (maybe already is?), see: micrometer-metrics/micrometer#5238 (comment)

@jonatan-ivanov
Copy link
Member

Langfuse doc still uses include-prompt and include-completion, but these parameters are now deprecated. However, the new parameters log-prompt and log-completion cannot work with Langfuse.

This makes me a bit confused. Why are we doing this if one is already deprecated and another is not supported?

@markpollack
Copy link
Member

markpollack commented Jun 26, 2025

Basically my view is that our observability needs to support products like langfuse. How to achieve this goal.

wrt to this PR

Rolled back parts of the code and dependencies from commit ca843e8.

that change in behavior can't occur in 1.0.x - and not sure about the strategy for 1.1 on this front. Hoping to use this issue to drive a strategy.

I thought there is flexibility to override behavior - we can perhaps create ready to use repos for products that customize observability in our https://github.com/orgs/spring-ai-community/repositories org

@ThomasVitale suggestions on how to structure the convo and go forward - a working google doc would seem to be a good start.

@markpollack
Copy link
Member

Also related discussion is here - #3151 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants