Skip to content

[Feature]: Databricks Native Anthropic Messages API Support #15960

@DevOps-zhuang

Description

@DevOps-zhuang

The Feature

Background:

  • Databricks provides two ways to serve Anthropic models:
    1. OpenAI-compatible endpoints (already supported by LiteLLM via databricks/ provider)
    2. Native Anthropic Messages endpoints (NEW - this implementation)

Pain Points Addressed:

  • The OpenAI-compatible format doesn't support Anthropic-specific features like Prompt Caching
  • Users couldn't leverage cache tokens (cache_creation_input_tokens, cache_read_input_tokens) for cost optimization
  • No way to use the full Anthropic Messages API protocol directly with Databricks-hosted models

Motivation, pitch

Suggest Solution:

  • New databricks_anthropic/ provider prefix that uses the native Anthropic Messages API
  • Direct pass-through of Anthropic Messages API requests/responses
  • Full support for Anthropic features including Prompt Caching
  • Proper extraction and tracking of cache-related token usage

LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?

No

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions