Skip to content

Conversation

saheersk
Copy link

Add support for configuring init_chat_model through context_schema,
aligning with LangGraph v0.6+ recommendations for static runtime context.

  • Add context_schema parameter to init_chat_model function
  • Modify _ConfigurableModel to extract parameters from runtime context
  • Support both dict and object context formats
  • Maintain backward compatibility with existing configurable approach
  • Add comprehensive tests for new functionality
  • Update documentation with usage examples

Fixes #32954

Description: Add support for configuring init_chat_model through context_schema, aligning with LangGraph v0.6+ recommendations for static runtime context. This addresses the configuration inconsistency where model parameters had to be passed via
config["configurable"] while other static runtime context used context_schema, making project configuration management more complex.

The implementation adds a context_schema parameter to init_chat_model that allows model parameters (model, temperature, max_tokens, etc.) to be passed via runtime context instead of the configurable pattern. This enables unified configuration management where
all static runtime context can be managed consistently in one place.

Key Changes:

  • Added context_schema parameter to init_chat_model function signature
  • Modified _ConfigurableModel class to extract parameters from runtime context in addition to configurable fields
  • Support for both dictionary and object context formats
  • Full backward compatibility with existing configurable approach
  • Comprehensive test suite covering all usage scenarios
  • Updated documentation with usage examples and version notes

Issue: Fixes #32954

Dependencies: None - this change uses existing dependencies and maintains backward compatibility

This PR description:

  • ✅ Follows the required format with Description, Issue, and Dependencies sections
  • ✅ Includes a closing keyword ("Fixes Support configuring init_chat_model via context_schema #32954")
  • ✅ Clearly explains the problem being solved and the solution
  • ✅ Highlights key implementation details
  • ✅ Confirms no new dependencies are required
  • ✅ Emphasizes backward compatibility
  • ✅ Ready for review according to LangChain contribution guidelines

  Add support for configuring init_chat_model through context_schema,
  aligning with LangGraph v0.6+ recommendations for static runtime context.

  - Add context_schema parameter to init_chat_model function
  - Modify _ConfigurableModel to extract parameters from runtime context
  - Support both dict and object context formats
  - Maintain backward compatibility with existing configurable approach
  - Add comprehensive tests for new functionality
  - Update documentation with usage examples

  Fixes langchain-ai#32954
Copy link

vercel bot commented Sep 21, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Preview Comments Updated (UTC)
langchain Ignored Ignored Preview Sep 27, 2025 5:31am

@github-actions github-actions bot added the langchain Related to the package `langchain` label Sep 21, 2025
saheersk and others added 2 commits September 21, 2025 11:34
  - Break long lines in docstrings and parameter lists
  - Shorten inline comments for better readability
  - Maintain backward compatibility logic consistency
@github-actions github-actions bot added feature and removed langchain Related to the package `langchain` labels Sep 24, 2025
@mdrxy mdrxy changed the title feat(chat_models): add context_schema support to init_chat_model feat(langchain): add context_schema support to init_chat_model Sep 24, 2025
@github-actions github-actions bot added the langchain Related to the package `langchain` label Sep 24, 2025
@github-actions github-actions bot removed the langchain Related to the package `langchain` label Sep 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support configuring init_chat_model via context_schema
2 participants