Skip to content

Improve Import Error Messages for LLM Client Dependencies #4605

@Xurnnba

Description

@Xurnnba

What feature would you like to be added?

Current Behavior

When an import error occurs while initializing LLM clients (e.g., Gemini, Anthropic, etc.), the code raises a generic ImportError with a hardcoded message suggesting to install specific packages. However, the actual import error might be caused by other issues (like missing dependencies of those packages or version conflicts).

For example, when trying to use the Gemini client, if there's an error importing vertexai, the code shows:
ImportError: Please install google-generativeai to use Google OpenAI API.
Even though the actual error was:
ImportError: No module named 'vertexai'

Why is this needed?

Proposed Change

Modify the import error handling to:

  1. Include the original error message in the raised ImportError
  2. Provide more context about potential dependencies

Example implementation:
if gemini_import_exception:
raise ImportError(
f"Failed to initialize Gemini client. Original error: {gemini_import_exception}\n"
"Required packages:\n"
"- google-generativeai\n"
"- google-cloud-aiplatform (provides vertexai)"
)

Benefits

  • Users get more accurate information about what's actually causing the import failure
  • Easier troubleshooting of dependency issues
  • Clearer guidance on required packages and their relationships

Affected Files

  • /autogen/oai/client.py

Metadata

Metadata

Labels

Type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions