AI Chat with MCP Server use Any LLM Model
Make sure you have installed uvx
or npx
in your system
# uvx
brew install uv
# npx
brew install node
- Configure Your LLM API Key and Endpoint in
Setting
Page - Install MCP Server from
MCP Server
Page - Chat with MCP Server
Download MacOS | Windows | Linux
- logs
~/Library/Application Support/run.daodao.chatmcp/logs
- chatmcp.db chat history
~/Documents/chatmcp.db
- mcp_server.json mcp server config
~/Documents/mcp_server.json
reset app can use this command
rm -rf ~/Library/Application\ Support/run.daodao.chatmcp
rm -rf ~/Documents/chatmcp.db
rm -rf ~/Documents/mcp_server.json
flutter pub get
flutter run -d macos
download test.db to test sqlite mcp server
~/Documents/mcp_server.json
is the configuration file for the mcp server
- Chat with MCP Server
- MCP Server Market
- Auto install MCP Server
- SSE MCP Transport Support
- Auto Choose MCP Server
- Chat History
- OpenAI LLM Model
- Claude LLM Model
- OLLama LLM Model
- RAG
- Better UI Design
All features are welcome to submit, you can submit your ideas or bugs in Issues
You can install MCP Server from MCP Server Market, MCP Server Market is a collection of MCP Server, you can use it to chat with different data.
This project is licensed under the GNU General Public License v3.0 (GPL-3.0).