Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 📄 Does Not Appear in AI Chat Page in AppFlowy AI Using AppFlowy-LAI #7592

Open
bobdowns opened this issue Mar 23, 2025 · 0 comments
Assignees

Comments

@bobdowns
Copy link

Bug Description

The instructions at https://appflowy.com/guide/intro-to-appflowy-ai state that users should "Click 📄 to select your page source(s)" in the AI Chat interface. However, when using AppFlowy-LAI with AppFlowy version 0.8.7 on macOS Sequoia 15.3.2, the 📄 button does not appear. Attempts to reference pages manually using "@", "/", or "[]" also fail to retrieve document-specific context. The AI responses remain generic and do not reference any page content.

How to Reproduce

  1. Install AppFlowy version 0.8.7 on macOS Sequoia 15.3.2.
  2. Set up AppFlowy-LAI following instructions at https://appflowy.com/guide/appflowy-local-ai-ollama.
  3. Open AppFlowy and navigate to AI Settings.
  4. Enable "AppFlowy Local AI (LAI)" and set Ollama server URL to http://localhost:11434.
  5. Open an AI Chat page and attempt to reference a page source:
    • Look for the 📄 button (it does not appear).
    • Use "@", "/", or "[]" syntax to reference pages (none work).
  6. Type a prompt that expects document-specific context (e.g., "What was my weight at my weight loss doctor visit on 9/20/24?").

Expected Behavior

The 📄 button should appear in the AI Chat interface, allowing users to select page sources as described in the documentation. Alternatively, manual referencing methods like "@", "/", or "[]" should retrieve document-specific context from selected pages.

Operating System

macOS Sequoia 15.3.2 (M4-based Mac mini with 16 GB RAM)

AppFlowy Version(s)

AppFlowy version 0.8.7

Screenshots

Image Image

Additional Context

  1. Ollama server is running properly and returning valid JSON responses.
  2. Models llama3.1 and nomic-embed-text are correctly configured in AppFlowy's AI Settings.
  3. The issue persists even after reinitializing the plugin, restarting both Ollama and AppFlowy, and attempting debug mode for af_ollama_plugin.
  4. The problem may be related to UI inconsistencies or incomplete integration between AppFlowy-LAI and Ollama.
@appflowy appflowy self-assigned this Mar 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants