A small feature suggestion:
It'd be useful if the project supported an LLM service (through any provider, API key, or local model). that way, users could read a chapter and directly ask questions or summarize it. ( but I plan to fork and work on this actually now )
Thanks again for open-sourcing it.
A small feature suggestion:
It'd be useful if the project supported an LLM service (through any provider, API key, or local model). that way, users could read a chapter and directly ask questions or summarize it. ( but I plan to fork and work on this actually now )
Thanks again for open-sourcing it.