You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for developing such a useful tool. For privacy reasons, I prefer using local independent apps. Additionally, would it be possible for this tool to support inference with the local ollama model in the future?
mrcfps
changed the title
请求非 docker 的本地部署或者 app
Self-deploy with non-docker mode 请求非 docker 的本地部署或者 app
Jan 25, 2025
Thank you very much for sharing your feedback. We truly appreciate it! Here's an overview of our preliminary plan for incrementally enhancing self-hosting support:
Local Machine Deployment via Docker Containers: This is already available. You can now deploy Refly on your local machine using Docker containers, providing you with a straightforward way to get started with self-hosting.
Fully Customizable Model Providers (Utilizing Containers): We're working on allowing you to have complete control over your inference model providers. This will still be container-based. Our goal is to include support for Ollama, and we anticipate this feature to be ready by the middle of February 2025.
Fully-functional Native Application with One-click Install: We're also developing a local-first native application that will offer a seamless installation experience with just one click. This application will be fully functional without the need for remote server, providing you with all the features of Refly in a more integrated and privacy-focused way. We expect to release this by the end of March 2025.
We'll keep you updated as we progress towards these milestones. If you have any further questions or suggestions in the meantime, please don't hesitate to let us know.
mrcfps
changed the title
Self-deploy with non-docker mode 请求非 docker 的本地部署或者 app
Self-deploy with non-docker option
Jan 25, 2025
不太习惯使用 docker
The text was updated successfully, but these errors were encountered: