OpenLaw is an innovative web application designed to offer free, clear, and accessible legal advice for residents across Switzerland.
Featuring a sophisticated chatbot, OpenLaw breaks down language barriers by leveraging a multilingual language model. This makes it an invaluable resource for understanding complex legal matters, tailored to the diverse linguistic landscape of Switzerland.
It features widgets such as lawyer recommendations and a community page where users can get additional feedback and advice from other users. Our goal is to decentralize legal knowledge for we understand the complexities of the Swiss judicial and legal system.
-
Chatbot: This is a simple conversational-ui RAG (retrieval augmented generation) based on the Swiss Code of Obligations. A user can query about specific provisions about the Swiss Code of Obligations, which would then summarize the response into simple language.
-
Lawyer Recommendation: For further information and quotations for an initial assessment, the user may then need to consult a lawyer using the "lawyer recommendation" widget. It would then bring the user to a page where one can filter lawyers based on the user geographical location and budget.
-
Community Platform: We are also implementing a community platform which connects the user to others facing a similar predicament for community support.
-
FAQ's: An FAQ section that provides the most frequent queries from users and its corresponding response. Legal Protection Insurance Recommendation: Understanding that legal action may be expensive and beyond reach for most, we are implementing a widget that recommends legal protection insurances offered by selected insurance companies.
There is two different ways to setup this project:
- Install Ollama & Qdrant locally (Ollama desktop app is currently is only available for Mac and Linux) - Ollama will take advantage of your GPU to run the model
- Use the Docker Compose file to run Ollama & Qdrant in containers (just run in a terminal in the project directory) - easier setup, but Ollama will run on CPU
docker compose up -d
to pull & run the containersdocker compose exec ollama ollama run mistral
to download & install the mistral model
- 🦙 Download Ollama and install it locally
ollama run mistral
to download and install the model locally (Requires 4.1GB and 8GB of RAM)- Open http://localhost:11434 to check if Ollama is running
docker pull qdrant/qdrant
docker run -p 6333:6333 qdrant/qdrant
Both Option 1 and 2 continue with the following setup:
- Copy the file
.env.local.example
in the project and rename it to.env
. Verify if all environment variables are correct yarn install
to install the required dependenciesyarn dev
to launch the development server- Go to http://localhost:3000 and try out the app