Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Development #88

Merged
merged 21 commits into from
Dec 29, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
92b1dd8
The rules persist until a new one comes along or until all the steps …
fractalego Dec 23, 2023
907243b
updated tests config.json
fractalego Dec 23, 2023
6f28782
removed listener model name from tests
fractalego Dec 23, 2023
42a3cd2
removed local connectors tests
fractalego Dec 23, 2023
466409a
Added a line to the last rule action
fractalego Dec 23, 2023
d6953ac
Merge pull request #85 from fractalego/multi-dialogue-turn-rules
fractalego Dec 23, 2023
dd43f98
Merge pull request #86 from fractalego/multi-dialogue-turn-rules
fractalego Dec 23, 2023
99f61e7
rudimental actions implemented
fractalego Dec 23, 2023
26347f2
resetting conversation memory
fractalego Dec 25, 2023
8384b32
updated docs
fractalego Dec 27, 2023
ba65a1a
refactored a rule_creator
fractalego Dec 27, 2023
320ddbd
refactored away knowledge_name
fractalego Dec 27, 2023
c44d1e7
refactored rules retrieval and added second order rules
fractalego Dec 27, 2023
da441c3
changed retrieval thresholds
fractalego Dec 27, 2023
713aab5
updated testcases with LLM entailment
fractalego Dec 28, 2023
ee310ce
actions have expected behavior
fractalego Dec 28, 2023
b06cf2d
added colors to actions
fractalego Dec 28, 2023
d73729c
wrote docs for actions and testcases
fractalego Dec 29, 2023
9032a18
Merge pull request #87 from fractalego/actions-from-commandline
fractalego Dec 29, 2023
fb8cf0a
Merge branch 'main' of github.com:fractalego/wafl into development
fractalego Dec 29, 2023
a63abd0
resolved merge conflict
fractalego Dec 29, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,16 @@ Please see the examples in the following chapters.

## LLM side (needs a GPU)
The second part (LLM side) is a model server for the speech-to-text model, the LLM, the embedding system, and the text-to-speech model.

#### Installation
In order to quickly run the LLM side, you can use the following installation commands:
```bash
pip install wafl-llm
wafl-llm start
```
which will use the default models and start the server on port 8080.

#### Docker
A docker image can be used to run it as in the following:

```bash
Expand Down
Binary file removed documentation/build/doctrees/environment.pickle
Binary file not shown.
Binary file modified documentation/build/doctrees/index.doctree
Binary file not shown.
Binary file modified documentation/build/doctrees/installation.doctree
Binary file not shown.
2 changes: 2 additions & 0 deletions documentation/build/html/_sources/index.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@ Welcome to WAFL's 0.0.80 documentation!
running_WAFL
facts_and_rules
examples
testcases
actions
license

Indices and tables
Expand Down
18 changes: 11 additions & 7 deletions documentation/build/html/_sources/installation.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -32,20 +32,24 @@ Please see the examples in the following chapters.

LLM side (needs a GPU)
----------------------
The second part (LLM side) is a model server for the speech-to-text model, the LLM, the embedding system, and the text-to-speech model.
In order to quickly run the LLM side, you can use the following installation commands:

The second part is a machine that runs on a machine accessible from the interface side.
The initial configuration is for a local deployment of language models.
No action is needed to run WAFL if you want to run it as a local instance.
.. code-block:: bash

$ pip install wafl-llm
$ wafl-llm start

which will use the default models and start the server on port 8080.

However, a multi-user setup will benefit for a dedicated server.
In this case, a docker image can be used
Alternatively, a Docker image can be used to run it as in the following:

.. code-block:: bash

$ docker run -p8080:8080 --env NVIDIA_DISABLE_REQUIRE=1 --gpus all fractalego/wafl-llm:latest
$ docker run -p8080:8080 --env NVIDIA_DISABLE_REQUIRE=1 --gpus all fractalego/wafl-llm:0.80


The interface side has a `config.json` file that needs to be filled with the IP address of the LLM side.
The default is localhost.
Alternatively, you can run the LLM side by cloning `this repository <https://github.com/fractalego/wafl-llm>`_.

Finally, you can run the LLM side by cloning [this repository](https://github.com/fractalego/wafl-llm).
2 changes: 2 additions & 0 deletions documentation/build/html/examples.html
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,8 @@
<li class="toctree-l2"><a class="reference internal" href="rules_with_remember_command.html">Rule with remember command</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down
2 changes: 2 additions & 0 deletions documentation/build/html/genindex.html
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down
8 changes: 8 additions & 0 deletions documentation/build/html/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down Expand Up @@ -110,6 +112,12 @@ <h1>Welcome to WAFL’s 0.0.80 documentation!<a class="headerlink" href="#welcom
<li class="toctree-l2"><a class="reference internal" href="rules_with_remember_command.html">Rule with remember command</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a><ul>
<li class="toctree-l2"><a class="reference internal" href="testcases.html#running-the-testcases">Running the testcases</a></li>
<li class="toctree-l2"><a class="reference internal" href="testcases.html#negative-testcases">Negative testcases</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>
</div>
Expand Down
22 changes: 14 additions & 8 deletions documentation/build/html/installation.html
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down Expand Up @@ -103,17 +105,21 @@ <h2>Interface side<a class="headerlink" href="#interface-side" title="Permalink
</section>
<section id="llm-side-needs-a-gpu">
<h2>LLM side (needs a GPU)<a class="headerlink" href="#llm-side-needs-a-gpu" title="Permalink to this heading"></a></h2>
<p>The second part is a machine that runs on a machine accessible from the interface side.
The initial configuration is for a local deployment of language models.
No action is needed to run WAFL if you want to run it as a local instance.</p>
<p>However, a multi-user setup will benefit for a dedicated server.
In this case, a docker image can be used</p>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$<span class="w"> </span>docker<span class="w"> </span>run<span class="w"> </span>-p8080:8080<span class="w"> </span>--env<span class="w"> </span><span class="nv">NVIDIA_DISABLE_REQUIRE</span><span class="o">=</span><span class="m">1</span><span class="w"> </span>--gpus<span class="w"> </span>all<span class="w"> </span>fractalego/wafl-llm:latest
<p>The second part (LLM side) is a model server for the speech-to-text model, the LLM, the embedding system, and the text-to-speech model.
In order to quickly run the LLM side, you can use the following installation commands:</p>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$<span class="w"> </span>pip<span class="w"> </span>install<span class="w"> </span>wafl-llm
$<span class="w"> </span>wafl-llm<span class="w"> </span>start

which<span class="w"> </span>will<span class="w"> </span>use<span class="w"> </span>the<span class="w"> </span>default<span class="w"> </span>models<span class="w"> </span>and<span class="w"> </span>start<span class="w"> </span>the<span class="w"> </span>server<span class="w"> </span>on<span class="w"> </span>port<span class="w"> </span><span class="m">8080</span>.
</pre></div>
</div>
<p>Alternatively, a Docker image can be used to run it as in the following:</p>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span>$<span class="w"> </span>docker<span class="w"> </span>run<span class="w"> </span>-p8080:8080<span class="w"> </span>--env<span class="w"> </span><span class="nv">NVIDIA_DISABLE_REQUIRE</span><span class="o">=</span><span class="m">1</span><span class="w"> </span>--gpus<span class="w"> </span>all<span class="w"> </span>fractalego/wafl-llm:0.80
</pre></div>
</div>
<p>The interface side has a <cite>config.json</cite> file that needs to be filled with the IP address of the LLM side.
The default is localhost.
Alternatively, you can run the LLM side by cloning <a class="reference external" href="https://github.com/fractalego/wafl-llm">this repository</a>.</p>
The default is localhost.</p>
<p>Finally, you can run the LLM side by cloning [this repository](<a class="reference external" href="https://github.com/fractalego/wafl-llm">https://github.com/fractalego/wafl-llm</a>).</p>
</section>
</section>

Expand Down
2 changes: 2 additions & 0 deletions documentation/build/html/introduction.html
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down
6 changes: 4 additions & 2 deletions documentation/build/html/license.html
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
<script src="_static/js/theme.js"></script>
<link rel="index" title="Index" href="genindex.html" />
<link rel="search" title="Search" href="search.html" />
<link rel="prev" title="Rule with remember command" href="rules_with_remember_command.html" />
<link rel="prev" title="Running Actions" href="actions.html" />
</head>

<body class="wy-body-for-nav">
Expand Down Expand Up @@ -48,6 +48,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">License</a></li>
</ul>

Expand Down Expand Up @@ -88,7 +90,7 @@ <h1>License<a class="headerlink" href="#license" title="Permalink to this headin
</div>
</div>
<footer><div class="rst-footer-buttons" role="navigation" aria-label="Footer">
<a href="rules_with_remember_command.html" class="btn btn-neutral float-left" title="Rule with remember command" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left" aria-hidden="true"></span> Previous</a>
<a href="actions.html" class="btn btn-neutral float-left" title="Running Actions" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left" aria-hidden="true"></span> Previous</a>
</div>

<hr/>
Expand Down
Binary file modified documentation/build/html/objects.inv
Binary file not shown.
2 changes: 2 additions & 0 deletions documentation/build/html/running_WAFL.html
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,8 @@
</li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down
2 changes: 2 additions & 0 deletions documentation/build/html/search.html
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,8 @@
<li class="toctree-l1"><a class="reference internal" href="running_WAFL.html">Running WAFL</a></li>
<li class="toctree-l1"><a class="reference internal" href="facts_and_rules.html">The rules.yaml file</a></li>
<li class="toctree-l1"><a class="reference internal" href="examples.html">Examples</a></li>
<li class="toctree-l1"><a class="reference internal" href="testcases.html">Creating a testcase</a></li>
<li class="toctree-l1"><a class="reference internal" href="actions.html">Running Actions</a></li>
<li class="toctree-l1"><a class="reference internal" href="license.html">License</a></li>
</ul>

Expand Down
Loading