Skip to content

Commit

Permalink
Update backend.py, index.html, requirements.txt (#1180)
Browse files Browse the repository at this point in the history
* Update backend.py

change to the model that received from user interactive from the web interface model selection.

* Update index.html

added Llama2 as a provider selection and also include the model selection for Llama2: llama2-70b, llama2-13b, llama2-7b

* Update requirements.txt

add asgiref to enable async for Flask in api.
"RuntimeError: Install Flask with the 'async' extra in order to use async views"
  • Loading branch information
hdsz25 authored Oct 28, 2023
1 parent 1dc8e6d commit b8a3db5
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 7 deletions.
10 changes: 5 additions & 5 deletions g4f/gui/client/html/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -130,9 +130,9 @@
<option value="google-bard">google-bard</option>
<option value="google-palm">google-palm</option>
<option value="bard">bard</option>
<option value="falcon-40b">falcon-40b</option>
<option value="falcon-7b">falcon-7b</option>
<option value="llama-13b">llama-13b</option>
<option value="llama2-7b">llama2-7b</option>
<option value="llama2-13b">llama2-13b</option>
<option value="llama2-70b">llama2-70b</option>
<option value="command-nightly">command-nightly</option>
<option value="gpt-neox-20b">gpt-neox-20b</option>
<option value="santacoder">santacoder</option>
Expand Down Expand Up @@ -188,7 +188,7 @@
<option value="g4f.Provider.Aibn">Aibn</option>
<option value="g4f.Provider.Bing">Bing</option>
<option value="g4f.Provider.You">You</option>
<option value="g4f.Provider.H2o">H2o</option>
<option value="g4f.Provider.Llama2">Llama2</option>
<option value="g4f.Provider.Aivvm">Aivvm</option>
</select>
</div>
Expand All @@ -203,4 +203,4 @@
</script>
</body>

</html>
</html>
2 changes: 1 addition & 1 deletion g4f/gui/server/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def _conversation(self):

def stream():
yield from g4f.ChatCompletion.create(
model=g4f.models.gpt_35_long,
model=model,
provider=get_provider(provider),
messages=messages,
stream=True,
Expand Down
3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,5 @@ loguru
tiktoken
pillow
platformdirs
numpy
numpy
asgiref

0 comments on commit b8a3db5

Please sign in to comment.