Skip to content

Commit

Permalink
Merge branch 'main' into Jack-devnlp/issue403
Browse files Browse the repository at this point in the history
  • Loading branch information
Jack-devnlp authored Feb 23, 2024
2 parents fe12aa0 + 2f8eb31 commit 519c5bb
Show file tree
Hide file tree
Showing 4 changed files with 2 additions and 1 deletion.
Binary file removed .DS_Store
Binary file not shown.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -167,3 +167,4 @@ cython_debug/
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
.DS_Store
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -939,6 +939,7 @@ With all these resources at your fingertips, you're ready to start your exciting
##### English tutorials
| Type | Deliverable | Date | Author |
|-------------|--------------------------------------------------------|----------------|----------------|
| Video | [Run dolphin-2.2-yi-34b on IoT Devices](https://www.youtube.com/watch?v=NJ89T5mO25Y) | 2023-11-30 | [Second State](https://github.com/second-state) |
| Blog | [Running Yi-34B-Chat locally using LlamaEdge](https://www.secondstate.io/articles/yi-34b/) | 2023-11-30 | [Second State](https://github.com/second-state) |
| Video | [Install Yi 34B Locally - Chinese English Bilingual LLM](https://www.youtube.com/watch?v=CVQvj4Wrh4w&t=476s) | 2023-11-05 | [Fahd Mirza](https://www.youtube.com/watch?v=CVQvj4Wrh4w&t=476s) |

Expand Down
1 change: 0 additions & 1 deletion VL/openai_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,6 @@ def generate_stream(

model_name = params.get("model", "llm")
temperature = float(params.get("temperature", 1.0))
repetition_penalty = float(params.get("repetition_penalty", 1.0))
top_p = float(params.get("top_p", 1.0))
top_k = int(params.get("top_k", 40))
max_new_tokens = int(params.get("max_tokens", 1024))
Expand Down

0 comments on commit 519c5bb

Please sign in to comment.