Skip to content

Commit

Permalink
Update model names (#187)
Browse files Browse the repository at this point in the history
* Rename models

* Remove test

* Optimize agent_ci.yaml
  • Loading branch information
Bobholamovic committed Dec 21, 2023
1 parent 81fbbad commit eb4069d
Show file tree
Hide file tree
Showing 45 changed files with 211 additions and 166 deletions.
6 changes: 4 additions & 2 deletions .github/workflows/agent_ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -73,5 +73,7 @@ jobs:
working-directory: erniebot-agent
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
token: ${{ secrets.CODECOV_TOKEN }}
directory: erniebot-agent
name: erniebot-agent
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,10 +60,10 @@ import erniebot
models = erniebot.Model.list()

print(models)
# ernie-bot 文心一言模型(ernie-bot
# ernie-bot-turbo 文心一言模型(ernie-bot-turbo)
# ernie-bot-4 文心一言模型(ernie-bot-4
# ernie-bot-8k 文心一言模型(ernie-bot-8k
# ernie-3.5 文心大模型(ernie-3.5
# ernie-turbo 文心大模型(ernie-turbo)
# ernie-4.0 文心大模型(ernie-4.0
# ernie-longtext 文心大模型(ernie-longtext
# ernie-text-embedding 文心百中语义模型
# ernie-vilg-v2 文心一格模型

Expand All @@ -72,7 +72,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

# Create a chat completion
response = erniebot.ChatCompletion.create(model="ernie-bot", messages=[{"role": "user", "content": "你好,请介绍下你自己"}])
response = erniebot.ChatCompletion.create(model="ernie-3.5", messages=[{"role": "user", "content": "你好,请介绍下你自己"}])

print(response.get_result())
```
Expand All @@ -87,8 +87,8 @@ erniebot api model.list
export EB_API_TYPE="aistudio"
export EB_ACCESS_TOKEN="<access-token-for-aistudio>"

# Create a chat completion (using ernie-bot, ernie-bot-turbo, etc.)
erniebot api chat_completion.create --model ernie-bot --message user "请介绍下你自己"
# Create a chat completion (using ernie-3.5, ernie-turbo, etc.)
erniebot api chat_completion.create --model ernie-3.5 --message user "请介绍下你自己"

# Set authentication params for image.create
export EB_API_TYPE="yinian"
Expand All @@ -102,11 +102,11 @@ erniebot api image.create --model ernie-vilg-v2 --prompt "画一只驴肉火烧"

### 对话补全(Chat Completion)

ERNIE Bot SDK提供具备对话补全能力的ernie-bot、ernie-bot-turbo、ernie-bot-4、ernie-bot-8k等文心一言系列模型
ERNIE Bot SDK提供具备对话补全能力的ernie-3.5、ernie-turbo、ernie-4.0、ernie-longtext等文心大模型

不同模型在效果、速度等方面各有千秋,大家可以根据实际场景的需求选择合适的模型。

以下是调用ernie-bot模型进行多轮对话的示例
以下是调用ernie-3.5模型进行多轮对话的示例

```python
import erniebot
Expand All @@ -115,7 +115,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

response = erniebot.ChatCompletion.create(
model="ernie-bot",
model="ernie-3.5",
messages=[{
"role": "user",
"content": "请问你是谁?"
Expand Down Expand Up @@ -194,7 +194,7 @@ ERNIE Bot SDK提供函数调用功能,即由大模型根据对话上下文确

借由函数调用,用户可以从大模型获取结构化数据,进而利用编程手段将大模型与已有的内外部API结合以构建应用。

以下是调用ernie-bot模型进行函数调用的示例
以下是调用ernie-3.5模型进行函数调用的示例

```python
import erniebot
Expand All @@ -203,7 +203,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

response = erniebot.ChatCompletion.create(
model="ernie-bot",
model="ernie-3.5",
messages=[{
"role": "user",
"content": "深圳市今天气温多少摄氏度?",
Expand Down
2 changes: 1 addition & 1 deletion erniebot-agent/examples/baizhong_search_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ def offline_ann(data_path, aurora_db):
print(aurora_search.function_call_schema())
# Tool Test
result = asyncio.run(aurora_search(query=query))
llm = ERNIEBot(model="ernie-bot", api_type="custom")
llm = ERNIEBot(model="ernie-3.5", api_type="custom")
memory = WholeMemory()
# Agent test
agent = FunctionalAgent(llm=llm, tools=[aurora_search], memory=memory)
Expand Down
4 changes: 2 additions & 2 deletions erniebot-agent/examples/cookbook/construction_assistant.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -351,8 +351,8 @@
"metadata": {},
"outputs": [],
"source": [
"# 创建一个ERNIEBot实例,使用\"ernie-bot-8k\"模型。\n",
"llm = ERNIEBot(model=\"ernie-bot-8k\")\n",
"# 创建一个ERNIEBot实例,使用\"ernie-longtext\"模型。\n",
"llm = ERNIEBot(model=\"ernie-longtext\")\n",
"# 创建一个WholeMemory实例。这可能是一个用于存储对话历史和上下文信息的类,有助于模型理解和持续对话。\n",
"memory = WholeMemory()\n",
"# 创建一个FunctionalAgent实例。这个代理将使用上面创建的ERNIEBot模型和WholeMemory,同时传入了一个名为tool的工具。\n",
Expand Down
2 changes: 1 addition & 1 deletion erniebot-agent/examples/cv_agent/CV_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ def __init__(self):
self.tools = self.toolkit.get_tools()


llm = ERNIEBot(model="ernie-bot", api_type="aistudio", access_token="<your-access-token>")
llm = ERNIEBot(model="ernie-3.5", api_type="aistudio", access_token="<your-access-token>")
toolkit = CVToolkit()
memory = WholeMemory()
file_manager = get_file_manager()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def offline_ann(data_path, baizhong_db):
res = offline_ann(args.data_path, baizhong_db)
print(res)

llm = ERNIEBot(model="ernie-bot", api_type="custom")
llm = ERNIEBot(model="ernie-3.5", api_type="custom")

retrieval_tool = BaizhongSearchTool(
description="Use Baizhong Search to retrieve documents.", db=baizhong_db, threshold=0.1
Expand Down
2 changes: 1 addition & 1 deletion erniebot-agent/examples/plugins/multiple_plugins.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ def examples(self) -> List[Message]:


# TODO(shiyutang): replace this when model is online
llm = ERNIEBot(model="ernie-bot", api_type="custom")
llm = ERNIEBot(model="ernie-3.5", api_type="custom")
memory = SlidingWindowMemory(max_round=1)
file_manager = get_file_manager(access_token="") # Access_token needs to be set here.
# plugins = ["ChatFile", "eChart"]
Expand Down
2 changes: 1 addition & 1 deletion erniebot-agent/examples/rpg_game_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ def parse_args():
parser = argparse.ArgumentParser(prog="erniebot-RPG")
parser.add_argument("--access-token", type=str, default=None, help="Access token to use.")
parser.add_argument("--game", type=str, default="射雕英雄传", help="Story name")
parser.add_argument("--model", type=str, default="ernie-bot-4", help="Model name")
parser.add_argument("--model", type=str, default="ernie-4.0", help="Model name")
return parser.parse_args()


Expand Down
4 changes: 2 additions & 2 deletions erniebot-agent/src/erniebot_agent/chat_models/erniebot.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,8 @@ def __init__(
"""Initializes an instance of the `ERNIEBot` class.
Args:
model (str): The model name. It should be "ernie-bot", "ernie-bot-turbo", "ernie-bot-8k", or
"ernie-bot-4".
model (str): The model name. It should be "ernie-3.5", "ernie-turbo", "ernie-4.0", or
"ernie-longtext".
api_type (Optional[str]): The API type for erniebot. It should be "aistudio" or "qianfan".
access_token (Optional[str]): The access token for erniebot.
close_multi_step_tool_call (bool): Whether to close the multi-step tool call. Defaults to False.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class ErnieBotChat(BaseChatModel):
Example:
.. code-block:: python
from erniebot_agent.extensions.langchain.chat_models import ErnieBotChat
erniebot_chat = ErnieBotChat(model="ernie-bot")
erniebot_chat = ErnieBotChat(model="ernie-3.5")
"""

client: Any = None
Expand All @@ -44,7 +44,7 @@ class ErnieBotChat(BaseChatModel):
streaming: bool = False
"""Whether to stream the results or not."""

model: str = "ernie-bot"
model: str = "ernie-3.5"
"""Model to use."""
temperature: Optional[float] = 0.95
"""Sampling temperature to use."""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ class ErnieBot(LLM):
.. code-block:: python
from erniebot_agent.extensions.langchain.llms import ErnieBot
erniebot = ErnieBot(model="ernie-bot")
erniebot = ErnieBot(model="ernie-3.5")
"""

client: Any = None
Expand All @@ -34,7 +34,7 @@ class ErnieBot(LLM):
streaming: bool = False
"""Whether to stream the results or not."""

model: str = "ernie-bot"
model: str = "ernie-3.5"
"""Model to use."""
temperature: Optional[float] = 0.95
"""Sampling temperature to use."""
Expand Down
2 changes: 1 addition & 1 deletion erniebot-agent/tests/chat_models/test_chat_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from erniebot_agent.message import HumanMessage


async def test_ernie_bot(model="ernie-bot-turbo", stream=False):
async def test_ernie_bot(model="ernie-turbo", stream=False):
api_type = "aistudio"
access_token = os.getenv("ACCESS_TOKEN") # set your access token as an environment variable
assert (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

@pytest.fixture(scope="module")
def llm():
return ERNIEBot(model="ernie-bot")
return ERNIEBot(model="ernie-3.5")


@pytest.fixture(scope="module")
Expand Down
4 changes: 2 additions & 2 deletions erniebot-agent/tests/integration_tests/apihub/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,9 +41,9 @@ def download_fixture_file(self, file_name: str):

def get_agent(self, toolkit: RemoteToolkit):
if "EB_BASE_URL" in os.environ:
llm = ERNIEBot(model="ernie-bot", api_type="custom")
llm = ERNIEBot(model="ernie-3.5", api_type="custom")
else:
llm = ERNIEBot(model="ernie-bot")
llm = ERNIEBot(model="ernie-3.5")

return FunctionalAgent(
llm=llm,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ class TestChatModel(unittest.IsolatedAsyncioTestCase):
@pytest.mark.asyncio
async def test_chat(self):
eb = ERNIEBot(
model="ernie-bot-turbo", api_type="aistudio", access_token=os.environ["AISTUDIO_ACCESS_TOKEN"]
model="ernie-turbo", api_type="aistudio", access_token=os.environ["AISTUDIO_ACCESS_TOKEN"]
)
messages = [
HumanMessage(content="你好!"),
Expand Down Expand Up @@ -64,9 +64,9 @@ async def test_function_call(self):
},
}
]
# use ernie-bot here since ernie-bot-turbo doesn't support function call
# use ernie-3.5 here since ernie-turbo doesn't support function call
eb = ERNIEBot(
model="ernie-bot", api_type="aistudio", access_token=os.environ["AISTUDIO_ACCESS_TOKEN"]
model="ernie-3.5", api_type="aistudio", access_token=os.environ["AISTUDIO_ACCESS_TOKEN"]
)
messages = [
HumanMessage(content="深圳市今天的气温是多少摄氏度?"),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ async def test_erniebot_astream() -> None:

def test_erniebot_params() -> None:
"""Test setting parameters."""
chat = ErnieBotChat(model="ernie-bot-turbo", temperature=0.7)
chat = ErnieBotChat(model="ernie-turbo", temperature=0.7)
message = HumanMessage(content="Hello")
response = chat([message])
assert isinstance(response, BaseMessage)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ def setUp(self) -> None:

def run_query(self, query):
response = erniebot.ChatCompletion.create(
model="ernie-bot",
model="ernie-3.5",
messages=[
{
"role": "user",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ async def test_agent(self):
if yinian_ak is None or yinian_sk is None:
return

eb = ERNIEBot(model="ernie-bot", api_type="aistudio", access_token=aistudio_access_token)
eb = ERNIEBot(model="ernie-3.5", api_type="aistudio", access_token=aistudio_access_token)
memory = WholeMemory()
img_gen_tool = ImageGenerationTool(yinian_ak=yinian_ak, yinian_sk=yinian_sk)
agent = FunctionalAgent(llm=eb, tools=[img_gen_tool], memory=memory)
Expand Down
24 changes: 12 additions & 12 deletions erniebot/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,10 +60,10 @@ import erniebot
models = erniebot.Model.list()

print(models)
# ernie-bot 文心一言模型(ernie-bot
# ernie-bot-turbo 文心一言模型(ernie-bot-turbo)
# ernie-bot-4 文心一言模型(ernie-bot-4
# ernie-bot-8k 文心一言模型(ernie-bot-8k
# ernie-3.5 文心大模型(ernie-3.5
# ernie-turbo 文心大模型(ernie-turbo)
# ernie-4.0 文心大模型(ernie-4.0
# ernie-longtext 文心大模型(ernie-longtext
# ernie-text-embedding 文心百中语义模型
# ernie-vilg-v2 文心一格模型

Expand All @@ -72,7 +72,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

# Create a chat completion
response = erniebot.ChatCompletion.create(model="ernie-bot", messages=[{"role": "user", "content": "你好,请介绍下你自己"}])
response = erniebot.ChatCompletion.create(model="ernie-3.5", messages=[{"role": "user", "content": "你好,请介绍下你自己"}])

print(response.get_result())
```
Expand All @@ -87,8 +87,8 @@ erniebot api model.list
export EB_API_TYPE="aistudio"
export EB_ACCESS_TOKEN="<access-token-for-aistudio>"

# Create a chat completion (using ernie-bot, ernie-bot-turbo, etc.)
erniebot api chat_completion.create --model ernie-bot --message user "请介绍下你自己"
# Create a chat completion (using ernie-3.5, ernie-turbo, etc.)
erniebot api chat_completion.create --model ernie-3.5 --message user "请介绍下你自己"

# Set authentication params for image.create
export EB_API_TYPE="yinian"
Expand All @@ -102,11 +102,11 @@ erniebot api image.create --model ernie-vilg-v2 --prompt "画一只驴肉火烧"

### 对话补全(Chat Completion)

ERNIE Bot SDK提供具备对话补全能力的ernie-bot、ernie-bot-turbo、ernie-bot-4、ernie-bot-8k等文心一言系列模型
ERNIE Bot SDK提供具备对话补全能力的ernie-3.5、ernie-turbo、ernie-4.0、ernie-longtext等文心大模型

不同模型在效果、速度等方面各有千秋,大家可以根据实际场景的需求选择合适的模型。

以下是调用ernie-bot模型进行多轮对话的示例
以下是调用ernie-3.5模型进行多轮对话的示例

```python
import erniebot
Expand All @@ -115,7 +115,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

response = erniebot.ChatCompletion.create(
model="ernie-bot",
model="ernie-3.5",
messages=[{
"role": "user",
"content": "请问你是谁?"
Expand Down Expand Up @@ -194,7 +194,7 @@ ERNIE Bot SDK提供函数调用功能,即由大模型根据对话上下文确

借由函数调用,用户可以从大模型获取结构化数据,进而利用编程手段将大模型与已有的内外部API结合以构建应用。

以下是调用ernie-bot模型进行函数调用的示例
以下是调用ernie-3.5模型进行函数调用的示例

```python
import erniebot
Expand All @@ -203,7 +203,7 @@ erniebot.api_type = "aistudio"
erniebot.access_token = "<access-token-for-aistudio>"

response = erniebot.ChatCompletion.create(
model="ernie-bot",
model="ernie-3.5",
messages=[{
"role": "user",
"content": "深圳市今天气温多少摄氏度?",
Expand Down
Loading

0 comments on commit eb4069d

Please sign in to comment.