Skip to content
This repository has been archived by the owner on Oct 23, 2023. It is now read-only.

Commit

Permalink
Merge pull request #30 from tectalichq/feature/1.5.0
Browse files Browse the repository at this point in the history
v1.5.0 Release: Adds Function Calling support
  • Loading branch information
thejamescollins committed Jun 19, 2023
2 parents d109f4b + 14c9363 commit 434f48f
Show file tree
Hide file tree
Showing 25 changed files with 651 additions and 231 deletions.
17 changes: 17 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,23 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## 1.5.0 - 2023-06-19

### Added
- Add support for **Function Calling** in the chat completions handler. [Function calling guide](https://platform.openai.com/docs/guides/gpt/function-calling).

### Changed
- Improve Examples in Readme.
- The `\Tectalic\OpenAi\Models\Completions\CreateRequest::$prompt` property is now required.
- The `\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem::$text` property is now required.
- The `\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem::$index` property is now required.
- The `\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem::$logprobs` property is now required.
- The `\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem::$finish_reason` property is now required.
- The `\Tectalic\OpenAi\Models\ChatCompletions\CreateRequestMessagesItem::$content` property is no longer marked as required, as it is not required for `function` chats but is for all other types.
- Document the valid values for the `\Tectalic\OpenAi\Models\Completions\CreateResponseChoicesItem::$finish_reason` property.
- Document the valid values for the `\Tectalic\OpenAi\Models\ChatCompletions\CreateResponseChoicesItem::$finish_reason` property.
- API version updated from 1.2.0 to 1.3.1.

## 1.4.0 - 2023-03-02

### Added
Expand Down
68 changes: 66 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,70 @@ Note: GPT-4 is currently in a limited beta and is only accessible to those who h

If you receive a 404 error when attempting to use GPT-4, then your OpenAI account has not been granted access.

### Chat Completion Function Calling using ChatGPT (GPT-3.5 & GPT-4)

The following example uses the `gpt-3.5-turbo-0613` model to demonstrate function calling.

It converts natural language into a function call, which can then be executed within your application.

```php
$openaiClient = \Tectalic\OpenAi\Manager::build(
new \GuzzleHttp\Client(),
new \Tectalic\OpenAi\Authentication(getenv('OPENAI_API_KEY'))
);

/** @var \Tectalic\OpenAi\Models\ChatCompletions\CreateResponse $response */
$response = $openaiClient->chatCompletions()->create(new CreateRequest([
'model' => 'gpt-3.5-turbo-0613',
'messages' => [
['role' => 'user', 'content' => 'What\'s the weather like in Boston?']
],
'functions' => [
[
'name' => 'get_current_weather',
'description' => 'Get the current weather in a given location',
'parameters' => new \Tectalic\OpenAi\Models\ChatCompletions\CreateRequestFunctionsItemParameters(
[
'type' => 'object',
'properties' => [
'location' => [
'type' => 'string',
'description' => 'The worldwide city and state, e.g. San Francisco, CA',
],
'format' => [
'type' => 'string',
'description' => 'The temperature unit to use. Infer this from the users location.',
'enum' => ['celsius', 'farhenheit'],
],
'num_days' => [
'type' => 'integer',
'description' => 'The number of days to forecast',
],
],
'required' => ['location', 'format', 'num_days'],
]
)
]
],
'function_call' => 'auto',
]))->toModel();

$params = json_decode($response->choices[0]->message->function_call->arguments, true);
var_dump($params);

// array(3) {
// 'location' =>
// string(6) "Boston"
// 'format' =>
// string(7) "celsius"
// 'num_days' =>
// int(1)
//}

```

[Learn more about function calling](https://platform.openai.com/docs/guides/gpt/function-calling).

### Text Completion (GPT-3)

```php
Expand Down Expand Up @@ -242,8 +306,8 @@ See the table below for a full list of API Handlers and Methods.
| --------------------------------- | ----------- | ---------------- |
|`AudioTranscriptions::create()`|Transcribes audio into the input language.|`POST` `/audio/transcriptions`|
|`AudioTranslations::create()`|Translates audio into into English.|`POST` `/audio/translations`|
|`ChatCompletions::create()`|Creates a completion for the chat message|`POST` `/chat/completions`|
|`Completions::create()`|Creates a completion for the provided prompt and parameters|`POST` `/completions`|
|`ChatCompletions::create()`|Creates a model response for the given chat conversation.|`POST` `/chat/completions`|
|`Completions::create()`|Creates a completion for the provided prompt and parameters.|`POST` `/completions`|
|`Edits::create()`|Creates a new edit for the provided input, instruction, and parameters.|`POST` `/edits`|
|`Embeddings::create()`|Creates an embedding vector representing the input text.|`POST` `/embeddings`|
|`Files::list()`|Returns a list of files that belong to the user's organization.|`GET` `/files`|
Expand Down
4 changes: 2 additions & 2 deletions manifest.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"libraryVersion": "1.4.0",
"apiVersion": "1.2.0",
"libraryVersion": "1.5.0",
"apiVersion": "1.3.1",
"buildVersion": "1.2.1"
}
18 changes: 9 additions & 9 deletions src/Client.php
Original file line number Diff line number Diff line change
Expand Up @@ -72,25 +72,25 @@ public function __construct(ClientInterface $httpClient, Authentication $auth, s
}

/**
* Access to the completions handler.
* Access to the chatCompletions handler.
*
* @api
* @return Completions
* @return ChatCompletions
*/
public function completions(): Completions
public function chatCompletions(): ChatCompletions
{
return new \Tectalic\OpenAi\Handlers\Completions($this);
return new \Tectalic\OpenAi\Handlers\ChatCompletions($this);
}

/**
* Access to the chatCompletions handler.
* Access to the completions handler.
*
* @api
* @return ChatCompletions
* @return Completions
*/
public function chatCompletions(): ChatCompletions
public function completions(): Completions
{
return new \Tectalic\OpenAi\Handlers\ChatCompletions($this);
return new \Tectalic\OpenAi\Handlers\Completions($this);
}

/**
Expand Down Expand Up @@ -406,7 +406,7 @@ private function mergeRequestParts(

$request = $request->withHeader(
'User-Agent',
'Tectalic OpenAI REST API Client/1.4.0'
'Tectalic OpenAI REST API Client/1.5.0'
);

// Merge Headers.
Expand Down
2 changes: 1 addition & 1 deletion src/Handlers/ChatCompletions.php
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ public function __construct(?Client $client = null)
}

/**
* Creates a completion for the chat message
* Creates a model response for the given chat conversation.
*
* Operation URL: POST /chat/completions
* Operation ID: createChatCompletion
Expand Down
2 changes: 1 addition & 1 deletion src/Handlers/Completions.php
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ public function __construct(?Client $client = null)
}

/**
* Creates a completion for the provided prompt and parameters
* Creates a completion for the provided prompt and parameters.
*
* Operation URL: POST /completions
* Operation ID: createCompletion
Expand Down
4 changes: 2 additions & 2 deletions src/Models/AudioTranscriptions/CreateRequest.php
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ final class CreateRequest extends AbstractModel
protected $ignoreMissing = false;

/**
* The audio file to transcribe, in one of these formats: mp3, mp4, mpeg, mpga,
* m4a, wav, or webm.
* The audio file object (not file name) to transcribe, in one of these formats:
* mp3, mp4, mpeg, mpga, m4a, wav, or webm.
*
* @var string must be an absolute path to a file.
*/
Expand Down
4 changes: 2 additions & 2 deletions src/Models/AudioTranslations/CreateRequest.php
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ final class CreateRequest extends AbstractModel
protected $ignoreMissing = false;

/**
* The audio file to translate, in one of these formats: mp3, mp4, mpeg, mpga, m4a,
* wav, or webm.
* The audio file object (not file name) translate, in one of these formats: mp3,
* mp4, mpeg, mpga, m4a, wav, or webm.
*
* @var string must be an absolute path to a file.
*/
Expand Down
32 changes: 26 additions & 6 deletions src/Models/ChatCompletions/CreateRequest.php
Original file line number Diff line number Diff line change
Expand Up @@ -24,20 +24,39 @@ final class CreateRequest extends AbstractModel
protected const REQUIRED = ['model', 'messages'];

/**
* ID of the model to use. Currently, only gpt-3.5-turbo and gpt-3.5-turbo-0301 are
* supported.
* ID of the model to use. See the model endpoint compatibility table for details
* on which models work with the Chat API.
*
* @var string
*/
public $model;

/**
* The messages to generate chat completions for, in the chat format.
* A list of messages comprising the conversation so far. Example Python code.
*
* @var \Tectalic\OpenAi\Models\ChatCompletions\CreateRequestMessagesItem[]
*/
public $messages;

/**
* A list of functions the model may generate JSON inputs for.
*
* @var \Tectalic\OpenAi\Models\ChatCompletions\CreateRequestFunctionsItem[]
*/
public $functions;

/**
* Controls how the model responds to function calls. "none" means the model does
* not call a function, and responds to the end-user. "auto" means the model can
* pick between an end-user or calling a function. Specifying a particular
* function via {"name":\ "my_function"} forces the model to call that function.
* "none" is the default when no functions are present. "auto" is the default if
* functions are present.
*
* @var mixed
*/
public $function_call;

/**
* What sampling temperature to use, between 0 and 2. Higher values like 0.8 will
* make the output more random, while lower values like 0.2 will make it more
Expand Down Expand Up @@ -80,7 +99,7 @@ final class CreateRequest extends AbstractModel
/**
* If set, partial message deltas will be sent, like in ChatGPT. Tokens will be
* sent as data-only server-sent events as they become available, with the stream
* terminated by a data: [DONE] message.
* terminated by a data: [DONE] message. Example Python code.
*
* Default Value: false
*
Expand All @@ -98,8 +117,9 @@ final class CreateRequest extends AbstractModel
public $stop;

/**
* The maximum number of tokens allowed for the generated answer. By default, the
* number of tokens the model can return will be (4096 - prompt tokens).
* The maximum number of tokens to generate in the chat completion.
* The total length of input tokens and generated tokens is limited by the model's
* context length. Example Python code for counting tokens.
*
* @var int
*/
Expand Down
49 changes: 49 additions & 0 deletions src/Models/ChatCompletions/CreateRequestFunctionsItem.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
<?php

/**
* Copyright (c) 2022-present Tectalic (https://tectalic.com)
*
* For copyright and license information, please view the LICENSE file that was distributed with this source code.
*
* Please see the README.md file for usage instructions.
*/

declare(strict_types=1);

namespace Tectalic\OpenAi\Models\ChatCompletions;

use Tectalic\OpenAi\Models\AbstractModel;

final class CreateRequestFunctionsItem extends AbstractModel
{
/**
* List of required property names.
*
* These properties must all be set when this Model is instantiated.
*/
protected const REQUIRED = ['name'];

/**
* The name of the function to be called. Must be a-z, A-Z, 0-9, or contain
* underscores and dashes, with a maximum length of 64.
*
* @var string
*/
public $name;

/**
* The description of what the function does.
*
* @var string
*/
public $description;

/**
* The parameters the functions accepts, described as a JSON Schema object. See the
* guide for examples, and the JSON Schema reference for documentation about the
* format.
*
* @var \Tectalic\OpenAi\Models\ChatCompletions\CreateRequestFunctionsItemParameters
*/
public $parameters;
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
<?php

/**
* Copyright (c) 2022-present Tectalic (https://tectalic.com)
*
* For copyright and license information, please view the LICENSE file that was distributed with this source code.
*
* Please see the README.md file for usage instructions.
*/

declare(strict_types=1);

namespace Tectalic\OpenAi\Models\ChatCompletions;

use Tectalic\OpenAi\Models\UnstructuredModel;

final class CreateRequestFunctionsItemParameters extends UnstructuredModel
{
}
21 changes: 16 additions & 5 deletions src/Models/ChatCompletions/CreateRequestMessagesItem.php
Original file line number Diff line number Diff line change
Expand Up @@ -21,28 +21,39 @@ final class CreateRequestMessagesItem extends AbstractModel
*
* These properties must all be set when this Model is instantiated.
*/
protected const REQUIRED = ['role', 'content'];
protected const REQUIRED = ['role'];

/**
* The role of the author of this message.
* The role of the messages author. One of system, user, assistant, or function.
*
* Allowed values: 'system', 'user', 'assistant'
* Allowed values: 'system', 'user', 'assistant', 'function'
*
* @var string
*/
public $role;

/**
* The contents of the message
* The contents of the message. content is required for all messages except
* assistant messages with function calls.
*
* @var string
*/
public $content;

/**
* The name of the user in a multi-user chat
* The name of the author of this message. name is required if role is function,
* and it should be the name of the function whose response is in the content. May
* contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
*
* @var string
*/
public $name;

/**
* The name and arguments of a function that should be called, as generated by the
* model.
*
* @var \Tectalic\OpenAi\Models\ChatCompletions\CreateRequestMessagesItemFunctionCall
*/
public $function_call;
}
Loading

0 comments on commit 434f48f

Please sign in to comment.