Skip to content

Commit

Permalink
api references
Browse files Browse the repository at this point in the history
  • Loading branch information
atinylittleshell committed Aug 20, 2023
1 parent 59332b8 commit a43afe8
Show file tree
Hide file tree
Showing 10 changed files with 570 additions and 8 deletions.
5 changes: 5 additions & 0 deletions .changeset/warm-panthers-cover.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'function-gpt': minor
---

Add API reference documentation
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,10 @@ const response = await session.send('count characters in the html content of htt
expect(response).toBe('There are 4096 characters in the html content of https://www.google.com/.');
```

## API References

See [API references](./doc/README.md) for more detailed information on how to use the library.

## Installation

```bash
Expand Down
1 change: 1 addition & 0 deletions doc/.nojekyll
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
TypeDoc added this file to prevent GitHub Pages from using Jekyll. You can turn off this behavior by setting the `githubPages` option to false.
187 changes: 187 additions & 0 deletions doc/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,187 @@
function-gpt

# function-gpt

## Table of contents

### Classes

- [ChatGPTSession](classes/ChatGPTSession.md)

### Type Aliases

- [ChatGPTFunctionCall](README.md#chatgptfunctioncall)
- [ChatGPTSendMessageOptions](README.md#chatgptsendmessageoptions)
- [ChatGPTSessionMessage](README.md#chatgptsessionmessage)
- [ChatGPTSessionOptions](README.md#chatgptsessionoptions)

### Functions

- [gptFunction](README.md#gptfunction)
- [gptObjectField](README.md#gptobjectfield)

## Type Aliases

### ChatGPTFunctionCall

Ƭ **ChatGPTFunctionCall**: `Object`

Represents a function call requested by ChatGPT.

#### Type declaration

| Name | Type |
| :------ | :------ |
| `arguments` | `string` |
| `name` | `string` |

#### Defined in

[src/session.ts:71](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L71)

___

### ChatGPTSendMessageOptions

Ƭ **ChatGPTSendMessageOptions**: `Object`

Options for the ChatGPTSession.send method.

**`See`**

[OpenAI Chat Completion API](https://platform.openai.com/docs/api-reference/chat/create).

#### Type declaration

| Name | Type | Description |
| :------ | :------ | :------ |
| `frequency_penalty?` | `number` \| ``null`` | Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim. **`See`** [See more information about frequency and presence penalties.](https://platform.openai.com/docs/api-reference/parameter-details) |
| `function_call?` | ``"none"`` \| ``"auto"`` \| { `name`: `string` } | Controls how the model responds to function calls. "none" means the model does not call a function, and responds to the end-user. "auto" means the model can pick between an end-user or calling a function. Specifying a particular function via `{"name":\ "my_function"}` forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present. |
| `function_call_execute_only?` | `boolean` | Stop the session after executing the function call. Useful when you don't need to give ChatGPT the result of the function call. Defaults to `false`. |
| `logit_bias?` | `Record`<`string`, `number`\> \| ``null`` | Modify the likelihood of specified tokens appearing in the completion. Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token. |
| `max_tokens?` | `number` | The maximum number of [tokens](/tokenizer) to generate in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) for counting tokens. |
| `model` | `string` | ID of the model to use. **`See`** [model endpoint compatibility](https://platform.openai.com/docs/models/overview) |
| `presence_penalty?` | `number` \| ``null`` | Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics. [See more information about frequency and presence penalties.](https://platform.openai.com/docs/api-reference/parameter-details) |
| `stop?` | `string` \| ``null`` \| `string`[] | Up to 4 sequences where the API will stop generating further tokens. |
| `temperature?` | `number` \| ``null`` | What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or `top_p` but not both. |
| `top_p?` | `number` \| ``null`` | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or `temperature` but not both. |
| `user?` | `string` | A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. **`See`** [Learn more](https://platform.openai.com/docs/guides/safety-best-practices). |

#### Defined in

[src/session.ts:91](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L91)

___

### ChatGPTSessionMessage

Ƭ **ChatGPTSessionMessage**: `Object`

Represents a message in a ChatGPT session.

#### Type declaration

| Name | Type |
| :------ | :------ |
| `content` | `string` \| ``null`` |
| `function_call?` | [`ChatGPTFunctionCall`](README.md#chatgptfunctioncall) |
| `name?` | `string` |
| `role` | ``"system"`` \| ``"user"`` \| ``"assistant"`` \| ``"function"`` |

#### Defined in

[src/session.ts:79](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L79)

___

### ChatGPTSessionOptions

Ƭ **ChatGPTSessionOptions**: { `systemMessage?`: `string` } & `ClientOptions`

Options for the ChatGPTSession constructor. Compatible with the OpenAI node client options.

**`See`**

[OpenAI Node Client](https://github.com/openai/openai-node)

#### Defined in

[src/session.ts:64](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L64)

## Functions

### gptFunction

**gptFunction**(`description`, `inputType`): (`target`: `object`, `propertyKey`: `string`, `descriptor`: `PropertyDescriptor`) => `void`

Use this decorator on a method within a ChatGPTSession subclass to enable it for function-calling.

#### Parameters

| Name | Type | Description |
| :------ | :------ | :------ |
| `description` | `string` | A description of the function. |
| `inputType` | () => `unknown` | Input for the function should be an object instance of a custom class. This parameter specifies the class of the object. |

#### Returns

`fn`

▸ (`target`, `propertyKey`, `descriptor`): `void`

##### Parameters

| Name | Type |
| :------ | :------ |
| `target` | `object` |
| `propertyKey` | `string` |
| `descriptor` | `PropertyDescriptor` |

##### Returns

`void`

**`See`**

[gptObjectField](README.md#gptobjectfield)

#### Defined in

[src/decorators.ts:19](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/decorators.ts#L19)

___

### gptObjectField

**gptObjectField**(`type`, `description`, `optional?`): (`target`: `object`, `propertyKey`: `string`) => `void`

Use this decorator on a property within a custom class to include it as a parameter for function-calling.

#### Parameters

| Name | Type | Default value | Description |
| :------ | :------ | :------ | :------ |
| `type` | ``"string"`` \| ``"number"`` \| ``"boolean"`` \| [``"string"`` \| ``"number"`` \| ``"boolean"``] \| [() => `unknown`] \| () => `unknown` | `undefined` | Type of the field. Use `'string'`, `'number'`, `'boolean'` for primitive types. Use `['string']`, `['number']`, `['boolean']` for arrays of primitive types. Use a ClassName for custom types. Use `[ClassName]` for arrays of custom types. |
| `description` | `string` | `undefined` | Description of the field. |
| `optional` | `boolean` | `false` | Whether the field is optional. Default to `false`. |

#### Returns

`fn`

▸ (`target`, `propertyKey`): `void`

##### Parameters

| Name | Type |
| :------ | :------ |
| `target` | `object` |
| `propertyKey` | `string` |

##### Returns

`void`

#### Defined in

[src/decorators.ts:53](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/decorators.ts#L53)
158 changes: 158 additions & 0 deletions doc/classes/ChatGPTSession.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,158 @@
[function-gpt](../README.md) / ChatGPTSession

# Class: ChatGPTSession

Extend this class to create your own function-calling enabled ChatGPT session.
Provide functions to the assistant by decorating them with the `@gptFunction` decorator.

**`See`**

[gptFunction](../README.md#gptfunction)

## Table of contents

### Constructors

- [constructor](ChatGPTSession.md#constructor)

### Properties

- [metadata](ChatGPTSession.md#metadata)
- [openai](ChatGPTSession.md#openai)
- [options](ChatGPTSession.md#options)
- [sessionMessages](ChatGPTSession.md#sessionmessages)

### Accessors

- [messages](ChatGPTSession.md#messages)

### Methods

- [processAssistantMessage](ChatGPTSession.md#processassistantmessage)
- [send](ChatGPTSession.md#send)

## Constructors

### constructor

**new ChatGPTSession**(`options?`)

#### Parameters

| Name | Type | Description |
| :------ | :------ | :------ |
| `options` | [`ChatGPTSessionOptions`](../README.md#chatgptsessionoptions) | Options for the ChatGPTSession constructor. |

**`See`**

[ChatGPTSessionOptions](../README.md#chatgptsessionoptions)

#### Defined in

[src/session.ts:204](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L204)

## Properties

### metadata

`Private` `Readonly` **metadata**: `GPTClientMetadata`

#### Defined in

[src/session.ts:196](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L196)

___

### openai

`Readonly` **openai**: `OpenAI`

#### Defined in

[src/session.ts:195](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L195)

___

### options

`Private` `Readonly` **options**: [`ChatGPTSessionOptions`](../README.md#chatgptsessionoptions) = `{}`

Options for the ChatGPTSession constructor.

#### Defined in

[src/session.ts:204](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L204)

___

### sessionMessages

`Private` **sessionMessages**: [`ChatGPTSessionMessage`](../README.md#chatgptsessionmessage)[] = `[]`

#### Defined in

[src/session.ts:197](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L197)

## Accessors

### messages

`get` **messages**(): [`ChatGPTSessionMessage`](../README.md#chatgptsessionmessage)[]

#### Returns

[`ChatGPTSessionMessage`](../README.md#chatgptsessionmessage)[]

The messages sent to and from the assistant so far.

#### Defined in

[src/session.ts:254](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L254)

## Methods

### processAssistantMessage

`Private` **processAssistantMessage**(`message`, `options`): `Promise`<`string`\>

#### Parameters

| Name | Type |
| :------ | :------ |
| `message` | [`ChatGPTSessionMessage`](../README.md#chatgptsessionmessage) |
| `options` | [`ChatGPTSendMessageOptions`](../README.md#chatgptsendmessageoptions) |

#### Returns

`Promise`<`string`\>

#### Defined in

[src/session.ts:258](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L258)

___

### send

**send**(`message`, `options?`): `Promise`<`string`\>

#### Parameters

| Name | Type | Description |
| :------ | :------ | :------ |
| `message` | `string` | The user message to send to the assistant. |
| `options` | [`ChatGPTSendMessageOptions`](../README.md#chatgptsendmessageoptions) | Options for the ChatGPTSession.send method. |

#### Returns

`Promise`<`string`\>

The assistant's response.

**`See`**

[ChatGPTSendMessageOptions](../README.md#chatgptsendmessageoptions)

#### Defined in

[src/session.ts:221](https://github.com/atinylittleshell/function-gpt/blob/04eb21b/src/session.ts#L221)
4 changes: 4 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
"node": ">=16.0.0"
},
"scripts": {
"doc": "shx rm -rf doc && typedoc",
"build": "tsup",
"lint": "eslint . --ext .ts",
"test": "vitest run --coverage",
Expand All @@ -61,7 +62,10 @@
"eslint-plugin-prettier": "^5.0.0",
"eslint-plugin-simple-import-sort": "^10.0.0",
"prettier": "^3.0.2",
"shx": "^0.3.4",
"tsup": "^7.2.0",
"typedoc": "^0.24.8",
"typedoc-plugin-markdown": "^3.15.4",
"typescript": "^5.1.6",
"vitest": "^0.34.1"
},
Expand Down
Loading

0 comments on commit a43afe8

Please sign in to comment.