Skip to content

Commit

Permalink
feat: Supprt Azure OpenAI Service
Browse files Browse the repository at this point in the history
  • Loading branch information
Peek-A-Booo committed Apr 28, 2023
1 parent 6bdd058 commit 4240e07
Show file tree
Hide file tree
Showing 19 changed files with 582 additions and 73 deletions.
10 changes: 8 additions & 2 deletions .env.local.demo
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
# sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# OpenAI Key. eg: sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_OPENAI_API_KEY=
# your own api proxy url. if empty here, default proxy will be https://api.openai.com
# Your own openai api proxy url. If empty here, default proxy will be https://api.openai.com
NEXT_PUBLIC_OPENAI_API_PROXY=

# Azure OpenAI Key. eg: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_AZURE_OPENAI_API_KEY=
# your own Azure OpenAI api proxy url. If it is empty, the Azure OpenAI Service will not function properly.
NEXT_PUBLIC_AZURE_OPENAI_API_PROXY=

# set your own sentry dsn. if empty here, it will not report error to sentry
NEXT_PUBLIC_SENTRY_DSN=
12 changes: 12 additions & 0 deletions CHANGE_LOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,17 @@
# L-GPT Change Log

## v0.1.3

> 2023-04-28
### Fixed

- Fix the issue where clicking the clear button in Input does not clear the value

### Add

- Support Azure OpenAI Service

## v0.1.2

> 2023-04-27
Expand Down
28 changes: 21 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,11 @@ L-GPT is an open-source project that imitates the OpenAI ChatGPT. [Demo](https:/
- Responsive design, dark mode and PWA
- Safe, all data based on local
- Support i18n
- Support Azure OpenAI Service

## Next

- [x] Support Azure OpenAI
- [ ] Introduce prompt words and prompt word templates
- [ ] Support GPT-4 and Claude
- [ ] Desktop version development?
Expand All @@ -41,11 +43,17 @@ NEXT_PUBLIC_OPENAI_API_KEY=
# If none of these are being used, then connect directly to the Open AI official address: https://api.openai.com.
NEXT_PUBLIC_OPENAI_API_PROXY=

# Set Your Azure OpenAI key.
NEXT_PUBLIC_AZURE_OPENAI_API_KEY=

# Set Your Azure OpenAI proxy.
NEXT_PUBLIC_AZURE_OPENAI_API_PROXY=

# set your own sentry dsn. if empty here, it will not report error to sentry
NEXT_PUBLIC_SENTRY_DSN=
```

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/Peek-A-Booo/L-GPT&env=NEXT_PUBLIC_OPENAI_API_KEY&env=NEXT_PUBLIC_OPENAI_API_PROXY&env=NEXT_PUBLIC_SENTRY_DSN)
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/Peek-A-Booo/L-GPT&env=NEXT_PUBLIC_OPENAI_API_KEY&env=NEXT_PUBLIC_OPENAI_API_PROXY&env=NEXT_PUBLIC_AZURE_OPENAI_API_KEY&env=NEXT_PUBLIC_AZURE_OPENAI_API_PROXY&env=NEXT_PUBLIC_SENTRY_DSN)

## Running Local

Expand Down Expand Up @@ -76,8 +84,12 @@ Rename .evn.local.demo to .env.local and configure it.
```bash
# sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_OPENAI_API_KEY=
# your own api proxy url.
# your own OpenAI Api proxy url.
NEXT_PUBLIC_OPENAI_API_PROXY=
# Azure OpenAI Key. eg: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_AZURE_OPENAI_API_KEY=
# your own Azure OpenAI api proxy url. If it is empty, the Azure OpenAI Service will not function properly.
NEXT_PUBLIC_AZURE_OPENAI_API_PROXY=
# set your own sentry dsn. if empty here, it will not report error to sentry
NEXT_PUBLIC_SENTRY_DSN=
```
Expand All @@ -98,8 +110,10 @@ pnpm build && pnpm start

You can configure the following environment variables.

| Environment Variable | Desc | Required | Default |
| ------------------------------ | ------------------------------------------------------------- | -------- | ------------------------ |
| `NEXT_PUBLIC_OPENAI_API_KEY` | your OpenAI API Key | false | |
| `NEXT_PUBLIC_OPENAI_API_PROXY` | your OpenAI API proxy server | false | `https://api.openai.com` |
| `NEXT_PUBLIC_SENTRY_DSN` | your sentry dsn. If empty, it will not report error to sentry | false | |
| Environment Variable | Desc | Required | Default |
| ------------------------------------ | ------------------------------------------------------------- | -------- | ------------------------ |
| `NEXT_PUBLIC_OPENAI_API_KEY` | your OpenAI API Key | false | |
| `NEXT_PUBLIC_OPENAI_API_PROXY` | your OpenAI API proxy server | false | `https://api.openai.com` |
| `NEXT_PUBLIC_AZURE_OPENAI_API_KEY` | your Azure OpenAI API Key | false | |
| `NEXT_PUBLIC_AZURE_OPENAI_API_PROXY` | your Azure OpenAI API proxy server | false | |
| `NEXT_PUBLIC_SENTRY_DSN` | your sentry dsn. If empty, it will not report error to sentry | false | |
30 changes: 22 additions & 8 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,11 @@ L-GPT 是一项开源项目,借助 OpenAI Api 模仿了 ChatGPT 的功能。 [
- 支持响应式,暗黑模式和 PWA
- 安全,所有数据均基于本地存储
- 支持 i18n
- 支持 Azure OpenAI Service

## 下一步计划

- [x] 支持 Azure OpenAI
- [ ] 引入提示词以及提示词模板
- [ ] 支持 GPT-4 和 Claude
- [ ] 桌面版本开发?
Expand All @@ -39,11 +41,17 @@ NEXT_PUBLIC_OPENAI_API_KEY=
# 都没有使用则直连Open AI 官方地址:https://api.openai.com
NEXT_PUBLIC_OPENAI_API_PROXY=

# 配置你的 Azure OpenAI key.
NEXT_PUBLIC_AZURE_OPENAI_API_KEY=

# 配置你的 Azure OpenAI proxy.
NEXT_PUBLIC_AZURE_OPENAI_API_PROXY=

# 配置你的 sentry dsn地址。如果为空, 将不会将错误报告到 sentry
NEXT_PUBLIC_SENTRY_DSN=
```

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/Peek-A-Booo/L-GPT&env=NEXT_PUBLIC_OPENAI_API_KEY&env=NEXT_PUBLIC_OPENAI_API_PROXY&env=NEXT_PUBLIC_SENTRY_DSN)
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/Peek-A-Booo/L-GPT&env=NEXT_PUBLIC_OPENAI_API_KEY&env=NEXT_PUBLIC_OPENAI_API_PROXY&env=NEXT_PUBLIC_AZURE_OPENAI_API_KEY&env=NEXT_PUBLIC_AZURE_OPENAI_API_PROXY&env=NEXT_PUBLIC_SENTRY_DSN)

## 本地运行

Expand Down Expand Up @@ -72,10 +80,14 @@ pnpm i
将 .evn.local.demo 重命名为 .env.local 并进行配置。

```bash
# sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# OpenAI 官方key: sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_OPENAI_API_KEY=
# 你个人的 api 代理地址。
# 你个人的 OpenAI api 代理地址。
NEXT_PUBLIC_OPENAI_API_PROXY=
# Azure OpenAI Key: eg: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
NEXT_PUBLIC_AZURE_OPENAI_API_KEY=
# 配置你的Azure OpenAI代理地址。 If it is empty, the Azure OpenAI Service will not function properly.
NEXT_PUBLIC_AZURE_OPENAI_API_PROXY=
# 配置你的 sentry dsn地址。如果为空, 将不会将错误报告到 sentry
NEXT_PUBLIC_SENTRY_DSN=
```
Expand All @@ -96,8 +108,10 @@ pnpm build && pnpm start

你可以配置以下环境变量。

| 环境变量 | 描述 | 是否必须配置 | 默认值 |
| ------------------------------ | --------------------------------------------------------- | ------------ | ------------------------ |
| `NEXT_PUBLIC_OPENAI_API_KEY` | 你个人的 OpenAI API key || |
| `NEXT_PUBLIC_OPENAI_API_PROXY` | 你个人的 API 代理地址 || `https://api.openai.com` |
| `NEXT_PUBLIC_SENTRY_DSN` | 你的 sentry dsn 地址。如果为空, 将不会将错误报告到 sentry || |
| 环境变量 | 描述 | 是否必须配置 | 默认值 |
| ------------------------------------ | --------------------------------------------------------- | ------------ | ------------------------ |
| `NEXT_PUBLIC_OPENAI_API_KEY` | 你个人的 OpenAI API key || |
| `NEXT_PUBLIC_OPENAI_API_PROXY` | 你个人的 OpenAI API 代理地址 || `https://api.openai.com` |
| `NEXT_PUBLIC_AZURE_OPENAI_API_KEY` | 你个人的 Azure OpenAI API key || |
| `NEXT_PUBLIC_AZURE_OPENAI_API_PROXY` | 你个人的 Azure OpenAI API 代理地址 || |
| `NEXT_PUBLIC_SENTRY_DSN` | 你的 sentry dsn 地址。如果为空, 将不会将错误报告到 sentry || |
194 changes: 194 additions & 0 deletions a.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,194 @@
addEventListener("fetch", (event) => {
event.respondWith(handleRequest(event.request));
});

const config = {
openai: {
originUrl: "https://api.openai.com",
},
azure: {
apiVersion: "2023-03-15-preview",
// The name of your Azure OpenAI Resource.
resourceName: "lgpt-azure-openai",
// The deployment name you chose when you deployed the model.
model: {
"gpt-3.5-turbo-0301": "lgpt-35-turbo",
},
},
};

async function handleRequest(request) {
if (
!request.url.includes("openai-proxy") &&
!request.url.includes("azure-proxy")
) {
return new Response("404 Not Found", { status: 404 });
}

if (request.method === "OPTIONS") return handleOPTIONS(request);

if (request.url.includes("openai-proxy")) {
return handleOpenAI(request);
} else if (request.url.includes("azure-proxy")) {
return handleAzure(request);
}
}

async function handleOpenAI(request) {
const url = new URL(request.url);

url.host = config.openai.originUrl.replace(/^https?:\/\//, "");

const modifiedRequest = new Request(url.toString(), {
headers: request.headers,
method: request.method,
body: request.body,
redirect: "follow",
});

const response = await fetch(modifiedRequest);
const modifiedResponse = new Response(response.body, response);

// 添加允许跨域访问的响应头
modifiedResponse.headers.set("Access-Control-Allow-Origin", "*");

return modifiedResponse;
}

async function handleAzure(request) {
const url = new URL(request.url);

if (url.pathname === "/v1/chat/completions") {
var path = "chat/completions";
} else if (url.pathname === "/v1/completions") {
var path = "completions";
} else if (url.pathname === "/v1/models") {
return handleModels(request);
} else {
return new Response("404 Not Found", { status: 404 });
}

let body;
if (request.method === "POST") body = await request.json();

const modelName = body?.model;
const deployName = config.azure.model[modelName];

if (deployName === "") return new Response("Missing model", { status: 403 });

const fetchAPI = `https://${config.azure.resourceName}.openai.azure.com/openai/deployments/${deployName}/${path}?api-version=${config.azure.apiVersion}`;

const authKey = request.headers.get("Authorization");

if (!authKey) return new Response("Not allowed", { status: 403 });

const payload = {
method: request.method,
headers: {
"Content-Type": "application/json",
"api-key": authKey.replace("Bearer ", ""),
},
body: typeof body === "object" ? JSON.stringify(body) : "{}",
};

const response = await fetch(fetchAPI, payload);

if (body?.stream != true) {
return response;
}

const { readable, writable } = new TransformStream();
stream(response.body, writable);
return new Response(readable, response);
}

function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}

// support printer mode and add newline
async function stream(readable, writable) {
const reader = readable.getReader();
const writer = writable.getWriter();

// const decoder = new TextDecoder();
const encoder = new TextEncoder();
const decoder = new TextDecoder();
// let decodedValue = decoder.decode(value);
const newline = "\n";
const delimiter = "\n\n";
const encodedNewline = encoder.encode(newline);

let buffer = "";
while (true) {
let { value, done } = await reader.read();
if (done) {
break;
}
buffer += decoder.decode(value, { stream: true }); // stream: true is important here,fix the bug of incomplete line
let lines = buffer.split(delimiter);

// Loop through all but the last line, which may be incomplete.
for (let i = 0; i < lines.length - 1; i++) {
await writer.write(encoder.encode(lines[i] + delimiter));
await sleep(30);
}

buffer = lines[lines.length - 1];
}

if (buffer) {
await writer.write(encoder.encode(buffer));
}
await writer.write(encodedNewline);
await writer.close();
}

async function handleModels() {
const data = {
object: "list",
data: [],
};

for (let key in config.azure.model) {
data.data.push({
id: key,
object: "model",
created: 1677610602,
owned_by: "openai",
permission: [
{
id: "modelperm-M56FXnG1AsIr3SXq8BYPvXJA",
object: "model_permission",
created: 1679602088,
allow_create_engine: false,
allow_sampling: true,
allow_logprobs: true,
allow_search_indices: false,
allow_view: true,
allow_fine_tuning: false,
organization: "*",
group: null,
is_blocking: false,
},
],
root: key,
parent: null,
});
}

const json = JSON.stringify(data, null, 2);
return new Response(json, {
headers: { "Content-Type": "application/json" },
});
}

async function handleOPTIONS() {
return new Response(null, {
headers: {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "*",
"Access-Control-Allow-Headers": "*",
},
});
}
25 changes: 25 additions & 0 deletions a1.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
const TELEGRAPH_URL = "https://api.openai.com";

addEventListener("fetch", (event) => {
event.respondWith(handleRequest(event.request));
});

async function handleRequest(request) {
const url = new URL(request.url);
url.host = TELEGRAPH_URL.replace(/^https?:\/\//, "");

const modifiedRequest = new Request(url.toString(), {
headers: request.headers,
method: request.method,
body: request.body,
redirect: "follow",
});

const response = await fetch(modifiedRequest);
const modifiedResponse = new Response(response.body, response);

// 添加允许跨域访问的响应头
modifiedResponse.headers.set("Access-Control-Allow-Origin", "*");

return modifiedResponse;
}
Loading

1 comment on commit 4240e07

@vercel
Copy link

@vercel vercel bot commented on 4240e07 Apr 28, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please sign in to comment.