用 Express 和 Vue3 搭建的 ChatGPT 演示网页
Go to file
zaimoe a3944f86b7
fix: missing VITE_GLOB_API_URL when docker build, fixed #690 #717 # (#877)
2023-03-25 17:23:57 +08:00
.github/workflows feat: 调整镜像构建的tag,让最新代码指向latest,让每次版本指向日期 (#144) 2023-02-27 19:24:46 +08:00
.husky fix: 修复代码提交前验证流程 2023-02-13 20:55:26 +08:00
.vscode chore(vsc-debug): 增加 visual-studio-code 的 debug 启动文件 (#296) 2023-03-22 18:35:10 +08:00
config feat: v2.7.2 消息样式美化和优化代码 (#111) 2023-02-24 15:03:49 +08:00
docker-compose feat: 新增限流功能 (#718) 2023-03-21 09:20:27 +08:00
docs chore: cover 2023-03-11 16:51:10 +08:00
public feat: support pwa (#452) 2023-03-10 13:25:30 +08:00
service fix: 反向代理限流失效 (#863) 2023-03-25 08:30:15 +08:00
src fix: 移动端焦点不触发的问题 2023-03-23 21:12:36 +08:00
.commitlintrc.json chore: 添加 lint-staged 2023-02-13 11:03:10 +08:00
.dockerignore fix: missing VITE_GLOB_API_URL when docker build, fixed #690 #717 # (#877) 2023-03-25 17:23:57 +08:00
.editorconfig chore: initialize 2023-02-09 11:21:33 +08:00
.env perf: 本地地址 2023-03-24 08:14:51 +08:00
.eslintrc.cjs chore: initialize 2023-02-09 11:21:33 +08:00
.gitattributes chore: initialize 2023-02-09 11:21:33 +08:00
.gitignore chore: reset .env 2023-03-09 23:57:47 +08:00
.npmrc chore: 添加 lint-staged 2023-02-13 11:03:10 +08:00
CHANGELOG.md chore: v2.10.8 2023-03-23 14:26:37 +08:00
CONTRIBUTING.en.md chore: added English translation for the docs 2023-03-04 12:48:42 -05:00
CONTRIBUTING.md chore: # CONTRIBUTING 2023-02-15 16:18:20 +08:00
Dockerfile chore: optimize dockerfile to speed up building and reduce docker image size (#520) 2023-03-13 09:03:17 +08:00
README.en.md docs: update README.md, add long reply feature 2023-03-25 17:01:12 +08:00
README.md docs: update README.md, add long reply feature 2023-03-25 17:01:12 +08:00
index.html feat: 支持webAPP(苹果添加到主页书签访问) (#227) 2023-03-04 08:10:01 +08:00
license pref: 优化部份内容 2023-02-09 15:48:27 +08:00
package.json chore: v2.10.8 2023-03-23 14:26:37 +08:00
pnpm-lock.yaml feat: 渲染的链接新窗口打开 2023-03-17 16:06:55 +08:00
postcss.config.js chore: initialize 2023-02-09 11:21:33 +08:00
start.cmd feat: Create start.cmd (#656) 2023-03-22 18:31:52 +08:00
start.sh Add Dockerfile (#33) 2023-02-16 13:39:07 +08:00
tailwind.config.js chore: version 2.7.3 (#120) 2023-02-25 00:19:13 +08:00
tsconfig.json feat: version 2.9.1 (#207) 2023-03-02 21:27:20 +08:00
vite.config.ts fix: PWA 未添加的问题 (#807) 2023-03-23 15:55:51 +08:00

README.en.md

ChatGPT Web


Disclaimer: This project is only released on GitHub, under the MIT License, free and for open-source learning purposes. There will be no account selling, paid services, discussion groups, or forums. Beware of fraud.

cover cover2

Introduction

Supports dual models, provides two unofficial ChatGPT API methods:

Method Free? Reliability Quality
ChatGPTAPI(gpt-3.5-turbo-0301) No Reliable Relatively clumsy
ChatGPTUnofficialProxyAPI(Web accessToken) Yes Relatively unreliable Smart

Comparison:

  1. ChatGPTAPI uses gpt-3.5-turbo-0301 to simulate ChatGPT through the official OpenAI completion API (the most reliable method, but it is not free and does not use models specifically tuned for chat).
  2. ChatGPTUnofficialProxyAPI accesses ChatGPT's backend API via an unofficial proxy server to bypass Cloudflare (uses the real ChatGPT, is very lightweight, but depends on third-party servers and has rate limits).

Details

Switching Methods:

  1. Go to the service/.env.example file and copy the contents to the service/.env file.
  2. For OpenAI API Key, fill in the OPENAI_API_KEY field (Get apiKey).
  3. For Web API, fill in the OPENAI_ACCESS_TOKEN field (Get accessToken).
  4. When both are present, OpenAI API Key takes precedence.

Reverse Proxy:

Available when using ChatGPTUnofficialProxyAPI.Details

# service/.env
API_REVERSE_PROXY=

Environment Variables:

For all parameter variables, check here or see:

/service/.env

Roadmap

[✓] Dual models

[✓] Multiple session storage and context logic

[✓] Formatting and beautifying code-like message types

[✓] Access rights control

[✓] Data import and export

[✓] Save message to local image

[✓] Multilingual interface

[✓] Interface themes

[✗] More...

Prerequisites

Node

node requires version ^16 || ^18 (node >= 14 requires installation of fetch polyfill), and multiple local node versions can be managed using nvm.

node -v

PNPM

If you have not installed pnpm before:

npm install pnpm -g

Fill in the Keys

Get Openai Api Key or accessToken and fill in the local environment variables jump

# service/.env file

# OpenAI API Key - https://platform.openai.com/overview
OPENAI_API_KEY=

# change this to an `accessToken` extracted from the ChatGPT site's `https://chat.openai.com/api/auth/session` response
OPENAI_ACCESS_TOKEN=

Install Dependencies

To make it easier for backend developers to understand, we did not use the front-end workspace mode, but stored it in different folders. If you only need to do secondary development of the front-end page, delete the service folder.

Backend

Enter the /service folder and run the following command

pnpm install

Frontend

Run the following command in the root directory

pnpm bootstrap

Run in Test Environment

Backend Service

Enter the /service folder and run the following command

pnpm start

Frontend Webpage

Run the following command in the root directory

pnpm dev

Packaging

Using Docker

Docker Parameter Example

  • OPENAI_API_KEY one of two
  • OPENAI_ACCESS_TOKEN one of two, OPENAI_API_KEY takes precedence when both are present
  • OPENAI_API_BASE_URL optional, available when OPENAI_API_KEY is set
  • OPENAI_API_MODEL optional, available when OPENAI_API_KEY is set
  • API_REVERSE_PROXY optional, available when OPENAI_ACCESS_TOKEN is set Reference
  • AUTH_SECRET_KEY Access Passwordoptional
  • TIMEOUT_MS timeout, in milliseconds, optional
  • SOCKS_PROXY_HOST optional, effective with SOCKS_PROXY_PORT
  • SOCKS_PROXY_PORT optional, effective with SOCKS_PROXY_HOST
  • HTTPS_PROXY optional, support httphttps, socks5
  • ALL_PROXY optional, support httphttps, socks5

docker

Docker Build & Run

docker build -t chatgpt-web .

# foreground operation
docker run --name chatgpt-web --rm -it -p 127.0.0.1:3002:3002 --env OPENAI_API_KEY=your_api_key chatgpt-web

# background operation
docker run --name chatgpt-web -d -p 127.0.0.1:3002:3002 --env OPENAI_API_KEY=your_api_key chatgpt-web

# running address
http://localhost:3002/

Docker Compose

Hub Address

version: '3'

services:
  app:
    image: chenzhaoyu94/chatgpt-web # always use latest, pull the tag image again when updating
    ports:
      - 127.0.0.1:3002:3002
    environment:
      # one of two
      OPENAI_API_KEY: xxxxxx
      # one of two
      OPENAI_ACCESS_TOKEN: xxxxxx
      # api interface url, optional, available when OPENAI_API_KEY is set
      OPENAI_API_BASE_URL: xxxx
      # api model, optional, available when OPENAI_API_KEY is set
      OPENAI_API_MODEL: xxxx
      # reverse proxy, optional
      API_REVERSE_PROXY: xxx
      # access passwordoptional
      AUTH_SECRET_KEY: xxx
      # timeout, in milliseconds, optional
      TIMEOUT_MS: 60000
      # socks proxy, optional, effective with SOCKS_PROXY_PORT
      SOCKS_PROXY_HOST: xxxx
      # socks proxy port, optional, effective with SOCKS_PROXY_HOST
      SOCKS_PROXY_PORT: xxxx
      # HTTPS Proxyoptional, support http, https, socks5
      HTTPS_PROXY: http://xxx:7890

The OPENAI_API_BASE_URL is optional and only used when setting the OPENAI_API_KEY. The OPENAI_API_MODEL is optional and only used when setting the OPENAI_API_KEY.

Deployment with Railway

Deploy on Railway

Railway Environment Variables

Environment Variable Required Description
PORT Required Default: 3002
AUTH_SECRET_KEY Optional access password
TIMEOUT_MS Optional Timeout in milliseconds
OPENAI_API_KEY Optional Required for OpenAI API. apiKey can be obtained from here.
OPENAI_ACCESS_TOKEN Optional Required for Web API. accessToken can be obtained from here.
OPENAI_API_BASE_URL Optional, only for OpenAI API API endpoint.
OPENAI_API_MODEL Optional, only for OpenAI API API model.
API_REVERSE_PROXY Optional, only for Web API Reverse proxy address for Web API. Details
SOCKS_PROXY_HOST Optional, effective with SOCKS_PROXY_PORT Socks proxy.
SOCKS_PROXY_PORT Optional, effective with SOCKS_PROXY_HOST Socks proxy port.
HTTPS_PROXY Optional HTTPS Proxy.
ALL_PROXY Optional ALL Proxy.

Note: Changing environment variables in Railway will cause re-deployment.

Manual packaging

Backend service

If you don't need the node interface of this project, you can skip the following steps.

Copy the service folder to a server that has a node service environment.

# Install
pnpm install

# Build
pnpm build

# Run
pnpm prod

PS: You can also run pnpm start directly on the server without packaging.

Frontend webpage

  1. Refer to the root directory .env.example file content to create .env file, modify VITE_GLOB_API_URL in .env at the root directory to your actual backend interface address.
  2. Run the following command in the root directory and then copy the files in the dist folder to the root directory of your website service.

Reference information

pnpm build

Frequently Asked Questions

Q: Why does Git always report an error when committing?

A: Because there is submission information verification, please follow the Commit Guidelines.

Q: Where to change the request interface if only the frontend page is used?

A: The VITE_GLOB_API_URL field in the .env file at the root directory.

Q: All red when saving the file?

A: For vscode, please install the recommended plug-in of the project or manually install the Eslint plug-in.

Q: Why doesn't the frontend have a typewriter effect?

A: One possible reason is that after Nginx reverse proxying, buffering is turned on, and Nginx will try to buffer a certain amount of data from the backend before sending it to the browser. Please try adding proxy_buffering off; after the reverse proxy parameter and then reloading Nginx. Other web server configurations are similar.

Q: The content returned is incomplete?

A: There is a length limit for the content returned by the API each time. You can modify the VITE_GLOB_OPEN_LONG_REPLY field in the .env file under the root directory, set it to true, and rebuild the front-end to enable the long reply feature, which can return the full content. It should be noted that using this feature may bring more API usage fees.

Contributing

Please read the Contributing Guidelines before contributing.

Thanks to all the contributors!

Sponsorship

If you find this project helpful and circumstances permit, you can give me a little support. Thank you very much for your support~

WeChat

WeChat Pay

Alipay

Alipay

License

MIT © ChenZhaoYu