Merge branch 'Yidadaa:main' into fix-theme

This commit is contained in:
AprilNEA 2023-03-27 15:10:10 +08:00 committed by GitHub
commit d8e4808316
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
21 changed files with 288 additions and 77 deletions

View File

@ -7,7 +7,7 @@
One-Click to deploy your own ChatGPT web UI. One-Click to deploy your own ChatGPT web UI.
[演示 Demo](https://chat-gpt-next-web.vercel.app/) / [反馈问题 Issues](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) [演示 Demo](https://chat-gpt-next-web.vercel.app/) / [反馈 Issues](https://github.com/Yidadaa/ChatGPT-Next-Web/issues) / [加入 Discord](https://discord.gg/zrhvHCr79N) / [微信群](https://user-images.githubusercontent.com/16968934/227772522-b3ba3713-9206-4c8d-a81f-22300b7c313a.jpg) / [打赏开发者](https://user-images.githubusercontent.com/16968934/227772541-5bcd52d8-61b7-488c-a203-0330d8006e2b.jpg)
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FYidadaa%2FChatGPT-Next-Web&env=OPENAI_API_KEY&project-name=chatgpt-next-web&repository-name=ChatGPT-Next-Web) [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FYidadaa%2FChatGPT-Next-Web&env=OPENAI_API_KEY&project-name=chatgpt-next-web&repository-name=ChatGPT-Next-Web)
@ -84,7 +84,7 @@ You can star or watch this project or follow author to get release notifictions
code1,code2,code3 code1,code2,code3
``` ```
增加或修改该环境变量后,请重新部署项目使改动生效。 增加或修改该环境变量后,请**重新部署**项目使改动生效。
This project provides limited access control. Please add an environment variable named `CODE` on the environment variables page. The value should be a custom control code separated by comma like this: This project provides limited access control. Please add an environment variable named `CODE` on the environment variables page. The value should be a custom control code separated by comma like this:
@ -109,29 +109,36 @@ OPENAI_API_KEY=<your api key here>
``` ```
### 本地开发 Local Development ### 本地开发 Local Development
> 如果你是中国大陆用户,不建议在本地进行开发,除非你能够独立解决 OpenAI API 本地代理问题。 > 如果你是中国大陆用户,不建议在本地进行开发,除非你能够独立解决 OpenAI API 本地代理问题。
1. 安装 nodejs 和 yarn具体细节请询问 ChatGPT 1. 安装 nodejs 和 yarn具体细节请询问 ChatGPT
2. 执行 `yarn install && yarn dev` 即可。 2. 执行 `yarn install && yarn dev` 即可。
### 本地部署 Local Deployment ### 本地部署 Local Deployment
请直接询问 ChatGPT使用下列 Prompt 请直接询问 ChatGPT使用下列 Prompt
``` ```
如何使用 pm2 和 yarn 部署 nextjs 项目到 ubuntu 服务器上,项目编译命令为 yarn build启动命令为 yarn start启动时需要设置环境变量为 OPENAI_API_KEY端口为 3000使用 ngnix 做反向代理 如何使用 pm2 和 yarn 部署 nextjs 项目到 ubuntu 服务器上,项目编译命令为 yarn build启动命令为 yarn start启动时需要设置环境变量为 OPENAI_API_KEY端口为 3000使用 ngnix 做反向代理
``` ```
Please ask ChatGPT with prompt: Please ask ChatGPT with prompt:
``` ```
how to deploy nextjs project with pm2 and yarn on my ubuntu server, the build command is `yarn build`, the start command is `yarn start`, the project must start with env var named `OPENAI_API_KEY`, the port is 3000, use ngnix how to deploy nextjs project with pm2 and yarn on my ubuntu server, the build command is `yarn build`, the start command is `yarn start`, the project must start with env var named `OPENAI_API_KEY`, the port is 3000, use ngnix
``` ```
### Docker Deployment ### Docker Deployment
请直接询问 ChatGPT使用下列 Prompt 请直接询问 ChatGPT使用下列 Prompt
``` ```
如何使用 docker 部署 nextjs 项目到 ubuntu 服务器上,项目编译命令为 yarn build启动命令为 yarn start启动时需要设置环境变量为 OPENAI_API_KEY端口为 3000使用 ngnix 做反向代理 如何使用 docker 部署 nextjs 项目到 ubuntu 服务器上,项目编译命令为 yarn build启动命令为 yarn start启动时需要设置环境变量为 OPENAI_API_KEY端口为 3000使用 ngnix 做反向代理
``` ```
Please ask ChatGPT with prompt: Please ask ChatGPT with prompt:
``` ```
how to deploy nextjs project with docker on my ubuntu server, the build command is `yarn build`, the start command is `yarn start`, the project must start with env var named `OPENAI_API_KEY`, the port is 3000, use ngnix how to deploy nextjs project with docker on my ubuntu server, the build command is `yarn build`, the start command is `yarn start`, the project must start with env var named `OPENAI_API_KEY`, the port is 3000, use ngnix
``` ```
@ -143,6 +150,7 @@ how to deploy nextjs project with docker on my ubuntu server, the build command
![更多展示 More](./static/more.png) ![更多展示 More](./static/more.png)
## 说明 Attention ## 说明 Attention
本项目的演示地址所用的 OpenAI 账户的免费额度将于 2023-04-01 过期,届时将无法通过演示地址在线体验。 本项目的演示地址所用的 OpenAI 账户的免费额度将于 2023-04-01 过期,届时将无法通过演示地址在线体验。
如果你想贡献出自己的 API Key可以通过作者主页的邮箱发送给作者并标注过期时间。 如果你想贡献出自己的 API Key可以通过作者主页的邮箱发送给作者并标注过期时间。
@ -151,6 +159,13 @@ The free trial of the OpenAI account used by the demo will expire on April 1, 20
If you would like to contribute your API key, you can email it to the author and indicate the expiration date of the API key. If you would like to contribute your API key, you can email it to the author and indicate the expiration date of the API key.
## 鸣谢 Special Thanks
### 捐赠者 Sponsor
[@mushan0x0](https://github.com/mushan0x0)
[@ClarenceDan](https://github.com/ClarenceDan)
## LICENSE ## LICENSE
- [Anti 996 License](https://github.com/kattgu7/Anti-996-License/blob/master/LICENSE_CN_EN) - [Anti 996 License](https://github.com/kattgu7/Anti-996-License/blob/master/LICENSE_CN_EN)

View File

@ -2,19 +2,25 @@ import type { ChatRequest } from "../chat/typing";
import { createParser } from "eventsource-parser"; import { createParser } from "eventsource-parser";
import { NextRequest } from "next/server"; import { NextRequest } from "next/server";
const apiKey = process.env.OPENAI_API_KEY; async function createStream(req: NextRequest) {
async function createStream(payload: ReadableStream<Uint8Array>) {
const encoder = new TextEncoder(); const encoder = new TextEncoder();
const decoder = new TextDecoder(); const decoder = new TextDecoder();
let apiKey = process.env.OPENAI_API_KEY;
const userApiKey = req.headers.get("token");
if (userApiKey) {
apiKey = userApiKey;
console.log("[Stream] using user api key");
}
const res = await fetch("https://api.openai.com/v1/chat/completions", { const res = await fetch("https://api.openai.com/v1/chat/completions", {
headers: { headers: {
"Content-Type": "application/json", "Content-Type": "application/json",
Authorization: `Bearer ${apiKey}`, Authorization: `Bearer ${apiKey}`,
}, },
method: "POST", method: "POST",
body: payload, body: req.body,
}); });
const stream = new ReadableStream({ const stream = new ReadableStream({
@ -49,7 +55,7 @@ async function createStream(payload: ReadableStream<Uint8Array>) {
export async function POST(req: NextRequest) { export async function POST(req: NextRequest) {
try { try {
const stream = await createStream(req.body!); const stream = await createStream(req);
return new Response(stream); return new Response(stream);
} catch (error) { } catch (error) {
console.error("[Chat Stream]", error); console.error("[Chat Stream]", error);

View File

@ -1,7 +1,14 @@
import { OpenAIApi, Configuration } from "openai"; import { OpenAIApi, Configuration } from "openai";
import { ChatRequest } from "./typing"; import { ChatRequest } from "./typing";
const apiKey = process.env.OPENAI_API_KEY; export async function POST(req: Request) {
try {
let apiKey = process.env.OPENAI_API_KEY;
const userApiKey = req.headers.get("token");
if (userApiKey) {
apiKey = userApiKey;
}
const openai = new OpenAIApi( const openai = new OpenAIApi(
new Configuration({ new Configuration({
@ -9,14 +16,10 @@ const openai = new OpenAIApi(
}) })
); );
export async function POST(req: Request) {
try {
const requestBody = (await req.json()) as ChatRequest; const requestBody = (await req.json()) as ChatRequest;
const completion = await openai!.createChatCompletion( const completion = await openai!.createChatCompletion({
{
...requestBody, ...requestBody,
} });
);
return new Response(JSON.stringify(completion.data)); return new Response(JSON.stringify(completion.data));
} catch (e) { } catch (e) {

View File

@ -27,6 +27,7 @@ import Locale from "../locales";
import dynamic from "next/dynamic"; import dynamic from "next/dynamic";
import { REPO_URL } from "../constant"; import { REPO_URL } from "../constant";
import { ControllerPool } from "../requests";
export function Loading(props: { noLogo?: boolean }) { export function Loading(props: { noLogo?: boolean }) {
return ( return (
@ -146,28 +147,67 @@ function useSubmitHandler() {
export function Chat(props: { showSideBar?: () => void }) { export function Chat(props: { showSideBar?: () => void }) {
type RenderMessage = Message & { preview?: boolean }; type RenderMessage = Message & { preview?: boolean };
const session = useChatStore((state) => state.currentSession()); const [session, sessionIndex] = useChatStore((state) => [
state.currentSession(),
state.currentSessionIndex,
]);
const [userInput, setUserInput] = useState(""); const [userInput, setUserInput] = useState("");
const [isLoading, setIsLoading] = useState(false); const [isLoading, setIsLoading] = useState(false);
const { submitKey, shouldSubmit } = useSubmitHandler(); const { submitKey, shouldSubmit } = useSubmitHandler();
const onUserInput = useChatStore((state) => state.onUserInput); const onUserInput = useChatStore((state) => state.onUserInput);
// submit user input
const onUserSubmit = () => { const onUserSubmit = () => {
if (userInput.length <= 0) return; if (userInput.length <= 0) return;
setIsLoading(true); setIsLoading(true);
onUserInput(userInput).then(() => setIsLoading(false)); onUserInput(userInput).then(() => setIsLoading(false));
setUserInput(""); setUserInput("");
}; };
// stop response
const onUserStop = (messageIndex: number) => {
console.log(ControllerPool, sessionIndex, messageIndex);
ControllerPool.stop(sessionIndex, messageIndex);
};
// check if should send message
const onInputKeyDown = (e: KeyboardEvent) => { const onInputKeyDown = (e: KeyboardEvent) => {
if (shouldSubmit(e)) { if (shouldSubmit(e)) {
onUserSubmit(); onUserSubmit();
e.preventDefault(); e.preventDefault();
} }
}; };
const onRightClick = (e: any, message: Message) => {
// auto fill user input
if (message.role === "user") {
setUserInput(message.content);
}
// copy to clipboard
if (selectOrCopy(e.currentTarget, message.content)) {
e.preventDefault();
}
};
const onResend = (botIndex: number) => {
// find last user input message and resend
for (let i = botIndex; i >= 0; i -= 1) {
if (messages[i].role === "user") {
setIsLoading(true);
onUserInput(messages[i].content).then(() => setIsLoading(false));
return;
}
}
};
// for auto-scroll
const latestMessageRef = useRef<HTMLDivElement>(null); const latestMessageRef = useRef<HTMLDivElement>(null);
const [hoveringMessage, setHoveringMessage] = useState(false); // wont scroll while hovering messages
const [autoScroll, setAutoScroll] = useState(false);
// preview messages
const messages = (session.messages as RenderMessage[]) const messages = (session.messages as RenderMessage[])
.concat( .concat(
isLoading isLoading
@ -194,10 +234,11 @@ export function Chat(props: { showSideBar?: () => void }) {
: [] : []
); );
// auto scroll
useLayoutEffect(() => { useLayoutEffect(() => {
setTimeout(() => { setTimeout(() => {
const dom = latestMessageRef.current; const dom = latestMessageRef.current;
if (dom && !isIOS() && !hoveringMessage) { if (dom && !isIOS() && autoScroll) {
dom.scrollIntoView({ dom.scrollIntoView({
behavior: "smooth", behavior: "smooth",
block: "end", block: "end",
@ -252,15 +293,7 @@ export function Chat(props: { showSideBar?: () => void }) {
</div> </div>
</div> </div>
<div <div className={styles["chat-body"]}>
className={styles["chat-body"]}
onMouseOver={() => {
setHoveringMessage(true);
}}
onMouseOut={() => {
setHoveringMessage(false);
}}
>
{messages.map((message, i) => { {messages.map((message, i) => {
const isUser = message.role === "user"; const isUser = message.role === "user";
@ -283,13 +316,20 @@ export function Chat(props: { showSideBar?: () => void }) {
<div className={styles["chat-message-item"]}> <div className={styles["chat-message-item"]}>
{!isUser && ( {!isUser && (
<div className={styles["chat-message-top-actions"]}> <div className={styles["chat-message-top-actions"]}>
{message.streaming && ( {message.streaming ? (
<div <div
className={styles["chat-message-top-action"]} className={styles["chat-message-top-action"]}
onClick={() => showToast(Locale.WIP)} onClick={() => onUserStop(i)}
> >
{Locale.Chat.Actions.Stop} {Locale.Chat.Actions.Stop}
</div> </div>
) : (
<div
className={styles["chat-message-top-action"]}
onClick={() => onResend(i)}
>
{Locale.Chat.Actions.Retry}
</div>
)} )}
<div <div
@ -306,11 +346,7 @@ export function Chat(props: { showSideBar?: () => void }) {
) : ( ) : (
<div <div
className="markdown-body" className="markdown-body"
onContextMenu={(e) => { onContextMenu={(e) => onRightClick(e, message)}
if (selectOrCopy(e.currentTarget, message.content)) {
e.preventDefault();
}
}}
> >
<Markdown content={message.content} /> <Markdown content={message.content} />
</div> </div>
@ -341,6 +377,9 @@ export function Chat(props: { showSideBar?: () => void }) {
onInput={(e) => setUserInput(e.currentTarget.value)} onInput={(e) => setUserInput(e.currentTarget.value)}
value={userInput} value={userInput}
onKeyDown={(e) => onInputKeyDown(e as any)} onKeyDown={(e) => onInputKeyDown(e as any)}
onFocus={() => setAutoScroll(true)}
onBlur={() => setAutoScroll(false)}
autoFocus
/> />
<IconButton <IconButton
icon={<SendWhiteIcon />} icon={<SendWhiteIcon />}

View File

@ -4,15 +4,36 @@ import RemarkMath from "remark-math";
import RehypeKatex from "rehype-katex"; import RehypeKatex from "rehype-katex";
import RemarkGfm from "remark-gfm"; import RemarkGfm from "remark-gfm";
import RehypePrsim from "rehype-prism-plus"; import RehypePrsim from "rehype-prism-plus";
import { useRef } from "react";
import { copyToClipboard } from "../utils";
export function PreCode(props: { children: any }) {
const ref = useRef<HTMLPreElement>(null);
return (
<pre ref={ref}>
<span
className="copy-code-button"
onClick={() => {
if (ref.current) {
const code = ref.current.innerText;
copyToClipboard(code);
}
}}
></span>
{props.children}
</pre>
);
}
export function Markdown(props: { content: string }) { export function Markdown(props: { content: string }) {
return ( return (
<ReactMarkdown <ReactMarkdown
remarkPlugins={[RemarkMath, RemarkGfm]} remarkPlugins={[RemarkMath, RemarkGfm]}
rehypePlugins={[ rehypePlugins={[RehypeKatex, [RehypePrsim, { ignoreMissing: true }]]}
RehypeKatex, components={{
[RehypePrsim, { ignoreMissing: true }], pre: PreCode,
]} }}
> >
{props.content} {props.content}
</ReactMarkdown> </ReactMarkdown>

View File

@ -257,6 +257,20 @@ export function Settings(props: { closeSettings: () => void }) {
<></> <></>
)} )}
<SettingItem
title={Locale.Settings.Token.Title}
subTitle={Locale.Settings.Token.SubTitle}
>
<input
value={accessStore.token}
type="text"
placeholder={Locale.Settings.Token.Placeholder}
onChange={(e) => {
accessStore.updateToken(e.currentTarget.value);
}}
></input>
</SettingItem>
<SettingItem <SettingItem
title={Locale.Settings.HistoryCount.Title} title={Locale.Settings.HistoryCount.Title}
subTitle={Locale.Settings.HistoryCount.SubTitle} subTitle={Locale.Settings.HistoryCount.SubTitle}

View File

@ -14,6 +14,7 @@ const cn = {
Export: "导出聊天记录", Export: "导出聊天记录",
Copy: "复制", Copy: "复制",
Stop: "停止", Stop: "停止",
Retry: "重试",
}, },
Typing: "正在输入…", Typing: "正在输入…",
Input: (submitKey: string) => `输入消息,${submitKey} 发送`, Input: (submitKey: string) => `输入消息,${submitKey} 发送`,
@ -68,6 +69,11 @@ const cn = {
Title: "历史消息长度压缩阈值", Title: "历史消息长度压缩阈值",
SubTitle: "当未压缩的历史消息超过该值时,将进行压缩", SubTitle: "当未压缩的历史消息超过该值时,将进行压缩",
}, },
Token: {
Title: "API Key",
SubTitle: "使用自己的 Key 可绕过受控访问限制",
Placeholder: "OpenAI API Key",
},
AccessCode: { AccessCode: {
Title: "访问码", Title: "访问码",
SubTitle: "现在是受控访问状态", SubTitle: "现在是受控访问状态",

View File

@ -17,6 +17,7 @@ const en: LocaleType = {
Export: "Export All Messages as Markdown", Export: "Export All Messages as Markdown",
Copy: "Copy", Copy: "Copy",
Stop: "Stop", Stop: "Stop",
Retry: "Retry",
}, },
Typing: "Typing…", Typing: "Typing…",
Input: (submitKey: string) => Input: (submitKey: string) =>
@ -73,6 +74,11 @@ const en: LocaleType = {
SubTitle: SubTitle:
"Will compress if uncompressed messages length exceeds the value", "Will compress if uncompressed messages length exceeds the value",
}, },
Token: {
Title: "API Key",
SubTitle: "Use your key to ignore access code limit",
Placeholder: "OpenAI API Key",
},
AccessCode: { AccessCode: {
Title: "Access Code", Title: "Access Code",
SubTitle: "Access control enabled", SubTitle: "Access control enabled",

View File

@ -1,5 +1,5 @@
import { Analytics } from "@vercel/analytics/react"; import { Analytics } from "@vercel/analytics/react";
import { Home } from './components/home' import { Home } from "./components/home";
export default function App() { export default function App() {
return ( return (

View File

@ -35,6 +35,10 @@ function getHeaders() {
headers["access-code"] = accessStore.accessCode; headers["access-code"] = accessStore.accessCode;
} }
if (accessStore.token && accessStore.token.length > 0) {
headers["token"] = accessStore.token;
}
return headers; return headers;
} }
@ -60,6 +64,7 @@ export async function requestChatStream(
modelConfig?: ModelConfig; modelConfig?: ModelConfig;
onMessage: (message: string, done: boolean) => void; onMessage: (message: string, done: boolean) => void;
onError: (error: Error) => void; onError: (error: Error) => void;
onController?: (controller: AbortController) => void;
} }
) { ) {
const req = makeRequestParam(messages, { const req = makeRequestParam(messages, {
@ -96,12 +101,12 @@ export async function requestChatStream(
controller.abort(); controller.abort();
}; };
console.log(res);
if (res.ok) { if (res.ok) {
const reader = res.body?.getReader(); const reader = res.body?.getReader();
const decoder = new TextDecoder(); const decoder = new TextDecoder();
options?.onController?.(controller);
while (true) { while (true) {
// handle time out, will stop if no response in 10 secs // handle time out, will stop if no response in 10 secs
const resTimeoutId = setTimeout(() => finish(), TIME_OUT_MS); const resTimeoutId = setTimeout(() => finish(), TIME_OUT_MS);
@ -146,3 +151,34 @@ export async function requestWithPrompt(messages: Message[], prompt: string) {
return res.choices.at(0)?.message?.content ?? ""; return res.choices.at(0)?.message?.content ?? "";
} }
// To store message streaming controller
export const ControllerPool = {
controllers: {} as Record<string, AbortController>,
addController(
sessionIndex: number,
messageIndex: number,
controller: AbortController
) {
const key = this.key(sessionIndex, messageIndex);
this.controllers[key] = controller;
return key;
},
stop(sessionIndex: number, messageIndex: number) {
const key = this.key(sessionIndex, messageIndex);
const controller = this.controllers[key];
console.log(controller);
controller?.abort();
},
remove(sessionIndex: number, messageIndex: number) {
const key = this.key(sessionIndex, messageIndex);
delete this.controllers[key];
},
key(sessionIndex: number, messageIndex: number) {
return `${sessionIndex},${messageIndex}`;
},
};

View File

@ -4,7 +4,9 @@ import { queryMeta } from "../utils";
export interface AccessControlStore { export interface AccessControlStore {
accessCode: string; accessCode: string;
token: string;
updateToken: (_: string) => void;
updateCode: (_: string) => void; updateCode: (_: string) => void;
enabledAccessControl: () => boolean; enabledAccessControl: () => boolean;
} }
@ -14,6 +16,7 @@ export const ACCESS_KEY = "access-control";
export const useAccessStore = create<AccessControlStore>()( export const useAccessStore = create<AccessControlStore>()(
persist( persist(
(set, get) => ({ (set, get) => ({
token: "",
accessCode: "", accessCode: "",
enabledAccessControl() { enabledAccessControl() {
return queryMeta("access") === "enabled"; return queryMeta("access") === "enabled";
@ -21,6 +24,9 @@ export const useAccessStore = create<AccessControlStore>()(
updateCode(code: string) { updateCode(code: string) {
set((state) => ({ accessCode: code })); set((state) => ({ accessCode: code }));
}, },
updateToken(token: string) {
set((state) => ({ token }));
},
}), }),
{ {
name: ACCESS_KEY, name: ACCESS_KEY,

View File

@ -2,7 +2,11 @@ import { create } from "zustand";
import { persist } from "zustand/middleware"; import { persist } from "zustand/middleware";
import { type ChatCompletionResponseMessage } from "openai"; import { type ChatCompletionResponseMessage } from "openai";
import { requestChatStream, requestWithPrompt } from "../requests"; import {
ControllerPool,
requestChatStream,
requestWithPrompt,
} from "../requests";
import { trimTopic } from "../utils"; import { trimTopic } from "../utils";
import Locale from "../locales"; import Locale from "../locales";
@ -45,22 +49,24 @@ export interface ChatConfig {
export type ModelConfig = ChatConfig["modelConfig"]; export type ModelConfig = ChatConfig["modelConfig"];
const ENABLE_GPT4 = true;
export const ALL_MODELS = [ export const ALL_MODELS = [
{ {
name: "gpt-4", name: "gpt-4",
available: false, available: ENABLE_GPT4,
}, },
{ {
name: "gpt-4-0314", name: "gpt-4-0314",
available: false, available: ENABLE_GPT4,
}, },
{ {
name: "gpt-4-32k", name: "gpt-4-32k",
available: false, available: ENABLE_GPT4,
}, },
{ {
name: "gpt-4-32k-0314", name: "gpt-4-32k-0314",
available: false, available: ENABLE_GPT4,
}, },
{ {
name: "gpt-3.5-turbo", name: "gpt-3.5-turbo",
@ -296,6 +302,8 @@ export const useChatStore = create<ChatStore>()(
// get recent messages // get recent messages
const recentMessages = get().getMessagesWithMemory(); const recentMessages = get().getMessagesWithMemory();
const sendMessages = recentMessages.concat(userMessage); const sendMessages = recentMessages.concat(userMessage);
const sessionIndex = get().currentSessionIndex;
const messageIndex = get().currentSession().messages.length + 1;
// save user's and bot's message // save user's and bot's message
get().updateCurrentSession((session) => { get().updateCurrentSession((session) => {
@ -303,13 +311,16 @@ export const useChatStore = create<ChatStore>()(
session.messages.push(botMessage); session.messages.push(botMessage);
}); });
// make request
console.log("[User Input] ", sendMessages); console.log("[User Input] ", sendMessages);
requestChatStream(sendMessages, { requestChatStream(sendMessages, {
onMessage(content, done) { onMessage(content, done) {
// stream response
if (done) { if (done) {
botMessage.streaming = false; botMessage.streaming = false;
botMessage.content = content; botMessage.content = content;
get().onNewMessage(botMessage); get().onNewMessage(botMessage);
ControllerPool.remove(sessionIndex, messageIndex);
} else { } else {
botMessage.content = content; botMessage.content = content;
set(() => ({})); set(() => ({}));
@ -319,6 +330,15 @@ export const useChatStore = create<ChatStore>()(
botMessage.content += "\n\n" + Locale.Store.Error; botMessage.content += "\n\n" + Locale.Store.Error;
botMessage.streaming = false; botMessage.streaming = false;
set(() => ({})); set(() => ({}));
ControllerPool.remove(sessionIndex, messageIndex);
},
onController(controller) {
// collect controller for stop/retry
ControllerPool.addController(
sessionIndex,
messageIndex,
controller
);
}, },
filterBot: !get().config.sendBotMessages, filterBot: !get().config.sendBotMessages,
modelConfig: get().config.modelConfig, modelConfig: get().config.modelConfig,

View File

@ -206,3 +206,36 @@ div.math {
text-decoration: underline; text-decoration: underline;
} }
} }
pre {
position: relative;
&:hover .copy-code-button {
pointer-events: all;
transform: translateX(0px);
opacity: 0.5;
}
.copy-code-button {
position: absolute;
right: 10px;
cursor: pointer;
padding: 0px 5px;
background-color: var(--black);
color: var(--white);
border: var(--border-in-light);
border-radius: 10px;
transform: translateX(10px);
pointer-events: none;
opacity: 0;
transition: all ease 0.3s;
&:after {
content: "copy";
}
&:hover {
opacity: 1;
}
}
}

View File

@ -1,4 +1,9 @@
.markdown-body { .markdown-body {
pre {
background: #282a36;
color: #f8f8f2;
}
code[class*="language-"], code[class*="language-"],
pre[class*="language-"] { pre[class*="language-"] {
color: #f8f8f2; color: #f8f8f2;
@ -116,32 +121,32 @@
} }
} }
@mixin light { // @mixin light {
.markdown-body pre[class*="language-"] { // .markdown-body pre[class*="language-"] {
filter: invert(1) hue-rotate(50deg) brightness(1.3); // filter: invert(1) hue-rotate(50deg) brightness(1.3);
} // }
} // }
@mixin dark { // @mixin dark {
.markdown-body pre[class*="language-"] { // .markdown-body pre[class*="language-"] {
filter: none; // filter: none;
} // }
} // }
:root { // :root {
@include light(); // @include light();
} // }
.light { // .light {
@include light(); // @include light();
} // }
.dark { // .dark {
@include dark(); // @include dark();
} // }
@media (prefers-color-scheme: dark) { // @media (prefers-color-scheme: dark) {
:root { // :root {
@include dark(); // @include dark();
} // }
} // }

View File

@ -8,13 +8,14 @@ export const config = {
export function middleware(req: NextRequest, res: NextResponse) { export function middleware(req: NextRequest, res: NextResponse) {
const accessCode = req.headers.get("access-code"); const accessCode = req.headers.get("access-code");
const token = req.headers.get("token");
const hashedCode = md5.hash(accessCode ?? "").trim(); const hashedCode = md5.hash(accessCode ?? "").trim();
console.log("[Auth] allowed hashed codes: ", [...ACCESS_CODES]); console.log("[Auth] allowed hashed codes: ", [...ACCESS_CODES]);
console.log("[Auth] got access code:", accessCode); console.log("[Auth] got access code:", accessCode);
console.log("[Auth] hashed access code:", hashedCode); console.log("[Auth] hashed access code:", hashedCode);
if (ACCESS_CODES.size > 0 && !ACCESS_CODES.has(hashedCode)) { if (ACCESS_CODES.size > 0 && !ACCESS_CODES.has(hashedCode) && !token) {
return NextResponse.json( return NextResponse.json(
{ {
needAccessCode: true, needAccessCode: true,

Binary file not shown.

Before

Width:  |  Height:  |  Size: 728 B

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 728 B

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 728 B

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 657 B

After

Width:  |  Height:  |  Size: 633 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 728 B

After

Width:  |  Height:  |  Size: 1.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.7 KiB

After

Width:  |  Height:  |  Size: 15 KiB