Skip to content
Snippets Groups Projects
Commit 747bbbc7 authored by Marcus Schiesser's avatar Marcus Schiesser
Browse files

fix: set maxTokens to 4096 so vision model is not stopping too early (seems to...

fix: set maxTokens to 4096 so vision model is not stopping too early (seems to have a lower default than other models)
parent 962b4bad
No related branches found
No related tags found
No related merge requests found
...@@ -44,6 +44,7 @@ export async function POST(request: NextRequest) { ...@@ -44,6 +44,7 @@ export async function POST(request: NextRequest) {
const llm = new OpenAI({ const llm = new OpenAI({
model: MODEL, model: MODEL,
maxTokens: 4096,
}); });
const chatEngine = await createChatEngine(llm); const chatEngine = await createChatEngine(llm);
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment