Skip to content
Snippets Groups Projects
Commit 2619d417 authored by Yi Ding's avatar Yi Ding
Browse files

Fixed the context window issue.

Putting in a max tokens into the request was a mistake.
parent b00821db
No related branches found
No related tags found
No related merge requests found
import { Document, VectorStoreIndex, RetrieverQueryEngine } from "llamaindex";
import essay from "./essay";
// Customize retrieval and query args
async function main() {
const document = new Document({ text: essay });
const index = await VectorStoreIndex.fromDocuments([document]);
const retriever = index.asRetriever();
retriever.similarityTopK = 5;
// TODO: cannot pass responseSynthesizer into retriever query engine
const queryEngine = new RetrieverQueryEngine(retriever);
const response = await queryEngine.query(
"What did the author do growing up?"
);
console.log(response.response);
}
main().catch(console.error);
......@@ -78,8 +78,7 @@ export class OpenAI implements LLM {
this.temperature = init?.temperature ?? 0;
this.requestTimeout = init?.requestTimeout ?? null;
this.maxRetries = init?.maxRetries ?? 10;
this.maxTokens =
init?.maxTokens ?? Math.floor(ALL_AVAILABLE_MODELS[this.model] / 2);
this.maxTokens = init?.maxTokens ?? undefined;
this.openAIKey = init?.openAIKey ?? null;
this.session = init?.session ?? getOpenAISession();
this.callbackManager = init?.callbackManager;
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment