Skip to content
Snippets Groups Projects
Unverified Commit 2b39ceff authored by ANKIT VARSHNEY's avatar ANKIT VARSHNEY Committed by GitHub
Browse files

docs: doc for structured output (#1761)

parent 77e24cec
No related branches found
No related tags found
No related merge requests found
---
"@llamaindex/doc": patch
---
Added documentation for structured output in openai and ollama
......@@ -55,6 +55,35 @@ const results = await queryEngine.query({
});
```
## Using JSON Response Format
You can configure Ollama to return responses in JSON format:
```ts
import { Ollama } from "@llamaindex/llms/ollama";
import { z } from "zod";
// Simple JSON format
const llm = new Ollama({
model: "llama2",
temperature: 0,
responseFormat: { type: "json_object" }
});
// Using Zod schema for validation
const responseSchema = z.object({
summary: z.string(),
topics: z.array(z.string()),
sentiment: z.enum(["positive", "negative", "neutral"])
});
const llm = new Ollama({
model: "llama2",
temperature: 0,
responseFormat: responseSchema
});
```
## Full Example
```ts
......
......@@ -46,6 +46,33 @@ or
Settings.llm = new OpenAI({ model: "gpt-3.5-turbo", temperature: 0, apiKey: <YOUR_API_KEY>, baseURL: "https://api.scaleway.ai/v1" });
```
## Using JSON Response Format
You can configure OpenAI to return responses in JSON format:
```ts
Settings.llm = new OpenAI({
model: "gpt-4o",
temperature: 0,
responseFormat: { type: "json_object" }
});
// You can also use a Zod schema to validate the response structure
import { z } from "zod";
const responseSchema = z.object({
summary: z.string(),
topics: z.array(z.string()),
sentiment: z.enum(["positive", "negative", "neutral"])
});
Settings.llm = new OpenAI({
model: "gpt-4o",
temperature: 0,
responseFormat: responseSchema
});
```
## Load and index documents
For this example, we will use a single document. In a real-world scenario, you would have multiple documents to index.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment