Skip to content
Snippets Groups Projects
Unverified Commit fa40b365 authored by Marcus Schiesser's avatar Marcus Schiesser Committed by GitHub
Browse files

docs: cleanup (#1745)

parent da8068e9
No related branches found
No related tags found
No related merge requests found
---
title: Structured data extraction tutorial
title: Structured data extraction
---
import { DynamicCodeBlock } from 'fumadocs-ui/components/dynamic-codeblock';
import CodeSource from "!raw-loader!../../../../../../../../examples/jsonExtract";
import CodeSource from "!raw-loader!../../../../../../../examples/jsonExtract";
Make sure you have installed LlamaIndex.TS and have an OpenAI key. If you haven't, check out the [installation](../setup) guide.
You can use [other LLMs](../../examples/other_llms) via their APIs; if you would prefer to use local models check out our [local LLM example](../../examples/local_llm).
You can use [other LLMs](/docs/llamaindex/modules/llms) via their APIs; if you would prefer to use local models check out our [local LLM example](./local_llm).
## Set up
......
/**
* This example shows how to use AgentWorkflow as a single agent with tools
* This example shows how to use a single agent with a tool
*/
import { openai } from "@llamaindex/openai";
import { Settings, agent } from "llamaindex";
import { agent } from "llamaindex";
import { getWeatherTool } from "../agent/utils/tools";
Settings.llm = openai({
model: "gpt-4o",
});
async function singleWeatherAgent() {
const workflow = agent({
async function main() {
const weatherAgent = agent({
llm: openai({
model: "gpt-4o",
}),
tools: [getWeatherTool],
verbose: false,
});
const workflowContext = workflow.run(
"What's the weather like in San Francisco?",
);
const sfResult = await workflowContext;
// The weather in San Francisco, CA is currently sunny.
console.log(`${JSON.stringify(sfResult, null, 2)}`);
// Run the agent and keep the context
const context = weatherAgent.run("What's the weather like in San Francisco?");
const result = await context;
console.log(`${JSON.stringify(result, null, 2)}`);
// Reuse the context from the previous run
const workflowContext2 = workflow.run("Compare it with California?", {
context: workflowContext.data,
const caResult = await weatherAgent.run("Compare it with California?", {
context: context.data,
});
const caResult = await workflowContext2;
// Both San Francisco and California are currently experiencing sunny weather.
console.log(`${JSON.stringify(caResult, null, 2)}`);
}
singleWeatherAgent().catch((error) => {
main().catch((error) => {
console.error("Error:", error);
});
......@@ -5,8 +5,8 @@
## Current Features:
- Bedrock support for Amazon Nova models Pro, Lite and Micro
- Bedrock support for the Anthropic Claude Models [usage](https://ts.llamaindex.ai/modules/llms/available_llms/bedrock) including the latest Sonnet 3.5 v2 and Haiku 3.5
- Bedrock support for the Meta LLama 2, 3, 3.1 and 3.2 Models [usage](https://ts.llamaindex.ai/modules/llms/available_llms/bedrock)
- Bedrock support for the Anthropic Claude Models [usage](https://ts.llamaindex.ai/docs/llamaindex/modules/llms/bedrock) including the latest Sonnet 3.5 v2 and Haiku 3.5
- Bedrock support for the Meta LLama 2, 3, 3.1 and 3.2 Models [usage](https://ts.llamaindex.ai/docs/llamaindex/modules/llms/bedrock)
- Meta LLama3.1 405b and Llama3.2 tool call support
- Meta 3.2 11B and 90B vision support
- Bedrock support for querying Knowledge Base
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment