diff --git a/apps/docs/docs/api/_category_.yml b/apps/docs/docs/api/_category_.yml new file mode 100644 index 0000000000000000000000000000000000000000..79eb4b8a2939888be2393145205ffe6bc5fea186 --- /dev/null +++ b/apps/docs/docs/api/_category_.yml @@ -0,0 +1,2 @@ +label: "API" +position: 6 \ No newline at end of file diff --git a/apps/docs/docs/api/classes/Anthropic.md b/apps/docs/docs/api/classes/Anthropic.md new file mode 100644 index 0000000000000000000000000000000000000000..f9dfd8e6deb6b2ec73753feeaef911fc8d421959 --- /dev/null +++ b/apps/docs/docs/api/classes/Anthropic.md @@ -0,0 +1,193 @@ +--- +id: "Anthropic" +title: "Class: Anthropic" +sidebar_label: "Anthropic" +sidebar_position: 0 +custom_edit_url: null +--- + +Anthropic LLM implementation + +## Implements + +- [`LLM`](../interfaces/LLM.md) + +## Constructors + +### constructor + +• **new Anthropic**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`Anthropic`](Anthropic.md)\> | + +#### Defined in + +[llm/LLM.ts:353](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L353) + +## Properties + +### apiKey + +• `Optional` **apiKey**: `string` = `undefined` + +#### Defined in + +[llm/LLM.ts:346](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L346) + +___ + +### callbackManager + +• `Optional` **callbackManager**: [`CallbackManager`](CallbackManager.md) + +#### Defined in + +[llm/LLM.ts:351](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L351) + +___ + +### maxRetries + +• **maxRetries**: `number` + +#### Defined in + +[llm/LLM.ts:347](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L347) + +___ + +### maxTokens + +• `Optional` **maxTokens**: `number` + +#### Defined in + +[llm/LLM.ts:343](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L343) + +___ + +### model + +• **model**: `string` + +#### Defined in + +[llm/LLM.ts:340](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L340) + +___ + +### session + +• **session**: `AnthropicSession` + +#### Defined in + +[llm/LLM.ts:349](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L349) + +___ + +### temperature + +• **temperature**: `number` + +#### Defined in + +[llm/LLM.ts:341](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L341) + +___ + +### timeout + +• `Optional` **timeout**: `number` + +#### Defined in + +[llm/LLM.ts:348](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L348) + +___ + +### topP + +• **topP**: `number` + +#### Defined in + +[llm/LLM.ts:342](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L342) + +## Methods + +### chat + +▸ **chat**(`messages`, `parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a chat response from the LLM + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[chat](../interfaces/LLM.md#chat) + +#### Defined in + +[llm/LLM.ts:388](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L388) + +___ + +### complete + +▸ **complete**(`prompt`, `parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a prompt completion from the LLM + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `prompt` | `string` | the prompt to complete | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | - | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[complete](../interfaces/LLM.md#complete) + +#### Defined in + +[llm/LLM.ts:406](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L406) + +___ + +### mapMessagesToPrompt + +▸ **mapMessagesToPrompt**(`messages`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | + +#### Returns + +`string` + +#### Defined in + +[llm/LLM.ts:373](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L373) diff --git a/apps/docs/docs/api/classes/BaseEmbedding.md b/apps/docs/docs/api/classes/BaseEmbedding.md new file mode 100644 index 0000000000000000000000000000000000000000..1f3d55164d57a6c8908eecf88d4d912645fde3d6 --- /dev/null +++ b/apps/docs/docs/api/classes/BaseEmbedding.md @@ -0,0 +1,81 @@ +--- +id: "BaseEmbedding" +title: "Class: BaseEmbedding" +sidebar_label: "BaseEmbedding" +sidebar_position: 0 +custom_edit_url: null +--- + +## Hierarchy + +- **`BaseEmbedding`** + + ↳ [`OpenAIEmbedding`](OpenAIEmbedding.md) + +## Constructors + +### constructor + +• **new BaseEmbedding**() + +## Methods + +### getQueryEmbedding + +▸ `Abstract` **getQueryEmbedding**(`query`): `Promise`<`number`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | + +#### Returns + +`Promise`<`number`[]\> + +#### Defined in + +[Embedding.ts:206](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L206) + +___ + +### getTextEmbedding + +▸ `Abstract` **getTextEmbedding**(`text`): `Promise`<`number`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | + +#### Returns + +`Promise`<`number`[]\> + +#### Defined in + +[Embedding.ts:205](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L205) + +___ + +### similarity + +▸ **similarity**(`embedding1`, `embedding2`, `mode?`): `number` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `embedding1` | `number`[] | `undefined` | +| `embedding2` | `number`[] | `undefined` | +| `mode` | [`SimilarityType`](../enums/SimilarityType.md) | `SimilarityType.DEFAULT` | + +#### Returns + +`number` + +#### Defined in + +[Embedding.ts:197](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L197) diff --git a/apps/docs/docs/api/classes/BaseIndex.md b/apps/docs/docs/api/classes/BaseIndex.md new file mode 100644 index 0000000000000000000000000000000000000000..55948ba06ec1f6995587c3fede307bb77f4856ae --- /dev/null +++ b/apps/docs/docs/api/classes/BaseIndex.md @@ -0,0 +1,153 @@ +--- +id: "BaseIndex" +title: "Class: BaseIndex<T>" +sidebar_label: "BaseIndex" +sidebar_position: 0 +custom_edit_url: null +--- + +Indexes are the data structure that we store our nodes and embeddings in so +they can be retrieved for our queries. + +## Type parameters + +| Name | +| :------ | +| `T` | + +## Hierarchy + +- **`BaseIndex`** + + ↳ [`ListIndex`](ListIndex.md) + + ↳ [`VectorStoreIndex`](VectorStoreIndex.md) + +## Constructors + +### constructor + +• **new BaseIndex**<`T`\>(`init`) + +#### Type parameters + +| Name | +| :------ | +| `T` | + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | [`BaseIndexInit`](../interfaces/BaseIndexInit.md)<`T`\> | + +#### Defined in + +[indices/BaseIndex.ts:122](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L122) + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Defined in + +[indices/BaseIndex.ts:117](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L117) + +___ + +### indexStore + +• `Optional` **indexStore**: `BaseIndexStore` + +#### Defined in + +[indices/BaseIndex.ts:119](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L119) + +___ + +### indexStruct + +• **indexStruct**: `T` + +#### Defined in + +[indices/BaseIndex.ts:120](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L120) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[indices/BaseIndex.ts:115](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L115) + +___ + +### storageContext + +• **storageContext**: [`StorageContext`](../interfaces/StorageContext.md) + +#### Defined in + +[indices/BaseIndex.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L116) + +___ + +### vectorStore + +• `Optional` **vectorStore**: [`VectorStore`](../interfaces/VectorStore.md) + +#### Defined in + +[indices/BaseIndex.ts:118](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L118) + +## Methods + +### asQueryEngine + +▸ `Abstract` **asQueryEngine**(`options?`): [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +Create a new query engine from the index. It will also create a retriever +and response synthezier if they are not provided. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `options?` | `Object` | you can supply your own custom Retriever and ResponseSynthesizer | +| `options.responseSynthesizer?` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | - | +| `options.retriever?` | [`BaseRetriever`](../interfaces/BaseRetriever.md) | - | + +#### Returns + +[`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +#### Defined in + +[indices/BaseIndex.ts:142](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L142) + +___ + +### asRetriever + +▸ `Abstract` **asRetriever**(`options?`): [`BaseRetriever`](../interfaces/BaseRetriever.md) + +Create a new retriever from the index. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options?` | `any` | + +#### Returns + +[`BaseRetriever`](../interfaces/BaseRetriever.md) + +#### Defined in + +[indices/BaseIndex.ts:135](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L135) diff --git a/apps/docs/docs/api/classes/BaseNode.md b/apps/docs/docs/api/classes/BaseNode.md new file mode 100644 index 0000000000000000000000000000000000000000..0a1b1298624c84cc3e60616168bcbe37cc0e5e6f --- /dev/null +++ b/apps/docs/docs/api/classes/BaseNode.md @@ -0,0 +1,287 @@ +--- +id: "BaseNode" +title: "Class: BaseNode" +sidebar_label: "BaseNode" +sidebar_position: 0 +custom_edit_url: null +--- + +Generic abstract class for retrievable nodes + +## Hierarchy + +- **`BaseNode`** + + ↳ [`TextNode`](TextNode.md) + +## Constructors + +### constructor + +• **new BaseNode**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`BaseNode`](BaseNode.md)\> | + +#### Defined in + +[Node.ts:48](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L48) + +## Properties + +### embedding + +• `Optional` **embedding**: `number`[] + +#### Defined in + +[Node.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L39) + +___ + +### excludedEmbedMetadataKeys + +• **excludedEmbedMetadataKeys**: `string`[] = `[]` + +#### Defined in + +[Node.ts:43](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L43) + +___ + +### excludedLlmMetadataKeys + +• **excludedLlmMetadataKeys**: `string`[] = `[]` + +#### Defined in + +[Node.ts:44](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L44) + +___ + +### hash + +• **hash**: `string` = `""` + +#### Defined in + +[Node.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L46) + +___ + +### id\_ + +• **id\_**: `string` + +#### Defined in + +[Node.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L38) + +___ + +### metadata + +• **metadata**: `Record`<`string`, `any`\> = `{}` + +#### Defined in + +[Node.ts:42](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L42) + +___ + +### relationships + +• **relationships**: `Partial`<`Record`<[`NodeRelationship`](../enums/NodeRelationship.md), [`RelatedNodeType`](../modules.md#relatednodetype)\>\> = `{}` + +#### Defined in + +[Node.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L45) + +## Accessors + +### childNodes + +• `get` **childNodes**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Defined in + +[Node.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L104) + +___ + +### nextNode + +• `get` **nextNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Defined in + +[Node.ts:84](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L84) + +___ + +### nodeId + +• `get` **nodeId**(): `string` + +#### Returns + +`string` + +#### Defined in + +[Node.ts:58](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L58) + +___ + +### parentNode + +• `get` **parentNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Defined in + +[Node.ts:94](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L94) + +___ + +### prevNode + +• `get` **prevNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Defined in + +[Node.ts:72](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L72) + +___ + +### sourceNode + +• `get` **sourceNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Defined in + +[Node.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L62) + +## Methods + +### asRelatedNodeInfo + +▸ **asRelatedNodeInfo**(): [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +[`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Defined in + +[Node.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L124) + +___ + +### getContent + +▸ `Abstract` **getContent**(`metadataMode`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | + +#### Returns + +`string` + +#### Defined in + +[Node.ts:54](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L54) + +___ + +### getEmbedding + +▸ **getEmbedding**(): `number`[] + +#### Returns + +`number`[] + +#### Defined in + +[Node.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L116) + +___ + +### getMetadataStr + +▸ `Abstract` **getMetadataStr**(`metadataMode`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | + +#### Returns + +`string` + +#### Defined in + +[Node.ts:55](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L55) + +___ + +### getType + +▸ `Abstract` **getType**(): [`ObjectType`](../enums/ObjectType.md) + +#### Returns + +[`ObjectType`](../enums/ObjectType.md) + +#### Defined in + +[Node.ts:52](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L52) + +___ + +### setContent + +▸ `Abstract` **setContent**(`value`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `value` | `any` | + +#### Returns + +`void` + +#### Defined in + +[Node.ts:56](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L56) diff --git a/apps/docs/docs/api/classes/CallbackManager.md b/apps/docs/docs/api/classes/CallbackManager.md new file mode 100644 index 0000000000000000000000000000000000000000..1dd090e28437b66ba10e3ad7a7ba8979ab448c6c --- /dev/null +++ b/apps/docs/docs/api/classes/CallbackManager.md @@ -0,0 +1,83 @@ +--- +id: "CallbackManager" +title: "Class: CallbackManager" +sidebar_label: "CallbackManager" +sidebar_position: 0 +custom_edit_url: null +--- + +## Implements + +- `CallbackManagerMethods` + +## Constructors + +### constructor + +• **new CallbackManager**(`handlers?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `handlers?` | `CallbackManagerMethods` | + +#### Defined in + +[callbacks/CallbackManager.ts:67](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L67) + +## Properties + +### onLLMStream + +• `Optional` **onLLMStream**: (`params`: [`StreamCallbackResponse`](../interfaces/StreamCallbackResponse.md)) => `void` \| `Promise`<`void`\> + +#### Type declaration + +▸ (`params`): `void` \| `Promise`<`void`\> + +##### Parameters + +| Name | Type | +| :------ | :------ | +| `params` | [`StreamCallbackResponse`](../interfaces/StreamCallbackResponse.md) | + +##### Returns + +`void` \| `Promise`<`void`\> + +#### Implementation of + +CallbackManagerMethods.onLLMStream + +#### Defined in + +[callbacks/CallbackManager.ts:64](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L64) + +___ + +### onRetrieve + +• `Optional` **onRetrieve**: (`params`: [`RetrievalCallbackResponse`](../interfaces/RetrievalCallbackResponse.md)) => `void` \| `Promise`<`void`\> + +#### Type declaration + +▸ (`params`): `void` \| `Promise`<`void`\> + +##### Parameters + +| Name | Type | +| :------ | :------ | +| `params` | [`RetrievalCallbackResponse`](../interfaces/RetrievalCallbackResponse.md) | + +##### Returns + +`void` \| `Promise`<`void`\> + +#### Implementation of + +CallbackManagerMethods.onRetrieve + +#### Defined in + +[callbacks/CallbackManager.ts:65](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L65) diff --git a/apps/docs/docs/api/classes/CompactAndRefine.md b/apps/docs/docs/api/classes/CompactAndRefine.md new file mode 100644 index 0000000000000000000000000000000000000000..2d53f611f90e088e5d0907aa90536ef261a7a81b --- /dev/null +++ b/apps/docs/docs/api/classes/CompactAndRefine.md @@ -0,0 +1,106 @@ +--- +id: "CompactAndRefine" +title: "Class: CompactAndRefine" +sidebar_label: "CompactAndRefine" +sidebar_position: 0 +custom_edit_url: null +--- + +CompactAndRefine is a slight variation of Refine that first compacts the text chunks into the smallest possible number of chunks. + +## Hierarchy + +- [`Refine`](Refine.md) + + ↳ **`CompactAndRefine`** + +## Constructors + +### constructor + +• **new CompactAndRefine**(`serviceContext`, `textQATemplate?`, `refineTemplate?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | +| `textQATemplate?` | [`SimplePrompt`](../modules.md#simpleprompt) | +| `refineTemplate?` | [`SimplePrompt`](../modules.md#simpleprompt) | + +#### Inherited from + +[Refine](Refine.md).[constructor](Refine.md#constructor) + +#### Defined in + +[ResponseSynthesizer.ts:78](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L78) + +## Properties + +### refineTemplate + +• **refineTemplate**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Inherited from + +[Refine](Refine.md).[refineTemplate](Refine.md#refinetemplate) + +#### Defined in + +[ResponseSynthesizer.ts:76](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L76) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Inherited from + +[Refine](Refine.md).[serviceContext](Refine.md#servicecontext) + +#### Defined in + +[ResponseSynthesizer.ts:74](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L74) + +___ + +### textQATemplate + +• **textQATemplate**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Inherited from + +[Refine](Refine.md).[textQATemplate](Refine.md#textqatemplate) + +#### Defined in + +[ResponseSynthesizer.ts:75](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L75) + +## Methods + +### getResponse + +▸ **getResponse**(`query`, `textChunks`, `parentEvent?`, `prevResponse?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `textChunks` | `string`[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | +| `prevResponse?` | `string` | + +#### Returns + +`Promise`<`string`\> + +#### Overrides + +[Refine](Refine.md).[getResponse](Refine.md#getresponse) + +#### Defined in + +[ResponseSynthesizer.ts:181](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L181) diff --git a/apps/docs/docs/api/classes/CondenseQuestionChatEngine.md b/apps/docs/docs/api/classes/CondenseQuestionChatEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..a70faf9cb09c3f1ad6cef1a29731f5f4b65e75a7 --- /dev/null +++ b/apps/docs/docs/api/classes/CondenseQuestionChatEngine.md @@ -0,0 +1,148 @@ +--- +id: "CondenseQuestionChatEngine" +title: "Class: CondenseQuestionChatEngine" +sidebar_label: "CondenseQuestionChatEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +CondenseQuestionChatEngine is used in conjunction with a Index (for example VectorStoreIndex). +It does two steps on taking a user's chat message: first, it condenses the chat message +with the previous chat history into a question with more context. +Then, it queries the underlying Index using the new question with context and returns +the response. +CondenseQuestionChatEngine performs well when the input is primarily questions about the +underlying data. It performs less well when the chat messages are not questions about the +data, or are very referential to previous context. + +## Implements + +- [`ChatEngine`](../interfaces/ChatEngine.md) + +## Constructors + +### constructor + +• **new CondenseQuestionChatEngine**(`init`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | `Object` | +| `init.chatHistory` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `init.condenseMessagePrompt?` | [`SimplePrompt`](../modules.md#simpleprompt) | +| `init.queryEngine` | [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) | +| `init.serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | + +#### Defined in + +[ChatEngine.ts:75](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L75) + +## Properties + +### chatHistory + +• **chatHistory**: [`ChatMessage`](../interfaces/ChatMessage.md)[] + +#### Defined in + +[ChatEngine.ts:71](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L71) + +___ + +### condenseMessagePrompt + +• **condenseMessagePrompt**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[ChatEngine.ts:73](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L73) + +___ + +### queryEngine + +• **queryEngine**: [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +#### Defined in + +[ChatEngine.ts:70](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L70) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[ChatEngine.ts:72](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L72) + +## Methods + +### chat + +▸ **chat**(`message`, `chatHistory?`): `Promise`<[`Response`](Response.md)\> + +Send message along with the class's current chat history to the LLM. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `message` | `string` | | +| `chatHistory?` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | optional chat history if you want to customize the chat history | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[chat](../interfaces/ChatEngine.md#chat) + +#### Defined in + +[ChatEngine.ts:100](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L100) + +___ + +### condenseQuestion + +▸ `Private` **condenseQuestion**(`chatHistory`, `question`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `chatHistory` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `question` | `string` | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Defined in + +[ChatEngine.ts:89](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L89) + +___ + +### reset + +▸ **reset**(): `void` + +Resets the chat history so that it's empty. + +#### Returns + +`void` + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[reset](../interfaces/ChatEngine.md#reset) + +#### Defined in + +[ChatEngine.ts:118](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L118) diff --git a/apps/docs/docs/api/classes/ContextChatEngine.md b/apps/docs/docs/api/classes/ContextChatEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..50d41da5d10b8b7e2eac0871b86dce9251e36f66 --- /dev/null +++ b/apps/docs/docs/api/classes/ContextChatEngine.md @@ -0,0 +1,111 @@ +--- +id: "ContextChatEngine" +title: "Class: ContextChatEngine" +sidebar_label: "ContextChatEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +ContextChatEngine uses the Index to get the appropriate context for each query. +The context is stored in the system prompt, and the chat history is preserved, +ideally allowing the appropriate context to be surfaced for each query. + +## Implements + +- [`ChatEngine`](../interfaces/ChatEngine.md) + +## Constructors + +### constructor + +• **new ContextChatEngine**(`init`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | `Object` | +| `init.chatHistory?` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `init.chatModel?` | [`OpenAI`](OpenAI.md) | +| `init.retriever` | [`BaseRetriever`](../interfaces/BaseRetriever.md) | + +#### Defined in + +[ChatEngine.ts:133](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L133) + +## Properties + +### chatHistory + +• **chatHistory**: [`ChatMessage`](../interfaces/ChatMessage.md)[] + +#### Defined in + +[ChatEngine.ts:131](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L131) + +___ + +### chatModel + +• **chatModel**: [`OpenAI`](OpenAI.md) + +#### Defined in + +[ChatEngine.ts:130](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L130) + +___ + +### retriever + +• **retriever**: [`BaseRetriever`](../interfaces/BaseRetriever.md) + +#### Defined in + +[ChatEngine.ts:129](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L129) + +## Methods + +### chat + +▸ **chat**(`message`, `chatHistory?`): `Promise`<[`Response`](Response.md)\> + +Send message along with the class's current chat history to the LLM. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `message` | `string` | | +| `chatHistory?` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | optional chat history if you want to customize the chat history | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[chat](../interfaces/ChatEngine.md#chat) + +#### Defined in + +[ChatEngine.ts:144](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L144) + +___ + +### reset + +▸ **reset**(): `void` + +Resets the chat history so that it's empty. + +#### Returns + +`void` + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[reset](../interfaces/ChatEngine.md#reset) + +#### Defined in + +[ChatEngine.ts:182](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L182) diff --git a/apps/docs/docs/api/classes/Document.md b/apps/docs/docs/api/classes/Document.md new file mode 100644 index 0000000000000000000000000000000000000000..dd2aaaff9427042c264b130c2a4689053139e436 --- /dev/null +++ b/apps/docs/docs/api/classes/Document.md @@ -0,0 +1,496 @@ +--- +id: "Document" +title: "Class: Document" +sidebar_label: "Document" +sidebar_position: 0 +custom_edit_url: null +--- + +A document is just a special text node with a docId. + +## Hierarchy + +- [`TextNode`](TextNode.md) + + ↳ **`Document`** + +## Constructors + +### constructor + +• **new Document**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`Document`](Document.md)\> | + +#### Overrides + +[TextNode](TextNode.md).[constructor](TextNode.md#constructor) + +#### Defined in + +[Node.ts:216](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L216) + +## Properties + +### embedding + +• `Optional` **embedding**: `number`[] + +#### Inherited from + +[TextNode](TextNode.md).[embedding](TextNode.md#embedding) + +#### Defined in + +[Node.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L39) + +___ + +### endCharIdx + +• `Optional` **endCharIdx**: `number` + +#### Inherited from + +[TextNode](TextNode.md).[endCharIdx](TextNode.md#endcharidx) + +#### Defined in + +[Node.ts:139](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L139) + +___ + +### excludedEmbedMetadataKeys + +• **excludedEmbedMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[TextNode](TextNode.md).[excludedEmbedMetadataKeys](TextNode.md#excludedembedmetadatakeys) + +#### Defined in + +[Node.ts:43](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L43) + +___ + +### excludedLlmMetadataKeys + +• **excludedLlmMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[TextNode](TextNode.md).[excludedLlmMetadataKeys](TextNode.md#excludedllmmetadatakeys) + +#### Defined in + +[Node.ts:44](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L44) + +___ + +### hash + +• **hash**: `string` = `""` + +#### Inherited from + +[TextNode](TextNode.md).[hash](TextNode.md#hash) + +#### Defined in + +[Node.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L46) + +___ + +### id\_ + +• **id\_**: `string` + +#### Inherited from + +[TextNode](TextNode.md).[id_](TextNode.md#id_) + +#### Defined in + +[Node.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L38) + +___ + +### metadata + +• **metadata**: `Record`<`string`, `any`\> = `{}` + +#### Inherited from + +[TextNode](TextNode.md).[metadata](TextNode.md#metadata) + +#### Defined in + +[Node.ts:42](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L42) + +___ + +### metadataSeparator + +• **metadataSeparator**: `string` = `"\n"` + +#### Inherited from + +[TextNode](TextNode.md).[metadataSeparator](TextNode.md#metadataseparator) + +#### Defined in + +[Node.ts:142](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L142) + +___ + +### relationships + +• **relationships**: `Partial`<`Record`<[`NodeRelationship`](../enums/NodeRelationship.md), [`RelatedNodeType`](../modules.md#relatednodetype)\>\> = `{}` + +#### Inherited from + +[TextNode](TextNode.md).[relationships](TextNode.md#relationships) + +#### Defined in + +[Node.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L45) + +___ + +### startCharIdx + +• `Optional` **startCharIdx**: `number` + +#### Inherited from + +[TextNode](TextNode.md).[startCharIdx](TextNode.md#startcharidx) + +#### Defined in + +[Node.ts:138](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L138) + +___ + +### text + +• **text**: `string` = `""` + +#### Inherited from + +[TextNode](TextNode.md).[text](TextNode.md#text) + +#### Defined in + +[Node.ts:137](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L137) + +## Accessors + +### childNodes + +• `get` **childNodes**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Inherited from + +TextNode.childNodes + +#### Defined in + +[Node.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L104) + +___ + +### docId + +• `get` **docId**(): `string` + +#### Returns + +`string` + +#### Defined in + +[Node.ts:225](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L225) + +___ + +### nextNode + +• `get` **nextNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.nextNode + +#### Defined in + +[Node.ts:84](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L84) + +___ + +### nodeId + +• `get` **nodeId**(): `string` + +#### Returns + +`string` + +#### Inherited from + +TextNode.nodeId + +#### Defined in + +[Node.ts:58](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L58) + +___ + +### parentNode + +• `get` **parentNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.parentNode + +#### Defined in + +[Node.ts:94](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L94) + +___ + +### prevNode + +• `get` **prevNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.prevNode + +#### Defined in + +[Node.ts:72](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L72) + +___ + +### sourceNode + +• `get` **sourceNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.sourceNode + +#### Defined in + +[Node.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L62) + +## Methods + +### asRelatedNodeInfo + +▸ **asRelatedNodeInfo**(): [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +[`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +[TextNode](TextNode.md).[asRelatedNodeInfo](TextNode.md#asrelatednodeinfo) + +#### Defined in + +[Node.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L124) + +___ + +### generateHash + +▸ **generateHash**(): `void` + +#### Returns + +`void` + +#### Inherited from + +[TextNode](TextNode.md).[generateHash](TextNode.md#generatehash) + +#### Defined in + +[Node.ts:149](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L149) + +___ + +### getContent + +▸ **getContent**(`metadataMode?`): `string` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | `MetadataMode.NONE` | + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getContent](TextNode.md#getcontent) + +#### Defined in + +[Node.ts:157](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L157) + +___ + +### getEmbedding + +▸ **getEmbedding**(): `number`[] + +#### Returns + +`number`[] + +#### Inherited from + +[TextNode](TextNode.md).[getEmbedding](TextNode.md#getembedding) + +#### Defined in + +[Node.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L116) + +___ + +### getMetadataStr + +▸ **getMetadataStr**(`metadataMode`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getMetadataStr](TextNode.md#getmetadatastr) + +#### Defined in + +[Node.ts:162](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L162) + +___ + +### getNodeInfo + +▸ **getNodeInfo**(): `Object` + +#### Returns + +`Object` + +| Name | Type | +| :------ | :------ | +| `end` | `undefined` \| `number` | +| `start` | `undefined` \| `number` | + +#### Inherited from + +[TextNode](TextNode.md).[getNodeInfo](TextNode.md#getnodeinfo) + +#### Defined in + +[Node.ts:187](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L187) + +___ + +### getText + +▸ **getText**(): `string` + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getText](TextNode.md#gettext) + +#### Defined in + +[Node.ts:191](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L191) + +___ + +### getType + +▸ **getType**(): [`ObjectType`](../enums/ObjectType.md) + +#### Returns + +[`ObjectType`](../enums/ObjectType.md) + +#### Overrides + +[TextNode](TextNode.md).[getType](TextNode.md#gettype) + +#### Defined in + +[Node.ts:221](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L221) + +___ + +### setContent + +▸ **setContent**(`value`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `value` | `string` | + +#### Returns + +`void` + +#### Inherited from + +[TextNode](TextNode.md).[setContent](TextNode.md#setcontent) + +#### Defined in + +[Node.ts:183](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L183) diff --git a/apps/docs/docs/api/classes/InMemoryFileSystem.md b/apps/docs/docs/api/classes/InMemoryFileSystem.md new file mode 100644 index 0000000000000000000000000000000000000000..3193bb4677f8d3bb3686b3af988aa049f8a7a02c --- /dev/null +++ b/apps/docs/docs/api/classes/InMemoryFileSystem.md @@ -0,0 +1,129 @@ +--- +id: "InMemoryFileSystem" +title: "Class: InMemoryFileSystem" +sidebar_label: "InMemoryFileSystem" +sidebar_position: 0 +custom_edit_url: null +--- + +A filesystem implementation that stores files in memory. + +## Implements + +- [`GenericFileSystem`](../interfaces/GenericFileSystem.md) + +## Constructors + +### constructor + +• **new InMemoryFileSystem**() + +## Properties + +### files + +• `Private` **files**: `Record`<`string`, `any`\> = `{}` + +#### Defined in + +[storage/FileSystem.ts:25](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L25) + +## Methods + +### access + +▸ **access**(`path`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | + +#### Returns + +`Promise`<`void`\> + +#### Implementation of + +[GenericFileSystem](../interfaces/GenericFileSystem.md).[access](../interfaces/GenericFileSystem.md#access) + +#### Defined in + +[storage/FileSystem.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L38) + +___ + +### mkdir + +▸ **mkdir**(`path`, `options?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`void`\> + +#### Implementation of + +[GenericFileSystem](../interfaces/GenericFileSystem.md).[mkdir](../interfaces/GenericFileSystem.md#mkdir) + +#### Defined in + +[storage/FileSystem.ts:44](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L44) + +___ + +### readFile + +▸ **readFile**(`path`, `options?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`string`\> + +#### Implementation of + +[GenericFileSystem](../interfaces/GenericFileSystem.md).[readFile](../interfaces/GenericFileSystem.md#readfile) + +#### Defined in + +[storage/FileSystem.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L31) + +___ + +### writeFile + +▸ **writeFile**(`path`, `content`, `options?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `content` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`void`\> + +#### Implementation of + +[GenericFileSystem](../interfaces/GenericFileSystem.md).[writeFile](../interfaces/GenericFileSystem.md#writefile) + +#### Defined in + +[storage/FileSystem.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L27) diff --git a/apps/docs/docs/api/classes/IndexDict.md b/apps/docs/docs/api/classes/IndexDict.md new file mode 100644 index 0000000000000000000000000000000000000000..16610fe99b348e8cbf85f0c9f6029207cf95cf67 --- /dev/null +++ b/apps/docs/docs/api/classes/IndexDict.md @@ -0,0 +1,151 @@ +--- +id: "IndexDict" +title: "Class: IndexDict" +sidebar_label: "IndexDict" +sidebar_position: 0 +custom_edit_url: null +--- + +The underlying structure of each index. + +## Hierarchy + +- [`IndexStruct`](IndexStruct.md) + + ↳ **`IndexDict`** + +## Constructors + +### constructor + +• **new IndexDict**(`indexId?`, `summary?`) + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `indexId` | `string` | `undefined` | +| `summary` | `undefined` | `undefined` | + +#### Inherited from + +[IndexStruct](IndexStruct.md).[constructor](IndexStruct.md#constructor) + +#### Defined in + +[indices/BaseIndex.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L19) + +## Properties + +### docStore + +• **docStore**: `Record`<`string`, [`Document`](Document.md)\> = `{}` + +#### Defined in + +[indices/BaseIndex.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L46) + +___ + +### indexId + +• **indexId**: `string` + +#### Inherited from + +[IndexStruct](IndexStruct.md).[indexId](IndexStruct.md#indexid) + +#### Defined in + +[indices/BaseIndex.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L16) + +___ + +### nodesDict + +• **nodesDict**: `Record`<`string`, [`BaseNode`](BaseNode.md)\> = `{}` + +#### Defined in + +[indices/BaseIndex.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L45) + +___ + +### summary + +• `Optional` **summary**: `string` + +#### Inherited from + +[IndexStruct](IndexStruct.md).[summary](IndexStruct.md#summary) + +#### Defined in + +[indices/BaseIndex.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L17) + +___ + +### type + +• **type**: [`IndexStructType`](../enums/IndexStructType.md) = `IndexStructType.SIMPLE_DICT` + +#### Defined in + +[indices/BaseIndex.ts:47](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L47) + +## Methods + +### addNode + +▸ **addNode**(`node`, `textId?`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `node` | [`BaseNode`](BaseNode.md) | +| `textId?` | `string` | + +#### Returns + +`void` + +#### Defined in + +[indices/BaseIndex.ts:56](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L56) + +___ + +### getSummary + +▸ **getSummary**(): `string` + +#### Returns + +`string` + +#### Overrides + +[IndexStruct](IndexStruct.md).[getSummary](IndexStruct.md#getsummary) + +#### Defined in + +[indices/BaseIndex.ts:49](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L49) + +___ + +### toJson + +▸ **toJson**(): `Record`<`string`, `unknown`\> + +#### Returns + +`Record`<`string`, `unknown`\> + +#### Overrides + +[IndexStruct](IndexStruct.md).[toJson](IndexStruct.md#tojson) + +#### Defined in + +[indices/BaseIndex.ts:61](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L61) diff --git a/apps/docs/docs/api/classes/IndexList.md b/apps/docs/docs/api/classes/IndexList.md new file mode 100644 index 0000000000000000000000000000000000000000..dd881c84013c112674f3fa6971d7a899f19c534f --- /dev/null +++ b/apps/docs/docs/api/classes/IndexList.md @@ -0,0 +1,140 @@ +--- +id: "IndexList" +title: "Class: IndexList" +sidebar_label: "IndexList" +sidebar_position: 0 +custom_edit_url: null +--- + +The underlying structure of each index. + +## Hierarchy + +- [`IndexStruct`](IndexStruct.md) + + ↳ **`IndexList`** + +## Constructors + +### constructor + +• **new IndexList**(`indexId?`, `summary?`) + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `indexId` | `string` | `undefined` | +| `summary` | `undefined` | `undefined` | + +#### Inherited from + +[IndexStruct](IndexStruct.md).[constructor](IndexStruct.md#constructor) + +#### Defined in + +[indices/BaseIndex.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L19) + +## Properties + +### indexId + +• **indexId**: `string` + +#### Inherited from + +[IndexStruct](IndexStruct.md).[indexId](IndexStruct.md#indexid) + +#### Defined in + +[indices/BaseIndex.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L16) + +___ + +### nodes + +• **nodes**: `string`[] = `[]` + +#### Defined in + +[indices/BaseIndex.ts:85](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L85) + +___ + +### summary + +• `Optional` **summary**: `string` + +#### Inherited from + +[IndexStruct](IndexStruct.md).[summary](IndexStruct.md#summary) + +#### Defined in + +[indices/BaseIndex.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L17) + +___ + +### type + +• **type**: [`IndexStructType`](../enums/IndexStructType.md) = `IndexStructType.LIST` + +#### Defined in + +[indices/BaseIndex.ts:86](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L86) + +## Methods + +### addNode + +▸ **addNode**(`node`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `node` | [`BaseNode`](BaseNode.md) | + +#### Returns + +`void` + +#### Defined in + +[indices/BaseIndex.ts:88](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L88) + +___ + +### getSummary + +▸ **getSummary**(): `string` + +#### Returns + +`string` + +#### Inherited from + +[IndexStruct](IndexStruct.md).[getSummary](IndexStruct.md#getsummary) + +#### Defined in + +[indices/BaseIndex.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L31) + +___ + +### toJson + +▸ **toJson**(): `Record`<`string`, `unknown`\> + +#### Returns + +`Record`<`string`, `unknown`\> + +#### Overrides + +[IndexStruct](IndexStruct.md).[toJson](IndexStruct.md#tojson) + +#### Defined in + +[indices/BaseIndex.ts:92](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L92) diff --git a/apps/docs/docs/api/classes/IndexNode.md b/apps/docs/docs/api/classes/IndexNode.md new file mode 100644 index 0000000000000000000000000000000000000000..d47c2ce9d14445e8c19eed1ee5b0cb034f1af8e9 --- /dev/null +++ b/apps/docs/docs/api/classes/IndexNode.md @@ -0,0 +1,492 @@ +--- +id: "IndexNode" +title: "Class: IndexNode" +sidebar_label: "IndexNode" +sidebar_position: 0 +custom_edit_url: null +--- + +TextNode is the default node type for text. Most common node type in LlamaIndex.TS + +## Hierarchy + +- [`TextNode`](TextNode.md) + + ↳ **`IndexNode`** + +## Constructors + +### constructor + +• **new IndexNode**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`TextNode`](TextNode.md)\> | + +#### Inherited from + +[TextNode](TextNode.md).[constructor](TextNode.md#constructor) + +#### Defined in + +[Node.ts:144](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L144) + +## Properties + +### embedding + +• `Optional` **embedding**: `number`[] + +#### Inherited from + +[TextNode](TextNode.md).[embedding](TextNode.md#embedding) + +#### Defined in + +[Node.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L39) + +___ + +### endCharIdx + +• `Optional` **endCharIdx**: `number` + +#### Inherited from + +[TextNode](TextNode.md).[endCharIdx](TextNode.md#endcharidx) + +#### Defined in + +[Node.ts:139](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L139) + +___ + +### excludedEmbedMetadataKeys + +• **excludedEmbedMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[TextNode](TextNode.md).[excludedEmbedMetadataKeys](TextNode.md#excludedembedmetadatakeys) + +#### Defined in + +[Node.ts:43](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L43) + +___ + +### excludedLlmMetadataKeys + +• **excludedLlmMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[TextNode](TextNode.md).[excludedLlmMetadataKeys](TextNode.md#excludedllmmetadatakeys) + +#### Defined in + +[Node.ts:44](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L44) + +___ + +### hash + +• **hash**: `string` = `""` + +#### Inherited from + +[TextNode](TextNode.md).[hash](TextNode.md#hash) + +#### Defined in + +[Node.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L46) + +___ + +### id\_ + +• **id\_**: `string` + +#### Inherited from + +[TextNode](TextNode.md).[id_](TextNode.md#id_) + +#### Defined in + +[Node.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L38) + +___ + +### indexId + +• **indexId**: `string` = `""` + +#### Defined in + +[Node.ts:205](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L205) + +___ + +### metadata + +• **metadata**: `Record`<`string`, `any`\> = `{}` + +#### Inherited from + +[TextNode](TextNode.md).[metadata](TextNode.md#metadata) + +#### Defined in + +[Node.ts:42](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L42) + +___ + +### metadataSeparator + +• **metadataSeparator**: `string` = `"\n"` + +#### Inherited from + +[TextNode](TextNode.md).[metadataSeparator](TextNode.md#metadataseparator) + +#### Defined in + +[Node.ts:142](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L142) + +___ + +### relationships + +• **relationships**: `Partial`<`Record`<[`NodeRelationship`](../enums/NodeRelationship.md), [`RelatedNodeType`](../modules.md#relatednodetype)\>\> = `{}` + +#### Inherited from + +[TextNode](TextNode.md).[relationships](TextNode.md#relationships) + +#### Defined in + +[Node.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L45) + +___ + +### startCharIdx + +• `Optional` **startCharIdx**: `number` + +#### Inherited from + +[TextNode](TextNode.md).[startCharIdx](TextNode.md#startcharidx) + +#### Defined in + +[Node.ts:138](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L138) + +___ + +### text + +• **text**: `string` = `""` + +#### Inherited from + +[TextNode](TextNode.md).[text](TextNode.md#text) + +#### Defined in + +[Node.ts:137](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L137) + +## Accessors + +### childNodes + +• `get` **childNodes**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Inherited from + +TextNode.childNodes + +#### Defined in + +[Node.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L104) + +___ + +### nextNode + +• `get` **nextNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.nextNode + +#### Defined in + +[Node.ts:84](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L84) + +___ + +### nodeId + +• `get` **nodeId**(): `string` + +#### Returns + +`string` + +#### Inherited from + +TextNode.nodeId + +#### Defined in + +[Node.ts:58](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L58) + +___ + +### parentNode + +• `get` **parentNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.parentNode + +#### Defined in + +[Node.ts:94](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L94) + +___ + +### prevNode + +• `get` **prevNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.prevNode + +#### Defined in + +[Node.ts:72](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L72) + +___ + +### sourceNode + +• `get` **sourceNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +TextNode.sourceNode + +#### Defined in + +[Node.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L62) + +## Methods + +### asRelatedNodeInfo + +▸ **asRelatedNodeInfo**(): [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +[`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +[TextNode](TextNode.md).[asRelatedNodeInfo](TextNode.md#asrelatednodeinfo) + +#### Defined in + +[Node.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L124) + +___ + +### generateHash + +▸ **generateHash**(): `void` + +#### Returns + +`void` + +#### Inherited from + +[TextNode](TextNode.md).[generateHash](TextNode.md#generatehash) + +#### Defined in + +[Node.ts:149](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L149) + +___ + +### getContent + +▸ **getContent**(`metadataMode?`): `string` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | `MetadataMode.NONE` | + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getContent](TextNode.md#getcontent) + +#### Defined in + +[Node.ts:157](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L157) + +___ + +### getEmbedding + +▸ **getEmbedding**(): `number`[] + +#### Returns + +`number`[] + +#### Inherited from + +[TextNode](TextNode.md).[getEmbedding](TextNode.md#getembedding) + +#### Defined in + +[Node.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L116) + +___ + +### getMetadataStr + +▸ **getMetadataStr**(`metadataMode`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getMetadataStr](TextNode.md#getmetadatastr) + +#### Defined in + +[Node.ts:162](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L162) + +___ + +### getNodeInfo + +▸ **getNodeInfo**(): `Object` + +#### Returns + +`Object` + +| Name | Type | +| :------ | :------ | +| `end` | `undefined` \| `number` | +| `start` | `undefined` \| `number` | + +#### Inherited from + +[TextNode](TextNode.md).[getNodeInfo](TextNode.md#getnodeinfo) + +#### Defined in + +[Node.ts:187](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L187) + +___ + +### getText + +▸ **getText**(): `string` + +#### Returns + +`string` + +#### Inherited from + +[TextNode](TextNode.md).[getText](TextNode.md#gettext) + +#### Defined in + +[Node.ts:191](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L191) + +___ + +### getType + +▸ **getType**(): [`ObjectType`](../enums/ObjectType.md) + +#### Returns + +[`ObjectType`](../enums/ObjectType.md) + +#### Overrides + +[TextNode](TextNode.md).[getType](TextNode.md#gettype) + +#### Defined in + +[Node.ts:207](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L207) + +___ + +### setContent + +▸ **setContent**(`value`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `value` | `string` | + +#### Returns + +`void` + +#### Inherited from + +[TextNode](TextNode.md).[setContent](TextNode.md#setcontent) + +#### Defined in + +[Node.ts:183](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L183) diff --git a/apps/docs/docs/api/classes/IndexStruct.md b/apps/docs/docs/api/classes/IndexStruct.md new file mode 100644 index 0000000000000000000000000000000000000000..01495f0648a23a6eeeb13e846122b930e5056dcb --- /dev/null +++ b/apps/docs/docs/api/classes/IndexStruct.md @@ -0,0 +1,82 @@ +--- +id: "IndexStruct" +title: "Class: IndexStruct" +sidebar_label: "IndexStruct" +sidebar_position: 0 +custom_edit_url: null +--- + +The underlying structure of each index. + +## Hierarchy + +- **`IndexStruct`** + + ↳ [`IndexDict`](IndexDict.md) + + ↳ [`IndexList`](IndexList.md) + +## Constructors + +### constructor + +• **new IndexStruct**(`indexId?`, `summary?`) + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `indexId` | `string` | `undefined` | +| `summary` | `undefined` | `undefined` | + +#### Defined in + +[indices/BaseIndex.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L19) + +## Properties + +### indexId + +• **indexId**: `string` + +#### Defined in + +[indices/BaseIndex.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L16) + +___ + +### summary + +• `Optional` **summary**: `string` + +#### Defined in + +[indices/BaseIndex.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L17) + +## Methods + +### getSummary + +▸ **getSummary**(): `string` + +#### Returns + +`string` + +#### Defined in + +[indices/BaseIndex.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L31) + +___ + +### toJson + +▸ **toJson**(): `Record`<`string`, `unknown`\> + +#### Returns + +`Record`<`string`, `unknown`\> + +#### Defined in + +[indices/BaseIndex.ts:24](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L24) diff --git a/apps/docs/docs/api/classes/LLMQuestionGenerator.md b/apps/docs/docs/api/classes/LLMQuestionGenerator.md new file mode 100644 index 0000000000000000000000000000000000000000..8db114d5b44beff73c3feb4898256298c7d521b2 --- /dev/null +++ b/apps/docs/docs/api/classes/LLMQuestionGenerator.md @@ -0,0 +1,84 @@ +--- +id: "LLMQuestionGenerator" +title: "Class: LLMQuestionGenerator" +sidebar_label: "LLMQuestionGenerator" +sidebar_position: 0 +custom_edit_url: null +--- + +LLMQuestionGenerator uses the LLM to generate new questions for the LLM using tools and a user query. + +## Implements + +- [`BaseQuestionGenerator`](../interfaces/BaseQuestionGenerator.md) + +## Constructors + +### constructor + +• **new LLMQuestionGenerator**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`LLMQuestionGenerator`](LLMQuestionGenerator.md)\> | + +#### Defined in + +[QuestionGenerator.ts:34](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L34) + +## Properties + +### llm + +• **llm**: [`LLM`](../interfaces/LLM.md) + +#### Defined in + +[QuestionGenerator.ts:30](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L30) + +___ + +### outputParser + +• **outputParser**: [`BaseOutputParser`](../interfaces/BaseOutputParser.md)<[`StructuredOutput`](../interfaces/StructuredOutput.md)<[`SubQuestion`](../interfaces/SubQuestion.md)[]\>\> + +#### Defined in + +[QuestionGenerator.ts:32](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L32) + +___ + +### prompt + +• **prompt**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[QuestionGenerator.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L31) + +## Methods + +### generate + +▸ **generate**(`tools`, `query`): `Promise`<[`SubQuestion`](../interfaces/SubQuestion.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `tools` | [`ToolMetadata`](../interfaces/ToolMetadata.md)[] | +| `query` | `string` | + +#### Returns + +`Promise`<[`SubQuestion`](../interfaces/SubQuestion.md)[]\> + +#### Implementation of + +[BaseQuestionGenerator](../interfaces/BaseQuestionGenerator.md).[generate](../interfaces/BaseQuestionGenerator.md#generate) + +#### Defined in + +[QuestionGenerator.ts:40](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L40) diff --git a/apps/docs/docs/api/classes/ListIndex.md b/apps/docs/docs/api/classes/ListIndex.md new file mode 100644 index 0000000000000000000000000000000000000000..244fe2a367cc9b9271ae56988d1504509b568c0d --- /dev/null +++ b/apps/docs/docs/api/classes/ListIndex.md @@ -0,0 +1,294 @@ +--- +id: "ListIndex" +title: "Class: ListIndex" +sidebar_label: "ListIndex" +sidebar_position: 0 +custom_edit_url: null +--- + +A ListIndex keeps nodes in a sequential list structure + +## Hierarchy + +- [`BaseIndex`](BaseIndex.md)<[`IndexList`](IndexList.md)\> + + ↳ **`ListIndex`** + +## Constructors + +### constructor + +• **new ListIndex**(`init`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | [`BaseIndexInit`](../interfaces/BaseIndexInit.md)<[`IndexList`](IndexList.md)\> | + +#### Overrides + +[BaseIndex](BaseIndex.md).[constructor](BaseIndex.md#constructor) + +#### Defined in + +[indices/list/ListIndex.ts:43](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L43) + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Inherited from + +[BaseIndex](BaseIndex.md).[docStore](BaseIndex.md#docstore) + +#### Defined in + +[indices/BaseIndex.ts:117](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L117) + +___ + +### indexStore + +• `Optional` **indexStore**: `BaseIndexStore` + +#### Inherited from + +[BaseIndex](BaseIndex.md).[indexStore](BaseIndex.md#indexstore) + +#### Defined in + +[indices/BaseIndex.ts:119](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L119) + +___ + +### indexStruct + +• **indexStruct**: [`IndexList`](IndexList.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[indexStruct](BaseIndex.md#indexstruct) + +#### Defined in + +[indices/BaseIndex.ts:120](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L120) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[serviceContext](BaseIndex.md#servicecontext) + +#### Defined in + +[indices/BaseIndex.ts:115](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L115) + +___ + +### storageContext + +• **storageContext**: [`StorageContext`](../interfaces/StorageContext.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[storageContext](BaseIndex.md#storagecontext) + +#### Defined in + +[indices/BaseIndex.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L116) + +___ + +### vectorStore + +• `Optional` **vectorStore**: [`VectorStore`](../interfaces/VectorStore.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[vectorStore](BaseIndex.md#vectorstore) + +#### Defined in + +[indices/BaseIndex.ts:118](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L118) + +## Methods + +### \_deleteNode + +▸ `Protected` **_deleteNode**(`nodeId`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `nodeId` | `string` | + +#### Returns + +`void` + +#### Defined in + +[indices/list/ListIndex.ts:193](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L193) + +___ + +### \_insert + +▸ `Protected` **_insert**(`nodes`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `nodes` | [`BaseNode`](BaseNode.md)[] | + +#### Returns + +`void` + +#### Defined in + +[indices/list/ListIndex.ts:187](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L187) + +___ + +### asQueryEngine + +▸ **asQueryEngine**(`options?`): [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +Create a new query engine from the index. It will also create a retriever +and response synthezier if they are not provided. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `options?` | `Object` | you can supply your own custom Retriever and ResponseSynthesizer | +| `options.responseSynthesizer?` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | - | +| `options.retriever?` | [`BaseRetriever`](../interfaces/BaseRetriever.md) | - | + +#### Returns + +[`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +#### Overrides + +[BaseIndex](BaseIndex.md).[asQueryEngine](BaseIndex.md#asqueryengine) + +#### Defined in + +[indices/list/ListIndex.ts:151](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L151) + +___ + +### asRetriever + +▸ **asRetriever**(`options?`): [`BaseRetriever`](../interfaces/BaseRetriever.md) + +Create a new retriever from the index. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options?` | `Object` | +| `options.mode` | [`ListRetrieverMode`](../enums/ListRetrieverMode.md) | + +#### Returns + +[`BaseRetriever`](../interfaces/BaseRetriever.md) + +#### Overrides + +[BaseIndex](BaseIndex.md).[asRetriever](BaseIndex.md#asretriever) + +#### Defined in + +[indices/list/ListIndex.ts:138](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L138) + +___ + +### getRefDocInfo + +▸ **getRefDocInfo**(): `Promise`<`Record`<`string`, `RefDocInfo`\>\> + +#### Returns + +`Promise`<`Record`<`string`, `RefDocInfo`\>\> + +#### Defined in + +[indices/list/ListIndex.ts:199](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L199) + +___ + +### \_buildIndexFromNodes + +▸ `Static` **_buildIndexFromNodes**(`nodes`, `docStore`, `indexStruct?`): `Promise`<[`IndexList`](IndexList.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `nodes` | [`BaseNode`](BaseNode.md)[] | +| `docStore` | `BaseDocumentStore` | +| `indexStruct?` | [`IndexList`](IndexList.md) | + +#### Returns + +`Promise`<[`IndexList`](IndexList.md)\> + +#### Defined in + +[indices/list/ListIndex.ts:172](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L172) + +___ + +### fromDocuments + +▸ `Static` **fromDocuments**(`documents`, `args?`): `Promise`<[`ListIndex`](ListIndex.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `documents` | [`Document`](Document.md)[] | +| `args` | `Object` | +| `args.serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | +| `args.storageContext?` | [`StorageContext`](../interfaces/StorageContext.md) | + +#### Returns + +`Promise`<[`ListIndex`](ListIndex.md)\> + +#### Defined in + +[indices/list/ListIndex.ts:112](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L112) + +___ + +### init + +▸ `Static` **init**(`options`): `Promise`<[`ListIndex`](ListIndex.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options` | `ListIndexOptions` | + +#### Returns + +`Promise`<[`ListIndex`](ListIndex.md)\> + +#### Defined in + +[indices/list/ListIndex.ts:47](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L47) diff --git a/apps/docs/docs/api/classes/ListIndexLLMRetriever.md b/apps/docs/docs/api/classes/ListIndexLLMRetriever.md new file mode 100644 index 0000000000000000000000000000000000000000..9491f2aee5e518725e495f44c1b39030f5902769 --- /dev/null +++ b/apps/docs/docs/api/classes/ListIndexLLMRetriever.md @@ -0,0 +1,137 @@ +--- +id: "ListIndexLLMRetriever" +title: "Class: ListIndexLLMRetriever" +sidebar_label: "ListIndexLLMRetriever" +sidebar_position: 0 +custom_edit_url: null +--- + +LLM retriever for ListIndex. + +## Implements + +- [`BaseRetriever`](../interfaces/BaseRetriever.md) + +## Constructors + +### constructor + +• **new ListIndexLLMRetriever**(`index`, `choiceSelectPrompt?`, `choiceBatchSize?`, `formatNodeBatchFn?`, `parseChoiceSelectAnswerFn?`, `serviceContext?`) + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `index` | [`ListIndex`](ListIndex.md) | `undefined` | +| `choiceSelectPrompt?` | [`SimplePrompt`](../modules.md#simpleprompt) | `undefined` | +| `choiceBatchSize` | `number` | `10` | +| `formatNodeBatchFn?` | `NodeFormatterFunction` | `undefined` | +| `parseChoiceSelectAnswerFn?` | `ChoiceSelectParserFunction` | `undefined` | +| `serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | `undefined` | + +#### Defined in + +[indices/list/ListIndexRetriever.ts:64](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L64) + +## Properties + +### choiceBatchSize + +• **choiceBatchSize**: `number` + +#### Defined in + +[indices/list/ListIndexRetriever.ts:59](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L59) + +___ + +### choiceSelectPrompt + +• **choiceSelectPrompt**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:58](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L58) + +___ + +### formatNodeBatchFn + +• **formatNodeBatchFn**: `NodeFormatterFunction` + +#### Defined in + +[indices/list/ListIndexRetriever.ts:60](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L60) + +___ + +### index + +• **index**: [`ListIndex`](ListIndex.md) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:57](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L57) + +___ + +### parseChoiceSelectAnswerFn + +• **parseChoiceSelectAnswerFn**: `ChoiceSelectParserFunction` + +#### Defined in + +[indices/list/ListIndexRetriever.ts:61](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L61) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L62) + +## Methods + +### getServiceContext + +▸ **getServiceContext**(): [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Returns + +[`ServiceContext`](../interfaces/ServiceContext.md) + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[getServiceContext](../interfaces/BaseRetriever.md#getservicecontext) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:127](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L127) + +___ + +### retrieve + +▸ **retrieve**(`query`, `parentEvent?`): `Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[retrieve](../interfaces/BaseRetriever.md#retrieve) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:81](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L81) diff --git a/apps/docs/docs/api/classes/ListIndexRetriever.md b/apps/docs/docs/api/classes/ListIndexRetriever.md new file mode 100644 index 0000000000000000000000000000000000000000..628990f5b6a465625d9e6bcc27688fc0761ba743 --- /dev/null +++ b/apps/docs/docs/api/classes/ListIndexRetriever.md @@ -0,0 +1,82 @@ +--- +id: "ListIndexRetriever" +title: "Class: ListIndexRetriever" +sidebar_label: "ListIndexRetriever" +sidebar_position: 0 +custom_edit_url: null +--- + +Simple retriever for ListIndex that returns all nodes + +## Implements + +- [`BaseRetriever`](../interfaces/BaseRetriever.md) + +## Constructors + +### constructor + +• **new ListIndexRetriever**(`index`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `index` | [`ListIndex`](ListIndex.md) | + +#### Defined in + +[indices/list/ListIndexRetriever.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L22) + +## Properties + +### index + +• **index**: [`ListIndex`](ListIndex.md) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L20) + +## Methods + +### getServiceContext + +▸ **getServiceContext**(): [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Returns + +[`ServiceContext`](../interfaces/ServiceContext.md) + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[getServiceContext](../interfaces/BaseRetriever.md#getservicecontext) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:48](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L48) + +___ + +### retrieve + +▸ **retrieve**(`query`, `parentEvent?`): `Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[retrieve](../interfaces/BaseRetriever.md#retrieve) + +#### Defined in + +[indices/list/ListIndexRetriever.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndexRetriever.ts#L26) diff --git a/apps/docs/docs/api/classes/LlamaDeuce.md b/apps/docs/docs/api/classes/LlamaDeuce.md new file mode 100644 index 0000000000000000000000000000000000000000..10aea4cb337de2262799c79a3f4e9bd67f934df6 --- /dev/null +++ b/apps/docs/docs/api/classes/LlamaDeuce.md @@ -0,0 +1,224 @@ +--- +id: "LlamaDeuce" +title: "Class: LlamaDeuce" +sidebar_label: "LlamaDeuce" +sidebar_position: 0 +custom_edit_url: null +--- + +Llama2 LLM implementation + +## Implements + +- [`LLM`](../interfaces/LLM.md) + +## Constructors + +### constructor + +• **new LlamaDeuce**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`LlamaDeuce`](LlamaDeuce.md)\> | + +#### Defined in + +[llm/LLM.ts:208](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L208) + +## Properties + +### chatStrategy + +• **chatStrategy**: [`DeuceChatStrategy`](../enums/DeuceChatStrategy.md) + +#### Defined in + +[llm/LLM.ts:202](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L202) + +___ + +### maxTokens + +• `Optional` **maxTokens**: `number` + +#### Defined in + +[llm/LLM.ts:205](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L205) + +___ + +### model + +• **model**: ``"Llama-2-70b-chat"`` \| ``"Llama-2-13b-chat"`` \| ``"Llama-2-7b-chat"`` + +#### Defined in + +[llm/LLM.ts:201](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L201) + +___ + +### replicateSession + +• **replicateSession**: `ReplicateSession` + +#### Defined in + +[llm/LLM.ts:206](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L206) + +___ + +### temperature + +• **temperature**: `number` + +#### Defined in + +[llm/LLM.ts:203](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L203) + +___ + +### topP + +• **topP**: `number` + +#### Defined in + +[llm/LLM.ts:204](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L204) + +## Methods + +### chat + +▸ **chat**(`messages`, `_parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a chat response from the LLM + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `_parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[chat](../interfaces/LLM.md#chat) + +#### Defined in + +[llm/LLM.ts:299](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L299) + +___ + +### complete + +▸ **complete**(`prompt`, `parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a prompt completion from the LLM + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `prompt` | `string` | the prompt to complete | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | - | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[complete](../interfaces/LLM.md#complete) + +#### Defined in + +[llm/LLM.ts:326](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L326) + +___ + +### mapMessageTypeA16Z + +▸ **mapMessageTypeA16Z**(`messageType`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messageType` | [`MessageType`](../modules.md#messagetype) | + +#### Returns + +`string` + +#### Defined in + +[llm/LLM.ts:240](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L240) + +___ + +### mapMessagesToPrompt + +▸ **mapMessagesToPrompt**(`messages`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | + +#### Returns + +`string` + +#### Defined in + +[llm/LLM.ts:217](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L217) + +___ + +### mapMessagesToPromptA16Z + +▸ **mapMessagesToPromptA16Z**(`messages`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | + +#### Returns + +`string` + +#### Defined in + +[llm/LLM.ts:229](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L229) + +___ + +### mapMessagesToPromptMeta + +▸ **mapMessagesToPromptMeta**(`messages`, `withBos?`): `string` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | `undefined` | +| `withBos` | `boolean` | `false` | + +#### Returns + +`string` + +#### Defined in + +[llm/LLM.ts:253](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L253) diff --git a/apps/docs/docs/api/classes/OpenAI.md b/apps/docs/docs/api/classes/OpenAI.md new file mode 100644 index 0000000000000000000000000000000000000000..ccbf3c402573d46e7b1bb7b98458d6cfafb80889 --- /dev/null +++ b/apps/docs/docs/api/classes/OpenAI.md @@ -0,0 +1,193 @@ +--- +id: "OpenAI" +title: "Class: OpenAI" +sidebar_label: "OpenAI" +sidebar_position: 0 +custom_edit_url: null +--- + +OpenAI LLM implementation + +## Implements + +- [`LLM`](../interfaces/LLM.md) + +## Constructors + +### constructor + +• **new OpenAI**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`OpenAI`](OpenAI.md)\> | + +#### Defined in + +[llm/LLM.ts:87](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L87) + +## Properties + +### apiKey + +• `Optional` **apiKey**: `string` = `undefined` + +#### Defined in + +[llm/LLM.ts:80](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L80) + +___ + +### callbackManager + +• `Optional` **callbackManager**: [`CallbackManager`](CallbackManager.md) + +#### Defined in + +[llm/LLM.ts:85](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L85) + +___ + +### maxRetries + +• **maxRetries**: `number` + +#### Defined in + +[llm/LLM.ts:81](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L81) + +___ + +### maxTokens + +• `Optional` **maxTokens**: `number` + +#### Defined in + +[llm/LLM.ts:77](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L77) + +___ + +### model + +• **model**: ``"gpt-3.5-turbo"`` \| ``"gpt-3.5-turbo-16k"`` \| ``"gpt-4"`` \| ``"gpt-4-32k"`` + +#### Defined in + +[llm/LLM.ts:74](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L74) + +___ + +### session + +• **session**: `OpenAISession` + +#### Defined in + +[llm/LLM.ts:83](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L83) + +___ + +### temperature + +• **temperature**: `number` + +#### Defined in + +[llm/LLM.ts:75](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L75) + +___ + +### timeout + +• `Optional` **timeout**: `number` + +#### Defined in + +[llm/LLM.ts:82](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L82) + +___ + +### topP + +• **topP**: `number` + +#### Defined in + +[llm/LLM.ts:76](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L76) + +## Methods + +### chat + +▸ **chat**(`messages`, `parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a chat response from the LLM + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[chat](../interfaces/LLM.md#chat) + +#### Defined in + +[llm/LLM.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L124) + +___ + +### complete + +▸ **complete**(`prompt`, `parentEvent?`): `Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +Get a prompt completion from the LLM + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `prompt` | `string` | the prompt to complete | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | - | + +#### Returns + +`Promise`<[`ChatResponse`](../interfaces/ChatResponse.md)\> + +#### Implementation of + +[LLM](../interfaces/LLM.md).[complete](../interfaces/LLM.md#complete) + +#### Defined in + +[llm/LLM.ts:163](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L163) + +___ + +### mapMessageType + +▸ **mapMessageType**(`messageType`): ``"function"`` \| ``"user"`` \| ``"assistant"`` \| ``"system"`` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messageType` | [`MessageType`](../modules.md#messagetype) | + +#### Returns + +``"function"`` \| ``"user"`` \| ``"assistant"`` \| ``"system"`` + +#### Defined in + +[llm/LLM.ts:107](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L107) diff --git a/apps/docs/docs/api/classes/OpenAIEmbedding.md b/apps/docs/docs/api/classes/OpenAIEmbedding.md new file mode 100644 index 0000000000000000000000000000000000000000..b9f88beb46a05d63bf1af3b3e641d5f897db504a --- /dev/null +++ b/apps/docs/docs/api/classes/OpenAIEmbedding.md @@ -0,0 +1,177 @@ +--- +id: "OpenAIEmbedding" +title: "Class: OpenAIEmbedding" +sidebar_label: "OpenAIEmbedding" +sidebar_position: 0 +custom_edit_url: null +--- + +## Hierarchy + +- [`BaseEmbedding`](BaseEmbedding.md) + + ↳ **`OpenAIEmbedding`** + +## Constructors + +### constructor + +• **new OpenAIEmbedding**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`OpenAIEmbedding`](OpenAIEmbedding.md)\> | + +#### Overrides + +[BaseEmbedding](BaseEmbedding.md).[constructor](BaseEmbedding.md#constructor) + +#### Defined in + +[Embedding.ts:222](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L222) + +## Properties + +### apiKey + +• `Optional` **apiKey**: `string` = `undefined` + +#### Defined in + +[Embedding.ts:217](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L217) + +___ + +### maxRetries + +• **maxRetries**: `number` + +#### Defined in + +[Embedding.ts:218](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L218) + +___ + +### model + +• **model**: `TEXT_EMBED_ADA_002` + +#### Defined in + +[Embedding.ts:214](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L214) + +___ + +### session + +• **session**: `OpenAISession` + +#### Defined in + +[Embedding.ts:220](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L220) + +___ + +### timeout + +• `Optional` **timeout**: `number` + +#### Defined in + +[Embedding.ts:219](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L219) + +## Methods + +### getOpenAIEmbedding + +▸ `Private` **getOpenAIEmbedding**(`input`): `Promise`<`number`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `string` | + +#### Returns + +`Promise`<`number`[]\> + +#### Defined in + +[Embedding.ts:237](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L237) + +___ + +### getQueryEmbedding + +▸ **getQueryEmbedding**(`query`): `Promise`<`number`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | + +#### Returns + +`Promise`<`number`[]\> + +#### Overrides + +[BaseEmbedding](BaseEmbedding.md).[getQueryEmbedding](BaseEmbedding.md#getqueryembedding) + +#### Defined in + +[Embedding.ts:253](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L253) + +___ + +### getTextEmbedding + +▸ **getTextEmbedding**(`text`): `Promise`<`number`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | + +#### Returns + +`Promise`<`number`[]\> + +#### Overrides + +[BaseEmbedding](BaseEmbedding.md).[getTextEmbedding](BaseEmbedding.md#gettextembedding) + +#### Defined in + +[Embedding.ts:249](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L249) + +___ + +### similarity + +▸ **similarity**(`embedding1`, `embedding2`, `mode?`): `number` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `embedding1` | `number`[] | `undefined` | +| `embedding2` | `number`[] | `undefined` | +| `mode` | [`SimilarityType`](../enums/SimilarityType.md) | `SimilarityType.DEFAULT` | + +#### Returns + +`number` + +#### Inherited from + +[BaseEmbedding](BaseEmbedding.md).[similarity](BaseEmbedding.md#similarity) + +#### Defined in + +[Embedding.ts:197](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L197) diff --git a/apps/docs/docs/api/classes/PDFReader.md b/apps/docs/docs/api/classes/PDFReader.md new file mode 100644 index 0000000000000000000000000000000000000000..963eb43ca5de3defec92bf1376e38dfba4d2f8cc --- /dev/null +++ b/apps/docs/docs/api/classes/PDFReader.md @@ -0,0 +1,44 @@ +--- +id: "PDFReader" +title: "Class: PDFReader" +sidebar_label: "PDFReader" +sidebar_position: 0 +custom_edit_url: null +--- + +Read the text of a PDF + +## Implements + +- [`BaseReader`](../interfaces/BaseReader.md) + +## Constructors + +### constructor + +• **new PDFReader**() + +## Methods + +### loadData + +▸ **loadData**(`file`, `fs?`): `Promise`<[`Document`](Document.md)[]\> + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `file` | `string` | `undefined` | +| `fs` | [`GenericFileSystem`](../interfaces/GenericFileSystem.md) | `DEFAULT_FS` | + +#### Returns + +`Promise`<[`Document`](Document.md)[]\> + +#### Implementation of + +[BaseReader](../interfaces/BaseReader.md).[loadData](../interfaces/BaseReader.md#loaddata) + +#### Defined in + +[readers/PDFReader.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/readers/PDFReader.ts#L11) diff --git a/apps/docs/docs/api/classes/Refine.md b/apps/docs/docs/api/classes/Refine.md new file mode 100644 index 0000000000000000000000000000000000000000..f76493b6e30ace25024d73b008552bc5750f7561 --- /dev/null +++ b/apps/docs/docs/api/classes/Refine.md @@ -0,0 +1,139 @@ +--- +id: "Refine" +title: "Class: Refine" +sidebar_label: "Refine" +sidebar_position: 0 +custom_edit_url: null +--- + +A response builder that uses the query to ask the LLM generate a better response using multiple text chunks. + +## Hierarchy + +- **`Refine`** + + ↳ [`CompactAndRefine`](CompactAndRefine.md) + +## Implements + +- `BaseResponseBuilder` + +## Constructors + +### constructor + +• **new Refine**(`serviceContext`, `textQATemplate?`, `refineTemplate?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | +| `textQATemplate?` | [`SimplePrompt`](../modules.md#simpleprompt) | +| `refineTemplate?` | [`SimplePrompt`](../modules.md#simpleprompt) | + +#### Defined in + +[ResponseSynthesizer.ts:78](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L78) + +## Properties + +### refineTemplate + +• **refineTemplate**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[ResponseSynthesizer.ts:76](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L76) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[ResponseSynthesizer.ts:74](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L74) + +___ + +### textQATemplate + +• **textQATemplate**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[ResponseSynthesizer.ts:75](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L75) + +## Methods + +### getResponse + +▸ **getResponse**(`query`, `textChunks`, `parentEvent?`, `prevResponse?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `textChunks` | `string`[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | +| `prevResponse?` | `string` | + +#### Returns + +`Promise`<`string`\> + +#### Implementation of + +BaseResponseBuilder.getResponse + +#### Defined in + +[ResponseSynthesizer.ts:88](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L88) + +___ + +### giveResponseSingle + +▸ `Private` **giveResponseSingle**(`queryStr`, `textChunk`, `parentEvent?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `queryStr` | `string` | +| `textChunk` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<`string`\> + +#### Defined in + +[ResponseSynthesizer.ts:113](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L113) + +___ + +### refineResponseSingle + +▸ `Private` **refineResponseSingle**(`response`, `queryStr`, `textChunk`, `parentEvent?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `response` | `string` | +| `queryStr` | `string` | +| `textChunk` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<`string`\> + +#### Defined in + +[ResponseSynthesizer.ts:149](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L149) diff --git a/apps/docs/docs/api/classes/Response.md b/apps/docs/docs/api/classes/Response.md new file mode 100644 index 0000000000000000000000000000000000000000..04202a45fdeed98b39416c4ab479dbd246c3da56 --- /dev/null +++ b/apps/docs/docs/api/classes/Response.md @@ -0,0 +1,74 @@ +--- +id: "Response" +title: "Class: Response" +sidebar_label: "Response" +sidebar_position: 0 +custom_edit_url: null +--- + +Respone is the output of a LLM + +## Constructors + +### constructor + +• **new Response**(`response`, `sourceNodes?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `response` | `string` | +| `sourceNodes?` | [`BaseNode`](BaseNode.md)[] | + +#### Defined in + +[Response.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Response.ts#L10) + +## Properties + +### response + +• **response**: `string` + +#### Defined in + +[Response.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Response.ts#L7) + +___ + +### sourceNodes + +• `Optional` **sourceNodes**: [`BaseNode`](BaseNode.md)[] + +#### Defined in + +[Response.ts:8](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Response.ts#L8) + +## Methods + +### getFormattedSources + +▸ **getFormattedSources**(): `void` + +#### Returns + +`void` + +#### Defined in + +[Response.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Response.ts#L15) + +___ + +### toString + +▸ **toString**(): `string` + +#### Returns + +`string` + +#### Defined in + +[Response.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Response.ts#L19) diff --git a/apps/docs/docs/api/classes/ResponseSynthesizer.md b/apps/docs/docs/api/classes/ResponseSynthesizer.md new file mode 100644 index 0000000000000000000000000000000000000000..a13a0d5752b0c5800b682c32f11b4cc133e0ab4c --- /dev/null +++ b/apps/docs/docs/api/classes/ResponseSynthesizer.md @@ -0,0 +1,69 @@ +--- +id: "ResponseSynthesizer" +title: "Class: ResponseSynthesizer" +sidebar_label: "ResponseSynthesizer" +sidebar_position: 0 +custom_edit_url: null +--- + +A ResponseSynthesizer is used to generate a response from a query and a list of nodes. + +## Constructors + +### constructor + +• **new ResponseSynthesizer**(`«destructured»?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `«destructured»` | `Object` | +| › `responseBuilder?` | `BaseResponseBuilder` | +| › `serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | + +#### Defined in + +[ResponseSynthesizer.ts:285](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L285) + +## Properties + +### responseBuilder + +• **responseBuilder**: `BaseResponseBuilder` + +#### Defined in + +[ResponseSynthesizer.ts:282](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L282) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[ResponseSynthesizer.ts:283](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L283) + +## Methods + +### synthesize + +▸ **synthesize**(`query`, `nodes`, `parentEvent?`): `Promise`<[`Response`](Response.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `nodes` | [`NodeWithScore`](../interfaces/NodeWithScore.md)[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Defined in + +[ResponseSynthesizer.ts:297](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L297) diff --git a/apps/docs/docs/api/classes/RetrieverQueryEngine.md b/apps/docs/docs/api/classes/RetrieverQueryEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..e392c39f8fa21c3553a765113ba7c8c3fa6fbeec --- /dev/null +++ b/apps/docs/docs/api/classes/RetrieverQueryEngine.md @@ -0,0 +1,77 @@ +--- +id: "RetrieverQueryEngine" +title: "Class: RetrieverQueryEngine" +sidebar_label: "RetrieverQueryEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +A query engine that uses a retriever to query an index and then synthesizes the response. + +## Implements + +- [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +## Constructors + +### constructor + +• **new RetrieverQueryEngine**(`retriever`, `responseSynthesizer?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `retriever` | [`BaseRetriever`](../interfaces/BaseRetriever.md) | +| `responseSynthesizer?` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | + +#### Defined in + +[QueryEngine.ts:34](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L34) + +## Properties + +### responseSynthesizer + +• **responseSynthesizer**: [`ResponseSynthesizer`](ResponseSynthesizer.md) + +#### Defined in + +[QueryEngine.ts:32](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L32) + +___ + +### retriever + +• **retriever**: [`BaseRetriever`](../interfaces/BaseRetriever.md) + +#### Defined in + +[QueryEngine.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L31) + +## Methods + +### query + +▸ **query**(`query`, `parentEvent?`): `Promise`<[`Response`](Response.md)\> + +Query the query engine and get a response. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Implementation of + +[BaseQueryEngine](../interfaces/BaseQueryEngine.md).[query](../interfaces/BaseQueryEngine.md#query) + +#### Defined in + +[QueryEngine.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L45) diff --git a/apps/docs/docs/api/classes/SentenceSplitter.md b/apps/docs/docs/api/classes/SentenceSplitter.md new file mode 100644 index 0000000000000000000000000000000000000000..329a9562ba446ea8c7a6ecf1bd3d0b9e08bdf8bd --- /dev/null +++ b/apps/docs/docs/api/classes/SentenceSplitter.md @@ -0,0 +1,238 @@ +--- +id: "SentenceSplitter" +title: "Class: SentenceSplitter" +sidebar_label: "SentenceSplitter" +sidebar_position: 0 +custom_edit_url: null +--- + +SentenceSplitter is our default text splitter that supports splitting into sentences, paragraphs, or fixed length chunks with overlap. + +One of the advantages of SentenceSplitter is that even in the fixed length chunks it will try to keep sentences together. + +## Constructors + +### constructor + +• **new SentenceSplitter**(`chunkSize?`, `chunkOverlap?`, `tokenizer?`, `tokenizerDecoder?`, `paragraphSeparator?`, `chunkingTokenizerFn?`) + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `chunkSize` | `number` | `DEFAULT_CHUNK_SIZE` | +| `chunkOverlap` | `number` | `DEFAULT_CHUNK_OVERLAP` | +| `tokenizer` | `any` | `null` | +| `tokenizerDecoder` | `any` | `null` | +| `paragraphSeparator` | `string` | `"\n\n\n"` | +| `chunkingTokenizerFn` | `any` | `undefined` | + +#### Defined in + +[TextSplitter.ts:35](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L35) + +## Properties + +### chunkOverlap + +• `Private` **chunkOverlap**: `number` + +#### Defined in + +[TextSplitter.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L28) + +___ + +### chunkSize + +• `Private` **chunkSize**: `number` + +#### Defined in + +[TextSplitter.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L27) + +___ + +### chunkingTokenizerFn + +• `Private` **chunkingTokenizerFn**: `any` + +#### Defined in + +[TextSplitter.ts:32](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L32) + +___ + +### paragraphSeparator + +• `Private` **paragraphSeparator**: `string` + +#### Defined in + +[TextSplitter.ts:31](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L31) + +___ + +### tokenizer + +• `Private` **tokenizer**: `any` + +#### Defined in + +[TextSplitter.ts:29](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L29) + +___ + +### tokenizerDecoder + +• `Private` **tokenizerDecoder**: `any` + +#### Defined in + +[TextSplitter.ts:30](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L30) + +## Methods + +### combineTextSplits + +▸ **combineTextSplits**(`newSentenceSplits`, `effectiveChunkSize`): `TextSplit`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `newSentenceSplits` | `SplitRep`[] | +| `effectiveChunkSize` | `number` | + +#### Returns + +`TextSplit`[] + +#### Defined in + +[TextSplitter.ts:155](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L155) + +___ + +### getEffectiveChunkSize + +▸ `Private` **getEffectiveChunkSize**(`extraInfoStr?`): `number` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `extraInfoStr?` | `string` | + +#### Returns + +`number` + +#### Defined in + +[TextSplitter.ts:74](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L74) + +___ + +### getParagraphSplits + +▸ **getParagraphSplits**(`text`, `effectiveChunkSize?`): `string`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | +| `effectiveChunkSize?` | `number` | + +#### Returns + +`string`[] + +#### Defined in + +[TextSplitter.ts:91](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L91) + +___ + +### getSentenceSplits + +▸ **getSentenceSplits**(`text`, `effectiveChunkSize?`): `string`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | +| `effectiveChunkSize?` | `number` | + +#### Returns + +`string`[] + +#### Defined in + +[TextSplitter.ts:117](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L117) + +___ + +### processSentenceSplits + +▸ `Private` **processSentenceSplits**(`sentenceSplits`, `effectiveChunkSize`): `SplitRep`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `sentenceSplits` | `string`[] | +| `effectiveChunkSize` | `number` | + +#### Returns + +`SplitRep`[] + +#### Defined in + +[TextSplitter.ts:130](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L130) + +___ + +### splitText + +▸ **splitText**(`text`, `extraInfoStr?`): `string`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | +| `extraInfoStr?` | `string` | + +#### Returns + +`string`[] + +#### Defined in + +[TextSplitter.ts:247](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L247) + +___ + +### splitTextWithOverlaps + +▸ **splitTextWithOverlaps**(`text`, `extraInfoStr?`): `TextSplit`[] + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `text` | `string` | +| `extraInfoStr?` | `string` | + +#### Returns + +`TextSplit`[] + +#### Defined in + +[TextSplitter.ts:219](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/TextSplitter.ts#L219) diff --git a/apps/docs/docs/api/classes/SimpleChatEngine.md b/apps/docs/docs/api/classes/SimpleChatEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..e534def4d126997595448a6ad4177e93601df262 --- /dev/null +++ b/apps/docs/docs/api/classes/SimpleChatEngine.md @@ -0,0 +1,96 @@ +--- +id: "SimpleChatEngine" +title: "Class: SimpleChatEngine" +sidebar_label: "SimpleChatEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +SimpleChatEngine is the simplest possible chat engine. Useful for using your own custom prompts. + +## Implements + +- [`ChatEngine`](../interfaces/ChatEngine.md) + +## Constructors + +### constructor + +• **new SimpleChatEngine**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`SimpleChatEngine`](SimpleChatEngine.md)\> | + +#### Defined in + +[ChatEngine.ts:40](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L40) + +## Properties + +### chatHistory + +• **chatHistory**: [`ChatMessage`](../interfaces/ChatMessage.md)[] + +#### Defined in + +[ChatEngine.ts:37](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L37) + +___ + +### llm + +• **llm**: [`LLM`](../interfaces/LLM.md) + +#### Defined in + +[ChatEngine.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L38) + +## Methods + +### chat + +▸ **chat**(`message`, `chatHistory?`): `Promise`<[`Response`](Response.md)\> + +Send message along with the class's current chat history to the LLM. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `message` | `string` | | +| `chatHistory?` | [`ChatMessage`](../interfaces/ChatMessage.md)[] | optional chat history if you want to customize the chat history | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[chat](../interfaces/ChatEngine.md#chat) + +#### Defined in + +[ChatEngine.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L45) + +___ + +### reset + +▸ **reset**(): `void` + +Resets the chat history so that it's empty. + +#### Returns + +`void` + +#### Implementation of + +[ChatEngine](../interfaces/ChatEngine.md).[reset](../interfaces/ChatEngine.md#reset) + +#### Defined in + +[ChatEngine.ts:54](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L54) diff --git a/apps/docs/docs/api/classes/SimpleDirectoryReader.md b/apps/docs/docs/api/classes/SimpleDirectoryReader.md new file mode 100644 index 0000000000000000000000000000000000000000..1591fae56f5ca07804aadcbbaf74c5a5d6a7ed27 --- /dev/null +++ b/apps/docs/docs/api/classes/SimpleDirectoryReader.md @@ -0,0 +1,43 @@ +--- +id: "SimpleDirectoryReader" +title: "Class: SimpleDirectoryReader" +sidebar_label: "SimpleDirectoryReader" +sidebar_position: 0 +custom_edit_url: null +--- + +Read all of the documents in a directory. Currently supports PDF and TXT files. + +## Implements + +- [`BaseReader`](../interfaces/BaseReader.md) + +## Constructors + +### constructor + +• **new SimpleDirectoryReader**() + +## Methods + +### loadData + +▸ **loadData**(`«destructured»`): `Promise`<[`Document`](Document.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `«destructured»` | [`SimpleDirectoryReaderLoadDataProps`](../modules.md#simpledirectoryreaderloaddataprops) | + +#### Returns + +`Promise`<[`Document`](Document.md)[]\> + +#### Implementation of + +[BaseReader](../interfaces/BaseReader.md).[loadData](../interfaces/BaseReader.md#loaddata) + +#### Defined in + +[readers/SimpleDirectoryReader.ts:37](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/readers/SimpleDirectoryReader.ts#L37) diff --git a/apps/docs/docs/api/classes/SimpleNodeParser.md b/apps/docs/docs/api/classes/SimpleNodeParser.md new file mode 100644 index 0000000000000000000000000000000000000000..8e1af1767c23051e8b3d2057e8ae94e8a89a6d1f --- /dev/null +++ b/apps/docs/docs/api/classes/SimpleNodeParser.md @@ -0,0 +1,120 @@ +--- +id: "SimpleNodeParser" +title: "Class: SimpleNodeParser" +sidebar_label: "SimpleNodeParser" +sidebar_position: 0 +custom_edit_url: null +--- + +SimpleNodeParser is the default NodeParser. It splits documents into TextNodes using a splitter, by default SentenceSplitter + +## Implements + +- [`NodeParser`](../interfaces/NodeParser.md) + +## Constructors + +### constructor + +• **new SimpleNodeParser**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Object` | +| `init.chunkOverlap?` | `number` | +| `init.chunkSize?` | `number` | +| `init.includeMetadata?` | `boolean` | +| `init.includePrevNextRel?` | `boolean` | +| `init.textSplitter?` | [`SentenceSplitter`](SentenceSplitter.md) | + +#### Defined in + +[NodeParser.ts:93](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L93) + +## Properties + +### includeMetadata + +• **includeMetadata**: `boolean` + +Whether to include metadata in the nodes. + +#### Defined in + +[NodeParser.ts:87](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L87) + +___ + +### includePrevNextRel + +• **includePrevNextRel**: `boolean` + +Whether to include previous and next relationships in the nodes. + +#### Defined in + +[NodeParser.ts:91](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L91) + +___ + +### textSplitter + +• **textSplitter**: [`SentenceSplitter`](SentenceSplitter.md) + +The text splitter to use. + +#### Defined in + +[NodeParser.ts:83](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L83) + +## Methods + +### getNodesFromDocuments + +▸ **getNodesFromDocuments**(`documents`): [`TextNode`](TextNode.md)[] + +Generate Node objects from documents + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `documents` | [`Document`](Document.md)[] | + +#### Returns + +[`TextNode`](TextNode.md)[] + +#### Implementation of + +[NodeParser](../interfaces/NodeParser.md).[getNodesFromDocuments](../interfaces/NodeParser.md#getnodesfromdocuments) + +#### Defined in + +[NodeParser.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L124) + +___ + +### fromDefaults + +▸ `Static` **fromDefaults**(`init?`): [`SimpleNodeParser`](SimpleNodeParser.md) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Object` | +| `init.chunkOverlap?` | `number` | +| `init.chunkSize?` | `number` | +| `init.includeMetadata?` | `boolean` | +| `init.includePrevNextRel?` | `boolean` | + +#### Returns + +[`SimpleNodeParser`](SimpleNodeParser.md) + +#### Defined in + +[NodeParser.ts:111](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L111) diff --git a/apps/docs/docs/api/classes/SimpleResponseBuilder.md b/apps/docs/docs/api/classes/SimpleResponseBuilder.md new file mode 100644 index 0000000000000000000000000000000000000000..924ea44fdbb2f438592c27f30fdc33f5f4a7c43c --- /dev/null +++ b/apps/docs/docs/api/classes/SimpleResponseBuilder.md @@ -0,0 +1,75 @@ +--- +id: "SimpleResponseBuilder" +title: "Class: SimpleResponseBuilder" +sidebar_label: "SimpleResponseBuilder" +sidebar_position: 0 +custom_edit_url: null +--- + +A response builder that just concatenates responses. + +## Implements + +- `BaseResponseBuilder` + +## Constructors + +### constructor + +• **new SimpleResponseBuilder**(`serviceContext`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | + +#### Defined in + +[ResponseSynthesizer.ts:49](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L49) + +## Properties + +### llm + +• **llm**: [`LLM`](../interfaces/LLM.md) + +#### Defined in + +[ResponseSynthesizer.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L46) + +___ + +### textQATemplate + +• **textQATemplate**: [`SimplePrompt`](../modules.md#simpleprompt) + +#### Defined in + +[ResponseSynthesizer.ts:47](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L47) + +## Methods + +### getResponse + +▸ **getResponse**(`query`, `textChunks`, `parentEvent?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `textChunks` | `string`[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<`string`\> + +#### Implementation of + +BaseResponseBuilder.getResponse + +#### Defined in + +[ResponseSynthesizer.ts:54](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L54) diff --git a/apps/docs/docs/api/classes/SubQuestionOutputParser.md b/apps/docs/docs/api/classes/SubQuestionOutputParser.md new file mode 100644 index 0000000000000000000000000000000000000000..d3071b83f7b13d391989a0d1a1030e7130dcf2af --- /dev/null +++ b/apps/docs/docs/api/classes/SubQuestionOutputParser.md @@ -0,0 +1,67 @@ +--- +id: "SubQuestionOutputParser" +title: "Class: SubQuestionOutputParser" +sidebar_label: "SubQuestionOutputParser" +sidebar_position: 0 +custom_edit_url: null +--- + +SubQuestionOutputParser is used to parse the output of the SubQuestionGenerator. + +## Implements + +- [`BaseOutputParser`](../interfaces/BaseOutputParser.md)<[`StructuredOutput`](../interfaces/StructuredOutput.md)<[`SubQuestion`](../interfaces/SubQuestion.md)[]\>\> + +## Constructors + +### constructor + +• **new SubQuestionOutputParser**() + +## Methods + +### format + +▸ **format**(`output`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `output` | `string` | + +#### Returns + +`string` + +#### Implementation of + +[BaseOutputParser](../interfaces/BaseOutputParser.md).[format](../interfaces/BaseOutputParser.md#format) + +#### Defined in + +[OutputParser.ts:97](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L97) + +___ + +### parse + +▸ **parse**(`output`): [`StructuredOutput`](../interfaces/StructuredOutput.md)<[`SubQuestion`](../interfaces/SubQuestion.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `output` | `string` | + +#### Returns + +[`StructuredOutput`](../interfaces/StructuredOutput.md)<[`SubQuestion`](../interfaces/SubQuestion.md)[]\> + +#### Implementation of + +[BaseOutputParser](../interfaces/BaseOutputParser.md).[parse](../interfaces/BaseOutputParser.md#parse) + +#### Defined in + +[OutputParser.ts:89](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L89) diff --git a/apps/docs/docs/api/classes/SubQuestionQueryEngine.md b/apps/docs/docs/api/classes/SubQuestionQueryEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..300902253026e474f145b32c2e301b5b99a878e7 --- /dev/null +++ b/apps/docs/docs/api/classes/SubQuestionQueryEngine.md @@ -0,0 +1,143 @@ +--- +id: "SubQuestionQueryEngine" +title: "Class: SubQuestionQueryEngine" +sidebar_label: "SubQuestionQueryEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +SubQuestionQueryEngine decomposes a question into subquestions and then + +## Implements + +- [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +## Constructors + +### constructor + +• **new SubQuestionQueryEngine**(`init`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | `Object` | +| `init.queryEngineTools` | [`QueryEngineTool`](../interfaces/QueryEngineTool.md)[] | +| `init.questionGen` | [`BaseQuestionGenerator`](../interfaces/BaseQuestionGenerator.md) | +| `init.responseSynthesizer` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | + +#### Defined in + +[QueryEngine.ts:65](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L65) + +## Properties + +### metadatas + +• **metadatas**: [`ToolMetadata`](../interfaces/ToolMetadata.md)[] + +#### Defined in + +[QueryEngine.ts:63](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L63) + +___ + +### queryEngines + +• **queryEngines**: `Record`<`string`, [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md)\> + +#### Defined in + +[QueryEngine.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L62) + +___ + +### questionGen + +• **questionGen**: [`BaseQuestionGenerator`](../interfaces/BaseQuestionGenerator.md) + +#### Defined in + +[QueryEngine.ts:61](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L61) + +___ + +### responseSynthesizer + +• **responseSynthesizer**: [`ResponseSynthesizer`](ResponseSynthesizer.md) + +#### Defined in + +[QueryEngine.ts:60](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L60) + +## Methods + +### query + +▸ **query**(`query`): `Promise`<[`Response`](Response.md)\> + +Query the query engine and get a response. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | + +#### Returns + +`Promise`<[`Response`](Response.md)\> + +#### Implementation of + +[BaseQueryEngine](../interfaces/BaseQueryEngine.md).[query](../interfaces/BaseQueryEngine.md#query) + +#### Defined in + +[QueryEngine.ts:106](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L106) + +___ + +### querySubQ + +▸ `Private` **querySubQ**(`subQ`, `parentEvent?`): `Promise`<``null`` \| [`NodeWithScore`](../interfaces/NodeWithScore.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `subQ` | [`SubQuestion`](../interfaces/SubQuestion.md) | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<``null`` \| [`NodeWithScore`](../interfaces/NodeWithScore.md)\> + +#### Defined in + +[QueryEngine.ts:134](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L134) + +___ + +### fromDefaults + +▸ `Static` **fromDefaults**(`init`): [`SubQuestionQueryEngine`](SubQuestionQueryEngine.md) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | `Object` | +| `init.queryEngineTools` | [`QueryEngineTool`](../interfaces/QueryEngineTool.md)[] | +| `init.questionGen?` | [`BaseQuestionGenerator`](../interfaces/BaseQuestionGenerator.md) | +| `init.responseSynthesizer?` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | +| `init.serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | + +#### Returns + +[`SubQuestionQueryEngine`](SubQuestionQueryEngine.md) + +#### Defined in + +[QueryEngine.ts:82](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L82) diff --git a/apps/docs/docs/api/classes/TextFileReader.md b/apps/docs/docs/api/classes/TextFileReader.md new file mode 100644 index 0000000000000000000000000000000000000000..859d0fa28a68634ff197e6d77f10a7412d0c42d2 --- /dev/null +++ b/apps/docs/docs/api/classes/TextFileReader.md @@ -0,0 +1,44 @@ +--- +id: "TextFileReader" +title: "Class: TextFileReader" +sidebar_label: "TextFileReader" +sidebar_position: 0 +custom_edit_url: null +--- + +Read a .txt file + +## Implements + +- [`BaseReader`](../interfaces/BaseReader.md) + +## Constructors + +### constructor + +• **new TextFileReader**() + +## Methods + +### loadData + +▸ **loadData**(`file`, `fs?`): `Promise`<[`Document`](Document.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `file` | `string` | +| `fs` | [`CompleteFileSystem`](../modules.md#completefilesystem) | + +#### Returns + +`Promise`<[`Document`](Document.md)[]\> + +#### Implementation of + +[BaseReader](../interfaces/BaseReader.md).[loadData](../interfaces/BaseReader.md#loaddata) + +#### Defined in + +[readers/SimpleDirectoryReader.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/readers/SimpleDirectoryReader.ts#L12) diff --git a/apps/docs/docs/api/classes/TextNode.md b/apps/docs/docs/api/classes/TextNode.md new file mode 100644 index 0000000000000000000000000000000000000000..4b7f6214d28d91600c2dd34e37bfb3fd6779dc15 --- /dev/null +++ b/apps/docs/docs/api/classes/TextNode.md @@ -0,0 +1,458 @@ +--- +id: "TextNode" +title: "Class: TextNode" +sidebar_label: "TextNode" +sidebar_position: 0 +custom_edit_url: null +--- + +TextNode is the default node type for text. Most common node type in LlamaIndex.TS + +## Hierarchy + +- [`BaseNode`](BaseNode.md) + + ↳ **`TextNode`** + + ↳↳ [`IndexNode`](IndexNode.md) + + ↳↳ [`Document`](Document.md) + +## Constructors + +### constructor + +• **new TextNode**(`init?`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init?` | `Partial`<[`TextNode`](TextNode.md)\> | + +#### Overrides + +[BaseNode](BaseNode.md).[constructor](BaseNode.md#constructor) + +#### Defined in + +[Node.ts:144](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L144) + +## Properties + +### embedding + +• `Optional` **embedding**: `number`[] + +#### Inherited from + +[BaseNode](BaseNode.md).[embedding](BaseNode.md#embedding) + +#### Defined in + +[Node.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L39) + +___ + +### endCharIdx + +• `Optional` **endCharIdx**: `number` + +#### Defined in + +[Node.ts:139](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L139) + +___ + +### excludedEmbedMetadataKeys + +• **excludedEmbedMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[BaseNode](BaseNode.md).[excludedEmbedMetadataKeys](BaseNode.md#excludedembedmetadatakeys) + +#### Defined in + +[Node.ts:43](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L43) + +___ + +### excludedLlmMetadataKeys + +• **excludedLlmMetadataKeys**: `string`[] = `[]` + +#### Inherited from + +[BaseNode](BaseNode.md).[excludedLlmMetadataKeys](BaseNode.md#excludedllmmetadatakeys) + +#### Defined in + +[Node.ts:44](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L44) + +___ + +### hash + +• **hash**: `string` = `""` + +#### Inherited from + +[BaseNode](BaseNode.md).[hash](BaseNode.md#hash) + +#### Defined in + +[Node.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L46) + +___ + +### id\_ + +• **id\_**: `string` + +#### Inherited from + +[BaseNode](BaseNode.md).[id_](BaseNode.md#id_) + +#### Defined in + +[Node.ts:38](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L38) + +___ + +### metadata + +• **metadata**: `Record`<`string`, `any`\> = `{}` + +#### Inherited from + +[BaseNode](BaseNode.md).[metadata](BaseNode.md#metadata) + +#### Defined in + +[Node.ts:42](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L42) + +___ + +### metadataSeparator + +• **metadataSeparator**: `string` = `"\n"` + +#### Defined in + +[Node.ts:142](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L142) + +___ + +### relationships + +• **relationships**: `Partial`<`Record`<[`NodeRelationship`](../enums/NodeRelationship.md), [`RelatedNodeType`](../modules.md#relatednodetype)\>\> = `{}` + +#### Inherited from + +[BaseNode](BaseNode.md).[relationships](BaseNode.md#relationships) + +#### Defined in + +[Node.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L45) + +___ + +### startCharIdx + +• `Optional` **startCharIdx**: `number` + +#### Defined in + +[Node.ts:138](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L138) + +___ + +### text + +• **text**: `string` = `""` + +#### Defined in + +[Node.ts:137](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L137) + +## Accessors + +### childNodes + +• `get` **childNodes**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md)[] + +#### Inherited from + +BaseNode.childNodes + +#### Defined in + +[Node.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L104) + +___ + +### nextNode + +• `get` **nextNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +BaseNode.nextNode + +#### Defined in + +[Node.ts:84](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L84) + +___ + +### nodeId + +• `get` **nodeId**(): `string` + +#### Returns + +`string` + +#### Inherited from + +BaseNode.nodeId + +#### Defined in + +[Node.ts:58](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L58) + +___ + +### parentNode + +• `get` **parentNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +BaseNode.parentNode + +#### Defined in + +[Node.ts:94](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L94) + +___ + +### prevNode + +• `get` **prevNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +BaseNode.prevNode + +#### Defined in + +[Node.ts:72](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L72) + +___ + +### sourceNode + +• `get` **sourceNode**(): `undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +`undefined` \| [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +BaseNode.sourceNode + +#### Defined in + +[Node.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L62) + +## Methods + +### asRelatedNodeInfo + +▸ **asRelatedNodeInfo**(): [`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Returns + +[`RelatedNodeInfo`](../interfaces/RelatedNodeInfo.md) + +#### Inherited from + +[BaseNode](BaseNode.md).[asRelatedNodeInfo](BaseNode.md#asrelatednodeinfo) + +#### Defined in + +[Node.ts:124](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L124) + +___ + +### generateHash + +▸ **generateHash**(): `void` + +#### Returns + +`void` + +#### Defined in + +[Node.ts:149](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L149) + +___ + +### getContent + +▸ **getContent**(`metadataMode?`): `string` + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | `MetadataMode.NONE` | + +#### Returns + +`string` + +#### Overrides + +[BaseNode](BaseNode.md).[getContent](BaseNode.md#getcontent) + +#### Defined in + +[Node.ts:157](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L157) + +___ + +### getEmbedding + +▸ **getEmbedding**(): `number`[] + +#### Returns + +`number`[] + +#### Inherited from + +[BaseNode](BaseNode.md).[getEmbedding](BaseNode.md#getembedding) + +#### Defined in + +[Node.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L116) + +___ + +### getMetadataStr + +▸ **getMetadataStr**(`metadataMode`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `metadataMode` | [`MetadataMode`](../enums/MetadataMode.md) | + +#### Returns + +`string` + +#### Overrides + +[BaseNode](BaseNode.md).[getMetadataStr](BaseNode.md#getmetadatastr) + +#### Defined in + +[Node.ts:162](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L162) + +___ + +### getNodeInfo + +▸ **getNodeInfo**(): `Object` + +#### Returns + +`Object` + +| Name | Type | +| :------ | :------ | +| `end` | `undefined` \| `number` | +| `start` | `undefined` \| `number` | + +#### Defined in + +[Node.ts:187](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L187) + +___ + +### getText + +▸ **getText**(): `string` + +#### Returns + +`string` + +#### Defined in + +[Node.ts:191](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L191) + +___ + +### getType + +▸ **getType**(): [`ObjectType`](../enums/ObjectType.md) + +#### Returns + +[`ObjectType`](../enums/ObjectType.md) + +#### Overrides + +[BaseNode](BaseNode.md).[getType](BaseNode.md#gettype) + +#### Defined in + +[Node.ts:153](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L153) + +___ + +### setContent + +▸ **setContent**(`value`): `void` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `value` | `string` | + +#### Returns + +`void` + +#### Overrides + +[BaseNode](BaseNode.md).[setContent](BaseNode.md#setcontent) + +#### Defined in + +[Node.ts:183](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L183) diff --git a/apps/docs/docs/api/classes/TreeSummarize.md b/apps/docs/docs/api/classes/TreeSummarize.md new file mode 100644 index 0000000000000000000000000000000000000000..86b9def05519d940193d6ecc62026422b36fb60f --- /dev/null +++ b/apps/docs/docs/api/classes/TreeSummarize.md @@ -0,0 +1,65 @@ +--- +id: "TreeSummarize" +title: "Class: TreeSummarize" +sidebar_label: "TreeSummarize" +sidebar_position: 0 +custom_edit_url: null +--- + +TreeSummarize repacks the text chunks into the smallest possible number of chunks and then summarizes them, then recursively does so until there's one chunk left. + +## Implements + +- `BaseResponseBuilder` + +## Constructors + +### constructor + +• **new TreeSummarize**(`serviceContext`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | + +#### Defined in + +[ResponseSynthesizer.ts:212](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L212) + +## Properties + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[ResponseSynthesizer.ts:210](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L210) + +## Methods + +### getResponse + +▸ **getResponse**(`query`, `textChunks`, `parentEvent?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `textChunks` | `string`[] | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<`string`\> + +#### Implementation of + +BaseResponseBuilder.getResponse + +#### Defined in + +[ResponseSynthesizer.ts:216](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L216) diff --git a/apps/docs/docs/api/classes/VectorIndexRetriever.md b/apps/docs/docs/api/classes/VectorIndexRetriever.md new file mode 100644 index 0000000000000000000000000000000000000000..a21d2e155b6c1609959b44db64248161a7112a8f --- /dev/null +++ b/apps/docs/docs/api/classes/VectorIndexRetriever.md @@ -0,0 +1,104 @@ +--- +id: "VectorIndexRetriever" +title: "Class: VectorIndexRetriever" +sidebar_label: "VectorIndexRetriever" +sidebar_position: 0 +custom_edit_url: null +--- + +VectorIndexRetriever retrieves nodes from a VectorIndex. + +## Implements + +- [`BaseRetriever`](../interfaces/BaseRetriever.md) + +## Constructors + +### constructor + +• **new VectorIndexRetriever**(`«destructured»`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `«destructured»` | `Object` | +| › `index` | [`VectorStoreIndex`](VectorStoreIndex.md) | +| › `similarityTopK?` | `number` | + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L22) + +## Properties + +### index + +• **index**: [`VectorStoreIndex`](VectorStoreIndex.md) + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:18](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L18) + +___ + +### serviceContext + +• `Private` **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L20) + +___ + +### similarityTopK + +• **similarityTopK**: `number` + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L19) + +## Methods + +### getServiceContext + +▸ **getServiceContext**(): [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Returns + +[`ServiceContext`](../interfaces/ServiceContext.md) + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[getServiceContext](../interfaces/BaseRetriever.md#getservicecontext) + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:69](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L69) + +___ + +### retrieve + +▸ **retrieve**(`query`, `parentEvent?`): `Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](../interfaces/Event.md) | + +#### Returns + +`Promise`<[`NodeWithScore`](../interfaces/NodeWithScore.md)[]\> + +#### Implementation of + +[BaseRetriever](../interfaces/BaseRetriever.md).[retrieve](../interfaces/BaseRetriever.md#retrieve) + +#### Defined in + +[indices/vectorStore/VectorIndexRetriever.ts:35](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorIndexRetriever.ts#L35) diff --git a/apps/docs/docs/api/classes/VectorStoreIndex.md b/apps/docs/docs/api/classes/VectorStoreIndex.md new file mode 100644 index 0000000000000000000000000000000000000000..436dde1aa617176861346492b5cd7b6af8d23cbb --- /dev/null +++ b/apps/docs/docs/api/classes/VectorStoreIndex.md @@ -0,0 +1,271 @@ +--- +id: "VectorStoreIndex" +title: "Class: VectorStoreIndex" +sidebar_label: "VectorStoreIndex" +sidebar_position: 0 +custom_edit_url: null +--- + +The VectorStoreIndex, an index that stores the nodes only according to their vector embedings. + +## Hierarchy + +- [`BaseIndex`](BaseIndex.md)<[`IndexDict`](IndexDict.md)\> + + ↳ **`VectorStoreIndex`** + +## Constructors + +### constructor + +• `Private` **new VectorStoreIndex**(`init`) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `init` | [`VectorIndexConstructorProps`](../interfaces/VectorIndexConstructorProps.md) | + +#### Overrides + +[BaseIndex](BaseIndex.md).[constructor](BaseIndex.md#constructor) + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:36](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L36) + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Inherited from + +[BaseIndex](BaseIndex.md).[docStore](BaseIndex.md#docstore) + +#### Defined in + +[indices/BaseIndex.ts:117](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L117) + +___ + +### indexStore + +• `Optional` **indexStore**: `BaseIndexStore` + +#### Inherited from + +[BaseIndex](BaseIndex.md).[indexStore](BaseIndex.md#indexstore) + +#### Defined in + +[indices/BaseIndex.ts:119](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L119) + +___ + +### indexStruct + +• **indexStruct**: [`IndexDict`](IndexDict.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[indexStruct](BaseIndex.md#indexstruct) + +#### Defined in + +[indices/BaseIndex.ts:120](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L120) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](../interfaces/ServiceContext.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[serviceContext](BaseIndex.md#servicecontext) + +#### Defined in + +[indices/BaseIndex.ts:115](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L115) + +___ + +### storageContext + +• **storageContext**: [`StorageContext`](../interfaces/StorageContext.md) + +#### Inherited from + +[BaseIndex](BaseIndex.md).[storageContext](BaseIndex.md#storagecontext) + +#### Defined in + +[indices/BaseIndex.ts:116](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L116) + +___ + +### vectorStore + +• **vectorStore**: [`VectorStore`](../interfaces/VectorStore.md) + +#### Overrides + +[BaseIndex](BaseIndex.md).[vectorStore](BaseIndex.md#vectorstore) + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:34](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L34) + +## Methods + +### asQueryEngine + +▸ **asQueryEngine**(`options?`): [`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +Create a new query engine from the index. It will also create a retriever +and response synthezier if they are not provided. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `options?` | `Object` | you can supply your own custom Retriever and ResponseSynthesizer | +| `options.responseSynthesizer?` | [`ResponseSynthesizer`](ResponseSynthesizer.md) | - | +| `options.retriever?` | [`BaseRetriever`](../interfaces/BaseRetriever.md) | - | + +#### Returns + +[`BaseQueryEngine`](../interfaces/BaseQueryEngine.md) + +#### Overrides + +[BaseIndex](BaseIndex.md).[asQueryEngine](BaseIndex.md#asqueryengine) + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:215](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L215) + +___ + +### asRetriever + +▸ **asRetriever**(`options?`): [`VectorIndexRetriever`](VectorIndexRetriever.md) + +Create a new retriever from the index. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options?` | `any` | + +#### Returns + +[`VectorIndexRetriever`](VectorIndexRetriever.md) + +#### Overrides + +[BaseIndex](BaseIndex.md).[asRetriever](BaseIndex.md#asretriever) + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:211](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L211) + +___ + +### buildIndexFromNodes + +▸ `Static` **buildIndexFromNodes**(`nodes`, `serviceContext`, `vectorStore`, `docStore`): `Promise`<[`IndexDict`](IndexDict.md)\> + +Get embeddings for nodes and place them into the index. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `nodes` | [`BaseNode`](BaseNode.md)[] | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | +| `vectorStore` | [`VectorStore`](../interfaces/VectorStore.md) | +| `docStore` | `BaseDocumentStore` | + +#### Returns + +`Promise`<[`IndexDict`](IndexDict.md)\> + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:151](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L151) + +___ + +### fromDocuments + +▸ `Static` **fromDocuments**(`documents`, `args?`): `Promise`<[`VectorStoreIndex`](VectorStoreIndex.md)\> + +High level API: split documents, get embeddings, and build index. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `documents` | [`Document`](Document.md)[] | +| `args` | `Object` | +| `args.serviceContext?` | [`ServiceContext`](../interfaces/ServiceContext.md) | +| `args.storageContext?` | [`StorageContext`](../interfaces/StorageContext.md) | + +#### Returns + +`Promise`<[`VectorStoreIndex`](VectorStoreIndex.md)\> + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:186](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L186) + +___ + +### getNodeEmbeddingResults + +▸ `Static` **getNodeEmbeddingResults**(`nodes`, `serviceContext`, `logProgress?`): `Promise`<[`NodeWithEmbedding`](../interfaces/NodeWithEmbedding.md)[]\> + +Get the embeddings for nodes. + +#### Parameters + +| Name | Type | Default value | Description | +| :------ | :------ | :------ | :------ | +| `nodes` | [`BaseNode`](BaseNode.md)[] | `undefined` | | +| `serviceContext` | [`ServiceContext`](../interfaces/ServiceContext.md) | `undefined` | | +| `logProgress` | `boolean` | `false` | log progress to console (useful for debugging) | + +#### Returns + +`Promise`<[`NodeWithEmbedding`](../interfaces/NodeWithEmbedding.md)[]\> + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:123](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L123) + +___ + +### init + +▸ `Static` **init**(`options`): `Promise`<[`VectorStoreIndex`](VectorStoreIndex.md)\> + +The async init function should be called after the constructor. +This is needed to handle persistence. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options` | [`VectorIndexOptions`](../interfaces/VectorIndexOptions.md) | + +#### Returns + +`Promise`<[`VectorStoreIndex`](VectorStoreIndex.md)\> + +#### Defined in + +[indices/vectorStore/VectorStoreIndex.ts:47](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/vectorStore/VectorStoreIndex.ts#L47) diff --git a/apps/docs/docs/api/classes/_category_.yml b/apps/docs/docs/api/classes/_category_.yml new file mode 100644 index 0000000000000000000000000000000000000000..55c7980a46440f2ff50537c3392125668d3bb43e --- /dev/null +++ b/apps/docs/docs/api/classes/_category_.yml @@ -0,0 +1,2 @@ +label: "Classes" +position: 3 \ No newline at end of file diff --git a/apps/docs/docs/api/enums/DeuceChatStrategy.md b/apps/docs/docs/api/enums/DeuceChatStrategy.md new file mode 100644 index 0000000000000000000000000000000000000000..b3ff389937e46b7020404d1b1e9282bfa09f8254 --- /dev/null +++ b/apps/docs/docs/api/enums/DeuceChatStrategy.md @@ -0,0 +1,37 @@ +--- +id: "DeuceChatStrategy" +title: "Enumeration: DeuceChatStrategy" +sidebar_label: "DeuceChatStrategy" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### A16Z + +• **A16Z** = ``"a16z"`` + +#### Defined in + +[llm/LLM.ts:190](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L190) + +___ + +### META + +• **META** = ``"meta"`` + +#### Defined in + +[llm/LLM.ts:191](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L191) + +___ + +### METAWBOS + +• **METAWBOS** = ``"metawbos"`` + +#### Defined in + +[llm/LLM.ts:192](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L192) diff --git a/apps/docs/docs/api/enums/IndexStructType.md b/apps/docs/docs/api/enums/IndexStructType.md new file mode 100644 index 0000000000000000000000000000000000000000..f6ee3cf2640d62ff415f52910d97d27658ed7825 --- /dev/null +++ b/apps/docs/docs/api/enums/IndexStructType.md @@ -0,0 +1,27 @@ +--- +id: "IndexStructType" +title: "Enumeration: IndexStructType" +sidebar_label: "IndexStructType" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### LIST + +• **LIST** = ``"list"`` + +#### Defined in + +[indices/BaseIndex.ts:41](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L41) + +___ + +### SIMPLE\_DICT + +• **SIMPLE\_DICT** = ``"simple_dict"`` + +#### Defined in + +[indices/BaseIndex.ts:40](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L40) diff --git a/apps/docs/docs/api/enums/ListRetrieverMode.md b/apps/docs/docs/api/enums/ListRetrieverMode.md new file mode 100644 index 0000000000000000000000000000000000000000..fe1addec49248baf497f8892512d85f1d38c7101 --- /dev/null +++ b/apps/docs/docs/api/enums/ListRetrieverMode.md @@ -0,0 +1,27 @@ +--- +id: "ListRetrieverMode" +title: "Enumeration: ListRetrieverMode" +sidebar_label: "ListRetrieverMode" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### DEFAULT + +• **DEFAULT** = ``"default"`` + +#### Defined in + +[indices/list/ListIndex.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L26) + +___ + +### LLM + +• **LLM** = ``"llm"`` + +#### Defined in + +[indices/list/ListIndex.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/list/ListIndex.ts#L28) diff --git a/apps/docs/docs/api/enums/MetadataMode.md b/apps/docs/docs/api/enums/MetadataMode.md new file mode 100644 index 0000000000000000000000000000000000000000..732189f334980bba75d195c07e748a7548474430 --- /dev/null +++ b/apps/docs/docs/api/enums/MetadataMode.md @@ -0,0 +1,47 @@ +--- +id: "MetadataMode" +title: "Enumeration: MetadataMode" +sidebar_label: "MetadataMode" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### ALL + +• **ALL** = ``"ALL"`` + +#### Defined in + +[Node.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L19) + +___ + +### EMBED + +• **EMBED** = ``"EMBED"`` + +#### Defined in + +[Node.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L20) + +___ + +### LLM + +• **LLM** = ``"LLM"`` + +#### Defined in + +[Node.ts:21](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L21) + +___ + +### NONE + +• **NONE** = ``"NONE"`` + +#### Defined in + +[Node.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L22) diff --git a/apps/docs/docs/api/enums/NodeRelationship.md b/apps/docs/docs/api/enums/NodeRelationship.md new file mode 100644 index 0000000000000000000000000000000000000000..3e65d05bf84c9f24bd3d458c6c9c576c74b2334c --- /dev/null +++ b/apps/docs/docs/api/enums/NodeRelationship.md @@ -0,0 +1,57 @@ +--- +id: "NodeRelationship" +title: "Enumeration: NodeRelationship" +sidebar_label: "NodeRelationship" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### CHILD + +• **CHILD** = ``"CHILD"`` + +#### Defined in + +[Node.ts:8](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L8) + +___ + +### NEXT + +• **NEXT** = ``"NEXT"`` + +#### Defined in + +[Node.ts:6](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L6) + +___ + +### PARENT + +• **PARENT** = ``"PARENT"`` + +#### Defined in + +[Node.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L7) + +___ + +### PREVIOUS + +• **PREVIOUS** = ``"PREVIOUS"`` + +#### Defined in + +[Node.ts:5](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L5) + +___ + +### SOURCE + +• **SOURCE** = ``"SOURCE"`` + +#### Defined in + +[Node.ts:4](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L4) diff --git a/apps/docs/docs/api/enums/ObjectType.md b/apps/docs/docs/api/enums/ObjectType.md new file mode 100644 index 0000000000000000000000000000000000000000..397aac1bc143275a24fe9422782187cacfe6989f --- /dev/null +++ b/apps/docs/docs/api/enums/ObjectType.md @@ -0,0 +1,47 @@ +--- +id: "ObjectType" +title: "Enumeration: ObjectType" +sidebar_label: "ObjectType" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### DOCUMENT + +• **DOCUMENT** = ``"DOCUMENT"`` + +#### Defined in + +[Node.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L15) + +___ + +### IMAGE + +• **IMAGE** = ``"IMAGE"`` + +#### Defined in + +[Node.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L13) + +___ + +### INDEX + +• **INDEX** = ``"INDEX"`` + +#### Defined in + +[Node.ts:14](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L14) + +___ + +### TEXT + +• **TEXT** = ``"TEXT"`` + +#### Defined in + +[Node.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L12) diff --git a/apps/docs/docs/api/enums/SimilarityType.md b/apps/docs/docs/api/enums/SimilarityType.md new file mode 100644 index 0000000000000000000000000000000000000000..5848da6a09891c506c9c834e514b20003c3511ff --- /dev/null +++ b/apps/docs/docs/api/enums/SimilarityType.md @@ -0,0 +1,40 @@ +--- +id: "SimilarityType" +title: "Enumeration: SimilarityType" +sidebar_label: "SimilarityType" +sidebar_position: 0 +custom_edit_url: null +--- + +Similarity type +Default is cosine similarity. Dot product and negative Euclidean distance are also supported. + +## Enumeration Members + +### DEFAULT + +• **DEFAULT** = ``"cosine"`` + +#### Defined in + +[Embedding.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L10) + +___ + +### DOT\_PRODUCT + +• **DOT\_PRODUCT** = ``"dot_product"`` + +#### Defined in + +[Embedding.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L11) + +___ + +### EUCLIDEAN + +• **EUCLIDEAN** = ``"euclidean"`` + +#### Defined in + +[Embedding.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L12) diff --git a/apps/docs/docs/api/enums/VectorStoreQueryMode.md b/apps/docs/docs/api/enums/VectorStoreQueryMode.md new file mode 100644 index 0000000000000000000000000000000000000000..ab30e61ff6cb560d61db7fa949eb2a25b88dddb5 --- /dev/null +++ b/apps/docs/docs/api/enums/VectorStoreQueryMode.md @@ -0,0 +1,77 @@ +--- +id: "VectorStoreQueryMode" +title: "Enumeration: VectorStoreQueryMode" +sidebar_label: "VectorStoreQueryMode" +sidebar_position: 0 +custom_edit_url: null +--- + +## Enumeration Members + +### DEFAULT + +• **DEFAULT** = ``"default"`` + +#### Defined in + +[storage/vectorStore/types.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L12) + +___ + +### HYBRID + +• **HYBRID** = ``"hybrid"`` + +#### Defined in + +[storage/vectorStore/types.ts:14](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L14) + +___ + +### LINEAR\_REGRESSION + +• **LINEAR\_REGRESSION** = ``"linear_regression"`` + +#### Defined in + +[storage/vectorStore/types.ts:18](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L18) + +___ + +### LOGISTIC\_REGRESSION + +• **LOGISTIC\_REGRESSION** = ``"logistic_regression"`` + +#### Defined in + +[storage/vectorStore/types.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L17) + +___ + +### MMR + +• **MMR** = ``"mmr"`` + +#### Defined in + +[storage/vectorStore/types.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L20) + +___ + +### SPARSE + +• **SPARSE** = ``"sparse"`` + +#### Defined in + +[storage/vectorStore/types.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L13) + +___ + +### SVM + +• **SVM** = ``"svm"`` + +#### Defined in + +[storage/vectorStore/types.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L16) diff --git a/apps/docs/docs/api/enums/_category_.yml b/apps/docs/docs/api/enums/_category_.yml new file mode 100644 index 0000000000000000000000000000000000000000..1687a9e03fd705b092a975c2bd86f8e76af69ea1 --- /dev/null +++ b/apps/docs/docs/api/enums/_category_.yml @@ -0,0 +1,2 @@ +label: "Enumerations" +position: 2 \ No newline at end of file diff --git a/apps/docs/docs/api/index.md b/apps/docs/docs/api/index.md new file mode 100644 index 0000000000000000000000000000000000000000..1f6232aeebf3383b7466f64cb4b0b0b71cd65f33 --- /dev/null +++ b/apps/docs/docs/api/index.md @@ -0,0 +1,107 @@ +--- +id: "index" +title: "llamaindex" +sidebar_label: "Readme" +sidebar_position: 0 +custom_edit_url: null +--- + +# LlamaIndex.TS + +LlamaIndex is a data framework for your LLM application. + +Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript. + +Documentation: https://ts.llamaindex.ai/ + +## What is LlamaIndex.TS? + +LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data. + +## Getting started with an example: + +LlamaIndex.TS requries Node v18 or higher. You can download it from https://nodejs.org or use https://nvm.sh (our preferred option). + +In a new folder: + +```bash +export OPENAI_API_KEY="sk-......" # Replace with your key from https://platform.openai.com/account/api-keys +pnpm init +pnpm install typescript +pnpm exec tsc –-init # if needed +pnpm install llamaindex +pnpm install @types/node +``` + +Create the file example.ts + +```ts +// example.ts +import fs from "fs/promises"; +import { Document, VectorStoreIndex } from "llamaindex"; + +async function main() { + // Load essay from abramov.txt in Node + const essay = await fs.readFile( + "node_modules/llamaindex/examples/abramov.txt", + "utf-8" + ); + + // Create Document object with essay + const document = new Document({ text: essay }); + + // Split text and create embeddings. Store them in a VectorStoreIndex + const index = await VectorStoreIndex.fromDocuments([document]); + + // Query the index + const queryEngine = index.asQueryEngine(); + const response = await queryEngine.query( + "What did the author do in college?" + ); + + // Output response + console.log(response.toString()); +} + +main(); +``` + +Then you can run it using + +```bash +pnpm dlx ts-node example.ts +``` + +## Playground + +Check out our NextJS playground at https://llama-playground.vercel.app/. The source is available at https://github.com/run-llama/ts-playground + +## Core concepts for getting started: + +- [Document](/packages/core/src/Node.ts): A document represents a text file, PDF file or other contiguous piece of data. + +- [Node](/packages/core/src/Node.ts): The basic data building block. Most commonly, these are parts of the document split into manageable pieces that are small enough to be fed into an embedding model and LLM. + +- [Embedding](/packages/core/src/Embedding.ts): Embeddings are sets of floating point numbers which represent the data in a Node. By comparing the similarity of embeddings, we can derive an understanding of the similarity of two pieces of data. One use case is to compare the embedding of a question with the embeddings of our Nodes to see which Nodes may contain the data needed to answer that quesiton. + +- [Indices](/packages/core/src/indices/): Indices store the Nodes and the embeddings of those nodes. QueryEngines retrieve Nodes from these Indices using embedding similarity. + +- [QueryEngine](/packages/core/src/QueryEngine.ts): Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected Nodes from your Index to give the LLM the context it needs to answer your query. + +- [ChatEngine](/packages/core/src/ChatEngine.ts): A ChatEngine helps you build a chatbot that will interact with your Indices. + +- [SimplePrompt](/packages/core/src/Prompt.ts): A simple standardized function call definition that takes in inputs and formats them in a template literal. SimplePrompts can be specialized using currying and combined using other SimplePrompt functions. + +## Supported LLMs: + +- OpenAI GPT-3.5-turbo and GPT-4 +- Anthropic Claude Instant and Claude 2 +- Llama2 Chat LLMs (70B, 13B, and 7B parameters) + +## Contributing: + +We are in the very early days of LlamaIndex.TS. If you’re interested in hacking on it with us check out our [contributing guide](/CONTRIBUTING.md) + +## Bugs? Questions? + +Please join our Discord! https://discord.com/invite/eN6D2HQ4aX diff --git a/apps/docs/docs/api/interfaces/BaseIndexInit.md b/apps/docs/docs/api/interfaces/BaseIndexInit.md new file mode 100644 index 0000000000000000000000000000000000000000..93a8e6342231f668fa84a740ae0e1870434298f2 --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseIndexInit.md @@ -0,0 +1,79 @@ +--- +id: "BaseIndexInit" +title: "Interface: BaseIndexInit<T>" +sidebar_label: "BaseIndexInit" +sidebar_position: 0 +custom_edit_url: null +--- + +## Type parameters + +| Name | +| :------ | +| `T` | + +## Hierarchy + +- **`BaseIndexInit`** + + ↳ [`VectorIndexConstructorProps`](VectorIndexConstructorProps.md) + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Defined in + +[indices/BaseIndex.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L104) + +___ + +### indexStore + +• `Optional` **indexStore**: `BaseIndexStore` + +#### Defined in + +[indices/BaseIndex.ts:106](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L106) + +___ + +### indexStruct + +• **indexStruct**: `T` + +#### Defined in + +[indices/BaseIndex.ts:107](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L107) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](ServiceContext.md) + +#### Defined in + +[indices/BaseIndex.ts:102](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L102) + +___ + +### storageContext + +• **storageContext**: [`StorageContext`](StorageContext.md) + +#### Defined in + +[indices/BaseIndex.ts:103](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L103) + +___ + +### vectorStore + +• `Optional` **vectorStore**: [`VectorStore`](VectorStore.md) + +#### Defined in + +[indices/BaseIndex.ts:105](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L105) diff --git a/apps/docs/docs/api/interfaces/BaseOutputParser.md b/apps/docs/docs/api/interfaces/BaseOutputParser.md new file mode 100644 index 0000000000000000000000000000000000000000..6f9b313d4852a7bc0e4dc6312b2d10666348deed --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseOutputParser.md @@ -0,0 +1,59 @@ +--- +id: "BaseOutputParser" +title: "Interface: BaseOutputParser<T>" +sidebar_label: "BaseOutputParser" +sidebar_position: 0 +custom_edit_url: null +--- + +An OutputParser is used to extract structured data from the raw output of the LLM. + +## Type parameters + +| Name | +| :------ | +| `T` | + +## Implemented by + +- [`SubQuestionOutputParser`](../classes/SubQuestionOutputParser.md) + +## Methods + +### format + +▸ **format**(`output`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `output` | `string` | + +#### Returns + +`string` + +#### Defined in + +[OutputParser.ts:8](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L8) + +___ + +### parse + +▸ **parse**(`output`): `T` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `output` | `string` | + +#### Returns + +`T` + +#### Defined in + +[OutputParser.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L7) diff --git a/apps/docs/docs/api/interfaces/BaseQueryEngine.md b/apps/docs/docs/api/interfaces/BaseQueryEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..91d874535af7eb655c0aebb4195612ce73b3fc6f --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseQueryEngine.md @@ -0,0 +1,37 @@ +--- +id: "BaseQueryEngine" +title: "Interface: BaseQueryEngine" +sidebar_label: "BaseQueryEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +A query engine is a question answerer that can use one or more steps. + +## Implemented by + +- [`RetrieverQueryEngine`](../classes/RetrieverQueryEngine.md) +- [`SubQuestionQueryEngine`](../classes/SubQuestionQueryEngine.md) + +## Methods + +### query + +▸ **query**(`query`, `parentEvent?`): `Promise`<[`Response`](../classes/Response.md)\> + +Query the query engine and get a response. + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](Event.md) | + +#### Returns + +`Promise`<[`Response`](../classes/Response.md)\> + +#### Defined in + +[QueryEngine.ts:24](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QueryEngine.ts#L24) diff --git a/apps/docs/docs/api/interfaces/BaseQuestionGenerator.md b/apps/docs/docs/api/interfaces/BaseQuestionGenerator.md new file mode 100644 index 0000000000000000000000000000000000000000..129a9844cd1252473319c64a10edd0bd2ed99107 --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseQuestionGenerator.md @@ -0,0 +1,34 @@ +--- +id: "BaseQuestionGenerator" +title: "Interface: BaseQuestionGenerator" +sidebar_label: "BaseQuestionGenerator" +sidebar_position: 0 +custom_edit_url: null +--- + +QuestionGenerators generate new questions for the LLM using tools and a user query. + +## Implemented by + +- [`LLMQuestionGenerator`](../classes/LLMQuestionGenerator.md) + +## Methods + +### generate + +▸ **generate**(`tools`, `query`): `Promise`<[`SubQuestion`](SubQuestion.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `tools` | [`ToolMetadata`](ToolMetadata.md)[] | +| `query` | `string` | + +#### Returns + +`Promise`<[`SubQuestion`](SubQuestion.md)[]\> + +#### Defined in + +[QuestionGenerator.ts:23](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L23) diff --git a/apps/docs/docs/api/interfaces/BaseReader.md b/apps/docs/docs/api/interfaces/BaseReader.md new file mode 100644 index 0000000000000000000000000000000000000000..114d1ede9a14d0abfceec8210c1570761ae9d281 --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseReader.md @@ -0,0 +1,35 @@ +--- +id: "BaseReader" +title: "Interface: BaseReader" +sidebar_label: "BaseReader" +sidebar_position: 0 +custom_edit_url: null +--- + +A reader takes imports data into Document objects. + +## Implemented by + +- [`PDFReader`](../classes/PDFReader.md) +- [`SimpleDirectoryReader`](../classes/SimpleDirectoryReader.md) +- [`TextFileReader`](../classes/TextFileReader.md) + +## Methods + +### loadData + +▸ **loadData**(`...args`): `Promise`<[`Document`](../classes/Document.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `...args` | `any`[] | + +#### Returns + +`Promise`<[`Document`](../classes/Document.md)[]\> + +#### Defined in + +[readers/base.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/readers/base.ts#L7) diff --git a/apps/docs/docs/api/interfaces/BaseRetriever.md b/apps/docs/docs/api/interfaces/BaseRetriever.md new file mode 100644 index 0000000000000000000000000000000000000000..6a499511b81b1d56108ee9d482cf97c8854d0cbf --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseRetriever.md @@ -0,0 +1,50 @@ +--- +id: "BaseRetriever" +title: "Interface: BaseRetriever" +sidebar_label: "BaseRetriever" +sidebar_position: 0 +custom_edit_url: null +--- + +Retrievers retrieve the nodes that most closely match our query in similarity. + +## Implemented by + +- [`ListIndexLLMRetriever`](../classes/ListIndexLLMRetriever.md) +- [`ListIndexRetriever`](../classes/ListIndexRetriever.md) +- [`VectorIndexRetriever`](../classes/VectorIndexRetriever.md) + +## Methods + +### getServiceContext + +▸ **getServiceContext**(): [`ServiceContext`](ServiceContext.md) + +#### Returns + +[`ServiceContext`](ServiceContext.md) + +#### Defined in + +[Retriever.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Retriever.ts#L10) + +___ + +### retrieve + +▸ **retrieve**(`query`, `parentEvent?`): `Promise`<[`NodeWithScore`](NodeWithScore.md)[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | `string` | +| `parentEvent?` | [`Event`](Event.md) | + +#### Returns + +`Promise`<[`NodeWithScore`](NodeWithScore.md)[]\> + +#### Defined in + +[Retriever.ts:9](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Retriever.ts#L9) diff --git a/apps/docs/docs/api/interfaces/BaseTool.md b/apps/docs/docs/api/interfaces/BaseTool.md new file mode 100644 index 0000000000000000000000000000000000000000..3c715d484d0b5f7dbd27067f4a45f034f357d1da --- /dev/null +++ b/apps/docs/docs/api/interfaces/BaseTool.md @@ -0,0 +1,25 @@ +--- +id: "BaseTool" +title: "Interface: BaseTool" +sidebar_label: "BaseTool" +sidebar_position: 0 +custom_edit_url: null +--- + +Simple Tool interface. Likely to change. + +## Hierarchy + +- **`BaseTool`** + + ↳ [`QueryEngineTool`](QueryEngineTool.md) + +## Properties + +### metadata + +• **metadata**: [`ToolMetadata`](ToolMetadata.md) + +#### Defined in + +[Tool.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Tool.ts#L12) diff --git a/apps/docs/docs/api/interfaces/ChatEngine.md b/apps/docs/docs/api/interfaces/ChatEngine.md new file mode 100644 index 0000000000000000000000000000000000000000..720d13a91108f1104303e4716a8cf283da9cb0c7 --- /dev/null +++ b/apps/docs/docs/api/interfaces/ChatEngine.md @@ -0,0 +1,54 @@ +--- +id: "ChatEngine" +title: "Interface: ChatEngine" +sidebar_label: "ChatEngine" +sidebar_position: 0 +custom_edit_url: null +--- + +A ChatEngine is used to handle back and forth chats between the application and the LLM. + +## Implemented by + +- [`CondenseQuestionChatEngine`](../classes/CondenseQuestionChatEngine.md) +- [`ContextChatEngine`](../classes/ContextChatEngine.md) +- [`SimpleChatEngine`](../classes/SimpleChatEngine.md) + +## Methods + +### chat + +▸ **chat**(`message`, `chatHistory?`): `Promise`<[`Response`](../classes/Response.md)\> + +Send message along with the class's current chat history to the LLM. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `message` | `string` | | +| `chatHistory?` | [`ChatMessage`](ChatMessage.md)[] | optional chat history if you want to customize the chat history | + +#### Returns + +`Promise`<[`Response`](../classes/Response.md)\> + +#### Defined in + +[ChatEngine.ts:25](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L25) + +___ + +### reset + +▸ **reset**(): `void` + +Resets the chat history so that it's empty. + +#### Returns + +`void` + +#### Defined in + +[ChatEngine.ts:30](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ChatEngine.ts#L30) diff --git a/apps/docs/docs/api/interfaces/ChatMessage.md b/apps/docs/docs/api/interfaces/ChatMessage.md new file mode 100644 index 0000000000000000000000000000000000000000..1a4eaae9a4f364c262e3feccebfbfb435db1faf8 --- /dev/null +++ b/apps/docs/docs/api/interfaces/ChatMessage.md @@ -0,0 +1,27 @@ +--- +id: "ChatMessage" +title: "Interface: ChatMessage" +sidebar_label: "ChatMessage" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### content + +• **content**: `string` + +#### Defined in + +[llm/LLM.ts:21](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L21) + +___ + +### role + +• **role**: [`MessageType`](../modules.md#messagetype) + +#### Defined in + +[llm/LLM.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L22) diff --git a/apps/docs/docs/api/interfaces/ChatResponse.md b/apps/docs/docs/api/interfaces/ChatResponse.md new file mode 100644 index 0000000000000000000000000000000000000000..104722024c1784925f11fb7fc9286eea175f3289 --- /dev/null +++ b/apps/docs/docs/api/interfaces/ChatResponse.md @@ -0,0 +1,37 @@ +--- +id: "ChatResponse" +title: "Interface: ChatResponse" +sidebar_label: "ChatResponse" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### delta + +• `Optional` **delta**: `string` + +#### Defined in + +[llm/LLM.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L28) + +___ + +### message + +• **message**: [`ChatMessage`](ChatMessage.md) + +#### Defined in + +[llm/LLM.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L26) + +___ + +### raw + +• `Optional` **raw**: `Record`<`string`, `any`\> + +#### Defined in + +[llm/LLM.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L27) diff --git a/apps/docs/docs/api/interfaces/Event.md b/apps/docs/docs/api/interfaces/Event.md new file mode 100644 index 0000000000000000000000000000000000000000..2448cc948932388bd9a305de4673ccbdca526101 --- /dev/null +++ b/apps/docs/docs/api/interfaces/Event.md @@ -0,0 +1,47 @@ +--- +id: "Event" +title: "Interface: Event" +sidebar_label: "Event" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### id + +• **id**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L13) + +___ + +### parentId + +• `Optional` **parentId**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L16) + +___ + +### tags + +• `Optional` **tags**: [`EventTag`](../modules.md#eventtag)[] + +#### Defined in + +[callbacks/CallbackManager.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L15) + +___ + +### type + +• **type**: [`EventType`](../modules.md#eventtype) + +#### Defined in + +[callbacks/CallbackManager.ts:14](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L14) diff --git a/apps/docs/docs/api/interfaces/ExactMatchFilter.md b/apps/docs/docs/api/interfaces/ExactMatchFilter.md new file mode 100644 index 0000000000000000000000000000000000000000..27620144ed75359f4f0394fb4c39803199868cba --- /dev/null +++ b/apps/docs/docs/api/interfaces/ExactMatchFilter.md @@ -0,0 +1,27 @@ +--- +id: "ExactMatchFilter" +title: "Interface: ExactMatchFilter" +sidebar_label: "ExactMatchFilter" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### key + +• **key**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:24](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L24) + +___ + +### value + +• **value**: `string` \| `number` + +#### Defined in + +[storage/vectorStore/types.ts:25](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L25) diff --git a/apps/docs/docs/api/interfaces/GenericFileSystem.md b/apps/docs/docs/api/interfaces/GenericFileSystem.md new file mode 100644 index 0000000000000000000000000000000000000000..3d7ce8e36beab1e1a09474c87f821cfd78c67d94 --- /dev/null +++ b/apps/docs/docs/api/interfaces/GenericFileSystem.md @@ -0,0 +1,100 @@ +--- +id: "GenericFileSystem" +title: "Interface: GenericFileSystem" +sidebar_label: "GenericFileSystem" +sidebar_position: 0 +custom_edit_url: null +--- + +A filesystem interface that is meant to be compatible with +the 'fs' module from Node.js. +Allows for the use of similar inteface implementation on +browsers. + +## Implemented by + +- [`InMemoryFileSystem`](../classes/InMemoryFileSystem.md) + +## Methods + +### access + +▸ **access**(`path`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | + +#### Returns + +`Promise`<`void`\> + +#### Defined in + +[storage/FileSystem.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L12) + +___ + +### mkdir + +▸ **mkdir**(`path`, `options?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`void`\> + +#### Defined in + +[storage/FileSystem.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L13) + +___ + +### readFile + +▸ **readFile**(`path`, `options?`): `Promise`<`string`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`string`\> + +#### Defined in + +[storage/FileSystem.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L11) + +___ + +### writeFile + +▸ **writeFile**(`path`, `content`, `options?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | +| `content` | `string` | +| `options?` | `any` | + +#### Returns + +`Promise`<`void`\> + +#### Defined in + +[storage/FileSystem.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L10) diff --git a/apps/docs/docs/api/interfaces/LLM.md b/apps/docs/docs/api/interfaces/LLM.md new file mode 100644 index 0000000000000000000000000000000000000000..798e1802b4b3d1fbbdf4463476c3051f59affd73 --- /dev/null +++ b/apps/docs/docs/api/interfaces/LLM.md @@ -0,0 +1,61 @@ +--- +id: "LLM" +title: "Interface: LLM" +sidebar_label: "LLM" +sidebar_position: 0 +custom_edit_url: null +--- + +Unified language model interface + +## Implemented by + +- [`Anthropic`](../classes/Anthropic.md) +- [`LlamaDeuce`](../classes/LlamaDeuce.md) +- [`OpenAI`](../classes/OpenAI.md) + +## Methods + +### chat + +▸ **chat**(`messages`, `parentEvent?`): `Promise`<[`ChatResponse`](ChatResponse.md)\> + +Get a chat response from the LLM + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](ChatMessage.md)[] | +| `parentEvent?` | [`Event`](Event.md) | + +#### Returns + +`Promise`<[`ChatResponse`](ChatResponse.md)\> + +#### Defined in + +[llm/LLM.ts:42](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L42) + +___ + +### complete + +▸ **complete**(`prompt`, `parentEvent?`): `Promise`<[`ChatResponse`](ChatResponse.md)\> + +Get a prompt completion from the LLM + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `prompt` | `string` | the prompt to complete | +| `parentEvent?` | [`Event`](Event.md) | - | + +#### Returns + +`Promise`<[`ChatResponse`](ChatResponse.md)\> + +#### Defined in + +[llm/LLM.ts:48](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L48) diff --git a/apps/docs/docs/api/interfaces/MetadataFilters.md b/apps/docs/docs/api/interfaces/MetadataFilters.md new file mode 100644 index 0000000000000000000000000000000000000000..8c0cf31d0a35eb6f09b46ba4fa59bc7230391ca8 --- /dev/null +++ b/apps/docs/docs/api/interfaces/MetadataFilters.md @@ -0,0 +1,17 @@ +--- +id: "MetadataFilters" +title: "Interface: MetadataFilters" +sidebar_label: "MetadataFilters" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### filters + +• **filters**: [`ExactMatchFilter`](ExactMatchFilter.md)[] + +#### Defined in + +[storage/vectorStore/types.ts:29](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L29) diff --git a/apps/docs/docs/api/interfaces/MetadataInfo.md b/apps/docs/docs/api/interfaces/MetadataInfo.md new file mode 100644 index 0000000000000000000000000000000000000000..4bac991dfdb8c415d35593886c51fdaaeb8bd720 --- /dev/null +++ b/apps/docs/docs/api/interfaces/MetadataInfo.md @@ -0,0 +1,37 @@ +--- +id: "MetadataInfo" +title: "Interface: MetadataInfo" +sidebar_label: "MetadataInfo" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### description + +• **description**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:41](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L41) + +___ + +### name + +• **name**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L39) + +___ + +### type + +• **type**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:40](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L40) diff --git a/apps/docs/docs/api/interfaces/NodeParser.md b/apps/docs/docs/api/interfaces/NodeParser.md new file mode 100644 index 0000000000000000000000000000000000000000..61cb47210076d7cf088f40ad7d19ad1700871a65 --- /dev/null +++ b/apps/docs/docs/api/interfaces/NodeParser.md @@ -0,0 +1,37 @@ +--- +id: "NodeParser" +title: "Interface: NodeParser" +sidebar_label: "NodeParser" +sidebar_position: 0 +custom_edit_url: null +--- + +A NodeParser generates TextNodes from Documents + +## Implemented by + +- [`SimpleNodeParser`](../classes/SimpleNodeParser.md) + +## Methods + +### getNodesFromDocuments + +▸ **getNodesFromDocuments**(`documents`): [`TextNode`](../classes/TextNode.md)[] + +Generates an array of nodes from an array of documents. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `documents` | [`Document`](../classes/Document.md)[] | The documents to generate nodes from. | + +#### Returns + +[`TextNode`](../classes/TextNode.md)[] + +An array of nodes. + +#### Defined in + +[NodeParser.ts:73](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L73) diff --git a/apps/docs/docs/api/interfaces/NodeWithEmbedding.md b/apps/docs/docs/api/interfaces/NodeWithEmbedding.md new file mode 100644 index 0000000000000000000000000000000000000000..d976889168a6d73413060a07317a0a81bc156c0b --- /dev/null +++ b/apps/docs/docs/api/interfaces/NodeWithEmbedding.md @@ -0,0 +1,29 @@ +--- +id: "NodeWithEmbedding" +title: "Interface: NodeWithEmbedding" +sidebar_label: "NodeWithEmbedding" +sidebar_position: 0 +custom_edit_url: null +--- + +A node with an embedding + +## Properties + +### embedding + +• **embedding**: `number`[] + +#### Defined in + +[Node.ts:247](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L247) + +___ + +### node + +• **node**: [`BaseNode`](../classes/BaseNode.md) + +#### Defined in + +[Node.ts:246](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L246) diff --git a/apps/docs/docs/api/interfaces/NodeWithScore.md b/apps/docs/docs/api/interfaces/NodeWithScore.md new file mode 100644 index 0000000000000000000000000000000000000000..4782c65b934aff9c9fcf3d001f18443994ce2b73 --- /dev/null +++ b/apps/docs/docs/api/interfaces/NodeWithScore.md @@ -0,0 +1,29 @@ +--- +id: "NodeWithScore" +title: "Interface: NodeWithScore" +sidebar_label: "NodeWithScore" +sidebar_position: 0 +custom_edit_url: null +--- + +A node with a similarity score + +## Properties + +### node + +• **node**: [`BaseNode`](../classes/BaseNode.md) + +#### Defined in + +[Node.ts:238](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L238) + +___ + +### score + +• **score**: `number` + +#### Defined in + +[Node.ts:239](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L239) diff --git a/apps/docs/docs/api/interfaces/QueryEngineTool.md b/apps/docs/docs/api/interfaces/QueryEngineTool.md new file mode 100644 index 0000000000000000000000000000000000000000..f1a62993dc135ae3cda0bbb198f52bbd45e058cc --- /dev/null +++ b/apps/docs/docs/api/interfaces/QueryEngineTool.md @@ -0,0 +1,39 @@ +--- +id: "QueryEngineTool" +title: "Interface: QueryEngineTool" +sidebar_label: "QueryEngineTool" +sidebar_position: 0 +custom_edit_url: null +--- + +A Tool that uses a QueryEngine. + +## Hierarchy + +- [`BaseTool`](BaseTool.md) + + ↳ **`QueryEngineTool`** + +## Properties + +### metadata + +• **metadata**: [`ToolMetadata`](ToolMetadata.md) + +#### Inherited from + +[BaseTool](BaseTool.md).[metadata](BaseTool.md#metadata) + +#### Defined in + +[Tool.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Tool.ts#L12) + +___ + +### queryEngine + +• **queryEngine**: [`BaseQueryEngine`](BaseQueryEngine.md) + +#### Defined in + +[Tool.ts:19](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Tool.ts#L19) diff --git a/apps/docs/docs/api/interfaces/RelatedNodeInfo.md b/apps/docs/docs/api/interfaces/RelatedNodeInfo.md new file mode 100644 index 0000000000000000000000000000000000000000..7797dee0e5726c2659aaba5f6d3f6b0c0b15aa12 --- /dev/null +++ b/apps/docs/docs/api/interfaces/RelatedNodeInfo.md @@ -0,0 +1,47 @@ +--- +id: "RelatedNodeInfo" +title: "Interface: RelatedNodeInfo" +sidebar_label: "RelatedNodeInfo" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### hash + +• `Optional` **hash**: `string` + +#### Defined in + +[Node.ts:29](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L29) + +___ + +### metadata + +• **metadata**: `Record`<`string`, `any`\> + +#### Defined in + +[Node.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L28) + +___ + +### nodeId + +• **nodeId**: `string` + +#### Defined in + +[Node.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L26) + +___ + +### nodeType + +• `Optional` **nodeType**: [`ObjectType`](../enums/ObjectType.md) + +#### Defined in + +[Node.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L27) diff --git a/apps/docs/docs/api/interfaces/RetrievalCallbackResponse.md b/apps/docs/docs/api/interfaces/RetrievalCallbackResponse.md new file mode 100644 index 0000000000000000000000000000000000000000..2650301c9f6fae298cf8928cd39747ea67310a72 --- /dev/null +++ b/apps/docs/docs/api/interfaces/RetrievalCallbackResponse.md @@ -0,0 +1,47 @@ +--- +id: "RetrievalCallbackResponse" +title: "Interface: RetrievalCallbackResponse" +sidebar_label: "RetrievalCallbackResponse" +sidebar_position: 0 +custom_edit_url: null +--- + +## Hierarchy + +- `BaseCallbackResponse` + + ↳ **`RetrievalCallbackResponse`** + +## Properties + +### event + +• **event**: [`Event`](Event.md) + +#### Inherited from + +BaseCallbackResponse.event + +#### Defined in + +[callbacks/CallbackManager.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L20) + +___ + +### nodes + +• **nodes**: [`NodeWithScore`](NodeWithScore.md)[] + +#### Defined in + +[callbacks/CallbackManager.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L46) + +___ + +### query + +• **query**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L45) diff --git a/apps/docs/docs/api/interfaces/ServiceContext.md b/apps/docs/docs/api/interfaces/ServiceContext.md new file mode 100644 index 0000000000000000000000000000000000000000..5769bee0963609d5c1fa084ca82adb980768823f --- /dev/null +++ b/apps/docs/docs/api/interfaces/ServiceContext.md @@ -0,0 +1,59 @@ +--- +id: "ServiceContext" +title: "Interface: ServiceContext" +sidebar_label: "ServiceContext" +sidebar_position: 0 +custom_edit_url: null +--- + +The ServiceContext is a collection of components that are used in different parts of the application. + +## Properties + +### callbackManager + +• **callbackManager**: [`CallbackManager`](../classes/CallbackManager.md) + +#### Defined in + +[ServiceContext.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L15) + +___ + +### embedModel + +• **embedModel**: [`BaseEmbedding`](../classes/BaseEmbedding.md) + +#### Defined in + +[ServiceContext.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L13) + +___ + +### llm + +• **llm**: [`LLM`](LLM.md) + +#### Defined in + +[ServiceContext.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L11) + +___ + +### nodeParser + +• **nodeParser**: [`NodeParser`](NodeParser.md) + +#### Defined in + +[ServiceContext.ts:14](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L14) + +___ + +### promptHelper + +• **promptHelper**: `PromptHelper` + +#### Defined in + +[ServiceContext.ts:12](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L12) diff --git a/apps/docs/docs/api/interfaces/ServiceContextOptions.md b/apps/docs/docs/api/interfaces/ServiceContextOptions.md new file mode 100644 index 0000000000000000000000000000000000000000..161228facb0b195c8d728b639c68eb75b62f2bad --- /dev/null +++ b/apps/docs/docs/api/interfaces/ServiceContextOptions.md @@ -0,0 +1,77 @@ +--- +id: "ServiceContextOptions" +title: "Interface: ServiceContextOptions" +sidebar_label: "ServiceContextOptions" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### callbackManager + +• `Optional` **callbackManager**: [`CallbackManager`](../classes/CallbackManager.md) + +#### Defined in + +[ServiceContext.ts:24](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L24) + +___ + +### chunkOverlap + +• `Optional` **chunkOverlap**: `number` + +#### Defined in + +[ServiceContext.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L27) + +___ + +### chunkSize + +• `Optional` **chunkSize**: `number` + +#### Defined in + +[ServiceContext.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L26) + +___ + +### embedModel + +• `Optional` **embedModel**: [`BaseEmbedding`](../classes/BaseEmbedding.md) + +#### Defined in + +[ServiceContext.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L22) + +___ + +### llm + +• `Optional` **llm**: [`LLM`](LLM.md) + +#### Defined in + +[ServiceContext.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L20) + +___ + +### nodeParser + +• `Optional` **nodeParser**: [`NodeParser`](NodeParser.md) + +#### Defined in + +[ServiceContext.ts:23](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L23) + +___ + +### promptHelper + +• `Optional` **promptHelper**: `PromptHelper` + +#### Defined in + +[ServiceContext.ts:21](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L21) diff --git a/apps/docs/docs/api/interfaces/StorageContext.md b/apps/docs/docs/api/interfaces/StorageContext.md new file mode 100644 index 0000000000000000000000000000000000000000..12b794cfb6707d946b967dbff439471b57c21213 --- /dev/null +++ b/apps/docs/docs/api/interfaces/StorageContext.md @@ -0,0 +1,37 @@ +--- +id: "StorageContext" +title: "Interface: StorageContext" +sidebar_label: "StorageContext" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Defined in + +[storage/StorageContext.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/StorageContext.ts#L15) + +___ + +### indexStore + +• **indexStore**: `BaseIndexStore` + +#### Defined in + +[storage/StorageContext.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/StorageContext.ts#L16) + +___ + +### vectorStore + +• **vectorStore**: [`VectorStore`](VectorStore.md) + +#### Defined in + +[storage/StorageContext.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/StorageContext.ts#L17) diff --git a/apps/docs/docs/api/interfaces/StreamCallbackResponse.md b/apps/docs/docs/api/interfaces/StreamCallbackResponse.md new file mode 100644 index 0000000000000000000000000000000000000000..c2df6bc210f067b5cb40e22df371abe407d9cbba --- /dev/null +++ b/apps/docs/docs/api/interfaces/StreamCallbackResponse.md @@ -0,0 +1,57 @@ +--- +id: "StreamCallbackResponse" +title: "Interface: StreamCallbackResponse" +sidebar_label: "StreamCallbackResponse" +sidebar_position: 0 +custom_edit_url: null +--- + +## Hierarchy + +- `BaseCallbackResponse` + + ↳ **`StreamCallbackResponse`** + +## Properties + +### event + +• **event**: [`Event`](Event.md) + +#### Inherited from + +BaseCallbackResponse.event + +#### Defined in + +[callbacks/CallbackManager.ts:20](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L20) + +___ + +### index + +• **index**: `number` + +#### Defined in + +[callbacks/CallbackManager.ts:39](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L39) + +___ + +### isDone + +• `Optional` **isDone**: `boolean` + +#### Defined in + +[callbacks/CallbackManager.ts:40](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L40) + +___ + +### token + +• `Optional` **token**: [`StreamToken`](StreamToken.md) + +#### Defined in + +[callbacks/CallbackManager.ts:41](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L41) diff --git a/apps/docs/docs/api/interfaces/StreamToken.md b/apps/docs/docs/api/interfaces/StreamToken.md new file mode 100644 index 0000000000000000000000000000000000000000..a205c3570cfc9e04569ed09102b0546133c7f06b --- /dev/null +++ b/apps/docs/docs/api/interfaces/StreamToken.md @@ -0,0 +1,57 @@ +--- +id: "StreamToken" +title: "Interface: StreamToken" +sidebar_label: "StreamToken" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### choices + +• **choices**: { `delta`: { `content?`: ``null`` \| `string` ; `role?`: ``"function"`` \| ``"user"`` \| ``"assistant"`` \| ``"system"`` } ; `finish_reason`: ``null`` \| `string` ; `index`: `number` }[] + +#### Defined in + +[callbacks/CallbackManager.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L28) + +___ + +### created + +• **created**: `number` + +#### Defined in + +[callbacks/CallbackManager.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L26) + +___ + +### id + +• **id**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:24](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L24) + +___ + +### model + +• **model**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:27](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L27) + +___ + +### object + +• **object**: `string` + +#### Defined in + +[callbacks/CallbackManager.ts:25](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L25) diff --git a/apps/docs/docs/api/interfaces/StructuredOutput.md b/apps/docs/docs/api/interfaces/StructuredOutput.md new file mode 100644 index 0000000000000000000000000000000000000000..877781557e783d2c2bd519ca7ebe657be0eb4487 --- /dev/null +++ b/apps/docs/docs/api/interfaces/StructuredOutput.md @@ -0,0 +1,35 @@ +--- +id: "StructuredOutput" +title: "Interface: StructuredOutput<T>" +sidebar_label: "StructuredOutput" +sidebar_position: 0 +custom_edit_url: null +--- + +StructuredOutput is just a combo of the raw output and the parsed output. + +## Type parameters + +| Name | +| :------ | +| `T` | + +## Properties + +### parsedOutput + +• **parsedOutput**: `T` + +#### Defined in + +[OutputParser.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L16) + +___ + +### rawOutput + +• **rawOutput**: `string` + +#### Defined in + +[OutputParser.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/OutputParser.ts#L15) diff --git a/apps/docs/docs/api/interfaces/SubQuestion.md b/apps/docs/docs/api/interfaces/SubQuestion.md new file mode 100644 index 0000000000000000000000000000000000000000..7c46de237a80499fea3fcfddc2c247f5a1483337 --- /dev/null +++ b/apps/docs/docs/api/interfaces/SubQuestion.md @@ -0,0 +1,27 @@ +--- +id: "SubQuestion" +title: "Interface: SubQuestion" +sidebar_label: "SubQuestion" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### subQuestion + +• **subQuestion**: `string` + +#### Defined in + +[QuestionGenerator.ts:15](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L15) + +___ + +### toolName + +• **toolName**: `string` + +#### Defined in + +[QuestionGenerator.ts:16](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/QuestionGenerator.ts#L16) diff --git a/apps/docs/docs/api/interfaces/ToolMetadata.md b/apps/docs/docs/api/interfaces/ToolMetadata.md new file mode 100644 index 0000000000000000000000000000000000000000..db25dc536b4737314846c9b5faf163599fcc430d --- /dev/null +++ b/apps/docs/docs/api/interfaces/ToolMetadata.md @@ -0,0 +1,27 @@ +--- +id: "ToolMetadata" +title: "Interface: ToolMetadata" +sidebar_label: "ToolMetadata" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### description + +• **description**: `string` + +#### Defined in + +[Tool.ts:4](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Tool.ts#L4) + +___ + +### name + +• **name**: `string` + +#### Defined in + +[Tool.ts:5](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Tool.ts#L5) diff --git a/apps/docs/docs/api/interfaces/VectorIndexConstructorProps.md b/apps/docs/docs/api/interfaces/VectorIndexConstructorProps.md new file mode 100644 index 0000000000000000000000000000000000000000..8761b18a159c4576a95f8d136492917a2318010d --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorIndexConstructorProps.md @@ -0,0 +1,97 @@ +--- +id: "VectorIndexConstructorProps" +title: "Interface: VectorIndexConstructorProps" +sidebar_label: "VectorIndexConstructorProps" +sidebar_position: 0 +custom_edit_url: null +--- + +## Hierarchy + +- [`BaseIndexInit`](BaseIndexInit.md)<[`IndexDict`](../classes/IndexDict.md)\> + + ↳ **`VectorIndexConstructorProps`** + +## Properties + +### docStore + +• **docStore**: `BaseDocumentStore` + +#### Inherited from + +[BaseIndexInit](BaseIndexInit.md).[docStore](BaseIndexInit.md#docstore) + +#### Defined in + +[indices/BaseIndex.ts:104](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L104) + +___ + +### indexStore + +• `Optional` **indexStore**: `BaseIndexStore` + +#### Inherited from + +[BaseIndexInit](BaseIndexInit.md).[indexStore](BaseIndexInit.md#indexstore) + +#### Defined in + +[indices/BaseIndex.ts:106](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L106) + +___ + +### indexStruct + +• **indexStruct**: [`IndexDict`](../classes/IndexDict.md) + +#### Inherited from + +[BaseIndexInit](BaseIndexInit.md).[indexStruct](BaseIndexInit.md#indexstruct) + +#### Defined in + +[indices/BaseIndex.ts:107](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L107) + +___ + +### serviceContext + +• **serviceContext**: [`ServiceContext`](ServiceContext.md) + +#### Inherited from + +[BaseIndexInit](BaseIndexInit.md).[serviceContext](BaseIndexInit.md#servicecontext) + +#### Defined in + +[indices/BaseIndex.ts:102](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L102) + +___ + +### storageContext + +• **storageContext**: [`StorageContext`](StorageContext.md) + +#### Inherited from + +[BaseIndexInit](BaseIndexInit.md).[storageContext](BaseIndexInit.md#storagecontext) + +#### Defined in + +[indices/BaseIndex.ts:103](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L103) + +___ + +### vectorStore + +• **vectorStore**: [`VectorStore`](VectorStore.md) + +#### Overrides + +[BaseIndexInit](BaseIndexInit.md).[vectorStore](BaseIndexInit.md#vectorstore) + +#### Defined in + +[indices/BaseIndex.ts:157](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L157) diff --git a/apps/docs/docs/api/interfaces/VectorIndexOptions.md b/apps/docs/docs/api/interfaces/VectorIndexOptions.md new file mode 100644 index 0000000000000000000000000000000000000000..5a2f44c512cfc24abe78fc9bd7b86a52ce069fda --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorIndexOptions.md @@ -0,0 +1,57 @@ +--- +id: "VectorIndexOptions" +title: "Interface: VectorIndexOptions" +sidebar_label: "VectorIndexOptions" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### indexId + +• `Optional` **indexId**: `string` + +#### Defined in + +[indices/BaseIndex.ts:151](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L151) + +___ + +### indexStruct + +• `Optional` **indexStruct**: [`IndexDict`](../classes/IndexDict.md) + +#### Defined in + +[indices/BaseIndex.ts:150](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L150) + +___ + +### nodes + +• `Optional` **nodes**: [`BaseNode`](../classes/BaseNode.md)[] + +#### Defined in + +[indices/BaseIndex.ts:149](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L149) + +___ + +### serviceContext + +• `Optional` **serviceContext**: [`ServiceContext`](ServiceContext.md) + +#### Defined in + +[indices/BaseIndex.ts:152](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L152) + +___ + +### storageContext + +• `Optional` **storageContext**: [`StorageContext`](StorageContext.md) + +#### Defined in + +[indices/BaseIndex.ts:153](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L153) diff --git a/apps/docs/docs/api/interfaces/VectorStore.md b/apps/docs/docs/api/interfaces/VectorStore.md new file mode 100644 index 0000000000000000000000000000000000000000..7834f91259c1ee6596513a9a48b8f5c6d850cbdc --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorStore.md @@ -0,0 +1,124 @@ +--- +id: "VectorStore" +title: "Interface: VectorStore" +sidebar_label: "VectorStore" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### isEmbeddingQuery + +• `Optional` **isEmbeddingQuery**: `boolean` + +#### Defined in + +[storage/vectorStore/types.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L62) + +___ + +### storesText + +• **storesText**: `boolean` + +#### Defined in + +[storage/vectorStore/types.ts:61](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L61) + +## Methods + +### add + +▸ **add**(`embeddingResults`): `Promise`<`string`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `embeddingResults` | [`NodeWithEmbedding`](NodeWithEmbedding.md)[] | + +#### Returns + +`Promise`<`string`[]\> + +#### Defined in + +[storage/vectorStore/types.ts:64](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L64) + +___ + +### client + +▸ **client**(): `any` + +#### Returns + +`any` + +#### Defined in + +[storage/vectorStore/types.ts:63](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L63) + +___ + +### delete + +▸ **delete**(`refDocId`, `deleteKwargs?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `refDocId` | `string` | +| `deleteKwargs?` | `any` | + +#### Returns + +`Promise`<`void`\> + +#### Defined in + +[storage/vectorStore/types.ts:65](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L65) + +___ + +### persist + +▸ **persist**(`persistPath`, `fs?`): `Promise`<`void`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `persistPath` | `string` | +| `fs?` | [`GenericFileSystem`](GenericFileSystem.md) | + +#### Returns + +`Promise`<`void`\> + +#### Defined in + +[storage/vectorStore/types.ts:67](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L67) + +___ + +### query + +▸ **query**(`query`, `kwargs?`): `Promise`<[`VectorStoreQueryResult`](VectorStoreQueryResult.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `query` | [`VectorStoreQuery`](VectorStoreQuery.md) | +| `kwargs?` | `any` | + +#### Returns + +`Promise`<[`VectorStoreQueryResult`](VectorStoreQueryResult.md)\> + +#### Defined in + +[storage/vectorStore/types.ts:66](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L66) diff --git a/apps/docs/docs/api/interfaces/VectorStoreInfo.md b/apps/docs/docs/api/interfaces/VectorStoreInfo.md new file mode 100644 index 0000000000000000000000000000000000000000..adf63a36947d6a0fdb2dda0815eb41ee5d3331f8 --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorStoreInfo.md @@ -0,0 +1,27 @@ +--- +id: "VectorStoreInfo" +title: "Interface: VectorStoreInfo" +sidebar_label: "VectorStoreInfo" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### contentInfo + +• **contentInfo**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:46](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L46) + +___ + +### metadataInfo + +• **metadataInfo**: [`MetadataInfo`](MetadataInfo.md)[] + +#### Defined in + +[storage/vectorStore/types.ts:45](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L45) diff --git a/apps/docs/docs/api/interfaces/VectorStoreQuery.md b/apps/docs/docs/api/interfaces/VectorStoreQuery.md new file mode 100644 index 0000000000000000000000000000000000000000..794e821e4f00947d1ba4ff686c2765c4541508a4 --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorStoreQuery.md @@ -0,0 +1,87 @@ +--- +id: "VectorStoreQuery" +title: "Interface: VectorStoreQuery" +sidebar_label: "VectorStoreQuery" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### alpha + +• `Optional` **alpha**: `number` + +#### Defined in + +[storage/vectorStore/types.ts:55](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L55) + +___ + +### docIds + +• `Optional` **docIds**: `string`[] + +#### Defined in + +[storage/vectorStore/types.ts:52](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L52) + +___ + +### filters + +• `Optional` **filters**: [`MetadataFilters`](MetadataFilters.md) + +#### Defined in + +[storage/vectorStore/types.ts:56](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L56) + +___ + +### mmrThreshold + +• `Optional` **mmrThreshold**: `number` + +#### Defined in + +[storage/vectorStore/types.ts:57](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L57) + +___ + +### mode + +• **mode**: [`VectorStoreQueryMode`](../enums/VectorStoreQueryMode.md) + +#### Defined in + +[storage/vectorStore/types.ts:54](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L54) + +___ + +### queryEmbedding + +• `Optional` **queryEmbedding**: `number`[] + +#### Defined in + +[storage/vectorStore/types.ts:50](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L50) + +___ + +### queryStr + +• `Optional` **queryStr**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:53](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L53) + +___ + +### similarityTopK + +• **similarityTopK**: `number` + +#### Defined in + +[storage/vectorStore/types.ts:51](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L51) diff --git a/apps/docs/docs/api/interfaces/VectorStoreQueryResult.md b/apps/docs/docs/api/interfaces/VectorStoreQueryResult.md new file mode 100644 index 0000000000000000000000000000000000000000..aa5223f33205719b9a5329c2e5ab7c2aabe71f19 --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorStoreQueryResult.md @@ -0,0 +1,37 @@ +--- +id: "VectorStoreQueryResult" +title: "Interface: VectorStoreQueryResult" +sidebar_label: "VectorStoreQueryResult" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### ids + +• **ids**: `string`[] + +#### Defined in + +[storage/vectorStore/types.ts:8](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L8) + +___ + +### nodes + +• `Optional` **nodes**: [`BaseNode`](../classes/BaseNode.md)[] + +#### Defined in + +[storage/vectorStore/types.ts:6](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L6) + +___ + +### similarities + +• **similarities**: `number`[] + +#### Defined in + +[storage/vectorStore/types.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L7) diff --git a/apps/docs/docs/api/interfaces/VectorStoreQuerySpec.md b/apps/docs/docs/api/interfaces/VectorStoreQuerySpec.md new file mode 100644 index 0000000000000000000000000000000000000000..75451bb7f7912993d02b6a021939c4e921c23bd1 --- /dev/null +++ b/apps/docs/docs/api/interfaces/VectorStoreQuerySpec.md @@ -0,0 +1,37 @@ +--- +id: "VectorStoreQuerySpec" +title: "Interface: VectorStoreQuerySpec" +sidebar_label: "VectorStoreQuerySpec" +sidebar_position: 0 +custom_edit_url: null +--- + +## Properties + +### filters + +• **filters**: [`ExactMatchFilter`](ExactMatchFilter.md)[] + +#### Defined in + +[storage/vectorStore/types.ts:34](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L34) + +___ + +### query + +• **query**: `string` + +#### Defined in + +[storage/vectorStore/types.ts:33](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L33) + +___ + +### topK + +• `Optional` **topK**: `number` + +#### Defined in + +[storage/vectorStore/types.ts:35](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/vectorStore/types.ts#L35) diff --git a/apps/docs/docs/api/interfaces/WalkableFileSystem.md b/apps/docs/docs/api/interfaces/WalkableFileSystem.md new file mode 100644 index 0000000000000000000000000000000000000000..cf2dcb3bdd0e4f7b455061df29b948a80ad06479 --- /dev/null +++ b/apps/docs/docs/api/interfaces/WalkableFileSystem.md @@ -0,0 +1,47 @@ +--- +id: "WalkableFileSystem" +title: "Interface: WalkableFileSystem" +sidebar_label: "WalkableFileSystem" +sidebar_position: 0 +custom_edit_url: null +--- + +## Methods + +### readdir + +▸ **readdir**(`path`): `Promise`<`string`[]\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | + +#### Returns + +`Promise`<`string`[]\> + +#### Defined in + +[storage/FileSystem.ts:17](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L17) + +___ + +### stat + +▸ **stat**(`path`): `Promise`<`any`\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `path` | `string` | + +#### Returns + +`Promise`<`any`\> + +#### Defined in + +[storage/FileSystem.ts:18](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L18) diff --git a/apps/docs/docs/api/interfaces/_category_.yml b/apps/docs/docs/api/interfaces/_category_.yml new file mode 100644 index 0000000000000000000000000000000000000000..43bec88cfa0aadb0e46f28701aad493dec3f2097 --- /dev/null +++ b/apps/docs/docs/api/interfaces/_category_.yml @@ -0,0 +1,2 @@ +label: "Interfaces" +position: 4 \ No newline at end of file diff --git a/apps/docs/docs/api/modules.md b/apps/docs/docs/api/modules.md new file mode 100644 index 0000000000000000000000000000000000000000..abb999529cd423715b86673d190f733496e87de7 --- /dev/null +++ b/apps/docs/docs/api/modules.md @@ -0,0 +1,999 @@ +--- +id: "modules" +title: "llamaindex" +sidebar_label: "Exports" +sidebar_position: 0.5 +custom_edit_url: null +--- + +## Enumerations + +- [DeuceChatStrategy](enums/DeuceChatStrategy.md) +- [IndexStructType](enums/IndexStructType.md) +- [ListRetrieverMode](enums/ListRetrieverMode.md) +- [MetadataMode](enums/MetadataMode.md) +- [NodeRelationship](enums/NodeRelationship.md) +- [ObjectType](enums/ObjectType.md) +- [SimilarityType](enums/SimilarityType.md) +- [VectorStoreQueryMode](enums/VectorStoreQueryMode.md) + +## Classes + +- [Anthropic](classes/Anthropic.md) +- [BaseEmbedding](classes/BaseEmbedding.md) +- [BaseIndex](classes/BaseIndex.md) +- [BaseNode](classes/BaseNode.md) +- [CallbackManager](classes/CallbackManager.md) +- [CompactAndRefine](classes/CompactAndRefine.md) +- [CondenseQuestionChatEngine](classes/CondenseQuestionChatEngine.md) +- [ContextChatEngine](classes/ContextChatEngine.md) +- [Document](classes/Document.md) +- [InMemoryFileSystem](classes/InMemoryFileSystem.md) +- [IndexDict](classes/IndexDict.md) +- [IndexList](classes/IndexList.md) +- [IndexNode](classes/IndexNode.md) +- [IndexStruct](classes/IndexStruct.md) +- [LLMQuestionGenerator](classes/LLMQuestionGenerator.md) +- [ListIndex](classes/ListIndex.md) +- [ListIndexLLMRetriever](classes/ListIndexLLMRetriever.md) +- [ListIndexRetriever](classes/ListIndexRetriever.md) +- [LlamaDeuce](classes/LlamaDeuce.md) +- [OpenAI](classes/OpenAI.md) +- [OpenAIEmbedding](classes/OpenAIEmbedding.md) +- [PDFReader](classes/PDFReader.md) +- [Refine](classes/Refine.md) +- [Response](classes/Response.md) +- [ResponseSynthesizer](classes/ResponseSynthesizer.md) +- [RetrieverQueryEngine](classes/RetrieverQueryEngine.md) +- [SentenceSplitter](classes/SentenceSplitter.md) +- [SimpleChatEngine](classes/SimpleChatEngine.md) +- [SimpleDirectoryReader](classes/SimpleDirectoryReader.md) +- [SimpleNodeParser](classes/SimpleNodeParser.md) +- [SimpleResponseBuilder](classes/SimpleResponseBuilder.md) +- [SubQuestionOutputParser](classes/SubQuestionOutputParser.md) +- [SubQuestionQueryEngine](classes/SubQuestionQueryEngine.md) +- [TextFileReader](classes/TextFileReader.md) +- [TextNode](classes/TextNode.md) +- [TreeSummarize](classes/TreeSummarize.md) +- [VectorIndexRetriever](classes/VectorIndexRetriever.md) +- [VectorStoreIndex](classes/VectorStoreIndex.md) + +## Interfaces + +- [BaseIndexInit](interfaces/BaseIndexInit.md) +- [BaseOutputParser](interfaces/BaseOutputParser.md) +- [BaseQueryEngine](interfaces/BaseQueryEngine.md) +- [BaseQuestionGenerator](interfaces/BaseQuestionGenerator.md) +- [BaseReader](interfaces/BaseReader.md) +- [BaseRetriever](interfaces/BaseRetriever.md) +- [BaseTool](interfaces/BaseTool.md) +- [ChatEngine](interfaces/ChatEngine.md) +- [ChatMessage](interfaces/ChatMessage.md) +- [ChatResponse](interfaces/ChatResponse.md) +- [Event](interfaces/Event.md) +- [ExactMatchFilter](interfaces/ExactMatchFilter.md) +- [GenericFileSystem](interfaces/GenericFileSystem.md) +- [LLM](interfaces/LLM.md) +- [MetadataFilters](interfaces/MetadataFilters.md) +- [MetadataInfo](interfaces/MetadataInfo.md) +- [NodeParser](interfaces/NodeParser.md) +- [NodeWithEmbedding](interfaces/NodeWithEmbedding.md) +- [NodeWithScore](interfaces/NodeWithScore.md) +- [QueryEngineTool](interfaces/QueryEngineTool.md) +- [RelatedNodeInfo](interfaces/RelatedNodeInfo.md) +- [RetrievalCallbackResponse](interfaces/RetrievalCallbackResponse.md) +- [ServiceContext](interfaces/ServiceContext.md) +- [ServiceContextOptions](interfaces/ServiceContextOptions.md) +- [StorageContext](interfaces/StorageContext.md) +- [StreamCallbackResponse](interfaces/StreamCallbackResponse.md) +- [StreamToken](interfaces/StreamToken.md) +- [StructuredOutput](interfaces/StructuredOutput.md) +- [SubQuestion](interfaces/SubQuestion.md) +- [ToolMetadata](interfaces/ToolMetadata.md) +- [VectorIndexConstructorProps](interfaces/VectorIndexConstructorProps.md) +- [VectorIndexOptions](interfaces/VectorIndexOptions.md) +- [VectorStore](interfaces/VectorStore.md) +- [VectorStoreInfo](interfaces/VectorStoreInfo.md) +- [VectorStoreQuery](interfaces/VectorStoreQuery.md) +- [VectorStoreQueryResult](interfaces/VectorStoreQueryResult.md) +- [VectorStoreQuerySpec](interfaces/VectorStoreQuerySpec.md) +- [WalkableFileSystem](interfaces/WalkableFileSystem.md) + +## Type Aliases + +### CompleteFileSystem + +Ƭ **CompleteFileSystem**: [`GenericFileSystem`](interfaces/GenericFileSystem.md) & [`WalkableFileSystem`](interfaces/WalkableFileSystem.md) + +#### Defined in + +[storage/FileSystem.ts:49](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L49) + +___ + +### CompletionResponse + +Ƭ **CompletionResponse**: [`ChatResponse`](interfaces/ChatResponse.md) + +#### Defined in + +[llm/LLM.ts:32](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L32) + +___ + +### EventTag + +Ƭ **EventTag**: ``"intermediate"`` \| ``"final"`` + +#### Defined in + +[callbacks/CallbackManager.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L10) + +___ + +### EventType + +Ƭ **EventType**: ``"retrieve"`` \| ``"llmPredict"`` \| ``"wrapper"`` + +#### Defined in + +[callbacks/CallbackManager.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/callbacks/CallbackManager.ts#L11) + +___ + +### MessageType + +Ƭ **MessageType**: ``"user"`` \| ``"assistant"`` \| ``"system"`` \| ``"generic"`` \| ``"function"`` + +#### Defined in + +[llm/LLM.ts:13](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L13) + +___ + +### RelatedNodeType + +Ƭ **RelatedNodeType**: [`RelatedNodeInfo`](interfaces/RelatedNodeInfo.md) \| [`RelatedNodeInfo`](interfaces/RelatedNodeInfo.md)[] + +#### Defined in + +[Node.ts:32](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Node.ts#L32) + +___ + +### SimpleDirectoryReaderLoadDataProps + +Ƭ **SimpleDirectoryReaderLoadDataProps**: `Object` + +#### Type declaration + +| Name | Type | +| :------ | :------ | +| `defaultReader?` | [`BaseReader`](interfaces/BaseReader.md) \| ``null`` | +| `directoryPath` | `string` | +| `fileExtToReader?` | `Record`<`string`, [`BaseReader`](interfaces/BaseReader.md)\> | +| `fs?` | [`CompleteFileSystem`](modules.md#completefilesystem) | + +#### Defined in + +[readers/SimpleDirectoryReader.ts:26](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/readers/SimpleDirectoryReader.ts#L26) + +___ + +### SimplePrompt + +Ƭ **SimplePrompt**: (`input`: `Record`<`string`, `string`\>) => `string` + +#### Type declaration + +▸ (`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +##### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +##### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +## Variables + +### ALL\_AVAILABLE\_LLAMADEUCE\_MODELS + +• `Const` **ALL\_AVAILABLE\_LLAMADEUCE\_MODELS**: `Object` + +#### Type declaration + +| Name | Type | +| :------ | :------ | +| `Llama-2-13b-chat` | { `contextWindow`: `number` = 4096; `replicateApi`: `string` = "a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5" } | +| `Llama-2-13b-chat.contextWindow` | `number` | +| `Llama-2-13b-chat.replicateApi` | `string` | +| `Llama-2-70b-chat` | { `contextWindow`: `number` = 4096; `replicateApi`: `string` = "replicate/llama70b-v2-chat:e951f18578850b652510200860fc4ea62b3b16fac280f83ff32282f87bbd2e48" } | +| `Llama-2-70b-chat.contextWindow` | `number` | +| `Llama-2-70b-chat.replicateApi` | `string` | +| `Llama-2-7b-chat` | { `contextWindow`: `number` = 4096; `replicateApi`: `string` = "a16z-infra/llama7b-v2-chat:4f0a4744c7295c024a1de15e1a63c880d3da035fa1f49bfd344fe076074c8eea" } | +| `Llama-2-7b-chat.contextWindow` | `number` | +| `Llama-2-7b-chat.replicateApi` | `string` | + +#### Defined in + +[llm/LLM.ts:171](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L171) + +___ + +### ALL\_AVAILABLE\_OPENAI\_MODELS + +• `Const` **ALL\_AVAILABLE\_OPENAI\_MODELS**: `Object` + +We currently support GPT-3.5 and GPT-4 models + +#### Type declaration + +| Name | Type | +| :------ | :------ | +| `gpt-3.5-turbo` | { `contextWindow`: `number` = 4096 } | +| `gpt-3.5-turbo.contextWindow` | `number` | +| `gpt-3.5-turbo-16k` | { `contextWindow`: `number` = 16384 } | +| `gpt-3.5-turbo-16k.contextWindow` | `number` | +| `gpt-4` | { `contextWindow`: `number` = 8192 } | +| `gpt-4.contextWindow` | `number` | +| `gpt-4-32k` | { `contextWindow`: `number` = 32768 } | +| `gpt-4-32k.contextWindow` | `number` | + +#### Defined in + +[llm/LLM.ts:64](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L64) + +___ + +### DEFAULT\_CHUNK\_OVERLAP + +• `Const` **DEFAULT\_CHUNK\_OVERLAP**: ``20`` + +#### Defined in + +[constants.ts:5](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L5) + +___ + +### DEFAULT\_CHUNK\_OVERLAP\_RATIO + +• `Const` **DEFAULT\_CHUNK\_OVERLAP\_RATIO**: ``0.1`` + +#### Defined in + +[constants.ts:6](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L6) + +___ + +### DEFAULT\_CHUNK\_SIZE + +• `Const` **DEFAULT\_CHUNK\_SIZE**: ``1024`` + +#### Defined in + +[constants.ts:4](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L4) + +___ + +### DEFAULT\_COLLECTION + +• `Const` **DEFAULT\_COLLECTION**: ``"data"`` + +#### Defined in + +[storage/constants.ts:1](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L1) + +___ + +### DEFAULT\_CONTEXT\_WINDOW + +• `Const` **DEFAULT\_CONTEXT\_WINDOW**: ``3900`` + +#### Defined in + +[constants.ts:1](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L1) + +___ + +### DEFAULT\_DOC\_STORE\_PERSIST\_FILENAME + +• `Const` **DEFAULT\_DOC\_STORE\_PERSIST\_FILENAME**: ``"docstore.json"`` + +#### Defined in + +[storage/constants.ts:4](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L4) + +___ + +### DEFAULT\_EMBEDDING\_DIM + +• `Const` **DEFAULT\_EMBEDDING\_DIM**: ``1536`` + +#### Defined in + +[constants.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L10) + +___ + +### DEFAULT\_FS + +• `Const` **DEFAULT\_FS**: [`GenericFileSystem`](interfaces/GenericFileSystem.md) \| [`CompleteFileSystem`](modules.md#completefilesystem) + +#### Defined in + +[storage/FileSystem.ts:62](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L62) + +___ + +### DEFAULT\_GRAPH\_STORE\_PERSIST\_FILENAME + +• `Const` **DEFAULT\_GRAPH\_STORE\_PERSIST\_FILENAME**: ``"graph_store.json"`` + +#### Defined in + +[storage/constants.ts:6](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L6) + +___ + +### DEFAULT\_INDEX\_STORE\_PERSIST\_FILENAME + +• `Const` **DEFAULT\_INDEX\_STORE\_PERSIST\_FILENAME**: ``"index_store.json"`` + +#### Defined in + +[storage/constants.ts:3](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L3) + +___ + +### DEFAULT\_NAMESPACE + +• `Const` **DEFAULT\_NAMESPACE**: ``"docstore"`` + +#### Defined in + +[storage/constants.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L7) + +___ + +### DEFAULT\_NUM\_OUTPUTS + +• `Const` **DEFAULT\_NUM\_OUTPUTS**: ``256`` + +#### Defined in + +[constants.ts:2](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L2) + +___ + +### DEFAULT\_PADDING + +• `Const` **DEFAULT\_PADDING**: ``5`` + +#### Defined in + +[constants.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L11) + +___ + +### DEFAULT\_PERSIST\_DIR + +• `Const` **DEFAULT\_PERSIST\_DIR**: ``"./storage"`` + +#### Defined in + +[storage/constants.ts:2](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L2) + +___ + +### DEFAULT\_SIMILARITY\_TOP\_K + +• `Const` **DEFAULT\_SIMILARITY\_TOP\_K**: ``2`` + +#### Defined in + +[constants.ts:7](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/constants.ts#L7) + +___ + +### DEFAULT\_VECTOR\_STORE\_PERSIST\_FILENAME + +• `Const` **DEFAULT\_VECTOR\_STORE\_PERSIST\_FILENAME**: ``"vector_store.json"`` + +#### Defined in + +[storage/constants.ts:5](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/constants.ts#L5) + +___ + +### GPT4\_MODELS + +• `Const` **GPT4\_MODELS**: `Object` + +#### Type declaration + +| Name | Type | +| :------ | :------ | +| `gpt-4` | { `contextWindow`: `number` = 8192 } | +| `gpt-4.contextWindow` | `number` | +| `gpt-4-32k` | { `contextWindow`: `number` = 32768 } | +| `gpt-4-32k.contextWindow` | `number` | + +#### Defined in + +[llm/LLM.ts:51](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L51) + +___ + +### TURBO\_MODELS + +• `Const` **TURBO\_MODELS**: `Object` + +#### Type declaration + +| Name | Type | +| :------ | :------ | +| `gpt-3.5-turbo` | { `contextWindow`: `number` = 4096 } | +| `gpt-3.5-turbo.contextWindow` | `number` | +| `gpt-3.5-turbo-16k` | { `contextWindow`: `number` = 16384 } | +| `gpt-3.5-turbo-16k.contextWindow` | `number` | + +#### Defined in + +[llm/LLM.ts:56](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/llm/LLM.ts#L56) + +___ + +### globalsHelper + +• `Const` **globalsHelper**: `GlobalsHelper` + +#### Defined in + +[GlobalsHelper.ts:50](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/GlobalsHelper.ts#L50) + +## Functions + +### buildToolsText + +▸ **buildToolsText**(`tools`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `tools` | [`ToolMetadata`](interfaces/ToolMetadata.md)[] | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:198](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L198) + +___ + +### contextSystemPrompt + +▸ **contextSystemPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultChoiceSelectPrompt + +▸ **defaultChoiceSelectPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultCondenseQuestionPrompt + +▸ **defaultCondenseQuestionPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultRefinePrompt + +▸ **defaultRefinePrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultSubQuestionPrompt + +▸ **defaultSubQuestionPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultSummaryPrompt + +▸ **defaultSummaryPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### defaultTextQaPrompt + +▸ **defaultTextQaPrompt**(`input`): `string` + +A SimplePrompt is a function that takes a dictionary of inputs and returns a string. +NOTE this is a different interface compared to LlamaIndex Python +NOTE 2: we default to empty string to make it easy to calculate prompt sizes + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `input` | `Record`<`string`, `string`\> | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:10](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L10) + +___ + +### exists + +▸ **exists**(`fs`, `path`): `Promise`<`boolean`\> + +Checks if a file exists. +Analogous to the os.path.exists function from Python. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `fs` | [`GenericFileSystem`](interfaces/GenericFileSystem.md) | The filesystem to use. | +| `path` | `string` | The path to the file to check. | + +#### Returns + +`Promise`<`boolean`\> + +A promise that resolves to true if the file exists, false otherwise. + +#### Defined in + +[storage/FileSystem.ts:74](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L74) + +___ + +### getNodeFS + +▸ **getNodeFS**(): [`CompleteFileSystem`](modules.md#completefilesystem) + +#### Returns + +[`CompleteFileSystem`](modules.md#completefilesystem) + +#### Defined in + +[storage/FileSystem.ts:51](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L51) + +___ + +### getNodesFromDocument + +▸ **getNodesFromDocument**(`document`, `textSplitter`, `includeMetadata?`, `includePrevNextRel?`): [`TextNode`](classes/TextNode.md)[] + +Generates an array of nodes from a document. + +#### Parameters + +| Name | Type | Default value | Description | +| :------ | :------ | :------ | :------ | +| `document` | [`Document`](classes/Document.md) | `undefined` | The document to generate nodes from. | +| `textSplitter` | [`SentenceSplitter`](classes/SentenceSplitter.md) | `undefined` | The text splitter to use. | +| `includeMetadata` | `boolean` | `true` | Whether to include metadata in the nodes. | +| `includePrevNextRel` | `boolean` | `true` | Whether to include previous and next relationships in the nodes. | + +#### Returns + +[`TextNode`](classes/TextNode.md)[] + +An array of nodes. + +#### Defined in + +[NodeParser.ts:29](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L29) + +___ + +### getResponseBuilder + +▸ **getResponseBuilder**(`serviceContext`, `responseMode?`): `BaseResponseBuilder` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](interfaces/ServiceContext.md) | +| `responseMode?` | `ResponseMode` | + +#### Returns + +`BaseResponseBuilder` + +#### Defined in + +[ResponseSynthesizer.ts:262](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ResponseSynthesizer.ts#L262) + +___ + +### getTextSplitsFromDocument + +▸ **getTextSplitsFromDocument**(`document`, `textSplitter`): `string`[] + +Splits the text of a document into smaller parts. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `document` | [`Document`](classes/Document.md) | The document to split. | +| `textSplitter` | [`SentenceSplitter`](classes/SentenceSplitter.md) | The text splitter to use. | + +#### Returns + +`string`[] + +An array of text splits. + +#### Defined in + +[NodeParser.ts:11](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/NodeParser.ts#L11) + +___ + +### getTopKEmbeddings + +▸ **getTopKEmbeddings**(`queryEmbedding`, `embeddings`, `similarityTopK?`, `embeddingIds?`, `similarityCutoff?`): [`number`[], `any`[]] + +Get the top K embeddings from a list of embeddings ordered by similarity to the query. + +#### Parameters + +| Name | Type | Default value | Description | +| :------ | :------ | :------ | :------ | +| `queryEmbedding` | `number`[] | `undefined` | | +| `embeddings` | `number`[][] | `undefined` | list of embeddings to consider | +| `similarityTopK` | `number` | `DEFAULT_SIMILARITY_TOP_K` | max number of embeddings to return, default 2 | +| `embeddingIds` | ``null`` \| `any`[] | `null` | ids of embeddings in the embeddings list | +| `similarityCutoff` | ``null`` \| `number` | `null` | minimum similarity score | + +#### Returns + +[`number`[], `any`[]] + +#### Defined in + +[Embedding.ts:77](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L77) + +___ + +### getTopKEmbeddingsLearner + +▸ **getTopKEmbeddingsLearner**(`queryEmbedding`, `embeddings`, `similarityTopK?`, `embeddingsIds?`, `queryMode?`): [`number`[], `any`[]] + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `queryEmbedding` | `number`[] | `undefined` | +| `embeddings` | `number`[][] | `undefined` | +| `similarityTopK?` | `number` | `undefined` | +| `embeddingsIds?` | `any`[] | `undefined` | +| `queryMode` | [`VectorStoreQueryMode`](enums/VectorStoreQueryMode.md) | `VectorStoreQueryMode.SVM` | + +#### Returns + +[`number`[], `any`[]] + +#### Defined in + +[Embedding.ts:119](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L119) + +___ + +### getTopKMMREmbeddings + +▸ **getTopKMMREmbeddings**(`queryEmbedding`, `embeddings`, `similarityFn?`, `similarityTopK?`, `embeddingIds?`, `_similarityCutoff?`, `mmrThreshold?`): [`number`[], `any`[]] + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `queryEmbedding` | `number`[] | `undefined` | +| `embeddings` | `number`[][] | `undefined` | +| `similarityFn` | ``null`` \| (...`args`: `any`[]) => `number` | `null` | +| `similarityTopK` | ``null`` \| `number` | `null` | +| `embeddingIds` | ``null`` \| `any`[] | `null` | +| `_similarityCutoff` | ``null`` \| `number` | `null` | +| `mmrThreshold` | ``null`` \| `number` | `null` | + +#### Returns + +[`number`[], `any`[]] + +#### Defined in + +[Embedding.ts:131](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L131) + +___ + +### jsonToIndexStruct + +▸ **jsonToIndexStruct**(`json`): [`IndexStruct`](classes/IndexStruct.md) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `json` | `any` | + +#### Returns + +[`IndexStruct`](classes/IndexStruct.md) + +#### Defined in + +[indices/BaseIndex.ts:70](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/indices/BaseIndex.ts#L70) + +___ + +### messagesToHistoryStr + +▸ **messagesToHistoryStr**(`messages`): `string` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `messages` | [`ChatMessage`](interfaces/ChatMessage.md)[] | + +#### Returns + +`string` + +#### Defined in + +[Prompt.ts:300](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Prompt.ts#L300) + +___ + +### serviceContextFromDefaults + +▸ **serviceContextFromDefaults**(`options?`): [`ServiceContext`](interfaces/ServiceContext.md) + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `options?` | [`ServiceContextOptions`](interfaces/ServiceContextOptions.md) | + +#### Returns + +[`ServiceContext`](interfaces/ServiceContext.md) + +#### Defined in + +[ServiceContext.ts:30](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L30) + +___ + +### serviceContextFromServiceContext + +▸ **serviceContextFromServiceContext**(`serviceContext`, `options`): `Object` + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `serviceContext` | [`ServiceContext`](interfaces/ServiceContext.md) | +| `options` | [`ServiceContextOptions`](interfaces/ServiceContextOptions.md) | + +#### Returns + +`Object` + +| Name | Type | +| :------ | :------ | +| `callbackManager` | [`CallbackManager`](classes/CallbackManager.md) | +| `embedModel` | [`BaseEmbedding`](classes/BaseEmbedding.md) | +| `llm` | [`LLM`](interfaces/LLM.md) | +| `nodeParser` | [`NodeParser`](interfaces/NodeParser.md) | +| `promptHelper` | `PromptHelper` | + +#### Defined in + +[ServiceContext.ts:48](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/ServiceContext.ts#L48) + +___ + +### similarity + +▸ **similarity**(`embedding1`, `embedding2`, `mode?`): `number` + +The similarity between two embeddings. + +#### Parameters + +| Name | Type | Default value | +| :------ | :------ | :------ | +| `embedding1` | `number`[] | `undefined` | +| `embedding2` | `number`[] | `undefined` | +| `mode` | [`SimilarityType`](enums/SimilarityType.md) | `SimilarityType.DEFAULT` | + +#### Returns + +`number` + +similartiy score with higher numbers meaning the two embeddings are more similar + +#### Defined in + +[Embedding.ts:22](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/Embedding.ts#L22) + +___ + +### storageContextFromDefaults + +▸ **storageContextFromDefaults**(`«destructured»`): `Promise`<[`StorageContext`](interfaces/StorageContext.md)\> + +#### Parameters + +| Name | Type | +| :------ | :------ | +| `«destructured»` | `BuilderParams` | + +#### Returns + +`Promise`<[`StorageContext`](interfaces/StorageContext.md)\> + +#### Defined in + +[storage/StorageContext.ts:28](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/StorageContext.ts#L28) + +___ + +### walk + +▸ **walk**(`fs`, `dirPath`): `AsyncIterable`<`string`\> + +Recursively traverses a directory and yields all the paths to the files in it. + +#### Parameters + +| Name | Type | Description | +| :------ | :------ | :------ | +| `fs` | [`WalkableFileSystem`](interfaces/WalkableFileSystem.md) | The filesystem to use. | +| `dirPath` | `string` | The path to the directory to traverse. | + +#### Returns + +`AsyncIterable`<`string`\> + +#### Defined in + +[storage/FileSystem.ts:91](https://github.com/run-llama/LlamaIndexTS/blob/main/packages/core/src/storage/FileSystem.ts#L91)