Skip to content
Snippets Groups Projects
Unverified Commit d3fa729a authored by Marcus Schiesser's avatar Marcus Schiesser Committed by GitHub
Browse files

Cleanups before 0.9 release (#1656)

parent c7c08005
No related branches found
No related tags found
No related merge requests found
Showing with 208 additions and 97 deletions
...@@ -2,6 +2,8 @@ ...@@ -2,6 +2,8 @@
title: Local LLMs title: Local LLMs
--- ---
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
LlamaIndex.TS supports OpenAI and [other remote LLM APIs](other_llms). You can also run a local LLM on your machine! LlamaIndex.TS supports OpenAI and [other remote LLM APIs](other_llms). You can also run a local LLM on your machine!
## Using a local model via Ollama ## Using a local model via Ollama
...@@ -24,7 +26,23 @@ The first time you run it will also automatically download and install the model ...@@ -24,7 +26,23 @@ The first time you run it will also automatically download and install the model
### Switch the LLM in your code ### Switch the LLM in your code
To tell LlamaIndex to use a local LLM, use the `Settings` object: To switch the LLM in your code, you first need to make sure to install the package for the Ollama model provider:
<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @llamaindex/ollama
```
```shell tab="yarn"
yarn add @llamaindex/ollama
```
```shell tab="pnpm"
pnpm add @llamaindex/ollama
```
</Tabs>
Then, to tell LlamaIndex to use a local LLM, use the `Settings` object:
```javascript ```javascript
Settings.llm = new Ollama({ Settings.llm = new Ollama({
...@@ -34,7 +52,25 @@ Settings.llm = new Ollama({ ...@@ -34,7 +52,25 @@ Settings.llm = new Ollama({
### Use local embeddings ### Use local embeddings
If you're doing retrieval-augmented generation, LlamaIndex.TS will also call out to OpenAI to index and embed your data. To be entirely local, you can use a local embedding model like this: If you're doing retrieval-augmented generation, LlamaIndex.TS will also call out to OpenAI to index and embed your data. To be entirely local, you can use a local embedding model from Huggingface like this:
First install the Huggingface model provider package:
<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @llamaindex/huggingface
```
```shell tab="yarn"
yarn add @llamaindex/huggingface
```
```shell tab="pnpm"
pnpm add @llamaindex/huggingface
```
</Tabs>
And then set the embedding model in your code:
```javascript ```javascript
Settings.embedModel = new HuggingFaceEmbedding({ Settings.embedModel = new HuggingFaceEmbedding({
......
--- ---
title: Installation title: Installation
description: Install llamaindex by running a single command. description: How to install llamaindex packages.
--- ---
import { Tab, Tabs } from "fumadocs-ui/components/tabs"; import { Tab, Tabs } from "fumadocs-ui/components/tabs";
......
...@@ -70,10 +70,8 @@ In Cloudflare Worker and similar serverless JS environment, you need to be aware ...@@ -70,10 +70,8 @@ In Cloudflare Worker and similar serverless JS environment, you need to be aware
- Some Node.js modules are not available in Cloudflare Worker, such as `node:fs`, `node:child_process`, `node:cluster`... - Some Node.js modules are not available in Cloudflare Worker, such as `node:fs`, `node:child_process`, `node:cluster`...
- You are recommend to design your code using network request, such as use `fetch` API to communicate with database, insteadof a long-running process in Node.js. - You are recommend to design your code using network request, such as use `fetch` API to communicate with database, insteadof a long-running process in Node.js.
- Some of LlamaIndex.TS modules are not available in Cloudflare Worker, for example `SimpleDirectoryReader` (requires `node:fs`), Some multimodal API that relies on [`onnxruntime-node`](https://www.npmjs.com/package/onnxruntime-node)(we might port to HTTP based module in the future). - Some of LlamaIndex.TS packages are not available in Cloudflare Worker, for example `@llamaindex/readers` and `@llamaindex/huggingface`.
- `@llamaindex/core` is designed to work in all JavaScript environment, including Cloudflare Worker. If you find any issue, please report to us. - The main `llamaindex` is designed to work in all JavaScript environment, including Cloudflare Worker. If you find any issue, please report to us.
- `@llamaindex/env` is a JS environment binding module, which polyfill some Node.js/Modern Web API (for example, we have a memory based `fs` module, and Crypto API polyfill). It is designed to work in all JavaScript environment, including Cloudflare Worker. - `@llamaindex/env` is a JS environment binding module, which polyfill some Node.js/Modern Web API (for example, we have a memory based `fs` module, and Crypto API polyfill). It is designed to work in all JavaScript environment, including Cloudflare Worker.
## Known issues
- `llamaindex` not work perfectly in Cloudflare Worker, bundle size will be larger than 1MB, which is the limit of Cloudflare Worker. You will need import submodule instead of the whole `llamaindex` module.
...@@ -20,5 +20,5 @@ LlamaIndex.TS provides tools for beginners, advanced users, and everyone in betw ...@@ -20,5 +20,5 @@ LlamaIndex.TS provides tools for beginners, advanced users, and everyone in betw
className="w-full h-[440px]" className="w-full h-[440px]"
aria-label="LlamaIndex.TS Starter" aria-label="LlamaIndex.TS Starter"
aria-description="This is a starter example for LlamaIndex.TS, it shows the basic usage of the library." aria-description="This is a starter example for LlamaIndex.TS, it shows the basic usage of the library."
src="https://stackblitz.com/github/run-llama/LlamaIndexTS/tree/main/examples?file=starter.ts" src="https://stackblitz.com/github/run-llama/LlamaIndexTS/tree/main/examples?embed=1&file=starter.ts"
/> />
...@@ -7,6 +7,7 @@ ...@@ -7,6 +7,7 @@
"what-is-llamaindex", "what-is-llamaindex",
"index", "index",
"getting_started", "getting_started",
"migration",
"guide", "guide",
"examples", "examples",
"modules", "modules",
......
---
title: Migrating from v0.8 to v0.9
---
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
Version 0.9 of LlamaIndex.TS introduces significant architectural changes to improve package size and runtime compatibility. The main goals of this release are:
1. Reduce the package size of the main `llamaindex` package by moving dependencies into provider packages, making it more suitable for serverless environments
2. Enable consistent code across different environments by using unified imports (no separate imports for Node.js and Edge runtimes)
## Major Changes
### Installing Provider Packages
In v0.9, you need to explicitly install the provider packages you want to use. The main `llamaindex` package no longer includes these dependencies by default.
### Updating Imports
You'll need to update your imports to get classes directly from their respective provider packages. Here's how to migrate different components:
### 1. AI Model Providers
Previously:
```typescript
import { OpenAI } from "llamaindex";
```
Now:
```typescript
import { OpenAI } from "@llamaindex/openai";
```
> Note: This examples requires installing the `@llamaindex/openai` package:
<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @llamaindex/openai
```
```shell tab="yarn"
yarn add @llamaindex/openai
```
```shell tab="pnpm"
pnpm add @llamaindex/openai
```
</Tabs>
For more details on available AI model providers and their configuration, see the [LLMs documentation](/docs/llamaindex/modules/llms) and the [Embedding Models documentation](/docs/llamaindex/modules/embeddings).
### 2. Storage Providers
Previously:
```typescript
import { PineconeVectorStore } from "llamaindex";
```
Now:
```typescript
import { PineconeVectorStore } from "@llamaindex/pinecone";
```
For more information about available storage options, refer to the [Data Stores documentation](/docs/llamaindex/modules/data_stores).
### 3. Data Loaders
Previously:
```typescript
import { SimpleDirectoryReader } from "llamaindex";
```
Now:
```typescript
import { SimpleDirectoryReader } from "@llamaindex/readers/directory";
```
For more details about available data loaders and their usage, check the [Loading Data](/docs/llamaindex/guide/loading).
### 4. Prefer using `llamaindex` instead of `@llamaindex/core`
`llamaindex` is now re-exporting most of `@llamaindex/core`. To simplify imports, just use `import { ... } from "llamaindex"` instead of `import { ... } from "@llamaindex/core"`. This is possible because `llamaindex` is now a smaller package.
We might change imports internally in `@llamaindex/core` in the future. Let us know if you're missing something.
## Benefits of the Changes
- **Smaller Bundle Size**: By moving dependencies to separate packages, your application only includes the features you actually use
- **Runtime Consistency**: The same code works across different environments without environment-specific imports
- **Improved Serverless Support**: Reduced package size makes it easier to deploy to serverless environments with size limitations
## Need Help?
If you encounter any issues during migration, please:
1. Check our [GitHub repository](https://github.com/run-llama/LlamaIndexTS) for the latest updates
2. Join our [Discord community](https://discord.gg/dGcwcsnxhU) for support
3. Open an issue on GitHub if you find a bug or have a feature request
{
"title": "Migration",
"description": "Migration between different versions",
"pages": ["0.8-to-0.9"]
}
...@@ -11,36 +11,36 @@ ...@@ -11,36 +11,36 @@
"@azure/cosmos": "^4.1.1", "@azure/cosmos": "^4.1.1",
"@azure/identity": "^4.4.1", "@azure/identity": "^4.4.1",
"@azure/search-documents": "^12.1.0", "@azure/search-documents": "^12.1.0",
"@llamaindex/anthropic": "workspace:* || ^0.0.33", "@llamaindex/anthropic": "^0.0.33",
"@llamaindex/astra": "workspace:* || ^0.0.4", "@llamaindex/astra": "^0.0.4",
"@llamaindex/azure": "workspace:* || ^0.0.4", "@llamaindex/azure": "^0.0.4",
"@llamaindex/chroma": "workspace:* || ^0.0.4", "@llamaindex/chroma": "^0.0.4",
"@llamaindex/clip": "workspace:* || ^0.0.35", "@llamaindex/clip": "^0.0.35",
"@llamaindex/cloud": "workspace:* || ^2.0.24", "@llamaindex/cloud": "^2.0.24",
"@llamaindex/cohere": "workspace:* || ^0.0.4", "@llamaindex/cohere": "^0.0.4",
"@llamaindex/deepinfra": "workspace:* || ^0.0.35", "@llamaindex/deepinfra": "^0.0.35",
"@llamaindex/env": "workspace:* || ^0.1.27", "@llamaindex/env": "^0.1.27",
"@llamaindex/google": "workspace:* || ^0.0.6", "@llamaindex/google": "^0.0.6",
"@llamaindex/groq": "workspace:* || ^0.0.50", "@llamaindex/groq": "^0.0.50",
"@llamaindex/huggingface": "workspace:* || ^0.0.35", "@llamaindex/huggingface": "^0.0.35",
"@llamaindex/milvus": "workspace:* || ^0.0.4", "@llamaindex/milvus": "^0.0.4",
"@llamaindex/mistral": "workspace:* || ^0.0.4", "@llamaindex/mistral": "^0.0.4",
"@llamaindex/mixedbread": "workspace:* || ^0.0.4", "@llamaindex/mixedbread": "^0.0.4",
"@llamaindex/mongodb": "workspace:* || ^0.0.4", "@llamaindex/mongodb": "^0.0.4",
"@llamaindex/node-parser": "workspace:* || ^0.0.24", "@llamaindex/node-parser": "^0.0.24",
"@llamaindex/ollama": "workspace:* || ^0.0.39", "@llamaindex/ollama": "^0.0.39",
"@llamaindex/openai": "workspace:* || ^0.1.51", "@llamaindex/openai": "^0.1.51",
"@llamaindex/pinecone": "workspace:* || ^0.0.4", "@llamaindex/pinecone": "^0.0.4",
"@llamaindex/portkey-ai": "workspace:* || ^0.0.32", "@llamaindex/portkey-ai": "^0.0.32",
"@llamaindex/postgres": "workspace:* || ^0.0.32", "@llamaindex/postgres": "^0.0.32",
"@llamaindex/qdrant": "workspace:* || ^0.0.4", "@llamaindex/qdrant": "^0.0.4",
"@llamaindex/readers": "workspace:* || ^1.0.25", "@llamaindex/readers": "^1.0.25",
"@llamaindex/replicate": "workspace:* || ^0.0.32", "@llamaindex/replicate": "^0.0.32",
"@llamaindex/upstash": "workspace:* || ^0.0.4", "@llamaindex/upstash": "^0.0.4",
"@llamaindex/vercel": "workspace:* || ^0.0.10", "@llamaindex/vercel": "^0.0.10",
"@llamaindex/vllm": "workspace:* || ^0.0.21", "@llamaindex/vllm": "^0.0.21",
"@llamaindex/weaviate": "workspace:* || ^0.0.4", "@llamaindex/weaviate": "^0.0.4",
"@llamaindex/workflow": "workspace:* || ^0.0.10", "@llamaindex/workflow": "^0.0.10",
"@notionhq/client": "^2.2.15", "@notionhq/client": "^2.2.15",
"@pinecone-database/pinecone": "^4.0.0", "@pinecone-database/pinecone": "^4.0.0",
"@vercel/postgres": "^0.10.0", "@vercel/postgres": "^0.10.0",
...@@ -49,7 +49,7 @@ ...@@ -49,7 +49,7 @@
"commander": "^12.1.0", "commander": "^12.1.0",
"dotenv": "^16.4.5", "dotenv": "^16.4.5",
"js-tiktoken": "^1.0.14", "js-tiktoken": "^1.0.14",
"llamaindex": "workspace:* || ^0.8.37", "llamaindex": "^0.8.37",
"mongodb": "6.7.0", "mongodb": "6.7.0",
"postgres": "^3.4.4", "postgres": "^3.4.4",
"wikipedia": "^2.1.2" "wikipedia": "^2.1.2"
......
...@@ -77,29 +77,10 @@ ...@@ -77,29 +77,10 @@
"default": "./dist/cjs/next.js" "default": "./dist/cjs/next.js"
} }
}, },
"./register": "./register.js",
"./internal/*": { "./internal/*": {
"import": "./dist/not-allow.js", "import": "./dist/not-allow.js",
"require": "./dist/cjs/not-allow.js" "require": "./dist/cjs/not-allow.js"
}, },
"./readers/SimpleDirectoryReader": {
"workerd": {
"types": "./dist/type/readers/SimpleDirectoryReader.edge.d.ts",
"default": "./dist/readers/SimpleDirectoryReader.edge.js"
},
"edge-light": {
"types": "./dist/type/readers/SimpleDirectoryReader.edge.d.ts",
"default": "./dist/readers/SimpleDirectoryReader.edge.js"
},
"import": {
"types": "./dist/type/readers/SimpleDirectoryReader.d.ts",
"default": "./dist/readers/SimpleDirectoryReader.js"
},
"require": {
"types": "./dist/type/readers/SimpleDirectoryReader.d.ts",
"default": "./dist/cjs/readers/SimpleDirectoryReader.js"
}
},
"./*": { "./*": {
"import": { "import": {
"types": "./dist/type/*.d.ts", "types": "./dist/type/*.d.ts",
...@@ -112,7 +93,6 @@ ...@@ -112,7 +93,6 @@
} }
}, },
"files": [ "files": [
"./register.js",
"dist", "dist",
"CHANGELOG.md", "CHANGELOG.md",
"examples", "examples",
......
/**
* ```shell
* node --import llamaindex/register ./loader.js
* ```
*/
import "@llamaindex/readers/node";
{ {
"name": "@llamaindex/anthropic", "name": "@llamaindex/anthropic",
"description": "Anthropic Adapter for LlamaIndex", "description": "Anthropic Adapter for LlamaIndex",
"version": "0.0.32", "version": "0.0.33",
"type": "module", "type": "module",
"main": "./dist/index.cjs", "main": "./dist/index.cjs",
"module": "./dist/index.js", "module": "./dist/index.js",
......
...@@ -572,94 +572,94 @@ importers: ...@@ -572,94 +572,94 @@ importers:
specifier: ^12.1.0 specifier: ^12.1.0
version: 12.1.0 version: 12.1.0
'@llamaindex/anthropic': '@llamaindex/anthropic':
specifier: workspace:* || ^0.0.33 specifier: ^0.0.33
version: link:../packages/providers/anthropic version: link:../packages/providers/anthropic
'@llamaindex/astra': '@llamaindex/astra':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/astra version: link:../packages/providers/storage/astra
'@llamaindex/azure': '@llamaindex/azure':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/azure version: link:../packages/providers/storage/azure
'@llamaindex/chroma': '@llamaindex/chroma':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/chroma version: link:../packages/providers/storage/chroma
'@llamaindex/clip': '@llamaindex/clip':
specifier: workspace:* || ^0.0.35 specifier: ^0.0.35
version: link:../packages/providers/clip version: link:../packages/providers/clip
'@llamaindex/cloud': '@llamaindex/cloud':
specifier: workspace:* || ^2.0.24 specifier: ^2.0.24
version: link:../packages/cloud version: link:../packages/cloud
'@llamaindex/cohere': '@llamaindex/cohere':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/cohere version: link:../packages/providers/cohere
'@llamaindex/deepinfra': '@llamaindex/deepinfra':
specifier: workspace:* || ^0.0.35 specifier: ^0.0.35
version: link:../packages/providers/deepinfra version: link:../packages/providers/deepinfra
'@llamaindex/env': '@llamaindex/env':
specifier: workspace:* || ^0.1.27 specifier: ^0.1.27
version: link:../packages/env version: link:../packages/env
'@llamaindex/google': '@llamaindex/google':
specifier: workspace:* || ^0.0.6 specifier: ^0.0.6
version: link:../packages/providers/google version: link:../packages/providers/google
'@llamaindex/groq': '@llamaindex/groq':
specifier: workspace:* || ^0.0.50 specifier: ^0.0.50
version: link:../packages/providers/groq version: link:../packages/providers/groq
'@llamaindex/huggingface': '@llamaindex/huggingface':
specifier: workspace:* || ^0.0.35 specifier: ^0.0.35
version: link:../packages/providers/huggingface version: link:../packages/providers/huggingface
'@llamaindex/milvus': '@llamaindex/milvus':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/milvus version: link:../packages/providers/storage/milvus
'@llamaindex/mistral': '@llamaindex/mistral':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/mistral version: link:../packages/providers/mistral
'@llamaindex/mixedbread': '@llamaindex/mixedbread':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/mixedbread version: link:../packages/providers/mixedbread
'@llamaindex/mongodb': '@llamaindex/mongodb':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/mongodb version: link:../packages/providers/storage/mongodb
'@llamaindex/node-parser': '@llamaindex/node-parser':
specifier: workspace:* || ^0.0.24 specifier: ^0.0.24
version: link:../packages/node-parser version: link:../packages/node-parser
'@llamaindex/ollama': '@llamaindex/ollama':
specifier: workspace:* || ^0.0.39 specifier: ^0.0.39
version: link:../packages/providers/ollama version: link:../packages/providers/ollama
'@llamaindex/openai': '@llamaindex/openai':
specifier: workspace:* || ^0.1.51 specifier: ^0.1.51
version: link:../packages/providers/openai version: link:../packages/providers/openai
'@llamaindex/pinecone': '@llamaindex/pinecone':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/pinecone version: link:../packages/providers/storage/pinecone
'@llamaindex/portkey-ai': '@llamaindex/portkey-ai':
specifier: workspace:* || ^0.0.32 specifier: ^0.0.32
version: link:../packages/providers/portkey-ai version: link:../packages/providers/portkey-ai
'@llamaindex/postgres': '@llamaindex/postgres':
specifier: workspace:* || ^0.0.32 specifier: ^0.0.32
version: link:../packages/providers/storage/postgres version: link:../packages/providers/storage/postgres
'@llamaindex/qdrant': '@llamaindex/qdrant':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/qdrant version: link:../packages/providers/storage/qdrant
'@llamaindex/readers': '@llamaindex/readers':
specifier: workspace:* || ^1.0.25 specifier: ^1.0.25
version: link:../packages/readers version: link:../packages/readers
'@llamaindex/replicate': '@llamaindex/replicate':
specifier: workspace:* || ^0.0.32 specifier: ^0.0.32
version: link:../packages/providers/replicate version: link:../packages/providers/replicate
'@llamaindex/upstash': '@llamaindex/upstash':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/upstash version: link:../packages/providers/storage/upstash
'@llamaindex/vercel': '@llamaindex/vercel':
specifier: workspace:* || ^0.0.10 specifier: ^0.0.10
version: link:../packages/providers/vercel version: link:../packages/providers/vercel
'@llamaindex/vllm': '@llamaindex/vllm':
specifier: workspace:* || ^0.0.21 specifier: ^0.0.21
version: link:../packages/providers/vllm version: link:../packages/providers/vllm
'@llamaindex/weaviate': '@llamaindex/weaviate':
specifier: workspace:* || ^0.0.4 specifier: ^0.0.4
version: link:../packages/providers/storage/weaviate version: link:../packages/providers/storage/weaviate
'@llamaindex/workflow': '@llamaindex/workflow':
specifier: workspace:* || ^0.0.10 specifier: ^0.0.10
version: link:../packages/workflow version: link:../packages/workflow
'@notionhq/client': '@notionhq/client':
specifier: ^2.2.15 specifier: ^2.2.15
...@@ -686,7 +686,7 @@ importers: ...@@ -686,7 +686,7 @@ importers:
specifier: ^1.0.14 specifier: ^1.0.14
version: 1.0.18 version: 1.0.18
llamaindex: llamaindex:
specifier: workspace:* || ^0.8.37 specifier: ^0.8.37
version: link:../packages/llamaindex version: link:../packages/llamaindex
mongodb: mongodb:
specifier: 6.7.0 specifier: 6.7.0
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment