Skip to content
Snippets Groups Projects
Unverified Commit 8b0e0e3c authored by Marcus Schiesser's avatar Marcus Schiesser Committed by GitHub
Browse files

docs: use dedicated embedding model for ollama (#745)

parent 87142b29
No related branches found
No related tags found
No related merge requests found
# Ollama
To use Ollama embeddings, you need to import `Ollama` from `llamaindex`.
To use Ollama embeddings, you need to import `OllamaEmbedding` from `llamaindex`.
Note that you need to pull the embedding model first before using it.
In the example below, we're using the [`nomic-embed-text`](https://ollama.com/library/nomic-embed-text) model, so you have to call:
```shell
ollama pull nomic-embed-text
```
```ts
import { Ollama, Settings } from "llamaindex";
import { OllamaEmbedding, Settings } from "llamaindex";
Settings.embedModel = new Ollama();
Settings.embedModel = new OllamaEmbedding({ model: "nomic-embed-text" });
const document = new Document({ text: essay, id_: "essay" });
......
import { OllamaEmbedding } from "llamaindex";
import { Ollama } from "llamaindex/llm/ollama";
(async () => {
const llm = new Ollama({ model: "llama3" });
const embedModel = new OllamaEmbedding({ model: "nomic-embed-text" });
{
const response = await llm.chat({
messages: [{ content: "Tell me a joke.", role: "user" }],
......@@ -35,7 +37,7 @@ import { Ollama } from "llamaindex/llm/ollama";
console.log(); // newline
}
{
const embedding = await llm.getTextEmbedding("Hello world!");
const embedding = await embedModel.getTextEmbedding("Hello world!");
console.log("Embedding:", embedding);
}
})();
......@@ -33,7 +33,7 @@
"turbo": "^1.13.2",
"typescript": "^5.4.5"
},
"packageManager": "pnpm@9.0.4",
"packageManager": "pnpm@9.0.5",
"pnpm": {
"overrides": {
"trim": "1.0.1",
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment