Skip to content
Snippets Groups Projects
Commit 31c99baf authored by Yi Ding's avatar Yi Ding
Browse files

Merge branch 'main' of github.com:run-llama/LlamaIndexTS

parents b2810777 805d6fb8
No related branches found
No related tags found
No related merge requests found
# LlamaIndex.TS # LlamaIndex.TS
LlamaIndex is a data framework for your LLM application.
Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript. Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript.
Documentation: https://ts.llamaindex.ai/
## What is LlamaIndex.TS? ## What is LlamaIndex.TS?
LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data. LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data.
......
...@@ -37,15 +37,14 @@ For more complex applications, our lower-level APIs allow advanced users to cust ...@@ -37,15 +37,14 @@ For more complex applications, our lower-level APIs allow advanced users to cust
Our documentation includes [Installation Instructions](./installation.md) and a [Starter Tutorial](./starter.md) to build your first application. Our documentation includes [Installation Instructions](./installation.md) and a [Starter Tutorial](./starter.md) to build your first application.
Once you're up and running, [High-Level Concepts](./concepts.md) has an overview of LlamaIndex's modular architecture. For more hands-on practical examples, look through our [End-to-End Tutorials](LINK TO EXAMPLES FOLDER). Once you're up and running, [High-Level Concepts](./concepts.md) has an overview of LlamaIndex's modular architecture. For more hands-on practical examples, look through our [End-to-End Tutorials](./end_to_end.md).
## 🗺️ Ecosystem ## 🗺️ Ecosystem
To download or contribute, find LlamaIndex on: To download or contribute, find LlamaIndex on:
- Github: https://github.com/jerryjliu/llama_index - Github: https://github.com/run-llama/LlamaIndexTS
- LlamaIndex (NPM): LINK TO NPM PACKAGE - NPM: https://www.npmjs.com/package/llamaindex
- LlamaIndex (Python): https://pypi.org/project/llama-index/.
## Community ## Community
......
...@@ -12,7 +12,7 @@ LlamaIndex.TS offers several core modules, seperated into high-level modules for ...@@ -12,7 +12,7 @@ LlamaIndex.TS offers several core modules, seperated into high-level modules for
- [**Indexes**](./high_level/data_index.md): indexes store the Nodes and the embeddings of those nodes. - [**Indexes**](./high_level/data_index.md): indexes store the Nodes and the embeddings of those nodes.
-[**QueryEngine**](./high_level/query_engine.md): Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected nodes from your Index to give the LLM the context it needs to answer your query. - [**QueryEngine**](./high_level/query_engine.md): Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected nodes from your Index to give the LLM the context it needs to answer your query.
- [**ChatEngine**](./high_level/chat_engine.md): A ChatEngine helps you build a chatbot that will interact with your Indexes. - [**ChatEngine**](./high_level/chat_engine.md): A ChatEngine helps you build a chatbot that will interact with your Indexes.
......
...@@ -11,7 +11,7 @@ In a new folder: ...@@ -11,7 +11,7 @@ In a new folder:
```bash npm2yarn ```bash npm2yarn
npm install typescript npm install typescript
npm install @types/node npm install @types/node
npx tsc -init # if needed npx tsc --init # if needed
``` ```
Create the file `example.ts`. This code will load some example data, create a document, index it (which creates embeddings using OpenAI), and then creates query engine to answer questions about the data. Create the file `example.ts`. This code will load some example data, create a document, index it (which creates embeddings using OpenAI), and then creates query engine to answer questions about the data.
......
...@@ -24,8 +24,8 @@ export default function Home(): JSX.Element { ...@@ -24,8 +24,8 @@ export default function Home(): JSX.Element {
const { siteConfig } = useDocusaurusContext(); const { siteConfig } = useDocusaurusContext();
return ( return (
<Layout <Layout
title={`Hello from ${siteConfig.title}`} title={`${siteConfig.title}`}
description="Description will go into a meta tag in <head />" description="LlamaIndex is a data framework for your LLM application. Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript."
> >
<HomepageHeader /> <HomepageHeader />
<main> <main>
......
# Simple Examples # Simple Examples
Due to packaging, you will need to run `pnpm --filter llamaindex build` before running these examples. Due to packaging, you will need to run these commands to get started.
```bash
pnpm --filter llamaindex build
pnpm install
```
Run them with ts-node, for example `npx ts-node vectorIndex.ts` Then run the examples with `ts-node`, for example `npx ts-node vectorIndex.ts`
import { LlamaDeuce } from "llamaindex/src/llm/LLM"; import { LlamaDeuce } from "llamaindex";
(async () => { (async () => {
const deuce = new LlamaDeuce(); const deuce = new LlamaDeuce();
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment