Skip to content
Snippets Groups Projects
Unverified Commit 7c90e1b6 authored by Alex Yang's avatar Alex Yang Committed by GitHub
Browse files

docs(next): update homepage & init setup toturial (#1386)

parent 21ba0a80
No related branches found
No related tags found
No related merge requests found
---
title: Getting Started
description: We support multiple JS runtime and frameworks, bundlers.
---
<Cards>
<Card title="Node.js" href="/docs/llamaindex/starter/node" />
<Card title="Vite" href="/docs/llamaindex/starter/vite" />
<Card
title="Next.js (React Server Component)"
href="/docs/llamaindex/starter/next"
/>
<Card title="Cloudflare Worker" href="/docs/llamaindex/starter/cloudflare" />
</Cards>
{
"title": "Starter",
"description": "The starter guide",
"defaultOpen": true,
"pages": ["getting-started", "next", "node"]
}
---
title: With Next.js
description: In this guide, you'll learn how to use LlamaIndex with Next.js.
---
Before you start, make sure you have try LlamaIndex.TS in Node.js to make sure you understand the basics.
<Card
title="Getting Started with LlamaIndex.TS in Node.js"
href="/docs/llamaindex/starter/node"
/>
## Differences between Node.js and Next.js
Next.js is a React framework that has both server side compatibility and client side compatibility.
This means that you need to be careful when using LlamaIndex.TS in Next.js.
Don't leak the import data like API keys to the client side.
Also, in Next.js, there is build time and runtime. Some computations can be done at build time like Document embedding could be done at build time for better performance.
LlamaIndex.TS has lots of upstream dependencies, some of them are not compatible with Next.js.
You might need to use `withNext` to make sure that LlamaIndex.TS works well with Next.js.
```js
// next.config.mjs / next.config.ts
import withLlamaIndex from "llamaindex/next";
/** @type {import('next').NextConfig} */
const nextConfig = {};
export default withLlamaIndex(nextConfig);
```
If you see any dependency issues, you are welcome to open an issue on the GitHub.
---
title: With Node.js/Bun/Deno
description: In this guide, you'll learn how to use LlamaIndex with Node.js, Bun, and Deno.
---
## Adding environment variables
By default, LlamaIndex uses OpenAI provider, which requires an API key. You can set the `OPENAI_API_KEY` environment variable to authenticate with OpenAI.
```shell
export OPENAI_API_KEY=your-api-key
```
Or you can use a `.env` file:
```shell
echo "OPENAI_API_KEY=your-api-key" > .env
node --env-file .env your-script.js
```
<Callout type="warn">Do not commit the api key to git repository.</Callout>
For more information, see the [How to read environment variables from Node.js](https://nodejs.org/en/learn/command-line/how-to-read-environment-variables-from-nodejs).
## TypeScript support
LlamaIndex.TS is written in TypeScript and provides type definitions.
We recommend using [tsx](https://www.npmjs.com/package/tsx) to run TypeScript scripts.
```shell
npx tsx my-script.ts
# or
node --import tsx my-script.ts
```
// when we are ready, change to /docs/llamaindex
export const DOCUMENT_URL = 'https://legacy.ts.llamaindex.ai/'
This diff is collapsed.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment