This is a [LlamaIndex](https://www.llamaindex.ai/) project using [Express](https://expressjs.com/) bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama).
## Getting Started
First, install the dependencies:
```
pnpm install
npm install
```
2. Run the server
Second, run the development server:
```
pnpm run dev
npm run dev
```
3. Call the API to LLM Chat
Then call the express API endpoint `/api/llm` to see the result: