Skip to content
Snippets Groups Projects
Unverified Commit c01502fb authored by Alex Yang's avatar Alex Yang Committed by GitHub
Browse files

docs: update document (#1418)

parent 075f88db
Branches
Tags
No related merge requests found
......@@ -20,7 +20,16 @@ export default async function Page(props: {
const MDX = page.data.body;
return (
<DocsPage toc={page.data.toc} full={page.data.full}>
<DocsPage
toc={page.data.toc}
full={page.data.full}
editOnGithub={{
owner: "run-llama",
repo: "LlamaIndexTS",
sha: "main",
path: `apps/next/src/content/docs/${page.file.path}`,
}}
>
<DocsTitle>{page.data.title}</DocsTitle>
<DocsDescription>{page.data.description}</DocsDescription>
<DocsBody>
......
---
title: Langtrace
description: Learn how to integrate LlamaIndex.TS with Langtrace.
---
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
Enhance your observability with Langtrace, a robust open-source tool supports OpenTelemetry and is designed to trace, evaluate, and manage LLM applications seamlessly. Langtrace integrates directly with LlamaIndex, offering detailed, real-time insights into performance metrics such as accuracy, evaluations, and latency.
## Install
- Self-host or sign-up and generate an API key using [Langtrace](https://www.langtrace.ai) Cloud
<Tabs groupId="install-langtrase" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @langtrase/typescript-sdk
```
```shell tab="yarn"
yarn add @langtrase/typescript-sdk
```
```shell tab="pnpm"
pnpm add @langtrase/typescript-sdk
```
</Tabs>
## Initialize
```js
import * as Langtrace from "@langtrase/typescript-sdk";
Langtrace.init({ api_key: "<YOUR_API_KEY>" });
```
Features:
- OpenTelemetry compliant, ensuring broad compatibility with observability platforms.
- Provides comprehensive logs and detailed traces of all components.
- Real-time monitoring of accuracy, evaluations, usage, costs, and latency.
- For more configuration options and details, visit [Langtrace Docs](https://docs.langtrace.ai/introduction).
{
"title": "Integration",
"description": "See our integrations",
"pages": ["open-llm-metry", "lang-trace"]
}
---
title: OpenLLMetry
description: Learn how to integrate LlamaIndex.TS with OpenLLMetry.
---
import { Tab, Tabs } from "fumadocs-ui/components/tabs";
[OpenLLMetry](https://github.com/traceloop/openllmetry-js) is an open-source project based on OpenTelemetry for tracing and monitoring
LLM applications. It connects to [all major observability platforms](https://www.traceloop.com/docs/openllmetry/integrations/introduction) and installs in minutes.
### Usage Pattern
<Tabs groupId="install-traceloop" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @traceloop/node-server-sdk
```
```shell tab="yarn"
yarn add @traceloop/node-server-sdk
```
```shell tab="pnpm"
pnpm add @traceloop/node-server-sdk
```
</Tabs>
```js
import * as traceloop from "@traceloop/node-server-sdk";
traceloop.initialize({
apiKey: process.env.TRACELOOP_API_KEY,
disableBatch: true
});
```
\ No newline at end of file
{
"title": "Loading Data",
"description": "Loading Data using LlamaIndex.TS",
"pages": ["index"]
}
......@@ -8,6 +8,7 @@
"index",
"setup",
"starter",
"readers"
"loading",
"Integration"
]
}
{
"title": "Loading",
"description": "File Readers Collection",
"pages": ["index"]
}
......@@ -20,7 +20,7 @@ import {
<>
<SiTypescript className="inline" color="#3178C6" /> TypeScript
</>
} href="/docs/llamaindex/setup/typescript.mdx" />
} href="/docs/llamaindex/setup/typescript" />
<Card title={
<>
<SiVite className='inline' color='#646CFF' /> Vite
......@@ -29,7 +29,7 @@ import {
<Card
title={
<>
<SiNextdotjs className='inline' color='#000000' /> Next.js (React Server Component)
<SiNextdotjs className='inline' /> Next.js (React Server Component)
</>
}
href="/docs/llamaindex/setup/next"
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment