Skip to content
Snippets Groups Projects
Unverified Commit 3e02793b authored by Marc Klingen's avatar Marc Klingen Committed by GitHub
Browse files

fix: public links in langfuse cookbook (#11955)

parent 446949e3
No related branches found
Tags llamaindex@0.0.9
No related merge requests found
%% Cell type:markdown id:d6509c3a tags:
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/callbacks/LangfuseCallbackHandler.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/></a>
%% Cell type:markdown id:c0d8b66c tags:
# Langfuse Callback Handler
[Langfuse](https://langfuse.com/docs) is an open source LLM engineering platform to help teams collaboratively debug, analyze and iterate on their LLM Applications.
The `LangfuseCallbackHandler` is integrated with Langfuse and empowers you to seamlessly track and monitor performance, traces, and metrics of your LlamaIndex application. Detailed traces of the LlamaIndex context augmentation and the LLM querying processes are captured and can be inspected directly in the Langfuse UI.
%% Cell type:markdown id:4a59a00e tags:
![langfuse-tracing](https://static.langfuse.com/llamaindex-langfuse-docs.gif)
%% Cell type:markdown id:3b9057da tags:
## Setup
%% Cell type:markdown id:5d9dfc7f tags:
### Install packages
%% Cell type:code id:49c3527e tags:
``` python
%pip install llama-index llama-index-callbacks-langfuse
```
%% Cell type:markdown id:bc10630b tags:
### Configure environment
%% Cell type:markdown id:4c256817 tags:
If you haven't done yet, [sign up on Langfuse](https://cloud.langfuse.com/auth/sign-up) and obtain your API keys from the project settings.
%% Cell type:code id:787e836d tags:
``` python
import os
# Langfuse
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
os.environ[
"LANGFUSE_HOST"
] = "https://cloud.langfuse.com" # 🇪🇺 EU region, 🇺🇸 US region: "https://us.cloud.langfuse.com"
# OpenAI
os.environ["OPENAI_API_KEY"] = "sk-..."
```
%% Cell type:markdown id:1fe2ba01 tags:
### Register the Langfuse callback handler
%% Cell type:markdown id:cfef9ddc tags:
#### Option 1: Set global LlamaIndex handler
%% Cell type:code id:72afb2b9 tags:
``` python
from llama_index.core import global_handler, set_global_handler
set_global_handler("langfuse")
langfuse_callback_handler = global_handler
```
%% Cell type:markdown id:0e6557d2 tags:
#### Option 2: Use Langfuse callback directly
%% Cell type:code id:4bdd95bf tags:
``` python
from llama_index.core import Settings
from llama_index.core.callbacks import CallbackManager
from langfuse.llama_index import LlamaIndexCallbackHandler
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])
```
%% Cell type:markdown id:e3e03ce7 tags:
### Flush events to Langfuse
%% Cell type:markdown id:e2c811ec tags:
The Langfuse SDKs queue and batches events in the background to reduce the number of network requests and improve overall performance. Before exiting your application, make sure all queued events have been flushed to Langfuse servers.
%% Cell type:code id:4e28876c tags:
``` python
# ... your LlamaIndex calls here ...
langfuse_callback_handler.flush()
```
%% Cell type:markdown id:6b86f1b5 tags:
Done!✨ Traces and metrics from your LlamaIndex application are now automatically tracked in Langfuse. If you construct a new index or query an LLM with your documents in context, your traces and metrics are immediately visible in the Langfuse UI. Next, let's take a look at how traces will look in Langfuse.
%% Cell type:markdown id:1f0d4465 tags:
## Example
%% Cell type:markdown id:8a9f3428 tags:
Fetch and save example data.
%% Cell type:code id:aa303ae3 tags:
``` python
!mkdir -p 'data/'
!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham_essay.txt'
```
%% Cell type:markdown id:9f053996 tags:
Run an example index construction, query, and chat.
%% Cell type:code id:983cbedd tags:
``` python
from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
# Create index
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
# Execute query
query_engine = index.as_query_engine()
query_response = query_engine.query("What did the author do growing up?")
print(query_response)
# Execute chat query
chat_engine = index.as_chat_engine()
chat_response = chat_engine.chat("What did the author do growing up?")
print(chat_response)
# As we want to immediately see result in Langfuse, we need to flush the callback handler
langfuse_callback_handler.flush()
```
%% Cell type:markdown id:d5cdd88f tags:
Done!✨ You will now see traces of your index and query in your Langfuse project.
Example traces (public links):
1. [Index construction](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/1294ed01-8193-40a5-bb4e-2f0723d2c827)
2. [Query Engine](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/eaa4ea74-78e0-42ef-ace0-7aa02c6fbbc6)
3. [Chat Engine](https://cloud.langfuse.com/project/clsuh9o2y0000mbztvdptt1mh/traces/d95914f5-66eb-4520-b996-49e84fd7f323)
1. [Query](https://cloud.langfuse.com/project/cltipxbkn0000cdd7sbfbpovm/traces/f2e7f721-0940-4139-9b3a-e5cc9b0cb2d3)
2. [Query (chat)](https://cloud.langfuse.com/project/cltipxbkn0000cdd7sbfbpovm/traces/89c62a4d-e992-4923-a6b7-e2f27ae4cff3)
3. [Session](https://cloud.langfuse.com/project/cltipxbkn0000cdd7sbfbpovm/sessions/notebook-session-2)
%% Cell type:markdown id:0b50845f tags:
## 📚 More details
Check out the full [Langfuse documentation](https://langfuse.com/docs) for more details on Langfuse's tracing and analytics capabilities and how to make most of this integration.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment