Skip to content
Snippets Groups Projects
Commit 244ec3cb authored by Sourabh Desai's avatar Sourabh Desai
Browse files

add arize phoenix integration

parent 56533760
No related branches found
No related tags found
No related merge requests found
run:
echo "Running in local mode."
docker compose create db localstack
docker compose start db localstack
docker compose create db localstack phoenix
docker compose start db localstack phoenix
poetry run start
run_docker:
......
......@@ -31,6 +31,11 @@ For any issues in setting up the above or during the rest of your development, y
- [Open & already closed Github Issues](https://github.com/run-llama/sec-insights/issues?q=is%3Aissue+is%3Aclosed)
- The [#sec-insights discord channel](https://discord.com/channels/1059199217496772688/1150942525968879636)
## LLM Observability
This project will automatically spin up a local version of [Arize Phoenix](https://phoenix.arize.com/) for you and send traces to it as you're using the chat interface.
Arize Phoenix is a open source LLM observability & evaluation tool. LlamaIndex's event instrumentation system is deeply integrated with Arize Phoenix to make it easier for you to debug your LLM application during development. Simply open the Arize Phoenix Dashboard at [`http://localhost:6006/`](http://localhost:6006/) when running SEC Insights locally to see the traced calls to LLMs, Embedding Models, Vector DBs, and more.
## Scripts
The `scripts/` folder contains several scripts that are useful for both operations and development.
......
......@@ -11,6 +11,7 @@ from alembic import script
from alembic.runtime import migration
from sqlalchemy.engine import create_engine, Engine
from llama_index.core.node_parser.text.utils import split_by_sentence_tokenizer
import llama_index.core
from app.api.api import api_router
from app.db.wait_for_db import check_database_connection
......@@ -89,6 +90,10 @@ async def lifespan(app: FastAPI):
except FileExistsError:
# Sometimes seen in deployments, should be benign.
logger.info("Tried to re-download NLTK files but already exists.")
if not settings.RENDER:
llama_index.core.set_global_handler("arize_phoenix")
yield
# This section is run on app shutdown
await vector_store.close()
......
......@@ -11,6 +11,7 @@ services:
- "127.0.0.1:8000:8000"
depends_on:
- db
- phoenix
env_file:
- .env
- .env.docker
......@@ -41,5 +42,14 @@ services:
- "${LOCALSTACK_VOLUME_DIR:-./volume}:/var/lib/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
# useful for local workflow debugging
# taken from here: https://docs.arize.com/phoenix/deployment/docker#postgresql
phoenix:
image: arizephoenix/phoenix:latest # Must be greater than 4.0 version to work
ports:
- 6006:6006 # PHOENIX_PORT
- 4317:4317 # PHOENIX_GRPC_PORT
- 9090:9090 # [Optional] PROMETHEUS PORT IF ENABLED
volumes:
postgres_data:
This diff is collapsed.
......@@ -51,6 +51,8 @@ fire = "^0.5.0"
sec-edgar-downloader = "~5.0"
pytickersymbols = "^1.13.0"
awscli-local = "^0.20"
llama-index-callbacks-arize-phoenix = "^0.4.0"
arize-phoenix = "^8.12.1"
[tool.poetry.scripts]
start = "app.main:start"
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment