Skip to content
Snippets Groups Projects
  • Timothy Carambat's avatar
    dd7c4675
    LLM performance metric tracking (#2825) · dd7c4675
    Timothy Carambat authored
    
    * WIP performance metric tracking
    
    * fix: patch UI trying to .toFixed() null metric
    Anthropic tracking migraiton
    cleanup logs
    
    * Apipie implmentation, not tested
    
    * Cleanup Anthropic notes, Add support for AzureOpenAI tracking
    
    * bedrock token metric tracking
    
    * Cohere support
    
    * feat: improve default stream handler to track for provider who are actually OpenAI compliant in usage reporting
    add deepseek support
    
    * feat: Add FireworksAI tracking reporting
    fix: improve handler when usage:null is reported (why?)
    
    * Add token reporting for GenericOpenAI
    
    * token reporting for koboldcpp + lmstudio
    
    * lint
    
    * support Groq token tracking
    
    * HF token tracking
    
    * token tracking for togetherai
    
    * LiteLLM token tracking
    
    * linting + Mitral token tracking support
    
    * XAI token metric reporting
    
    * native provider runner
    
    * LocalAI token tracking
    
    * Novita token tracking
    
    * OpenRouter token tracking
    
    * Apipie stream metrics
    
    * textwebgenui token tracking
    
    * perplexity token reporting
    
    * ollama token reporting
    
    * lint
    
    * put back comment
    
    * Rip out LC ollama wrapper and use official library
    
    * patch images with new ollama lib
    
    * improve ollama offline message
    
    * fix image handling in ollama llm provider
    
    * lint
    
    * NVIDIA NIM token tracking
    
    * update openai compatbility responses
    
    * UI/UX show/hide metrics on click for user preference
    
    * update bedrock client
    
    ---------
    
    Co-authored-by: default avatarshatfield4 <seanhatfield5@gmail.com>
    LLM performance metric tracking (#2825)
    Timothy Carambat authored
    
    * WIP performance metric tracking
    
    * fix: patch UI trying to .toFixed() null metric
    Anthropic tracking migraiton
    cleanup logs
    
    * Apipie implmentation, not tested
    
    * Cleanup Anthropic notes, Add support for AzureOpenAI tracking
    
    * bedrock token metric tracking
    
    * Cohere support
    
    * feat: improve default stream handler to track for provider who are actually OpenAI compliant in usage reporting
    add deepseek support
    
    * feat: Add FireworksAI tracking reporting
    fix: improve handler when usage:null is reported (why?)
    
    * Add token reporting for GenericOpenAI
    
    * token reporting for koboldcpp + lmstudio
    
    * lint
    
    * support Groq token tracking
    
    * HF token tracking
    
    * token tracking for togetherai
    
    * LiteLLM token tracking
    
    * linting + Mitral token tracking support
    
    * XAI token metric reporting
    
    * native provider runner
    
    * LocalAI token tracking
    
    * Novita token tracking
    
    * OpenRouter token tracking
    
    * Apipie stream metrics
    
    * textwebgenui token tracking
    
    * perplexity token reporting
    
    * ollama token reporting
    
    * lint
    
    * put back comment
    
    * Rip out LC ollama wrapper and use official library
    
    * patch images with new ollama lib
    
    * improve ollama offline message
    
    * fix image handling in ollama llm provider
    
    * lint
    
    * NVIDIA NIM token tracking
    
    * update openai compatbility responses
    
    * UI/UX show/hide metrics on click for user preference
    
    * update bedrock client
    
    ---------
    
    Co-authored-by: default avatarshatfield4 <seanhatfield5@gmail.com>
Code owners
Assign users and groups as approvers for specific file changes. Learn more.