This project is mirrored from https://github.com/BerriAI/litellm.
Pull mirroring failed .
Repository mirroring has been paused due to too many failed attempts. It can be resumed by a project maintainer or owner.
Last successful update .
Repository mirroring has been paused due to too many failed attempts. It can be resumed by a project maintainer or owner.
Last successful update .
- Sep 12, 2024
-
-
Ishaan Jaff authored
This reverts commit 5910d87a, reversing changes made to 6d1455de.
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Krish Dholakia authored
* fix(caching.py): set ttl for async_increment cache fixes issue where ttl for redis client was not being set on increment_cache Fixes https://github.com/BerriAI/litellm/issues/5609 * fix(caching.py): fix increment cache w/ ttl for sync increment cache on redis Fixes https://github.com/BerriAI/litellm/issues/5609 * fix(router.py): support adding retry policy + allowed fails policy via config.yaml * fix(router.py): don't cooldown single deployments No point, as there's no other deployment to loadbalance with. * fix(user_api_key_auth.py): support setting allowed email domains on jwt tokens Closes https://github.com/BerriAI/litellm/issues/5605 * docs(token_auth.md): add user upsert + allowed email domain to jwt auth docs * fix(litellm_pre_call_utils.py): fix dynamic key logging when team id is set Fixes issue where key logging would not be set if team metadata was not none * fix(secret_managers/main.py): load environment variables correctly Fixes issue where os.environ/ was not being loaded correctly * test(test_router.py): fix test * feat(spend_tracking_utils.py): support logging additional usage params - e.g. prompt caching values for deepseek * test: fix tests * test: fix test * test: fix test * test: fix test * test: fix test
-
Ishaan Jaff authored
-
Ishaan Jaff authored
[Feat] Add Load Testing for Langsmith, and OTEL logging
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
[Fix-Perf] OTEL use sensible default values for logging
-
Ishaan Jaff authored
-
Ishaan Jaff authored
[Langsmith Perf Improvement] Use /batch for Langsmith Logging
-
- Sep 11, 2024
-
-
Ishaan Jaff authored
-
steffen-sbt authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Krish Dholakia authored
* fix(cost_calculator.py): move to debug for noisy warning message on cost calculation error Fixes https://github.com/BerriAI/litellm/issues/5610 * fix(databricks/cost_calculator.py): Handles model name issues for databricks models * fix(main.py): fix stream chunk builder for multiple tool calls Fixes https://github.com/BerriAI/litellm/issues/5591 * fix: correctly set user_alias when passed in Fixes https://github.com/BerriAI/litellm/issues/5612 * fix(types/utils.py): allow passing role for message object https://github.com/BerriAI/litellm/issues/5621 * fix(litellm_logging.py): Fix langfuse logging across multiple projects Fixes issue where langfuse logger was re-using the old logging object * feat(proxy/_types.py): support adding key-based tags for tag-based routing Enable tag based routing at key-level * fix(proxy/_types.py): fix inheritance * test(test_key_generate_prisma.py): fix test * test: fix test * fix(litellm_logging.py): return used callback object
-
Christopher Chou authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
docs: update ai21 docs
-
Ishaan Jaff authored
-
Miri Bar authored
-
Ishaan Jaff authored
-
Ishaan Jaff authored
Bump body-parser and express in /docs/my-website
-