-
- Downloads
llms-vllm: fix VllmServer to work without CUDA-required vllm core. (#12003)
Showing
- docs/examples/llm/vllm.ipynb 20 additions, 93 deletionsdocs/examples/llm/vllm.ipynb
- llama-index-integrations/llms/llama-index-llms-vllm/llama_index/llms/vllm/base.py 28 additions, 23 deletions.../llms/llama-index-llms-vllm/llama_index/llms/vllm/base.py
- llama-index-integrations/llms/llama-index-llms-vllm/pyproject.toml 1 addition, 1 deletion...ex-integrations/llms/llama-index-llms-vllm/pyproject.toml
- llama-index-integrations/llms/llama-index-llms-vllm/tests/test_integration.py 120 additions, 0 deletions...ions/llms/llama-index-llms-vllm/tests/test_integration.py
- llama-index-integrations/llms/llama-index-llms-vllm/tests/test_llms_vllm.py 24 additions, 1 deletion...ations/llms/llama-index-llms-vllm/tests/test_llms_vllm.py
Loading
Please register or sign in to comment