This project is mirrored from https://github.com/run-llama/llama_index.
Pull mirroring updated .
- Feb 27, 2025
-
-
Massimiliano Pippi authored
-
Danielle Neri authored
-
Massimiliano Pippi authored
-
Massimiliano Pippi authored
-
Karol Zmorski authored
-
Logan authored
-
Massimiliano Pippi authored
-
Massimiliano Pippi authored
-
Matt Morris authored
* build: support `python3=3.9,<4.0` for `llama-index-vector-stores-databricks` Signed-off-by:
themattmorris <matt.morris114@gmail.com> * build: bump `llama-index-vector-stores-databricks` minor version Signed-off-by:
themattmorris <matt.morris114@gmail.com> --------- Signed-off-by:
themattmorris <matt.morris114@gmail.com>
-
rcodes7 authored
-
Thomas Rothenbächer authored
-
nbrosse authored
-
Yarikama authored
-
Pablo N. Marino authored
-
Wey Gu authored
-
Xu Wu authored
-
Yagiz Degirmenci authored
-
Shihang W authored
-
Eric Brown authored
-
Massimiliano Pippi authored
-
Massimiliano Pippi authored
-
Andre Wisplinghoff authored
Fix "Invalid value for 'content': expected a string, got null." openai error in case of empty assistant messages (#17921)
-
Yarikama authored
-
Massimiliano Pippi authored
-
Eric Brown authored
-
ex0ns authored
-
Harsh Jaykumar Jalan authored
-
Logan authored
-
- Feb 26, 2025
-
-
Laurie Voss authored
-
Denrei Keith authored
-
- Feb 25, 2025
-
-
Massimiliano Pippi authored
* update mocked payload * fix outlook import paths * reduce flakyness
-
Massimiliano Pippi authored
-
Robert Shelton authored
* update for redisvl 0.4.0 * update toml * bump version * simplify creation
-
Ishaan Sehgal authored
* fix: Convert Embedding Functions to Async This PR updates the embedding functions to run asynchronously. By wrapping the synchronous implementations with asyncio.to_thread, both _aget_query_embedding and _aget_text_embedding now execute without blocking the event loop. * Update pyproject.toml
-
Yarikama authored
Add support for anthropic.claude-3-7-sonnet-20250219-v1:0 model with 200k token context window
- Feb 24, 2025
-
-
Laurie Voss authored
-
David Wiszowaty authored
-
- Feb 22, 2025
-
-
Massimiliano Pippi authored
fix: `FunctionAgent` and `ReActAgent` error out when LLM response contains no chat messages (#17884) * fix: error when response contains no chat messages * add unit test
-
- Feb 20, 2025
-
-
Laurie Voss authored
-