This project is mirrored from https://github.com/aurelio-labs/semantic-router.
Pull mirroring updated .
- Mar 07, 2024
-
-
szelesaron authored
-
- Mar 06, 2024
-
-
szelesaron authored
-
- Mar 02, 2024
-
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
- Feb 27, 2024
- Feb 26, 2024
- Feb 25, 2024
-
-
James Briggs authored
-
James Briggs authored
-
- Feb 23, 2024
-
-
James Briggs authored
-
James Briggs authored
-
Simonas authored
-
Simonas authored
-
Simonas authored
-
Simonas authored
-
Simonas authored
-
Simonas authored
-
- Feb 22, 2024
-
-
Siraj R Aizlewood authored
This involved updating the prompt in llms > base.py to be more reliable. Also created Notebook 07 to showcase Ollama with dynamic routes.
-
dwmorris11 authored
-
dwmorris11 authored
-
siddicky authored
-
- Feb 21, 2024
-
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Ensured that we have default values for attributes name, temperature, llm_name, max_tokens and stream. User can choose to alter the values that are actually used on the fly via new optional arguments in __call__. Also _call_ changed to __call__, in line with other LLMs. Previously self.name was being used to identify the LLM to be called, but self.name is intended to identify the OllamaLLM instance as an instance of OllamaLLM. So it's now set to "ollama".
-
Siraj R Aizlewood authored
-
Ismail Ashraq authored
-
Ismail Ashraq authored
-
- Feb 20, 2024
-
-
maxyousif15 authored
-
zahid-syed authored
-
zahid-syed authored
-
zahid-syed authored
-
- Feb 19, 2024
-
-
James Briggs authored
-