This project is mirrored from https://github.com/aurelio-labs/semantic-router.
Pull mirroring updated .
- Feb 23, 2024
-
-
Simonas authored
-
- Feb 22, 2024
-
-
Siraj R Aizlewood authored
This involved updating the prompt in llms > base.py to be more reliable. Also created Notebook 07 to showcase Ollama with dynamic routes.
-
- Feb 21, 2024
-
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Ensured that we have default values for attributes name, temperature, llm_name, max_tokens and stream. User can choose to alter the values that are actually used on the fly via new optional arguments in __call__. Also _call_ changed to __call__, in line with other LLMs. Previously self.name was being used to identify the LLM to be called, but self.name is intended to identify the OllamaLLM instance as an instance of OllamaLLM. So it's now set to "ollama".
-
- Feb 19, 2024
-
-
CP500 authored
Adding Ollama support. Howto use: from ollama import OllamaLLM rl = RouteLayer(encoder = encoder, routes=routes, llm=llm)
-