This project is mirrored from https://github.com/aurelio-labs/semantic-router.
Pull mirroring updated .
- Feb 21, 2024
-
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Ensured that we have default values for attributes name, temperature, llm_name, max_tokens and stream. User can choose to alter the values that are actually used on the fly via new optional arguments in __call__. Also _call_ changed to __call__, in line with other LLMs. Previously self.name was being used to identify the LLM to be called, but self.name is intended to identify the OllamaLLM instance as an instance of OllamaLLM. So it's now set to "ollama".
-
- Feb 19, 2024
-
-
James Briggs authored
-
James Briggs authored
feat: Adds VitEncoder for visual transformers
-
James Briggs authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
CP500 authored
Adding Ollama support. Howto use: from ollama import OllamaLLM rl = RouteLayer(encoder = encoder, routes=routes, llm=llm)
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
- Feb 18, 2024
-
-
James Briggs authored
chore: move pinecone demo
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
feat: Pinecone demo and hotfix
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
- Feb 14, 2024
-
-
James Briggs authored
fix: Saving JSON/YAML of Layer Config
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
feat: separate indexes and PineconeIndex
-
Siraj R Aizlewood authored
test_from_file_invalid_config()
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
test_from_file_with_llm()
-
- Feb 13, 2024
-
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
Siraj R Aizlewood authored
And some improvements where we make sure that temp files are definitely deleted after use.
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Saving Layer Configs as JSON or YAML config files by representing the LayerConfig object as a dict and then turning into JSON/YAML wasn't working. See issue here: https://github.com/aurelio-labs/semantic-router/issues/144 The issue was that, by attempting this, we were attempting to serialize all objects included in the Layer, including Routes, and the LLMs that those Routes use. In the case of the above issue, the LLM was a Cohere one, which included a Client as one of its attributes, and this Client is not serializable. So, instead of attempting to represent the entire LLM object a dict, to then be converted into JSON/YAML, we only keep key information about the LLM: 'module': self.llm.__module__, 'class': self.llm.__class__.__name__, 'model': self.llm.name This is what's saved in route.py, and then sent to layer.py to be serialized (in LayerConfig.to_file()). Then, when it comes time to load from the file via LayerConfig.from_file, the LLM is re-initialized dynamically.
-
- Feb 12, 2024
-
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Using index.describe instead of shape[].
-