This project is mirrored from https://github.com/aurelio-labs/semantic-router.
Pull mirroring updated .
- Feb 19, 2024
-
-
Siraj R Aizlewood authored
-
James Briggs authored
fix: llm verbose bug
-
Siraj Aizlewood authored
-
James Briggs authored
feat: Adds VitEncoder for visual transformers
-
James Briggs authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
To avoid UnsupportedOperation: fileno, as in: https://github.com/aurelio-labs/semantic-router/issues/107
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
Bogdan Buduroiu authored
-
- Feb 18, 2024
-
-
James Briggs authored
chore: move pinecone demo
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
feat: Pinecone demo and hotfix
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
- Feb 14, 2024
-
-
James Briggs authored
fix: Saving JSON/YAML of Layer Config
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
feat: separate indexes and PineconeIndex
-
Siraj R Aizlewood authored
test_from_file_invalid_config()
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
test_from_file_with_llm()
-
- Feb 13, 2024
-
-
James Briggs authored
-
James Briggs authored
-
James Briggs authored
-
Siraj R Aizlewood authored
And some improvements where we make sure that temp files are definitely deleted after use.
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Saving Layer Configs as JSON or YAML config files by representing the LayerConfig object as a dict and then turning into JSON/YAML wasn't working. See issue here: https://github.com/aurelio-labs/semantic-router/issues/144 The issue was that, by attempting this, we were attempting to serialize all objects included in the Layer, including Routes, and the LLMs that those Routes use. In the case of the above issue, the LLM was a Cohere one, which included a Client as one of its attributes, and this Client is not serializable. So, instead of attempting to represent the entire LLM object a dict, to then be converted into JSON/YAML, we only keep key information about the LLM: 'module': self.llm.__module__, 'class': self.llm.__class__.__name__, 'model': self.llm.name This is what's saved in route.py, and then sent to layer.py to be serialized (in LayerConfig.to_file()). Then, when it comes time to load from the file via LayerConfig.from_file, the LLM is re-initialized dynamically.
-
- Feb 12, 2024
-
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
-
Siraj R Aizlewood authored
Using index.describe instead of shape[].
-
Siraj R Aizlewood authored
-