Skip to content
Snippets Groups Projects
Commit 8a556ada authored by Simonas's avatar Simonas
Browse files

feat: compatible with python 3.9 and 3.12

parent c96769a6
No related branches found
No related tags found
No related merge requests found
[![Aurelio AI](https://pbs.twimg.com/profile_banners/1671498317455581184/1696285195/1500x500)](https://aurelio.ai) [![Aurelio AI](https://pbs.twimg.com/profile_banners/1671498317455581184/1696285195/1500x500)](https://aurelio.ai)
# Semantic Router # Semantic Router
<p> <p>
<img alt="PyPI - Python Version" src="https://img.shields.io/pypi/pyversions/semantic-router?logo=python&logoColor=gold" /> <img alt="PyPI - Python Version" src="https://img.shields.io/pypi/pyversions/semantic-router?logo=python&logoColor=gold" />
<img alt="GitHub Contributors" src="https://img.shields.io/github/contributors/aurelio-labs/semantic-router" /> <img alt="GitHub Contributors" src="https://img.shields.io/github/contributors/aurelio-labs/semantic-router" />
...@@ -22,7 +23,7 @@ To get started with _semantic-router_ we install it like so: ...@@ -22,7 +23,7 @@ To get started with _semantic-router_ we install it like so:
pip install -qU semantic-router pip install -qU semantic-router
``` ```
❗️ _If wanting to use local embeddings you can use `FastEmbedEncoder` (`pip install -qU semantic-router[fastembed]`). To use the `HybridRouteLayer` you must `pip install -qU semantic-router[hybrid]`._ ❗️ _If wanting to use local embeddings you can use `FastEmbedEncoder` (`pip install -qU "semantic-router[fastembed]`"). To use the `HybridRouteLayer` you must `pip install -qU "semantic-router[hybrid]"`._
We begin by defining a set of `Route` objects. These are the decision paths that the semantic router can decide to use, let's try two simple routes for now — one for talk on _politics_ and another for _chitchat_: We begin by defining a set of `Route` objects. These are the decision paths that the semantic router can decide to use, let's try two simple routes for now — one for talk on _politics_ and another for _chitchat_:
...@@ -114,10 +115,3 @@ rl("I'm interested in learning about llama 2").name ...@@ -114,10 +115,3 @@ rl("I'm interested in learning about llama 2").name
In this case, no decision could be made as we had no matches — so our route layer returned `None`! In this case, no decision could be made as we had no matches — so our route layer returned `None`!
## 📚 [Resources](https://github.com/aurelio-labs/semantic-router/tree/main/docs) ## 📚 [Resources](https://github.com/aurelio-labs/semantic-router/tree/main/docs)
This diff is collapsed.
...@@ -14,7 +14,7 @@ readme = "README.md" ...@@ -14,7 +14,7 @@ readme = "README.md"
packages = [{include = "semantic_router"}] packages = [{include = "semantic_router"}]
[tool.poetry.dependencies] [tool.poetry.dependencies]
python = ">=3.10,<3.12" python = "^3.9"
pydantic = "^1.8.2" pydantic = "^1.8.2"
openai = "^1.3.9" openai = "^1.3.9"
cohere = "^4.32" cohere = "^4.32"
......
...@@ -16,7 +16,7 @@ class BM25Encoder(BaseEncoder): ...@@ -16,7 +16,7 @@ class BM25Encoder(BaseEncoder):
except ImportError: except ImportError:
raise ImportError( raise ImportError(
"Please install pinecone-text to use BM25Encoder. " "Please install pinecone-text to use BM25Encoder. "
"You can install it with: `pip install semantic-router[hybrid]`" "You can install it with: `pip install 'semantic-router[hybrid]'`"
) )
logger.info("Downloading and initializing BM25 model parameters.") logger.info("Downloading and initializing BM25 model parameters.")
self.model = encoder.default() self.model = encoder.default()
......
...@@ -27,7 +27,7 @@ class FastEmbedEncoder(BaseEncoder): ...@@ -27,7 +27,7 @@ class FastEmbedEncoder(BaseEncoder):
raise ImportError( raise ImportError(
"Please install fastembed to use FastEmbedEncoder. " "Please install fastembed to use FastEmbedEncoder. "
"You can install it with: " "You can install it with: "
"`pip install semantic-router[fastembed]`" "`pip install 'semantic-router[fastembed]'`"
) )
embedding_args = { embedding_args = {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment