From d84ed0544f3ea2b85e90b6f8c878652ef7693c6c Mon Sep 17 00:00:00 2001
From: James Briggs <35938317+jamescalam@users.noreply.github.com>
Date: Tue, 16 Jan 2024 14:02:46 +0000
Subject: [PATCH] readme

---
 README.md | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/README.md b/README.md
index 3578cb88..bc853cda 100644
--- a/README.md
+++ b/README.md
@@ -15,6 +15,7 @@
 
 Semantic Router is a superfast decision-making layer for your LLMs and agents. Rather than waiting for slow LLM generations to make tool-use decisions, we use the magic of semantic vector space to make those decisions — _routing_ our requests using _semantic_ meaning.
 
+
 ---
 
 ## Quickstart
@@ -25,7 +26,7 @@ To get started with _semantic-router_ we install it like so:
 pip install -qU semantic-router
 ```
 
-❗️ _If wanting to use local embeddings you can use `FastEmbedEncoder` (`pip install -qU "semantic-router[fastembed]`"). To use the `HybridRouteLayer` you must `pip install -qU "semantic-router[hybrid]"`._
+❗️ _If wanting to use a fully local version of semantic router you can use `HuggingFaceEncoder` and `LlamaCppEncoder` (`pip install -qU "semantic-router[local]"`, see [here](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb)). To use the `HybridRouteLayer` you must `pip install -qU "semantic-router[hybrid]"`._
 
 We begin by defining a set of `Route` objects. These are the decision paths that the semantic router can decide to use, let's try two simple routes for now — one for talk on _politics_ and another for _chitchat_:
 
-- 
GitLab