diff --git a/README.md b/README.md
index fd4c12a2e5f441e8cb792d6705c976c7e67b55d4..3578cb88c1c4e161d55c7c960ef13b9482241aec 100644
--- a/README.md
+++ b/README.md
@@ -15,6 +15,8 @@
 
 Semantic Router is a superfast decision-making layer for your LLMs and agents. Rather than waiting for slow LLM generations to make tool-use decisions, we use the magic of semantic vector space to make those decisions — _routing_ our requests using _semantic_ meaning.
 
+---
+
 ## Quickstart
 
 To get started with _semantic-router_ we install it like so:
@@ -114,4 +116,28 @@ rl("I'm interested in learning about llama 2").name
 
 In this case, no decision could be made as we had no matches — so our route layer returned `None`!
 
-## 📚 [Resources](https://github.com/aurelio-labs/semantic-router/tree/main/docs)
+---
+
+## 📚 Resources
+
+### Docs
+
+| Notebook | Description |
+| -------- | ----------- |
+| [Introduction](https://github.com/aurelio-labs/semantic-router/blob/main/docs/00-introduction.ipynb) | Introduction to Semantic Router and static routes |
+| [Dynamic Routes](https://github.com/aurelio-labs/semantic-router/blob/main/docs/02-dynamic-routes.ipynb) | Dynamic routes for parameter generation and functionc calls |
+| [Save/Load Layers](https://github.com/aurelio-labs/semantic-router/blob/main/docs/01-save-load-from-file.ipynb) | How to save and load `RouteLayer` from file |
+| [Local Execution](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb) | Fully local Semantic Router with dynamic routes — *local models such as Mistral 7B outperform GPT-3.5 in most tests* |
+| [LangChain Integration](https://github.com/aurelio-labs/semantic-router/blob/main/docs/03-basic-langchain-agent.ipynb) | How to integrate Semantic Router with LangChain Agents |
+
+### Online Course
+
+**COMING SOON**
+
+### Community
+
+Julian Horsey, [Semantic Router superfast decision layer for LLMs and AI agents](https://www.geeky-gadgets.com/semantic-router-superfast-decision-layer-for-llms-and-ai-agents/), Geeky Gadgets
+
+azhar, [Beyond Basic Chatbots: How Semantic Router is Changing the Game](https://medium.com/ai-insights-cobet/beyond-basic-chatbots-how-semantic-router-is-changing-the-game-783dd959a32d), AI Insights @ Medium
+
+Daniel Avila, [Semantic Router: Enhancing Control in LLM Conversations](https://blog.codegpt.co/semantic-router-enhancing-control-in-llm-conversations-68ce905c8d33), CodeGPT @ Medium
diff --git a/docs/00-introduction.ipynb b/docs/00-introduction.ipynb
index 0665070391a00545a4230aead9f22bd9b22717b3..2dfb4e81bf73e8aa7c5f969b03b64320b57c5cbe 100644
--- a/docs/00-introduction.ipynb
+++ b/docs/00-introduction.ipynb
@@ -41,7 +41,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "!pip install -qU semantic-router==0.0.15"
+    "!pip install -qU semantic-router==0.0.16"
    ]
   },
   {
diff --git a/docs/01-save-load-from-file.ipynb b/docs/01-save-load-from-file.ipynb
index 6c5945cb9020a5d2296b86a1c282553abfb15736..062474be79bffc91f236b19119520ed87f7b1cfe 100644
--- a/docs/01-save-load-from-file.ipynb
+++ b/docs/01-save-load-from-file.ipynb
@@ -36,7 +36,7 @@
    "metadata": {},
    "outputs": [],
    "source": [
-    "!pip install -qU semantic-router==0.0.15"
+    "!pip install -qU semantic-router==0.0.16"
    ]
   },
   {
diff --git a/docs/02-dynamic-routes.ipynb b/docs/02-dynamic-routes.ipynb
index 88b7794aadac761ec74dc872d2203203208791e7..72c8a3ec44c7e0d24df7a59c90787868bb25d9d8 100644
--- a/docs/02-dynamic-routes.ipynb
+++ b/docs/02-dynamic-routes.ipynb
@@ -26,7 +26,9 @@
       "source": [
         "In semantic-router there are two types of routes that can be chosen. Both routes belong to the `Route` object, the only difference between them is that _static_ routes return a `Route.name` when chosen, whereas _dynamic_ routes use an LLM call to produce parameter input values.\n",
         "\n",
-        "For example, a _static_ route will tell us if a query is talking about mathematics by returning the route name (which could be `\"math\"` for example). A _dynamic_ route can generate additional values, so it may decide a query is talking about maths, but it can also generate Python code that we can later execute to answer the user's query, this output may look like `\"math\", \"import math; output = math.sqrt(64)`."
+        "For example, a _static_ route will tell us if a query is talking about mathematics by returning the route name (which could be `\"math\"` for example). A _dynamic_ route can generate additional values, so it may decide a query is talking about maths, but it can also generate Python code that we can later execute to answer the user's query, this output may look like `\"math\", \"import math; output = math.sqrt(64)`.\n",
+        "\n",
+        "***⚠️ Note: We have a fully local version of dynamic routes available at [docs/05-local-execution.ipynb](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb). The local 05 version tends to outperform the OpenAI version we demo in this notebook, so we'd recommend trying [05](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb)!***"
       ]
     },
     {
@@ -46,7 +48,7 @@
       },
       "outputs": [],
       "source": [
-        "!pip install -qU semantic-router==0.0.15"
+        "!pip install -qU semantic-router==0.0.16"
       ]
     },
     {
@@ -114,16 +116,16 @@
       "cell_type": "code",
       "execution_count": 3,
       "metadata": {
-        "id": "BI9AiDspur0y",
-        "outputId": "27329a54-3f16-44a5-ac20-13a6b26afb97",
         "colab": {
           "base_uri": "https://localhost:8080/"
-        }
+        },
+        "id": "BI9AiDspur0y",
+        "outputId": "27329a54-3f16-44a5-ac20-13a6b26afb97"
       },
       "outputs": [
         {
-          "output_type": "stream",
           "name": "stderr",
+          "output_type": "stream",
           "text": [
             "\u001b[32m2024-01-08 11:12:24 INFO semantic_router.utils.logger Initializing RouteLayer\u001b[0m\n"
           ]
@@ -163,22 +165,22 @@
       "cell_type": "code",
       "execution_count": 4,
       "metadata": {
-        "id": "_rNREh7gur0y",
-        "outputId": "f3a1dc0b-d760-4efb-b634-d3547011dcb7",
         "colab": {
           "base_uri": "https://localhost:8080/"
-        }
+        },
+        "id": "_rNREh7gur0y",
+        "outputId": "f3a1dc0b-d760-4efb-b634-d3547011dcb7"
       },
       "outputs": [
         {
-          "output_type": "execute_result",
           "data": {
             "text/plain": [
               "RouteChoice(name='chitchat', function_call=None)"
             ]
           },
+          "execution_count": 4,
           "metadata": {},
-          "execution_count": 4
+          "output_type": "execute_result"
         }
       ],
       "source": [
@@ -233,26 +235,26 @@
       "cell_type": "code",
       "execution_count": 6,
       "metadata": {
-        "id": "YyFKV8jMur0z",
-        "outputId": "29cf80f4-552c-47bb-fbf9-019f5dfdf00a",
         "colab": {
           "base_uri": "https://localhost:8080/",
           "height": 35
-        }
+        },
+        "id": "YyFKV8jMur0z",
+        "outputId": "29cf80f4-552c-47bb-fbf9-019f5dfdf00a"
       },
       "outputs": [
         {
-          "output_type": "execute_result",
           "data": {
-            "text/plain": [
-              "'06:13'"
-            ],
             "application/vnd.google.colaboratory.intrinsic+json": {
               "type": "string"
-            }
+            },
+            "text/plain": [
+              "'06:13'"
+            ]
           },
+          "execution_count": 6,
           "metadata": {},
-          "execution_count": 6
+          "output_type": "execute_result"
         }
       ],
       "source": [
@@ -272,15 +274,14 @@
       "cell_type": "code",
       "execution_count": 7,
       "metadata": {
-        "id": "tOjuhp5Xur0z",
-        "outputId": "ca88a3ea-d70a-4950-be9a-63fab699de3b",
         "colab": {
           "base_uri": "https://localhost:8080/"
-        }
+        },
+        "id": "tOjuhp5Xur0z",
+        "outputId": "ca88a3ea-d70a-4950-be9a-63fab699de3b"
       },
       "outputs": [
         {
-          "output_type": "execute_result",
           "data": {
             "text/plain": [
               "{'name': 'get_time',\n",
@@ -289,8 +290,9 @@
               " 'output': \"<class 'str'>\"}"
             ]
           },
+          "execution_count": 7,
           "metadata": {},
-          "execution_count": 7
+          "output_type": "execute_result"
         }
       ],
       "source": [
@@ -341,16 +343,16 @@
       "cell_type": "code",
       "execution_count": 9,
       "metadata": {
-        "id": "-0vY8PRXur0z",
-        "outputId": "db01e14c-eab3-4f93-f4c2-e30f508c8b5d",
         "colab": {
           "base_uri": "https://localhost:8080/"
-        }
+        },
+        "id": "-0vY8PRXur0z",
+        "outputId": "db01e14c-eab3-4f93-f4c2-e30f508c8b5d"
       },
       "outputs": [
         {
-          "output_type": "stream",
           "name": "stderr",
+          "output_type": "stream",
           "text": [
             "\u001b[32m2024-01-08 11:15:26 INFO semantic_router.utils.logger Adding `get_time` route\u001b[0m\n"
           ]
@@ -373,33 +375,33 @@
       "cell_type": "code",
       "execution_count": 11,
       "metadata": {
-        "id": "Wfb68M0-ur0z",
-        "outputId": "79923883-2a4d-4744-f8ce-e818cb5f14c3",
         "colab": {
           "base_uri": "https://localhost:8080/",
           "height": 53
-        }
+        },
+        "id": "Wfb68M0-ur0z",
+        "outputId": "79923883-2a4d-4744-f8ce-e818cb5f14c3"
       },
       "outputs": [
         {
-          "output_type": "stream",
           "name": "stderr",
+          "output_type": "stream",
           "text": [
             "\u001b[32m2024-01-08 11:16:24 INFO semantic_router.utils.logger Extracting function input...\u001b[0m\n"
           ]
         },
         {
-          "output_type": "execute_result",
           "data": {
-            "text/plain": [
-              "'06:16'"
-            ],
             "application/vnd.google.colaboratory.intrinsic+json": {
               "type": "string"
-            }
+            },
+            "text/plain": [
+              "'06:16'"
+            ]
           },
+          "execution_count": 11,
           "metadata": {},
-          "execution_count": 11
+          "output_type": "execute_result"
         }
       ],
       "source": [
@@ -427,6 +429,9 @@
     }
   ],
   "metadata": {
+    "colab": {
+      "provenance": []
+    },
     "kernelspec": {
       "display_name": "decision-layer",
       "language": "python",
@@ -443,11 +448,8 @@
       "nbconvert_exporter": "python",
       "pygments_lexer": "ipython3",
       "version": "3.11.5"
-    },
-    "colab": {
-      "provenance": []
     }
   },
   "nbformat": 4,
   "nbformat_minor": 0
-}
\ No newline at end of file
+}
diff --git a/docs/03-basic-langchain-agent.ipynb b/docs/03-basic-langchain-agent.ipynb
index dc728495136a804a79340026d14f56807929e443..6eac40982bf757aa11697b85ca3175281c67d55f 100644
--- a/docs/03-basic-langchain-agent.ipynb
+++ b/docs/03-basic-langchain-agent.ipynb
@@ -78,7 +78,7 @@
    ],
    "source": [
     "!pip install -qU \\\n",
-    "    semantic-router==0.0.15 \\\n",
+    "    semantic-router==0.0.16 \\\n",
     "    langchain==0.0.352 \\\n",
     "    openai==1.6.1"
    ]
diff --git a/docs/04-chat-history.ipynb b/docs/04-chat-history.ipynb
index 4ac533b1ef4126ac63cb6b8cacf2441d73a538ea..38fd769f94f9befc9e7d8969aa18880d45841b51 100644
--- a/docs/04-chat-history.ipynb
+++ b/docs/04-chat-history.ipynb
@@ -21,11 +21,6 @@
     "Applying semantic-router to the most recent interaction in a conversation can work for many cases but it misses scenarios where information provided in the latest interaction."
    ]
   },
-  {
-   "cell_type": "markdown",
-   "metadata": {},
-   "source": []
-  },
   {
    "cell_type": "code",
    "execution_count": 1,
diff --git a/docs/05-local-execution.ipynb b/docs/05-local-execution.ipynb
index ce36bccb8a88ca7cd728f1263f351098456c805e..f02ecf15478d49a3668338d547f9418171f6664b 100644
--- a/docs/05-local-execution.ipynb
+++ b/docs/05-local-execution.ipynb
@@ -1,11 +1,21 @@
 {
  "cells": [
+  {
+   "cell_type": "markdown",
+   "id": "e92c26d9",
+   "metadata": {},
+   "source": [
+    "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb) [![Open nbviewer](https://raw.githubusercontent.com/pinecone-io/examples/master/assets/nbviewer-shield.svg)](https://nbviewer.org/github/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb)"
+   ]
+  },
   {
    "cell_type": "markdown",
    "id": "ee50410e-3f98-4d9c-8838-b38aebd6ce77",
    "metadata": {},
    "source": [
-    "# Local execution with `llama.cpp` and HuggingFace Encoder\n",
+    "# Local Dynamic Routes\n",
+    "\n",
+    "## Fully local Semantic Router with `llama.cpp` and HuggingFace Encoder\n",
     "\n",
     "There are many reasons users might choose to roll their own LLMs rather than use a third-party service. Whether it's due to cost, privacy or compliance, Semantic Router supports the use of \"local\" LLMs through `llama.cpp`.\n",
     "\n",
diff --git a/pyproject.toml b/pyproject.toml
index 07536a512b342f58d2a0dc77933cdaa3c0f321f6..cd959d96d0023a183b92488e41528e60c1180c4d 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
 [tool.poetry]
 name = "semantic-router"
-version = "0.0.15"
+version = "0.0.16"
 description = "Super fast semantic router for AI decision making"
 authors = [
     "James Briggs <james@aurelio.ai>",
@@ -8,7 +8,8 @@ authors = [
     "Simonas Jakubonis <simonas@aurelio.ai>",
     "Luca Mannini <luca@aurelio.ai>",
     "Bogdan Buduroiu <bogdan@aurelio.ai>",
-    "Ismail Ashraq <ashraq@aurelio.ai>"
+    "Ismail Ashraq <ashraq@aurelio.ai>",
+    "Daniel Griffin <daniel@aurelio.ai>"
 ]
 readme = "README.md"
 packages = [{include = "semantic_router"}]