[](https://colab.research.google.com/github/aurelio-labs/semantic-router/blob/main/docs/encoders/bedrock.ipynb) [](https://nbviewer.org/github/aurelio-labs/semantic-router/blob/main/docs/encoders/bedrock.ipynb)
%% Cell type:markdown id: tags:
# Using Bedrock embedding Models
%% Cell type:markdown id: tags:
The 3rd generation embedding models from AWS Bedrock (`amazon.titan-embed-text-v1`, `amazon.titan-embed-text-v2` and `cohere.embed-english-v3`) can both be used with our `BedrockEncoder`.
%% Cell type:markdown id: tags:
## Getting Started
%% Cell type:markdown id: tags:
We start by installing semantic-router. Support for the new `Bedrock` embedding models was added in `semantic-router==0.0.40`.
We start by installing semantic-router. Support for the new `Bedrock` embedding models was added in `0.0.40`.
%% Cell type:code id: tags:
``` python
!pipinstall-qU"semantic-router[bedrock]==0.0.40"
!pipinstall-qU"semantic-router[bedrock]"
```
%% Cell type:markdown id: tags:
We start by defining a dictionary mapping routes to example phrases that should trigger those routes.
%% Cell type:code id: tags:
``` python
fromsemantic_routerimportRoute
politics=Route(
name="politics",
utterances=[
"isn't politics the best thing ever",
"why don't you tell me about your political opinions",
"don't you just love the president",
"don't you just hate the president",
"they're going to destroy this country!",
"they will save the country!",
],
)
```
%% Cell type:markdown id: tags:
Let's define another for good measure:
%% Cell type:code id: tags:
``` python
chitchat=Route(
name="chitchat",
utterances=[
"how's the weather today?",
"how are things going?",
"lovely weather today",
"the weather is horrendous",
"let's go to the chippy",
],
)
routes=[politics,chitchat]
```
%% Cell type:markdown id: tags:
Now we initialize our embedding model, we will use the `-3-large` model alongside a `dimensions` value of `256`. This will produce _tiny_ 256-dimensional vectors that — according to OpenAI — outperform the 1536-dimensional vectors produced by `text-embedding-ada-002`.
Now we define the `RouteLayer`. When called, the route layer will consume text (a query) and output the category (`Route`) it belongs to — to initialize a `RouteLayer` we need our `encoder` model and a list of `routes`.
%% Cell type:code id: tags:
``` python
fromsemantic_router.layerimportRouteLayer
rl=RouteLayer(encoder=encoder,routes=routes)
```
%% Output
[32m2024-05-13 22:26:54 INFO semantic_router.utils.logger local[0m
%% Cell type:markdown id: tags:
We can check the dimensionality of our vectors by looking at the `index` attribute of the `RouteLayer`.
%% Cell type:code id: tags:
``` python
rl.index.index.shape
```
%% Output
(11, 1024)
%% Cell type:markdown id: tags:
We do have 256-dimensional vectors. Now let's test them:
In this case, we return `None` because no matches were identified. We always recommend optimizing your `RouteLayer` for optimal performance, you can see how in [this notebook](https://github.com/aurelio-labs/semantic-router/blob/main/docs/06-threshold-optimization.ipynb).