# we could use this as a guide for our chatbot to avoid political conversations
politics=Route(
name="politics",
utterances=[
"isn't politics the best thing ever",
"why don't you tell me about your political opinions",
"don't you just love the president""don't you just hate the president",
"they're going to destroy this country!",
"they will save the country!",
],
)
# this could be used as an indicator to our chatbot to switch to a more
# conversational prompt
chitchat=Route(
name="chitchat",
utterances=[
"how's the weather today?",
"how are things going?",
"lovely weather today",
"the weather is horrendous",
"let's go to the chippy",
],
)
# we place both of our decisions together into single list
routes=[politics,chitchat]
```
%% Cell type:markdown id: tags:
As of 13 June 2024, two encoders support async functionality:
*`AzureOpenAIEncoder`
*`OpenAIEncoder`
To use either of these encoders in async mode we simply initialize them as we usually would. When we then include them within a `RouteLayer` and run `acall` the route layer will automatically run the encoders in async mode.
There are several async methods we can call directly:
%% Cell type:code id: tags:
``` python
awaitpc_index._async_list_indexes()
```
%% Output
{'indexes': []}
%% Cell type:markdown id: tags:
But unless we're using the index directly, we don't need to use these. As with the encoder, once we pass the `PineconeIndex` to our route layer, the route layer will call all async methods automatically when we hit the `acall` method.
%% Cell type:markdown id: tags:
## Async RouteLayer
%% Cell type:markdown id: tags:
The `RouteLayer` class supports both sync and async operations by default, so we initialize as usual: