diff --git a/CHANGELOG.md b/CHANGELOG.md index 91d5944faa76bcce4f1f7b5130149fbcf69e0ccd..95996a03778012675d87e07e2069bc07515a1a30 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,11 @@ # ChangeLog +## Unreleased + +### Bug Fixes / Nits + +- Add missing import to `ChatEngine` usage pattern `.md` doc (#8518) + ## [0.8.53] - 2023-10-27 ### New Features diff --git a/docs/module_guides/deploying/chat_engines/usage_pattern.md b/docs/module_guides/deploying/chat_engines/usage_pattern.md index 8830e720ca819608c2439bcb0a6f7c6feaeaceae..b04e3fc31313401522ec35e8c0835446707487fb 100644 --- a/docs/module_guides/deploying/chat_engines/usage_pattern.md +++ b/docs/module_guides/deploying/chat_engines/usage_pattern.md @@ -74,6 +74,7 @@ Here's an example where we configure the following: ```python from llama_index.prompts import PromptTemplate from llama_index.llms import ChatMessage, MessageRole +from llama_index.chat_engine.condense_question import CondenseQuestionChatEngine custom_prompt = PromptTemplate("""\ Given a conversation (between Human and Assistant) and a follow up message from Human, \