From 9a236d659592583b4e929181dc17435b7315c148 Mon Sep 17 00:00:00 2001
From: Praharsh Bhatt <30700808+praharshbhatt@users.noreply.github.com>
Date: Fri, 27 Oct 2023 11:29:29 -0400
Subject: [PATCH] =?UTF-8?q?fix:=20add=20the=20needed=20CondenseQuestionCha?=
 =?UTF-8?q?tEngine=20import=20in=20the=20usage=5Fpa=E2=80=A6=20(#8518)?=
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit

* fix: add the needed CondenseQuestionChatEngine import in the usage_pattern docs

* Add entry to CHANGELOG.md

---------

Co-authored-by: Andrei Fajardo <92402603+nerdai@users.noreply.github.com>
---
 CHANGELOG.md                                               | 6 ++++++
 docs/module_guides/deploying/chat_engines/usage_pattern.md | 1 +
 2 files changed, 7 insertions(+)

diff --git a/CHANGELOG.md b/CHANGELOG.md
index 91d5944faa..95996a0377 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,11 @@
 # ChangeLog
 
+## Unreleased
+
+### Bug Fixes / Nits
+
+- Add missing import to `ChatEngine` usage pattern `.md` doc (#8518)
+
 ## [0.8.53] - 2023-10-27
 
 ### New Features
diff --git a/docs/module_guides/deploying/chat_engines/usage_pattern.md b/docs/module_guides/deploying/chat_engines/usage_pattern.md
index 8830e720ca..b04e3fc313 100644
--- a/docs/module_guides/deploying/chat_engines/usage_pattern.md
+++ b/docs/module_guides/deploying/chat_engines/usage_pattern.md
@@ -74,6 +74,7 @@ Here's an example where we configure the following:
 ```python
 from llama_index.prompts  import PromptTemplate
 from llama_index.llms import ChatMessage, MessageRole
+from llama_index.chat_engine.condense_question import CondenseQuestionChatEngine
 
 custom_prompt = PromptTemplate("""\
 Given a conversation (between Human and Assistant) and a follow up message from Human, \
-- 
GitLab