diff --git a/.changeset/moody-eggs-destroy.md b/.changeset/moody-eggs-destroy.md
new file mode 100644
index 0000000000000000000000000000000000000000..e078bdd334d03321c27646858333e6babf7e7343
--- /dev/null
+++ b/.changeset/moody-eggs-destroy.md
@@ -0,0 +1,5 @@
+---
+"@llamaindex/doc": patch
+---
+
+docs: update chat engine docs
diff --git a/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx b/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx
index d79496337675fedcbb0ad2ad68de19968566e288..f75b0fc1581d63b453514825c10485be92fa27db 100644
--- a/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx
+++ b/apps/next/src/content/docs/llamaindex/modules/chat_engine.mdx
@@ -12,9 +12,26 @@ const chatEngine = new ContextChatEngine({ retriever });
 const response = await chatEngine.chat({ message: query });
 ```
 
+In short, you can use the chat engine by calling `index.asChatEngine()`. It will return a `ContextChatEngine` to start chatting.
+
+```typescript
+const chatEngine = index.asChatEngine();
+```
+
+You can also pass in options to the chat engine.
+
+```typescript
+const chatEngine = index.asChatEngine({
+  similarityTopK: 5,
+  systemPrompt: "You are a helpful assistant.",
+});
+```
+
+
 The `chat` function also supports streaming, just add `stream: true` as an option:
 
 ```typescript
+const chatEngine = index.asChatEngine();
 const stream = await chatEngine.chat({ message: query, stream: true });
 for await (const chunk of stream) {
   process.stdout.write(chunk.response);