diff --git a/README.md b/README.md
index b52ddc03553fca25ae3833d16855a537c7ef8494..368826db0bb2f30ba678d395fe57be7a358ec95e 100644
--- a/README.md
+++ b/README.md
@@ -17,14 +17,14 @@ pnpm create llama-app
 bunx create-llama
 ```
 
-You will be asked for the name of your project, and then whether you want to
+You will be asked for the name of your project, and then which framework you want to use
 create a TypeScript project:
 
 ```bash
-✔ Would you like to use TypeScript? … No / Yes
+✔ Which framework would you like to use? › NextJS
 ```
 
-Select **Yes** to install the necessary types/dependencies and create a new TS project.
+You can choose between NextJS and Express.
 
 ### Non-interactive
 
diff --git a/templates/simple/express/README-template.md b/templates/simple/express/README-template.md
index b00c2e8d80f755ec0519e0c895c24008a0fbd44a..2b4d1153161e4a68fbb7bcd1e7c2f3d79b77a808 100644
--- a/templates/simple/express/README-template.md
+++ b/templates/simple/express/README-template.md
@@ -1,14 +1,21 @@
-1. Install the dependencies
+This is a [LlamaIndex](https://www.llamaindex.ai/) project using [Express](https://expressjs.com/) bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama).
+
+## Getting Started
+
+First, install the dependencies:
+
 ```
-pnpm install
+npm install
 ```
 
-2. Run the server
+Second, run the development server:
+
 ```
-pnpm run dev
+npm run dev
 ```
 
-3. Call the API to LLM Chat
+Then call the express API endpoint `/api/llm` to see the result:
+
 ```
 curl --location 'localhost:3000/api/llm' \
 --header 'Content-Type: application/json' \
@@ -16,4 +23,15 @@ curl --location 'localhost:3000/api/llm' \
     "message": "Hello",
     "chatHistory": []
 }'
-```
\ No newline at end of file
+```
+
+You can start editing the API by modifying `src/controllers/llm.controller.ts`. The endpoint auto-updates as you save the file.
+
+## Learn More
+
+To learn more about LlamaIndex, take a look at the following resources:
+
+- [LlamaIndex Documentation](https://docs.llamaindex.ai) - learn about LlamaIndex (Python features).
+- [LlamaIndexTS Documentation](https://ts.llamaindex.ai) - learn about LlamaIndex (Typescript features).
+
+You can check out [the LlamaIndexTS GitHub repository](https://github.com/run-llama/LlamaIndexTS) - your feedback and contributions are welcome!