diff --git a/README.md b/README.md
index 5cb61026618ef3b4732d1b00ee9d651fd517f460..e326767377837c1dc6c339a6352fb585946c450e 100644
--- a/README.md
+++ b/README.md
@@ -1,7 +1,16 @@
 # Create LlamaIndex App
 
-The easiest way to get started with LlamaIndex is by using `create-llama`. This CLI tool enables you to quickly start building a new LlamaIndex application, with everything set up for you. 
-To get started, use the following command:
+The easiest way to get started with [LlamaIndex](https://www.llamaindex.ai/) is by using `create-llama`. This CLI tool enables you to quickly start building a new LlamaIndex application, with everything set up for you. 
+
+## Features
+
+- NextJS, ExpressJS, or FastAPI (python) stateless backend generation 💻
+- Streaming or non-streaming backend âš¡
+- Optional `shadcn` frontend generation 🎨
+
+## Get Started
+
+You can run `create-llama` in interactive or non-interactive mode.
 
 ### Interactive
 
@@ -17,15 +26,25 @@ yarn create llama
 pnpm create llama
 ```
 
-You will be asked for the name of your project, and then which framework you want to use
-create a TypeScript project:
+You will be asked for the name of your project, along with other configuration options.
+
+Here is an example:
 
 ```bash
+>> npm create llama
+Need to install the following packages:
+  create-llama@0.0.3
+Ok to proceed? (y) y
+✔ What is your project named? … my-app
+✔ Which template would you like to use? › Chat with streaming
 ✔ Which framework would you like to use? › NextJS
+✔ Which UI would you like to use? › Just HTML
+✔ Which chat engine would you like to use? › ContextChatEngine
+✔ Please provide your OpenAI API key (leave blank to skip): … 
+✔ Would you like to use ESLint? … No / Yes
+Creating a new LlamaIndex app in /home/my-app.
 ```
 
-You can choose between NextJS and Express.
-
 ### Non-interactive
 
 You can also pass command line arguments to set up a new project
@@ -52,3 +71,7 @@ Options:
 
 ```
 
+## LlamaIndex Documentation
+
+- [TS/JS docs](https://ts.llamaindex.ai/)
+- [Python docs](https://docs.llamaindex.ai/en/stable/)