diff --git a/docs/docs/getting_started/starter_tools/chatllamaindex.png b/docs/docs/getting_started/starter_tools/chatllamaindex.png
new file mode 100644
index 0000000000000000000000000000000000000000..2f451d6e624442cc5568278f0806ea784b9c91be
Binary files /dev/null and b/docs/docs/getting_started/starter_tools/chatllamaindex.png differ
diff --git a/docs/docs/getting_started/starter_tools/index.md b/docs/docs/getting_started/starter_tools/index.md
new file mode 100644
index 0000000000000000000000000000000000000000..fdef569eeead35c355ee9af2c728a4a7d72782a9
--- /dev/null
+++ b/docs/docs/getting_started/starter_tools/index.md
@@ -0,0 +1,37 @@
+# Starter Tools
+
+We have created a variety of open-source tools to help you bootstrap your generative AI projects.
+
+## create-llama: Full-stack web application generator
+
+The `create-llama` tool is a CLI tool that helps you create a full-stack web application with your choice of frontend and backend that indexes your documents and allows you to chat with them. Running it is as simple as running:
+
+```shell
+npx create-llama@latest
+```
+
+For full documentation, check out the [create-llama README on npm](https://www.npmjs.com/package/create-llama).
+
+## SEC Insights: advanced query techniques
+
+Indexing and querying financial filings is a very common use-case for generative AI. To help you get started, we have created and open-sourced a full-stack application that lets you select filings from public companies across multiple years and summarize and compare them. It uses advanced querying and retrieval techniques to achieve high quality results.
+
+You can use the app yourself at [SECinsights.ai](https://www.secinsights.ai/) or check out the code on [GitHub](https://github.com/run-llama/sec-insights).
+
+![SEC Insights](secinsights.png)
+
+## Chat LlamaIndex: Full-stack chat application
+
+Chat LlamaIndex is another full-stack, open-source application that has a variety of interaction modes including streaming chat and multi-modal querying over images. It's a great way to see advanced chat application techniques. You can use it at [chat.llamaindex.ai](https://chat.llamaindex.ai/) or check out the code on [GitHub](https://github.com/run-llama/chat-llamaindex).
+
+![Chat LlamaIndex](chatllamaindex.png)
+
+## LlamaBot: Slack and Discord apps
+
+LlamaBot is another open-source application, this time for building a Slack bot that listens to messages within your organization and answers questions about what's going on. You can check out the [full tutorial and code on GitHub]https://github.com/run-llama/llamabot). If you prefer Discord, there is a [Discord version contributed by the community](https://twitter.com/clusteredbytes/status/1754220009885163957).
+
+![LlamaBot](llamabot.png)
+
+## RAG CLI: quick command-line chat with any document
+
+We provide a command-line tool that quickly lets you chat with documents. Learn more in the [RAG CLI documentation](rag-cli).
diff --git a/docs/docs/getting_started/starter_tools/llamabot.png b/docs/docs/getting_started/starter_tools/llamabot.png
new file mode 100644
index 0000000000000000000000000000000000000000..91ca10f2272286aed61f7e505b80eb485e16e0ba
Binary files /dev/null and b/docs/docs/getting_started/starter_tools/llamabot.png differ
diff --git a/docs/docs/use_cases/q_and_a/rag_cli.md b/docs/docs/getting_started/starter_tools/rag_cli.md
similarity index 100%
rename from docs/docs/use_cases/q_and_a/rag_cli.md
rename to docs/docs/getting_started/starter_tools/rag_cli.md
diff --git a/docs/docs/getting_started/starter_tools/secinsights.png b/docs/docs/getting_started/starter_tools/secinsights.png
new file mode 100644
index 0000000000000000000000000000000000000000..d5364810af8b544726f74832409808ddc67195fd
Binary files /dev/null and b/docs/docs/getting_started/starter_tools/secinsights.png differ
diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml
index e086561e6f06b4c811f69c24a804948077b51127..a6fe2d8859c5ec20dbed10d182f50e4fff69a251 100644
--- a/docs/mkdocs.yml
+++ b/docs/mkdocs.yml
@@ -24,6 +24,9 @@ nav:
           - getting_started/starter_example_local.md
       - getting_started/discover_llamaindex.md
       - getting_started/customization.md
+      - Starter Tools:
+          - getting_started/starter_tools/index.md
+          - getting_started/starter_tools/rag_cli.md
   - Learn:
       - Building an LLM Application: ./understanding/index.md
       - Using LLMs: ./understanding/using_llms/using_llms.md
@@ -2363,7 +2366,7 @@ plugins:
         ./use_cases/fine_tuning.html: https://docs.llamaindex.ai/en/stable/use_cases/fine_tuning/
         ./use_cases/graph_querying.html: https://docs.llamaindex.ai/en/stable/use_cases/graph_querying/
         ./use_cases/multimodal.html: https://docs.llamaindex.ai/en/stable/use_cases/multimodal/
-        ./use_cases/q_and_a/rag_cli.html: https://docs.llamaindex.ai/en/stable/use_cases/q_and_a/rag_cli/
+        ./use_cases/q_and_a/rag_cli.html: getting_started/starter_tools/rag_cli.md
         ./use_cases/q_and_a/root.html: use_cases/q_and_a/index.md
         ./use_cases/querying_csvs.html: https://docs.llamaindex.ai/en/stable/use_cases/querying_csvs/
         ./use_cases/root.html: https://docs.llamaindex.ai/en/stable/use_cases/