diff --git a/apps/docs/docs/introduction.md b/apps/docs/docs/introduction.md index 74d1db5b4bfc7fdf383ef9940c0186a95db3f378..e49977dfb5e8c293b2b90e748eab842001d11ab9 100644 --- a/apps/docs/docs/introduction.md +++ b/apps/docs/docs/introduction.md @@ -37,14 +37,14 @@ For more complex applications, our lower-level APIs allow advanced users to cust Our documentation includes [Installation Instructions](./installation.md) and a [Starter Tutorial](./starter.md) to build your first application. -Once you're up and running, [High-Level Concepts](./concepts.md) has an overview of LlamaIndex's modular architecture. For more hands-on practical examples, look through our [End-to-End Tutorials](LINK TO EXAMPLES FOLDER). +Once you're up and running, [High-Level Concepts](./concepts.md) has an overview of LlamaIndex's modular architecture. For more hands-on practical examples, look through our [End-to-End Tutorials](./end_to_end.md). ## ðŸ—ºï¸ Ecosystem To download or contribute, find LlamaIndex on: - Github: https://github.com/jerryjliu/llama_index -- LlamaIndex (NPM): LINK TO NPM PACKAGE +- LlamaIndex (NPM): https://www.npmjs.com/package/llamaindex - LlamaIndex (Python): https://pypi.org/project/llama-index/. ## Community diff --git a/apps/docs/docs/modules/index.md b/apps/docs/docs/modules/index.md index 9293c7e880d86c4d377b041724982c68472eaf12..a5f91feb510b4577570773da1291cb8f13026a64 100644 --- a/apps/docs/docs/modules/index.md +++ b/apps/docs/docs/modules/index.md @@ -12,7 +12,7 @@ LlamaIndex.TS offers several core modules, seperated into high-level modules for - [**Indexes**](./high_level/data_index.md): indexes store the Nodes and the embeddings of those nodes. --[**QueryEngine**](./high_level/query_engine.md): Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected nodes from your Index to give the LLM the context it needs to answer your query. +- [**QueryEngine**](./high_level/query_engine.md): Query engines are what generate the query you put in and give you back the result. Query engines generally combine a pre-built prompt with selected nodes from your Index to give the LLM the context it needs to answer your query. - [**ChatEngine**](./high_level/chat_engine.md): A ChatEngine helps you build a chatbot that will interact with your Indexes.