The most basic example usage of LlamaIndex is through semantic search. We provide a simple in-memory vector store for you to get started, but you can also choose to use any one of our [vector store integrations](/community/integrations/vector_stores.md):
@@ -6,6 +6,12 @@ LlamaIndex gives you the tools to build knowledge-augmented chatbots and agents.
Here's some relevant resources:
-[Building a chatbot](/understanding/putting_it_all_together/chatbots/building_a_chatbot.md)
-[How to build a chatbot](/examples/agent/Chatbot_SEC.ipynb) tutorial
-[Using with a LangChain Agent](/community/integrations/using_with_langchain.md)
-[Building a chatbot](/understanding/putting_it_all_together/chatbots/building_a_chatbot.md) tutorial
-[create-llama](https://blog.llamaindex.ai/create-llama-a-command-line-tool-to-generate-llamaindex-apps-8f7683021191), a command line tool that generates a full-stack chatbot application for you
-[SECinsights.ai](https://www.secinsights.ai/), an open-source application that uses LlamaIndex to build a chatbot that answers questions about SEC filings
-[RAGs](https://blog.llamaindex.ai/introducing-rags-your-personalized-chatgpt-experience-over-your-data-2b9d140769b1), a project inspired by OpenAI's GPTs that lets you build a low-code chatbot over your data using Streamlit
- Our [OpenAI agents](/module_guides/deploying/agents/modules.md) are all chat bots in nature
## External sources
-[Building a chatbot with Streamlit](https://blog.streamlit.io/build-a-chatbot-with-custom-data-sources-powered-by-llamaindex/)
@@ -9,19 +9,28 @@ Q&A has all sorts of sub-types, such as:
### What to do
-**Semantic search**: finding data that matches not just your query terms, but your intent and the meaning behind your question. This is sometimes known as "top k" search.
-[Example of semantic search](semantic-search)
-**Summarization**: condensing a large amount of data into a short summary relevant to your current question
-[Example of summarization](summarization)
### Where to search
-**Over documents**: LlamaIndex can pull in unstructured text, PDFs, Notion and Slack documents and more and index the data within them.
-[Example of search over documents](combine-multiple-sources)
-[Building a multi-document agent over the LlamaIndex docs](/examples/agent/multi_document_agents-v1.ipynb)
-**Over structured data**: if your data already exists in a SQL database, as JSON or as any number of other structured formats, LlamaIndex can query the data in these sources.
-[Text to SQL](/examples/index_structs/struct_indices/SQLIndexDemo.md)
### How to search
-**Combine multiple sources**: is some of your data in Slack, some in PDFs, some in unstructured text? LlamaIndex can combine queries across an arbitrary number of sources and combine them.
-[Example of combining multiple sources](combine-multiple-sources)
-**Route across multiple sources**: given multiple data sources, your application can first pick the best source and then "route" the question to that source.
-[Example of routing across multiple sources](route-across-multiple-sources)
-**Multi-document queries**: some questions have partial answers in multiple data sources which need to be questioned separately before they can be combined
-[Example of multi-document queries](multi-document-queries)
## Examples
## Further examples
For examples of all of these types of Q&A, check out [Q&A](/understanding/putting_it_all_together/q_and_a.md) under "Putting it all together".
For further examples of Q&A use cases, see our [Q&A section in Putting it All Together](/understanding/putting_it_all_together/q_and_a.html).