Skip to content
Snippets Groups Projects
Unverified Commit eaaa0927 authored by Laurie Voss's avatar Laurie Voss Committed by GitHub
Browse files

Expanding use-cases docs: Q&A and chatbots

parent 2dd418e1
Branches
Tags
No related merge requests found
# Q&A patterns
(Semantic-search)=
## Semantic Search
The most basic example usage of LlamaIndex is through semantic search. We provide a simple in-memory vector store for you to get started, but you can also choose to use any one of our [vector store integrations](/community/integrations/vector_stores.md):
......@@ -23,6 +25,8 @@ print(response)
- [Example](/examples/vector_stores/SimpleIndexDemo.ipynb) ([Notebook](https://github.com/run-llama/llama_index/tree/main/docs/examples/vector_stores/SimpleIndexDemo.ipynb))
(Summarization)=
## Summarization
A summarization query requires the LLM to iterate through many if not most documents in order to synthesize an answer.
......@@ -57,6 +61,8 @@ Here are some relevant resources:
- [SQL Guide (Core)](/examples/index_structs/struct_indices/SQLIndexDemo.ipynb) ([Notebook](https://github.com/jerryjliu/llama_index/blob/main/docs/examples/index_structs/struct_indices/SQLIndexDemo.ipynb))
- [Pandas Demo](/examples/query_engine/pandas_query_engine.ipynb) ([Notebook](https://github.com/jerryjliu/llama_index/blob/main/docs/examples/query_engine/pandas_query_engine.ipynb))
(Combine-multiple-sources)=
## Synthesis over Heterogeneous Data
LlamaIndex supports synthesizing across heterogeneous data sources. This can be done by composing a graph over your existing data.
......@@ -81,6 +87,8 @@ response = query_engine.query("<query_str>")
- [City Analysis](/examples/composable_indices/city_analysis/PineconeDemo-CityAnalysis.ipynb) ([Notebook](https://github.com/jerryjliu/llama_index/blob/main/docs/examples/composable_indices/city_analysis/PineconeDemo-CityAnalysis.ipynb))
(Route-across-multiple-sources)=
## Routing over Heterogeneous Data
LlamaIndex also supports routing over heterogeneous data sources with `RouterQueryEngine` - for instance, if you want to "route" a query to an
......@@ -153,6 +161,8 @@ This module will help break down a complex query into a simpler one over your ex
You can also rely on the LLM to _infer_ whether to perform compare/contrast queries (see Multi-Document Queries below).
(Multi-document-queries)=
## Multi-Document Queries
Besides the explicit synthesis/routing flows described above, LlamaIndex can support more general multi-document queries as well.
......
......@@ -6,6 +6,12 @@ LlamaIndex gives you the tools to build knowledge-augmented chatbots and agents.
Here's some relevant resources:
- [Building a chatbot](/understanding/putting_it_all_together/chatbots/building_a_chatbot.md)
- [How to build a chatbot](/examples/agent/Chatbot_SEC.ipynb) tutorial
- [Using with a LangChain Agent](/community/integrations/using_with_langchain.md)
- [Building a chatbot](/understanding/putting_it_all_together/chatbots/building_a_chatbot.md) tutorial
- [create-llama](https://blog.llamaindex.ai/create-llama-a-command-line-tool-to-generate-llamaindex-apps-8f7683021191), a command line tool that generates a full-stack chatbot application for you
- [SECinsights.ai](https://www.secinsights.ai/), an open-source application that uses LlamaIndex to build a chatbot that answers questions about SEC filings
- [RAGs](https://blog.llamaindex.ai/introducing-rags-your-personalized-chatgpt-experience-over-your-data-2b9d140769b1), a project inspired by OpenAI's GPTs that lets you build a low-code chatbot over your data using Streamlit
- Our [OpenAI agents](/module_guides/deploying/agents/modules.md) are all chat bots in nature
## External sources
- [Building a chatbot with Streamlit](https://blog.streamlit.io/build-a-chatbot-with-custom-data-sources-powered-by-llamaindex/)
......@@ -9,19 +9,28 @@ Q&A has all sorts of sub-types, such as:
### What to do
- **Semantic search**: finding data that matches not just your query terms, but your intent and the meaning behind your question. This is sometimes known as "top k" search.
- [Example of semantic search](semantic-search)
- **Summarization**: condensing a large amount of data into a short summary relevant to your current question
- [Example of summarization](summarization)
### Where to search
- **Over documents**: LlamaIndex can pull in unstructured text, PDFs, Notion and Slack documents and more and index the data within them.
- [Example of search over documents](combine-multiple-sources)
- [Building a multi-document agent over the LlamaIndex docs](/examples/agent/multi_document_agents-v1.ipynb)
- **Over structured data**: if your data already exists in a SQL database, as JSON or as any number of other structured formats, LlamaIndex can query the data in these sources.
- [Searching Pandas tables](/examples/query_engine/pandas_query_engine.md)
- [Text to SQL](/examples/index_structs/struct_indices/SQLIndexDemo.md)
### How to search
- **Combine multiple sources**: is some of your data in Slack, some in PDFs, some in unstructured text? LlamaIndex can combine queries across an arbitrary number of sources and combine them.
- [Example of combining multiple sources](combine-multiple-sources)
- **Route across multiple sources**: given multiple data sources, your application can first pick the best source and then "route" the question to that source.
- [Example of routing across multiple sources](route-across-multiple-sources)
- **Multi-document queries**: some questions have partial answers in multiple data sources which need to be questioned separately before they can be combined
- [Example of multi-document queries](multi-document-queries)
## Examples
## Further examples
For examples of all of these types of Q&A, check out [Q&A](/understanding/putting_it_all_together/q_and_a.md) under "Putting it all together".
For further examples of Q&A use cases, see our [Q&A section in Putting it All Together](/understanding/putting_it_all_together/q_and_a.html).
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment