@@ -109,28 +109,22 @@ container rebuilds or pulls from Docker Hub.
...
@@ -109,28 +109,22 @@ container rebuilds or pulls from Docker Hub.
Your docker host will show the image as online once the build process is completed. This will build the app to `http://localhost:3001`.
Your docker host will show the image as online once the build process is completed. This will build the app to `http://localhost:3001`.
## ⚠️ Vector DB support ⚠️
## Common questions and fixes
Out of the box, all vector databases are supported. Any vector databases requiring special configuration are listed below.
### Using local ChromaDB with Dockerized AnythingLLM
- Ensure in your `./docker/.env` file that you have
### Cannot connect to service running on localhost!
```
If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:
#./docker/.env
...other configs
VECTOR_DB="chroma"
-`localhost`
CHROMA_ENDPOINT='http://host.docker.internal:8000' # Allow docker to look on host port, not container.
-`127.0.0.1`
# CHROMA_API_HEADER="X-Api-Key" // If you have an Auth middleware on your instance.
-`0.0.0.0`
# CHROMA_API_KEY="sk-123abc"
...other configs
> [!IMPORTANT]
> On linux `http://host.docker.internal:xxxx` does not work.
> Use `http://172.17.0.1:xxxx` instead to emulate this functionality.
```
Then in docker you need to replace that localhost part with `host.docker.internal`. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434 you should put `http://host.docker.internal:11434` into the connection URL in AnythingLLM.
## Common questions and fixes
### API is not working, cannot login, LLM is "offline"?
### API is not working, cannot login, LLM is "offline"?