Skip to content
Snippets Groups Projects
Unverified Commit cc3343ba authored by Timothy Carambat's avatar Timothy Carambat Committed by GitHub
Browse files

add Docker internal URL to READMEs (#441)

add Docker internal URL
parent da0cec7a
No related branches found
No related tags found
No related merge requests found
...@@ -94,6 +94,13 @@ This monorepo consists of three main sections: ...@@ -94,6 +94,13 @@ This monorepo consists of three main sections:
*AnythingLLM by default embeds text on instance privately [Learn More](/server/storage/models/README.md) *AnythingLLM by default embeds text on instance privately [Learn More](/server/storage/models/README.md)
## Recommended usage with Docker (easy!) ## Recommended usage with Docker (easy!)
> [!IMPORTANT]
> If you are running another service on localhost like Chroma, LocalAi, or LMStudio
> you will need to use http://host.docker.internal:xxxx to access the service from within
> the docker container using AnythingLLM as `localhost:xxxx` will not resolve for the host system.
> eg: Chroma host URL running on localhost:8000 on host machine needs to be http://host.docker.internal:8000
> when used in AnythingLLM.
> [!TIP] > [!TIP]
> It is best to mount the containers storage volume to a folder on your host machine > It is best to mount the containers storage volume to a folder on your host machine
> so that you can pull in future updates without deleting your existing data! > so that you can pull in future updates without deleting your existing data!
......
...@@ -6,6 +6,13 @@ Use the Dockerized version of AnythingLLM for a much faster and complete startup ...@@ -6,6 +6,13 @@ Use the Dockerized version of AnythingLLM for a much faster and complete startup
- Install [Docker](https://www.docker.com/) on your computer or machine. - Install [Docker](https://www.docker.com/) on your computer or machine.
## Recommend way to run dockerized AnythingLLM! ## Recommend way to run dockerized AnythingLLM!
> [!IMPORTANT]
> If you are running another service on localhost like Chroma, LocalAi, or LMStudio
> you will need to use http://host.docker.internal:xxxx to access the service from within
> the docker container using AnythingLLM as `localhost:xxxx` will not resolve for the host system.
> eg: Chroma host URL running on localhost:8000 on host machine needs to be http://host.docker.internal:8000
> when used in AnythingLLM.
> [!TIP] > [!TIP]
> It is best to mount the containers storage volume to a folder on your host machine > It is best to mount the containers storage volume to a folder on your host machine
> so that you can pull in future updates without deleting your existing data! > so that you can pull in future updates without deleting your existing data!
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment