diff --git a/packages/create-llama/templates/types/simple/fastapi/README-template.md b/packages/create-llama/templates/types/simple/fastapi/README-template.md index f0b92bdfce648a374bd2723fee4ceec69605db69..0e7fb53886cb223956fafa4f5290b42ae8fd3f36 100644 --- a/packages/create-llama/templates/types/simple/fastapi/README-template.md +++ b/packages/create-llama/templates/types/simple/fastapi/README-template.md @@ -9,6 +9,13 @@ poetry install poetry shell ``` +By default, we use the OpenAI LLM (though you can customize, see app/api/routers/chat.py). As a result you need to specify an `OPENAI_API_KEY` in an .env file in this directory. + +Example `backend/.env` file: +``` +OPENAI_API_KEY=<openai_api_key> +``` + Second, run the development server: ``` diff --git a/packages/create-llama/templates/types/streaming/fastapi/README-template.md b/packages/create-llama/templates/types/streaming/fastapi/README-template.md index f0b92bdfce648a374bd2723fee4ceec69605db69..0e7fb53886cb223956fafa4f5290b42ae8fd3f36 100644 --- a/packages/create-llama/templates/types/streaming/fastapi/README-template.md +++ b/packages/create-llama/templates/types/streaming/fastapi/README-template.md @@ -9,6 +9,13 @@ poetry install poetry shell ``` +By default, we use the OpenAI LLM (though you can customize, see app/api/routers/chat.py). As a result you need to specify an `OPENAI_API_KEY` in an .env file in this directory. + +Example `backend/.env` file: +``` +OPENAI_API_KEY=<openai_api_key> +``` + Second, run the development server: ```