-
- Downloads
feat: support setting maxConcurrentChunks for Generic OpenAI embedder (#2655)
* exposes `maxConcurrentChunks` parameter for the generic openai embedder through configuration. This allows setting a batch size for endpoints which don't support the default of 500
* Update new field to new UI
make getting to ensure proper type and format
---------
Co-authored-by:
timothycarambat <rambat1010@gmail.com>
Showing
- docker/.env.example 2 additions, 1 deletiondocker/.env.example
- frontend/src/components/EmbeddingSelection/GenericOpenAiOptions/index.jsx 44 additions, 0 deletions...ponents/EmbeddingSelection/GenericOpenAiOptions/index.jsx
- server/.env.example 1 addition, 0 deletionsserver/.env.example
- server/models/systemSettings.js 2 additions, 0 deletionsserver/models/systemSettings.js
- server/utils/EmbeddingEngines/genericOpenAi/index.js 16 additions, 3 deletionsserver/utils/EmbeddingEngines/genericOpenAi/index.js
- server/utils/helpers/updateENV.js 4 additions, 0 deletionsserver/utils/helpers/updateENV.js
Please register or sign in to comment