This project is mirrored from https://github.com/Mintplex-Labs/anything-llm.
Pull mirroring updated .
- Jan 04, 2024
-
-
timothycarambat authored
-
- Dec 19, 2023
-
-
Timothy Carambat authored
* wip * btn links * button updates * update dev instructions * typo
-
- Dec 07, 2023
-
-
Timothy Carambat authored
* Implement use of native embedder (all-Mini-L6-v2) stop showing prisma queries during dev * Add native embedder as an available embedder selection * wrap model loader in try/catch * print progress on download * add built-in LLM support (expiermental) * Update to progress output for embedder * move embedder selection options to component * saftey checks for modelfile * update ref * Hide selection when on hosted subdomain * update documentation hide localLlama when on hosted * saftey checks for storage of models * update dockerfile to pre-build Llama.cpp bindings * update lockfile * add langchain doc comment * remove extraneous --no-metal option * Show data handling for private LLM * persist model in memory for N+1 chats * update import update dev comment on token model size * update primary README * chore: more readme updates and remove screenshots - too much to maintain, just use the app! * remove screeshot link
-
- Nov 20, 2023
-
-
timothycarambat authored
-
- Oct 25, 2023
-
-
timothycarambat authored
-
timothycarambat authored
Update README to point to updated YT video
-
timothycarambat authored
-
- Aug 12, 2023
-
-
timothycarambat authored
-
- Jul 27, 2023
-
-
timothycarambat authored
-
- Jun 14, 2023
-
-
Timothy Carambat authored
* wip * fix file ref * update dockerfile * mute chown * add build script for AWS CF template construction add comment about script and AWS deployment * move aws stuff into its own folder * edit readme
-
- Jun 04, 2023
-
-
timothycarambat authored
-