Skip to content
Snippets Groups Projects
Select Git revision
  • 0.3.0
  • add-dockers
  • add/azure-code-interpreter
  • add/duckduckgo
  • bump/react-19-stable
  • changeset-release/main
  • chore/add-ollama-url
  • chore/always-generate-tool-config
  • chore/better-dx
  • chore/bump-ai-v4
  • chore/bump-lits
  • chore/bump-llama-index
  • chore/bump-react19-rc
  • chore/fix-always-ask-post-installation-question
  • chore/fix-dependency
  • chore/fix-syntax
  • chore/gemini
  • chore/improve
  • chore/improve-e2b-output
  • chore/lee-fix
  • v0.4.0
  • v0.3.28
  • v0.3.27
  • v0.3.26
  • v0.3.25
  • v0.3.24
  • v0.3.23
  • v0.3.22
  • v0.3.21
  • v0.3.20
  • v0.3.19
  • v0.3.18
  • v0.3.17
  • v0.3.16
  • v0.3.15
  • v0.3.14
  • v0.3.13
  • v0.3.12
  • v0.3.11
  • v0.3.10
40 results
You can move around the graph by using the arrow keys.
Created with Raphaël 2.2.025Apr2422191716151211109854129Mar28272625222120191814131211765123Feb222011762131Jan302926252422191816151412111085229Dec282722201815141275428Nov272423211917161514151413141310add model mapping to settingsfix: hide events per default and optimize python messaging (#64)Create lovely-papayas-exist.mdfeat: support anthropicfix: changeset statusci: fix pnpmci: name changeset PRs with versionfix: allow onnxruntime in nextjs server side (#59)feat: display chat events (#52)Release 0.1.0 (#44)v0.1.0v0.1.0fix: only render sources with metadataAdding support for Llama 3 and Phi3 (via Ollama) (#53)chore: fix format and update to pnpm 9.0.5feat: show alert when getting chat error (#55)refactor: use tsx instead of ts-node (#54)ci: add npm publish to release scriptfeat: display sources in chat messages (#45)fix: aws4 warning (#51)add npmrc file for express to fix long path name issue with node 20 on windows (#49)fix: nextjs type checksfix: use new ToolsFactoryci: add github releases actiondocs: don't autocommit changesetsfix vercel stream breakinglee/vercel-stre…lee/vercel-streamingadd vercel streaming formatfeat: Use `gpt-4-turbo` model as default. Upgrade Python llama-index to 0.10.28docs(changeset): Use `gpt-4-turbo` model as default. Upgrade Python llama-index to 0.10.28feat: Qdrant support (#42)feat: use poetry run generate (#41)feat: optionally ask to select AI models and use default models (#40)docs(changeset): Use Settings object for LlamaIndex configurationfeat: use setting config (#38)fix: release scriptchore: add changeset release scriptdocs(changeset): Add observability for Pythonfeat: observability for Python with OpenTelemetry (#39)check vison model to append empty datafix/remove-empt…fix/remove-empty-object-at-start-and-end-of-streamingRELEASING: Releasing 1 package(s)v0.0.32v0.0.32fix: remove non-working vercel streaming with source nodesfix: remove empty object at start and end of streaming
Loading