Skip to content
Snippets Groups Projects
Unverified Commit 6281fc8c authored by github-actions[bot]'s avatar github-actions[bot] Committed by GitHub
Browse files

Release 0.3.9 (#828)

parent c3747d09
No related branches found
No related tags found
No related merge requests found
Showing
with 88 additions and 20 deletions
---
"llamaindex": patch
"@llamaindex/core-e2e": patch
"@llamaindex/next-agent-test": patch
"@llamaindex/nextjs-edge-runtime-test": patch
---
fix: import `@xenova/transformers`
For now, if you use llamaindex in next.js, you need to add a plugin from `llamaindex/next` to ensure some module resolutions are correct.
# docs
## 0.0.17
### Patch Changes
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
## 0.0.16
### Patch Changes
......
{
"name": "docs",
"version": "0.0.16",
"version": "0.0.17",
"private": true,
"scripts": {
"docusaurus": "docusaurus",
......
# @llamaindex/autotool-01-node-example
## null
### Patch Changes
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
- @llamaindex/autotool@0.0.1
......@@ -12,5 +12,6 @@
},
"scripts": {
"start": "node --import tsx --import @llamaindex/autotool/node ./src/index.ts"
}
},
"version": null
}
# @llamaindex/autotool-02-next-example
## 0.1.1
### Patch Changes
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
- @llamaindex/autotool@0.0.1
{
"name": "@llamaindex/autotool-02-next-example",
"private": true,
"version": "0.1.0",
"version": "0.1.1",
"scripts": {
"dev": "next dev",
"build": "next build",
......
......@@ -47,7 +47,7 @@
"unplugin": "^1.10.1"
},
"peerDependencies": {
"llamaindex": "^0.3.0",
"llamaindex": "^0.3.9",
"openai": "^4",
"typescript": "^4"
},
......
# llamaindex
## 0.3.9
### Patch Changes
- c3747d0: fix: import `@xenova/transformers`
For now, if you use llamaindex in next.js, you need to add a plugin from `llamaindex/next` to ensure some module resolutions are correct.
## 0.3.8
### Patch Changes
......
# @llamaindex/core-e2e
## 0.0.5
### Patch Changes
- c3747d0: fix: import `@xenova/transformers`
For now, if you use llamaindex in next.js, you need to add a plugin from `llamaindex/next` to ensure some module resolutions are correct.
## 0.0.4
### Patch Changes
......
# @llamaindex/cloudflare-worker-agent-test
## 0.0.10
### Patch Changes
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
## 0.0.9
### Patch Changes
......
{
"name": "@llamaindex/cloudflare-worker-agent-test",
"version": "0.0.9",
"version": "0.0.10",
"type": "module",
"private": true,
"scripts": {
......
# @llamaindex/next-agent-test
## 0.1.10
### Patch Changes
- c3747d0: fix: import `@xenova/transformers`
For now, if you use llamaindex in next.js, you need to add a plugin from `llamaindex/next` to ensure some module resolutions are correct.
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
## 0.1.9
### Patch Changes
......
{
"name": "@llamaindex/next-agent-test",
"version": "0.1.9",
"version": "0.1.10",
"private": true,
"scripts": {
"dev": "next dev",
......
# test-edge-runtime
## 0.1.9
### Patch Changes
- c3747d0: fix: import `@xenova/transformers`
For now, if you use llamaindex in next.js, you need to add a plugin from `llamaindex/next` to ensure some module resolutions are correct.
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
## 0.1.8
### Patch Changes
......
{
"name": "@llamaindex/nextjs-edge-runtime-test",
"version": "0.1.8",
"version": "0.1.9",
"private": true,
"scripts": {
"dev": "next dev",
......
# @llamaindex/waku-query-engine-test
## 0.0.10
### Patch Changes
- Updated dependencies [c3747d0]
- llamaindex@0.3.9
## 0.0.9
### Patch Changes
......
{
"name": "@llamaindex/waku-query-engine-test",
"version": "0.0.9",
"version": "0.0.10",
"type": "module",
"private": true,
"scripts": {
......
{
"name": "@llamaindex/core-e2e",
"private": true,
"version": "0.0.4",
"version": "0.0.5",
"type": "module",
"scripts": {
"e2e": "node --import tsx --import ./mock-register.js --test ./node/*.e2e.ts",
......
{
"name": "@llamaindex/core",
"version": "0.3.8",
"version": "0.3.9",
"exports": "./src/index.ts",
"imports": {
"@llamaindex/env": "jsr:@llamaindex/env@0.1.2"
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment