Skip to content
Snippets Groups Projects
Code owners
Assign users and groups as approvers for specific file changes. Learn more.
To find the state of this project's repository at the time of any of these versions, check out the tags.

@llamaindex/openai

0.1.44

Patch Changes

  • Updated dependencies [a9b5b993]
    • @llamaindex/core@0.4.19

0.1.43

Patch Changes

  • Updated dependencies [b504303c]
  • Updated dependencies [e0f6cc3b]
    • @llamaindex/env@0.1.25
    • @llamaindex/core@0.4.18

0.1.42

Patch Changes

  • 3d1808b5: chore: bump version
  • Updated dependencies [3d1808b5]
    • @llamaindex/core@0.4.17

0.1.41

Patch Changes

  • 8be45899: chore: bump version
  • Updated dependencies [8be45899]
    • @llamaindex/core@0.4.16
    • @llamaindex/env@0.1.24

0.1.40

Patch Changes

  • Updated dependencies [d2b2722a]
    • @llamaindex/env@0.1.23
    • @llamaindex/core@0.4.15

0.1.39

Patch Changes

  • Updated dependencies [969365ca]
    • @llamaindex/env@0.1.22
    • @llamaindex/core@0.4.14

0.1.38

Patch Changes

  • 90d265cf: chore: bump version
  • Updated dependencies [90d265cf]
    • @llamaindex/core@0.4.13
    • @llamaindex/env@0.1.21

0.1.37

Patch Changes

  • Updated dependencies [ef4f63d9]
    • @llamaindex/core@0.4.12

0.1.36

Patch Changes

  • Updated dependencies [6d22fa2a]
    • @llamaindex/core@0.4.11

0.1.35

Patch Changes

  • Updated dependencies [a7b0ac3c]
  • Updated dependencies [c69605f4]
    • @llamaindex/core@0.4.10

0.1.34

Patch Changes

  • 7ae6eaa0: feat: allow pass additionalChatOptions to agent
  • Updated dependencies [7ae6eaa0]
    • @llamaindex/core@0.4.9

0.1.33

Patch Changes

  • Updated dependencies [f865c984]
    • @llamaindex/core@0.4.8

0.1.32

Patch Changes

  • Updated dependencies [d89ebe02]
  • Updated dependencies [fd8c8827]
    • @llamaindex/core@0.4.7

0.1.31

Patch Changes

  • Updated dependencies [4fc001c8]
    • @llamaindex/env@0.1.20
    • @llamaindex/core@0.4.6

0.1.30

Patch Changes

  • Updated dependencies [ad85bd0b]
    • @llamaindex/core@0.4.5
    • @llamaindex/env@0.1.19

0.1.29

Patch Changes

  • Updated dependencies [a8d3fa68]
    • @llamaindex/env@0.1.18
    • @llamaindex/core@0.4.4

0.1.28

Patch Changes

  • Updated dependencies [95a5cc6e]
    • @llamaindex/core@0.4.3

0.1.27

Patch Changes

  • Updated dependencies [14cc9ebe]
    • @llamaindex/env@0.1.17
    • @llamaindex/core@0.4.2

0.1.26

Patch Changes

  • Updated dependencies [9c73f0a5]
    • @llamaindex/core@0.4.1

0.1.25

Patch Changes

0.1.24

Patch Changes

  • Updated dependencies [60b185ff]
    • @llamaindex/core@0.3.7

0.1.23

Patch Changes

  • Updated dependencies [691c5bca]
    • @llamaindex/core@0.3.6

0.1.22

Patch Changes

  • Updated dependencies [fa60fc66]
    • @llamaindex/env@0.1.16
    • @llamaindex/core@0.3.5

0.1.21

Patch Changes

  • Updated dependencies [e2a0876d]
    • @llamaindex/core@0.3.4

0.1.20

Patch Changes

  • Updated dependencies [0493f679]
    • @llamaindex/core@0.3.3

0.1.19

Patch Changes

  • Updated dependencies [4ba2cfe7]
    • @llamaindex/env@0.1.15
    • @llamaindex/core@0.3.2

0.1.18

Patch Changes

  • a75af835: refactor: move some llm and embedding to single package
  • Updated dependencies [ae49ff4e]
  • Updated dependencies [a75af835]
    • @llamaindex/env@0.1.14
    • @llamaindex/core@0.3.1

0.1.17

Patch Changes

  • Updated dependencies [1364e8ee]
  • Updated dependencies [96fc69cc]
    • @llamaindex/core@0.3.0

0.1.16

Patch Changes

  • 6a9a7b14: fix: take init api key into account

0.1.15

Patch Changes

  • Updated dependencies [5f678203]
    • @llamaindex/core@0.2.12

0.1.14

Patch Changes

  • Updated dependencies [ee697fb1]
    • @llamaindex/core@0.2.11

0.1.13

Patch Changes

  • Updated dependencies [3489e7de]
  • Updated dependencies [468bda59]
    • @llamaindex/core@0.2.10

0.1.12

Patch Changes

  • 2a824132: fix(core): set Settings.llm to OpenAI by default and support lazy load openai

0.1.11

Patch Changes

  • Updated dependencies [b17d439d]
    • @llamaindex/core@0.2.9

0.1.10

Patch Changes

  • Updated dependencies [df441e28]
    • @llamaindex/core@0.2.8
    • @llamaindex/env@0.1.13

0.1.9

Patch Changes

  • 96f72ad8: fix: openai streaming with token usage and finish_reason
  • Updated dependencies [6cce3b12]
    • @llamaindex/core@0.2.7

0.1.8

Patch Changes

  • Updated dependencies [8b7fdba5]
    • @llamaindex/core@0.2.6

0.1.7

Patch Changes

  • Updated dependencies [d902cc3e]
    • @llamaindex/core@0.2.5

0.1.6

Patch Changes

  • Updated dependencies [b48bcc3a]
    • @llamaindex/core@0.2.4
    • @llamaindex/env@0.1.12

0.1.5

Patch Changes

  • Updated dependencies [2cd1383d]
    • @llamaindex/core@0.2.3

0.1.4

Patch Changes

  • Updated dependencies [749b43a3]
    • @llamaindex/core@0.2.2

0.1.3

Patch Changes

  • Updated dependencies [ac07e3cb]
  • Updated dependencies [70ccb4ae]
  • Updated dependencies [1a6137b3]
  • Updated dependencies [ac07e3cb]
    • @llamaindex/core@0.2.1
    • @llamaindex/env@0.1.11

0.1.2

Patch Changes

  • Updated dependencies [11feef8c]
    • @llamaindex/core@0.2.0

0.1.1

Patch Changes

  • 7edeb1c2: feat: decouple openai from llamaindex module

    This should be a non-breaking change, but just you can now only install @llamaindex/openai to reduce the bundle size in the future