Skip to content
Snippets Groups Projects
Unverified Commit d4d29a45 authored by Siraj R Aizlewood's avatar Siraj R Aizlewood
Browse files

OllamaLLM Updates

Ensured that we have default values for attributes name, temperature, llm_name, max_tokens and stream.

User can choose to alter the values that are actually used on the fly via new optional arguments in  __call__.

Also _call_ changed to __call__, in line with other LLMs.

Previously self.name was being used to identify the LLM to be called, but self.name is intended to identify the OllamaLLM instance as an instance of OllamaLLM. So it's now set to "ollama".
parent daef9df3
No related branches found
No related tags found
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment