-
- Downloads
OllamaLLM Updates
Ensured that we have default values for attributes name, temperature, llm_name, max_tokens and stream. User can choose to alter the values that are actually used on the fly via new optional arguments in __call__. Also _call_ changed to __call__, in line with other LLMs. Previously self.name was being used to identify the LLM to be called, but self.name is intended to identify the OllamaLLM instance as an instance of OllamaLLM. So it's now set to "ollama".
Loading
Please register or sign in to comment