ollama/api
Bruce MacDonald 1fbf3585d6
Relay default values to llama runner (#672)
* include seed in params for llama.cpp server and remove empty filter for temp

* relay default predict options to llama.cpp

- reorganize options to match predict request for readability

* omit empty stop

---------

Co-authored-by: hallh <hallh@users.noreply.github.com>
2023-10-02 14:53:16 -04:00
..
client.go add show command (#474) 2023-09-06 11:04:17 -07:00
client.py DRAFT: add a simple python client to access ollama (#522) 2023-09-14 16:37:38 -07:00
types.go Relay default values to llama runner (#672) 2023-10-02 14:53:16 -04:00