llama.cpp/llama_cpp/server
nullname d634efcdd9
feat: adding rpc_servers parameter to Llama class (#1477)
* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>
2024-06-04 10:38:21 -04:00
..
__init__.py llama_cpp server: app is now importable, still runnable as a module 2023-04-29 11:41:25 -07:00
__main__.py feat: Add support for yaml based configs 2024-04-10 02:47:01 -04:00
app.py feat: add MinTokensLogitProcessor and min_tokens argument to server (#1333) 2024-05-14 09:50:53 -04:00
cli.py Fix python3.8 support 2024-01-19 08:17:49 -05:00
errors.py misc: Format 2024-02-28 14:27:40 -05:00
model.py feat: adding rpc_servers parameter to Llama class (#1477) 2024-06-04 10:38:21 -04:00
settings.py feat: adding rpc_servers parameter to Llama class (#1477) 2024-06-04 10:38:21 -04:00
types.py feat: add MinTokensLogitProcessor and min_tokens argument to server (#1333) 2024-05-14 09:50:53 -04:00