llama.cpp/llama_cpp/server
Lucas Doyle 1e42913599 llama_cpp server: move logprobs to supported
I think this is actually supported (its in the arguments of `LLama.__call__`, which is how the completion is invoked). decision: mark as supported
2023-05-01 15:38:19 -07:00
..
__init__.py llama_cpp server: app is now importable, still runnable as a module 2023-04-29 11:41:25 -07:00
__main__.py llama_cpp server: slight refactor to init_llama function 2023-04-29 11:42:23 -07:00
app.py llama_cpp server: move logprobs to supported 2023-05-01 15:38:19 -07:00