a5aa6c1478
`llama.create_chat_completion` definitely has a `top_k` argument, but its missing from `CreateChatCompletionRequest`. decision: add it |
||
---|---|---|
.. | ||
server | ||
__init__.py | ||
llama.py | ||
llama_cpp.py | ||
llama_types.py |
a5aa6c1478
`llama.create_chat_completion` definitely has a `top_k` argument, but its missing from `CreateChatCompletionRequest`. decision: add it |
||
---|---|---|
.. | ||
server | ||
__init__.py | ||
llama.py | ||
llama_cpp.py | ||
llama_types.py |