b8438f70b5
* Added support for min_p My small contribution to this great project. Ref: https://github.com/ggerganov/llama.cpp/pull/3841 Closes: https://github.com/abetlen/llama-cpp-python/issues/911 * Fix for negative temp (sample_softmax) |
||
---|---|---|
.. | ||
server | ||
__init__.py | ||
_utils.py | ||
llama.py | ||
llama_chat_format.py | ||
llama_cpp.py | ||
llama_grammar.py | ||
llama_types.py | ||
llava_cpp.py | ||
py.typed |