llama.cpp/llama_cpp/server
TK-Master b8438f70b5
Added support for min_p (#921)
* Added support for min_p

My small contribution to this great project.

Ref: https://github.com/ggerganov/llama.cpp/pull/3841

Closes: https://github.com/abetlen/llama-cpp-python/issues/911

* Fix for negative temp (sample_softmax)
2023-11-20 23:21:33 -05:00
..
__init__.py llama_cpp server: app is now importable, still runnable as a module 2023-04-29 11:41:25 -07:00
__main__.py Remove confusing helpstring from server cli args. Closes #719 2023-09-15 14:09:43 -04:00
app.py Added support for min_p (#921) 2023-11-20 23:21:33 -05:00