llama.cpp/llama_cpp
windspirit95 aa9f1ae011
feat: Add logprobs support to chat completions (#1311)
* Add logprobs return in ChatCompletionResponse

* Fix duplicate field

* Set default to false

* Simplify check

* Add server example

---------

Co-authored-by: Andrei Betlen <abetlen@gmail.com>
2024-03-31 13:30:13 -04:00
..
server feat: Add logprobs support to chat completions (#1311) 2024-03-31 13:30:13 -04:00
__init__.py chore: Bump version 2024-03-18 11:37:30 -04:00
_internals.py fix: Remove deprecated cfg sampling functions 2024-02-28 14:37:07 -05:00
_logger.py fix: Use llama_log_callback to avoid suppress_stdout_stderr 2024-02-05 21:52:12 -05:00
_utils.py Revert "Fix: fileno error google colab (#729) (#1156)" (#1157) 2024-02-02 12:18:55 -05:00
llama.py feat: Add logprobs support to chat completions (#1311) 2024-03-31 13:30:13 -04:00
llama_cache.py Move cache classes to llama_cache submodule. 2024-01-17 09:09:12 -05:00
llama_chat_format.py feat: Add logprobs support to chat completions (#1311) 2024-03-31 13:30:13 -04:00
llama_cpp.py feat: Update llama.cpp 2024-03-28 12:06:46 -04:00
llama_grammar.py Fixed json strings grammar by blacklisting character control set. Closes #1259 2024-03-08 21:10:53 -05:00
llama_speculative.py Add speculative decoding (#1120) 2024-01-31 14:08:14 -05:00
llama_tokenizer.py fix: LlamaHFTokenizer now receives pre_tokens 2024-02-23 12:23:24 -05:00
llama_types.py feat: Add logprobs support to chat completions (#1311) 2024-03-31 13:30:13 -04:00
llava_cpp.py misc: llava_cpp use ctypes function decorator for binding 2024-02-26 11:07:33 -05:00
py.typed Add py.typed 2023-08-11 09:58:48 +02:00