llama.cpp/llama_cpp
anil cfb7da98ed
Support Accept text/event-stream in chat and completion endpoints, resolves #1083 (#1088)
Co-authored-by: Anil Pathak <anil@heyday.com>
Co-authored-by: Andrei Betlen <abetlen@gmail.com>
2024-01-16 12:52:52 -05:00
..
server Support Accept text/event-stream in chat and completion endpoints, resolves #1083 (#1088) 2024-01-16 12:52:52 -05:00
__init__.py Bump version 2024-01-15 12:54:51 -05:00
_utils.py Avoid "LookupError: unknown encoding: ascii" when open() called in a destructor (#1012) 2024-01-15 10:52:10 -05:00
llama.py Add split_mode option. Closes #1085 2024-01-15 12:49:20 -05:00
llama_chat_format.py Add Saiga chat format. (#1050) 2024-01-04 18:12:58 -05:00
llama_cpp.py Update llama.cpp 2024-01-15 10:12:10 -05:00
llama_grammar.py Fix Pydantic model parsing (#1087) 2024-01-15 10:45:57 -05:00
llama_types.py Add missing tool_calls finish_reason 2023-11-10 02:51:06 -05:00
llava_cpp.py Make building llava optional 2023-11-28 04:55:21 -05:00
py.typed Add py.typed 2023-08-11 09:58:48 +02:00