server
|
feat: Update llama.cpp
|
2024-02-17 00:37:51 -05:00 |
__init__.py
|
chore: Bump version
|
2024-02-15 23:10:50 -05:00 |
_internals.py
|
feat: Support batch embeddings (#1186)
|
2024-02-14 04:26:09 -05:00 |
llama.py
|
fix: self.numa missing
|
2024-02-17 01:02:33 -05:00 |
llama_cpp.py
|
feat: Update llama.cpp
|
2024-02-17 00:37:51 -05:00 |
llama_speculative.py
|
Add speculative decoding (#1120)
|
2024-01-31 14:08:14 -05:00 |
llava_cpp.py
|
Update llama.cpp
|
2024-02-14 03:47:21 -05:00 |
py.typed
|
Add py.typed
|
2023-08-11 09:58:48 +02:00 |