Andrei Betlen
|
446d5f5649
|
Add metal ci test
|
2023-11-01 21:15:01 -04:00 |
|
Andrei Betlen
|
c89eadafbf
|
Update CHANGELOG
|
2023-11-01 19:40:04 -04:00 |
|
Andrei Betlen
|
6b3aa7fc8f
|
Bump version
|
2023-11-01 19:25:03 -04:00 |
|
NickAlgra
|
3fbcded7cd
|
Add missing n_seq_id to llama_batch (#842)
|
2023-11-01 18:56:29 -04:00 |
|
Sujeendran Menon
|
7b136bb5b1
|
Fix for shared library not found and compile issues in Windows (#848)
* fix windows library dll name issue
* Updated README.md Windows instructions
* Update llama_cpp.py to handle different windows dll file versions
|
2023-11-01 18:55:57 -04:00 |
|
cebtenzzre
|
eefd76fe81
|
llama: fix exception in Llama.__del__ (#846)
|
2023-11-01 18:53:57 -04:00 |
|
David Ponce
|
3fc9147218
|
Iterate over tokens that should be biased rather than the entire vocabulary. (#851)
|
2023-11-01 18:53:47 -04:00 |
|
Marko Tasic
|
9c8f4dca5f
|
fixed Llama._create_completion suffix check, it can be either None or str instance (#854)
|
2023-11-01 18:52:50 -04:00 |
|
Daniel Thuerck
|
5f8f369d1b
|
Pass-Through grammar parameter in web server. (#855) Closes #778
|
2023-11-01 18:51:12 -04:00 |
|
Adam Katora
|
25cb710281
|
Update llama_types.py (#849)
Minor typo fix, funcion -> function
|
2023-11-01 18:50:11 -04:00 |
|
Andrei Betlen
|
bdf5254658
|
Update llama.cpp
|
2023-11-01 14:15:56 -04:00 |
|
Andrei Betlen
|
d808fd436c
|
Update llama.cpp
|
2023-10-31 21:29:35 -04:00 |
|
Andrei Betlen
|
53861c9e53
|
Update llama.cpp
|
2023-10-24 03:13:32 -04:00 |
|
Andrei Betlen
|
acf50f179a
|
Update llama.cpp
|
2023-10-20 11:17:31 -04:00 |
|
Andrei Betlen
|
5a045fcbbc
|
Update llama.cpp
|
2023-10-19 17:37:07 -04:00 |
|
Andrei Betlen
|
ef03d77b59
|
Enable finish reason tests
|
2023-10-19 02:56:45 -04:00 |
|
gmcgoldr
|
09a8406c83
|
Fix streaming doesn't return finish reason (#798)
When streaming the yield that contains the finish can be skipped. This change ensures that yield isn't skipped.
|
2023-10-19 02:55:56 -04:00 |
|
Andrei Betlen
|
28c2b884e2
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-10-19 02:55:31 -04:00 |
|
Andrei Betlen
|
cbeef36510
|
Re-enable tests completion function
|
2023-10-19 02:55:29 -04:00 |
|
Andrei Betlen
|
ff580031d2
|
Update llama.cpp
|
2023-10-19 02:55:08 -04:00 |
|
Xiaoyu Kevin Hu
|
a315128d66
|
update value check for n_gpu_layers field (#826)
|
2023-10-18 18:25:25 -04:00 |
|
Andrei Betlen
|
d989ac86e6
|
Update llama.cpp
|
2023-10-15 15:12:57 -04:00 |
|
Pierre Alexandre SCHEMBRI
|
10304d75fc
|
Make use of suppress_stdout_stderr when freeing model (#803)
|
2023-10-15 13:52:43 -04:00 |
|
Ma, Guokai
|
a1ac199980
|
Fix repeat greeting (#808)
* fix repeated greeting
* remove seperator between role and message
|
2023-10-15 13:52:21 -04:00 |
|
Eric Liu
|
b50166500e
|
Add validation for tensor_split size exceeding LLAMA_MAX_DEVICES (#820)
* Add validation for tensor_split size exceeding LLAMA_MAX_DEVICES
* reword
|
2023-10-15 13:51:51 -04:00 |
|
Andrei Betlen
|
f30aa20126
|
Update llama.cpp
|
2023-10-12 02:24:50 -04:00 |
|
Andrei Betlen
|
622bff19b2
|
Update llama.cpp
|
2023-10-10 19:23:35 -04:00 |
|
Andrei Betlen
|
d6a130a052
|
Print traceback on server error
|
2023-10-10 15:56:04 -04:00 |
|
Andrei Betlen
|
43dfe1e2ab
|
Update llama.cpp
|
2023-10-05 16:07:49 -04:00 |
|
Andrei Betlen
|
2c0456acf0
|
Update llama.cpp
|
2023-10-04 20:19:31 -04:00 |
|
Andrei Betlen
|
c305be6db6
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-10-03 15:23:37 -04:00 |
|
Andrei Betlen
|
a7d17b8ac9
|
Update llama.cpp
|
2023-10-03 15:23:35 -04:00 |
|
ccshen
|
b76724cddc
|
Update instruction to download GGUF model (#783)
Co-authored-by: john.shen <john.shen@bioclinica.com>
|
2023-10-02 11:46:47 -04:00 |
|
Andrei Betlen
|
305482bd41
|
Add chatml chat format
|
2023-09-30 21:01:34 -04:00 |
|
Andrei Betlen
|
5ef5280ef9
|
Log server exceptions to stdout
|
2023-09-30 19:13:36 -04:00 |
|
Andrei Betlen
|
f0af1c7201
|
Update llama.cpp
|
2023-09-30 19:09:50 -04:00 |
|
Andrei Betlen
|
fab4bccc35
|
Bump version
|
2023-09-30 16:04:46 -04:00 |
|
Andrei Betlen
|
d696251fbe
|
Fix logits_all bug
|
2023-09-30 16:02:35 -04:00 |
|
Andrei Betlen
|
6ee413d79e
|
Bump version
|
2023-09-30 13:23:09 -04:00 |
|
Andrei Betlen
|
42bb721d64
|
Fix bug in embedding
|
2023-09-30 13:20:22 -04:00 |
|
Andrei Betlen
|
bca965325d
|
Update CHANGELOG
|
2023-09-30 00:08:45 -04:00 |
|
Andrei Betlen
|
5d62d55a82
|
Bump version
|
2023-09-30 00:07:06 -04:00 |
|
Andrei Betlen
|
ac853e01e1
|
Include git directories
|
2023-09-30 00:01:14 -04:00 |
|
Andrei Betlen
|
9e76613629
|
Remove git repo exclude
|
2023-09-29 23:28:59 -04:00 |
|
Andrei Betlen
|
b4939c2d99
|
Revert BUILD_NUMBER fix
|
2023-09-29 23:28:45 -04:00 |
|
Andrei Betlen
|
541aaff45e
|
Quote fix attempt #2
|
2023-09-29 23:05:26 -04:00 |
|
Andrei Betlen
|
39e5feb138
|
Fix quote issue
|
2023-09-29 23:01:38 -04:00 |
|
Andrei Betlen
|
3c6e98f945
|
Use dev versioning for test pypi
|
2023-09-29 22:57:49 -04:00 |
|
Andrei Betlen
|
1cca20304b
|
Revert update to publish test pypi
|
2023-09-29 22:48:17 -04:00 |
|
Andrei Betlen
|
85e4d08a2e
|
Update publish to test pypi workflow
|
2023-09-29 22:32:31 -04:00 |
|