Andrei Betlen
|
f72b6e9b73
|
Update llama.cpp
|
2023-07-15 15:01:08 -04:00 |
|
Andrei Betlen
|
e6c67c8f7d
|
Update llama.cpp
|
2023-07-14 16:40:31 -04:00 |
|
Andrei Betlen
|
896ab7b88a
|
Update llama.cpp
|
2023-07-13 23:24:55 -04:00 |
|
Andrei Betlen
|
98ae4e58a3
|
Update llama.cpp
|
2023-07-06 17:57:56 -04:00 |
|
Andrei Betlen
|
b994296c75
|
Update llama.cpp
|
2023-07-05 01:00:14 -04:00 |
|
Andrei Betlen
|
c67f786360
|
Update llama.cpp
|
2023-06-29 01:08:15 -04:00 |
|
Andrei Betlen
|
952228407e
|
Update llama.cpp
|
2023-06-26 08:50:38 -04:00 |
|
Andrei Betlen
|
e37798777e
|
Update llama.cpp
|
2023-06-20 11:25:10 -04:00 |
|
Andrei Betlen
|
d7153abcf8
|
Update llama.cpp
|
2023-06-16 23:11:14 -04:00 |
|
Andrei Betlen
|
715f98c591
|
Update llama.cpp
|
2023-06-14 21:40:13 -04:00 |
|
Andrei Betlen
|
6639371407
|
Update llama.cpp
|
2023-06-10 12:17:38 -04:00 |
|
Andrei Betlen
|
607d217caa
|
Allow both .so and .dylib extensions for macos
|
2023-06-08 00:27:19 -04:00 |
|
Andrei Betlen
|
aad4b17f52
|
Update llama.cpp
|
2023-06-06 16:23:55 -04:00 |
|
Andrei Betlen
|
7b57420ea9
|
Update llama.cpp
|
2023-06-05 18:17:29 -04:00 |
|
Andrei Betlen
|
fafe47114c
|
Update llama.cpp
|
2023-05-21 17:47:21 -04:00 |
|
Andrei Betlen
|
01a010be52
|
Fix llama_cpp and Llama type signatures. Closes #221
|
2023-05-19 11:59:33 -04:00 |
|
Andrei Betlen
|
61d58e7b35
|
Check for CUDA_PATH before adding
|
2023-05-17 15:26:38 -04:00 |
|
Aneesh Joy
|
e9794f91f2
|
Fixd CUBLAS dll load issue in Windows
|
2023-05-17 18:04:58 +01:00 |
|
Andrei Betlen
|
cbac19bf24
|
Add winmode arg only on windows if python version supports it
|
2023-05-15 09:15:01 -04:00 |
|
Andrei Betlen
|
c804efe3f0
|
Fix obscure Wndows DLL issue. Closes #208
|
2023-05-14 22:08:11 -04:00 |
|
Andrei Betlen
|
cdf59768f5
|
Update llama.cpp
|
2023-05-14 00:04:22 -04:00 |
|
Andrei Betlen
|
7a536e86c2
|
Allow model to tokenize strings longer than context length and set add_bos. Closes #92
|
2023-05-12 14:28:22 -04:00 |
|
Andrei Betlen
|
8dfde63255
|
Fix return type
|
2023-05-07 19:30:14 -04:00 |
|
Andrei Betlen
|
3fbda71790
|
Fix mlock_supported and mmap_supported return type
|
2023-05-07 03:04:22 -04:00 |
|
Andrei Betlen
|
7c3743fe5f
|
Update llama.cpp
|
2023-05-07 00:12:47 -04:00 |
|
Andrei Betlen
|
b5f3e74627
|
Add return type annotations for embeddings and logits
|
2023-05-05 14:22:55 -04:00 |
|
Andrei Betlen
|
3e28e0e50c
|
Fix: runtime type errors
|
2023-05-05 14:12:26 -04:00 |
|
Andrei Betlen
|
e24c3d7447
|
Prefer explicit imports
|
2023-05-05 14:05:31 -04:00 |
|
Andrei Betlen
|
40501435c1
|
Fix: types
|
2023-05-05 14:04:12 -04:00 |
|
Andrei Betlen
|
6702d2abfd
|
Fix candidates type
|
2023-05-05 14:00:30 -04:00 |
|
Andrei Betlen
|
5e7ddfc3d6
|
Fix llama_cpp types
|
2023-05-05 13:54:22 -04:00 |
|
Andrei Betlen
|
b6a9a0b6ba
|
Add types for all low-level api functions
|
2023-05-05 12:22:27 -04:00 |
|
Andrei Betlen
|
1d47cce222
|
Update llama.cpp
|
2023-05-03 09:33:30 -04:00 |
|
Matt Hoffner
|
f97ff3c5bb
|
Update llama_cpp.py
|
2023-05-01 20:40:06 -07:00 |
|
Andrei Betlen
|
350a1769e1
|
Update sampling api
|
2023-05-01 14:47:55 -04:00 |
|
Andrei Betlen
|
7837c3fdc7
|
Fix return types and import comments
|
2023-05-01 14:02:06 -04:00 |
|
Andrei Betlen
|
80184a286c
|
Update llama.cpp
|
2023-05-01 10:44:28 -04:00 |
|
Andrei Betlen
|
ea0faabae1
|
Update llama.cpp
|
2023-04-28 15:32:43 -04:00 |
|
Andrei Betlen
|
9339929f56
|
Update llama.cpp
|
2023-04-26 20:00:54 -04:00 |
|
Andrei Betlen
|
cbd26fdcc1
|
Update llama.cpp
|
2023-04-25 19:03:41 -04:00 |
|
Andrei Betlen
|
02cf881317
|
Update llama.cpp
|
2023-04-24 09:30:10 -04:00 |
|
Andrei Betlen
|
e99caedbbd
|
Update llama.cpp
|
2023-04-22 19:50:28 -04:00 |
|
Andrei Betlen
|
1eb130a6b2
|
Update llama.cpp
|
2023-04-21 17:40:27 -04:00 |
|
Andrei Betlen
|
95c0dc134e
|
Update type signature to allow for null pointer to be passed.
|
2023-04-18 23:44:46 -04:00 |
|
Andrei Betlen
|
35abf89552
|
Add bindings for LoRA adapters. Closes #88
|
2023-04-18 01:30:04 -04:00 |
|
Andrei Betlen
|
005c78d26c
|
Update llama.cpp
|
2023-04-12 14:29:00 -04:00 |
|
Andrei Betlen
|
9f1e565594
|
Update llama.cpp
|
2023-04-11 11:59:03 -04:00 |
|
Mug
|
2559e5af9b
|
Changed the environment variable name into "LLAMA_CPP_LIB"
|
2023-04-10 17:27:17 +02:00 |
|
Mug
|
ee71ce8ab7
|
Make windows users happy (hopefully)
|
2023-04-10 17:12:25 +02:00 |
|
Mug
|
cf339c9b3c
|
Better custom library debugging
|
2023-04-10 17:06:58 +02:00 |
|