Andrei Betlen
|
401309d11c
|
Revert "Merge pull request #521 from bretello/main"
This reverts commit 07f0f3a386 , reversing
changes made to d8a3ddbb1c .
|
2023-07-24 13:11:10 -04:00 |
|
Andrei
|
07f0f3a386
|
Merge pull request #521 from bretello/main
raise exception when `llama_load_model_from_file` fails
|
2023-07-24 13:09:28 -04:00 |
|
Andrei Betlen
|
d8a3ddbb1c
|
Update llama.cpp
|
2023-07-24 13:08:06 -04:00 |
|
Andrei Betlen
|
985d559971
|
Update llama.cpp
|
2023-07-24 13:04:34 -04:00 |
|
bretello
|
8be7d67f7e
|
raise exception when llama_load_model_from_file fails
|
2023-07-24 14:42:37 +02:00 |
|
Andrei Betlen
|
231123ee1e
|
Update llama.cpp
|
2023-07-21 12:41:59 -04:00 |
|
Andrei Betlen
|
b83728ad1e
|
Update llama.cpp
|
2023-07-21 12:33:27 -04:00 |
|
Andrei Betlen
|
a4fe3fe350
|
Bump version
|
2023-07-20 18:56:29 -04:00 |
|
Andrei Betlen
|
01435da740
|
Update llama.cpp
|
2023-07-20 18:54:25 -04:00 |
|
Andrei Betlen
|
28a111704b
|
Fix compatibility with older python versions
|
2023-07-20 18:52:10 -04:00 |
|
Andrei Betlen
|
d10ce62714
|
Revert ctypes argtype change
|
2023-07-20 18:51:53 -04:00 |
|
Andrei
|
365d9a4367
|
Merge pull request #481 from c0sogi/main
Added `RouteErrorHandler` for server
|
2023-07-20 17:41:42 -04:00 |
|
Andrei
|
a9cb645495
|
Merge pull request #511 from viniciusarruda/patch-1
Update llama_cpp.py - Fix c_char_p to Array[c_char_p] and c_float to …
|
2023-07-20 17:40:39 -04:00 |
|
Vinicius
|
a8551477f5
|
Update llama_cpp.py - Fix c_char_p to Array[c_char_p] and c_float to Array[c_float]
|
2023-07-20 17:29:11 -03:00 |
|
Andrei
|
5549a1cabd
|
Merge pull request #508 from ctejada85/main
Now the last token sent when `stream=True`
|
2023-07-20 16:07:54 -04:00 |
|
Carlos Tejada
|
0756a2d3fb
|
Now the last token sent when stream=True
|
2023-07-19 22:47:14 -04:00 |
|
Andrei
|
36872620d0
|
Merge pull request #501 from a10y/patch-1
Update install instructions for Linux OpenBLAS
|
2023-07-18 22:26:42 -04:00 |
|
Andrew Duffy
|
b6b2071180
|
Update install instructions for Linux OpenBLAS
The instructions are different than they used to be.
Source: https://github.com/ggerganov/llama.cpp#openblas
|
2023-07-18 22:22:33 -04:00 |
|
Andrei
|
a05cfaf815
|
Merge pull request #498 from ctejada85/windows-pip-install
Added info to set ENV variables in PowerShell
|
2023-07-18 18:53:24 -04:00 |
|
Carlos Tejada
|
b24b10effd
|
Added info to set ENV variables in PowerShell
- Added an example on how to set the variables `CMAKE_ARGS`
and `FORCE_CMAKE`.
- Added a subtitle for the `Windows remarks` and `MacOS` remarks.
|
2023-07-18 17:14:42 -04:00 |
|
Andrei Betlen
|
c9985abc03
|
Bump version
|
2023-07-18 13:54:51 -04:00 |
|
Andrei Betlen
|
9127bc2777
|
Update llama.cpp
|
2023-07-18 13:54:42 -04:00 |
|
Andrei
|
071ac799d5
|
Merge pull request #485 from callMeMakerRen/main
expose RoPE param to server start
|
2023-07-18 12:30:21 -04:00 |
|
shutup
|
5ed8bf132f
|
expose RoPE param to server start
|
2023-07-18 16:34:36 +08:00 |
|
c0sogi
|
1551ba10bd
|
Added RouteErrorHandler for server
|
2023-07-16 14:57:39 +09:00 |
|
Andrei Betlen
|
6d8892fe64
|
Bump version
|
2023-07-15 17:13:55 -04:00 |
|
Andrei Betlen
|
8ab098e49d
|
Re-order Llama class params
|
2023-07-15 15:35:08 -04:00 |
|
Andrei Betlen
|
e4f9db37db
|
Fix context_params struct layout
|
2023-07-15 15:34:55 -04:00 |
|
Andrei Betlen
|
bdf32df255
|
Add additional direnv directory to gitignore
|
2023-07-15 15:34:32 -04:00 |
|
Andrei Betlen
|
d0572f4fca
|
Merge branch 'custom_rope' into main
|
2023-07-15 15:11:43 -04:00 |
|
Andrei Betlen
|
f0797a6054
|
Merge branch main into custom_rope
|
2023-07-15 15:11:01 -04:00 |
|
Andrei Betlen
|
f72b6e9b73
|
Update llama.cpp
|
2023-07-15 15:01:08 -04:00 |
|
Andrei
|
15e0e0a937
|
Merge pull request #390 from SubhranshuSharma/main
added termux with root instructions
|
2023-07-14 16:53:23 -04:00 |
|
Andrei Betlen
|
118b7f6d5c
|
fix: tensor_split should be optional list
|
2023-07-14 16:52:48 -04:00 |
|
Andrei Betlen
|
25b3494e11
|
Minor fix to tensor_split parameter
|
2023-07-14 16:40:53 -04:00 |
|
Andrei Betlen
|
e6c67c8f7d
|
Update llama.cpp
|
2023-07-14 16:40:31 -04:00 |
|
Andrei
|
82b11c8c16
|
Merge pull request #460 from shouyiwang/tensor_split
Add support for llama.cpp's --tensor-split parameter
|
2023-07-14 16:33:54 -04:00 |
|
Shouyi Wang
|
579f526246
|
Resolve merge conflicts
|
2023-07-14 14:37:01 +10:00 |
|
Andrei Betlen
|
6705f9b6c6
|
Bump version
|
2023-07-13 23:32:06 -04:00 |
|
Andrei Betlen
|
de4cc5a233
|
bugfix: pydantic v2 fields
|
2023-07-13 23:25:12 -04:00 |
|
Andrei Betlen
|
896ab7b88a
|
Update llama.cpp
|
2023-07-13 23:24:55 -04:00 |
|
Andrei Betlen
|
7bb0024cd0
|
Fix uvicorn dependency
|
2023-07-12 19:31:43 -04:00 |
|
randoentity
|
3f8f276f9f
|
Add bindings for custom_rope
|
2023-07-10 17:37:46 +02:00 |
|
Andrei Betlen
|
f6c9d17f6b
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-07-09 18:20:06 -04:00 |
|
Andrei Betlen
|
8e0f6253db
|
Bump version
|
2023-07-09 18:20:04 -04:00 |
|
Andrei Betlen
|
c988c2ac0b
|
Bump version
|
2023-07-09 18:19:37 -04:00 |
|
Andrei Betlen
|
df3d545938
|
Update changelog
|
2023-07-09 18:13:41 -04:00 |
|
Andrei Betlen
|
a86bfdf0a5
|
bugfix: truncate completion max_tokens to fit context length by default
|
2023-07-09 18:13:29 -04:00 |
|
Andrei Betlen
|
6f70cc4b7d
|
bugfix: pydantic settings missing / changed fields
|
2023-07-09 18:03:31 -04:00 |
|
Andrei Betlen
|
0f3c474a49
|
Bump version
|
2023-07-09 11:44:29 -04:00 |
|