Commit graph

1668 commits

Author SHA1 Message Date
Vinicius
a8551477f5
Update llama_cpp.py - Fix c_char_p to Array[c_char_p] and c_float to Array[c_float] 2023-07-20 17:29:11 -03:00
Andrei
5549a1cabd
Merge pull request #508 from ctejada85/main
Now the last token sent when `stream=True`
2023-07-20 16:07:54 -04:00
Carlos Tejada
0756a2d3fb Now the last token sent when stream=True 2023-07-19 22:47:14 -04:00
Andrei Betlen
0b121a7456 Format 2023-07-19 03:48:27 -04:00
Andrei Betlen
b43917c144 Add functions parameters 2023-07-19 03:48:20 -04:00
Andrei
36872620d0
Merge pull request #501 from a10y/patch-1
Update install instructions for Linux OpenBLAS
2023-07-18 22:26:42 -04:00
Andrew Duffy
b6b2071180
Update install instructions for Linux OpenBLAS
The instructions are different than they used to be.

Source: https://github.com/ggerganov/llama.cpp#openblas
2023-07-18 22:22:33 -04:00
Andrei Betlen
57db1f9570 Update development docs for scikit-build-core. Closes #490 2023-07-18 20:26:25 -04:00
Andrei Betlen
d2c5afe5a3 Remove prerelease python version 2023-07-18 19:38:51 -04:00
Andrei Betlen
7ce6cdf45b Update supported python versions. 2023-07-18 19:37:52 -04:00
Andrei Betlen
792b981119 Fix numpy dependency 2023-07-18 19:30:06 -04:00
Andrei Betlen
19ba9d3845 Use numpy arrays for logits_processors and stopping_criteria. Closes #491 2023-07-18 19:27:41 -04:00
Andrei
5eab1db0d0
Merge branch 'main' into v0.2-wip 2023-07-18 18:54:27 -04:00
Andrei
a05cfaf815
Merge pull request #498 from ctejada85/windows-pip-install
Added info to set ENV variables in PowerShell
2023-07-18 18:53:24 -04:00
Andrei Betlen
6cb77a20c6 Migrate to scikit-build-core. Closes #489 2023-07-18 18:52:29 -04:00
Carlos Tejada
b24b10effd Added info to set ENV variables in PowerShell
- Added an example on how to set the variables `CMAKE_ARGS`
  and `FORCE_CMAKE`.
- Added a subtitle for the `Windows remarks` and `MacOS` remarks.
2023-07-18 17:14:42 -04:00
Andrei Betlen
c9985abc03 Bump version 2023-07-18 13:54:51 -04:00
Andrei Betlen
9127bc2777 Update llama.cpp 2023-07-18 13:54:42 -04:00
Andrei
071ac799d5
Merge pull request #485 from callMeMakerRen/main
expose RoPE param to server start
2023-07-18 12:30:21 -04:00
shutup
5ed8bf132f expose RoPE param to server start 2023-07-18 16:34:36 +08:00
c0sogi
1551ba10bd Added RouteErrorHandler for server 2023-07-16 14:57:39 +09:00
Andrei Betlen
6d8892fe64 Bump version 2023-07-15 17:13:55 -04:00
Andrei Betlen
8ab098e49d Re-order Llama class params 2023-07-15 15:35:08 -04:00
Andrei Betlen
e4f9db37db Fix context_params struct layout 2023-07-15 15:34:55 -04:00
Andrei Betlen
bdf32df255 Add additional direnv directory to gitignore 2023-07-15 15:34:32 -04:00
Andrei Betlen
d0572f4fca Merge branch 'custom_rope' into main 2023-07-15 15:11:43 -04:00
Andrei Betlen
f0797a6054 Merge branch main into custom_rope 2023-07-15 15:11:01 -04:00
Andrei Betlen
f72b6e9b73 Update llama.cpp 2023-07-15 15:01:08 -04:00
Andrei
15e0e0a937
Merge pull request #390 from SubhranshuSharma/main
added termux with root instructions
2023-07-14 16:53:23 -04:00
Andrei Betlen
118b7f6d5c fix: tensor_split should be optional list 2023-07-14 16:52:48 -04:00
Andrei Betlen
25b3494e11 Minor fix to tensor_split parameter 2023-07-14 16:40:53 -04:00
Andrei Betlen
e6c67c8f7d Update llama.cpp 2023-07-14 16:40:31 -04:00
Andrei
82b11c8c16
Merge pull request #460 from shouyiwang/tensor_split
Add support for llama.cpp's --tensor-split parameter
2023-07-14 16:33:54 -04:00
Shouyi Wang
579f526246 Resolve merge conflicts 2023-07-14 14:37:01 +10:00
Andrei Betlen
6705f9b6c6 Bump version 2023-07-13 23:32:06 -04:00
Andrei Betlen
de4cc5a233 bugfix: pydantic v2 fields 2023-07-13 23:25:12 -04:00
Andrei Betlen
896ab7b88a Update llama.cpp 2023-07-13 23:24:55 -04:00
Andrei Betlen
7bb0024cd0 Fix uvicorn dependency 2023-07-12 19:31:43 -04:00
randoentity
3f8f276f9f Add bindings for custom_rope 2023-07-10 17:37:46 +02:00
Andrei Betlen
f6c9d17f6b Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-07-09 18:20:06 -04:00
Andrei Betlen
8e0f6253db Bump version 2023-07-09 18:20:04 -04:00
Andrei Betlen
c988c2ac0b Bump version 2023-07-09 18:19:37 -04:00
Andrei Betlen
df3d545938 Update changelog 2023-07-09 18:13:41 -04:00
Andrei Betlen
a86bfdf0a5 bugfix: truncate completion max_tokens to fit context length by default 2023-07-09 18:13:29 -04:00
Andrei Betlen
6f70cc4b7d bugfix: pydantic settings missing / changed fields 2023-07-09 18:03:31 -04:00
Andrei Betlen
0f3c474a49 Bump version 2023-07-09 11:44:29 -04:00
Andrei Betlen
9aa64163db Update llama.cpp 2023-07-09 11:40:59 -04:00
Shouyi Wang
9f21f548a5 Add tensor split 2023-07-09 23:00:59 +10:00
Andrei Betlen
99f064e681 docker: Add libopenblas to simple image 2023-07-09 01:36:39 -04:00
Andrei Betlen
00da643929 Update llama.cpp 2023-07-08 20:30:34 -04:00