Commit graph

1041 commits

Author SHA1 Message Date
Andrei Betlen
c7c700b0d4 Bump version 2023-07-24 14:11:21 -04:00
Andrei Betlen
11dd2bf382 Add temporary rms_norm_eps parameter 2023-07-24 14:09:24 -04:00
Andrei Betlen
8cd64d4ac3 Add rms_eps_norm 2023-07-24 13:52:12 -04:00
Andrei
e4431a6ade
Merge pull request #522 from bretello/llama2-70b-support
Llama2 70b support
2023-07-24 13:48:26 -04:00
bretello
0f09f10e8c
add support for llama2 70b 2023-07-24 19:38:24 +02:00
Andrei Betlen
77c9f496b0 Merge branch 'main' into v0.2-wip 2023-07-24 13:19:54 -04:00
Andrei Betlen
4aaaec561d Bump version 2023-07-24 13:12:38 -04:00
Andrei Betlen
401309d11c Revert "Merge pull request #521 from bretello/main"
This reverts commit 07f0f3a386, reversing
changes made to d8a3ddbb1c.
2023-07-24 13:11:10 -04:00
Andrei
07f0f3a386
Merge pull request #521 from bretello/main
raise exception when `llama_load_model_from_file` fails
2023-07-24 13:09:28 -04:00
Andrei Betlen
d8a3ddbb1c Update llama.cpp 2023-07-24 13:08:06 -04:00
Andrei Betlen
985d559971 Update llama.cpp 2023-07-24 13:04:34 -04:00
bretello
8be7d67f7e
raise exception when llama_load_model_from_file fails 2023-07-24 14:42:37 +02:00
Charles Duffy
c03fa87956
pyproject.toml: extras list should contain only package list, not versions (#515)
Update poetry.lock accordingly.
2023-07-23 13:15:40 -05:00
Andrei Betlen
436036aa67 Merge branch 'main' into v0.2-wip 2023-07-21 12:42:38 -04:00
Andrei Betlen
231123ee1e Update llama.cpp 2023-07-21 12:41:59 -04:00
Andrei Betlen
b83728ad1e Update llama.cpp 2023-07-21 12:33:27 -04:00
Andrei Betlen
0538ba1dab Merge branch 'main' into v0.2-wip 2023-07-20 19:06:26 -04:00
Andrei Betlen
a4fe3fe350 Bump version 2023-07-20 18:56:29 -04:00
Andrei Betlen
01435da740 Update llama.cpp 2023-07-20 18:54:25 -04:00
Andrei Betlen
28a111704b Fix compatibility with older python versions 2023-07-20 18:52:10 -04:00
Andrei Betlen
d10ce62714 Revert ctypes argtype change 2023-07-20 18:51:53 -04:00
Andrei
365d9a4367
Merge pull request #481 from c0sogi/main
Added `RouteErrorHandler` for server
2023-07-20 17:41:42 -04:00
Andrei
a9cb645495
Merge pull request #511 from viniciusarruda/patch-1
Update llama_cpp.py - Fix c_char_p to Array[c_char_p] and c_float to …
2023-07-20 17:40:39 -04:00
Vinicius
a8551477f5
Update llama_cpp.py - Fix c_char_p to Array[c_char_p] and c_float to Array[c_float] 2023-07-20 17:29:11 -03:00
Andrei
5549a1cabd
Merge pull request #508 from ctejada85/main
Now the last token sent when `stream=True`
2023-07-20 16:07:54 -04:00
Carlos Tejada
0756a2d3fb Now the last token sent when stream=True 2023-07-19 22:47:14 -04:00
Andrei Betlen
0b121a7456 Format 2023-07-19 03:48:27 -04:00
Andrei Betlen
b43917c144 Add functions parameters 2023-07-19 03:48:20 -04:00
Andrei
36872620d0
Merge pull request #501 from a10y/patch-1
Update install instructions for Linux OpenBLAS
2023-07-18 22:26:42 -04:00
Andrew Duffy
b6b2071180
Update install instructions for Linux OpenBLAS
The instructions are different than they used to be.

Source: https://github.com/ggerganov/llama.cpp#openblas
2023-07-18 22:22:33 -04:00
Andrei Betlen
57db1f9570 Update development docs for scikit-build-core. Closes #490 2023-07-18 20:26:25 -04:00
Andrei Betlen
d2c5afe5a3 Remove prerelease python version 2023-07-18 19:38:51 -04:00
Andrei Betlen
7ce6cdf45b Update supported python versions. 2023-07-18 19:37:52 -04:00
Andrei Betlen
792b981119 Fix numpy dependency 2023-07-18 19:30:06 -04:00
Andrei Betlen
19ba9d3845 Use numpy arrays for logits_processors and stopping_criteria. Closes #491 2023-07-18 19:27:41 -04:00
Andrei
5eab1db0d0
Merge branch 'main' into v0.2-wip 2023-07-18 18:54:27 -04:00
Andrei
a05cfaf815
Merge pull request #498 from ctejada85/windows-pip-install
Added info to set ENV variables in PowerShell
2023-07-18 18:53:24 -04:00
Andrei Betlen
6cb77a20c6 Migrate to scikit-build-core. Closes #489 2023-07-18 18:52:29 -04:00
Carlos Tejada
b24b10effd Added info to set ENV variables in PowerShell
- Added an example on how to set the variables `CMAKE_ARGS`
  and `FORCE_CMAKE`.
- Added a subtitle for the `Windows remarks` and `MacOS` remarks.
2023-07-18 17:14:42 -04:00
Andrei Betlen
c9985abc03 Bump version 2023-07-18 13:54:51 -04:00
Andrei Betlen
9127bc2777 Update llama.cpp 2023-07-18 13:54:42 -04:00
Andrei
071ac799d5
Merge pull request #485 from callMeMakerRen/main
expose RoPE param to server start
2023-07-18 12:30:21 -04:00
shutup
5ed8bf132f expose RoPE param to server start 2023-07-18 16:34:36 +08:00
c0sogi
1551ba10bd Added RouteErrorHandler for server 2023-07-16 14:57:39 +09:00
Andrei Betlen
6d8892fe64 Bump version 2023-07-15 17:13:55 -04:00
Andrei Betlen
8ab098e49d Re-order Llama class params 2023-07-15 15:35:08 -04:00
Andrei Betlen
e4f9db37db Fix context_params struct layout 2023-07-15 15:34:55 -04:00
Andrei Betlen
bdf32df255 Add additional direnv directory to gitignore 2023-07-15 15:34:32 -04:00
Andrei Betlen
d0572f4fca Merge branch 'custom_rope' into main 2023-07-15 15:11:43 -04:00
Andrei Betlen
f0797a6054 Merge branch main into custom_rope 2023-07-15 15:11:01 -04:00