Andrei Betlen
|
e38485a66d
|
Bump version.
|
2023-04-15 20:27:55 -04:00 |
|
Andrei Betlen
|
89856ef00d
|
Bugfix: only eval new tokens
|
2023-04-15 17:32:53 -04:00 |
|
Niek van der Maas
|
6df27b2da0
|
Merge branch 'main' of github.com:abetlen/llama-cpp-python
|
2023-04-15 20:24:59 +02:00 |
|
Niek van der Maas
|
59b37bbbd2
|
Support openblas
|
2023-04-15 20:24:46 +02:00 |
|
Andrei Betlen
|
887f3b73ac
|
Update llama.cpp
|
2023-04-15 12:16:05 -04:00 |
|
Andrei Betlen
|
92c077136d
|
Add experimental cache
|
2023-04-15 12:03:09 -04:00 |
|
Andrei Betlen
|
a6372a7ae5
|
Update stop sequences for chat
|
2023-04-15 12:02:48 -04:00 |
|
Andrei Betlen
|
83b2be6dc4
|
Update chat parameters
|
2023-04-15 11:58:43 -04:00 |
|
Andrei Betlen
|
62087514c6
|
Update chat prompt
|
2023-04-15 11:58:19 -04:00 |
|
Andrei Betlen
|
02f9fb82fb
|
Bugfix
|
2023-04-15 11:39:52 -04:00 |
|
Andrei Betlen
|
3cd67c7bd7
|
Add type annotations
|
2023-04-15 11:39:21 -04:00 |
|
Andrei Betlen
|
d7de0e8014
|
Bugfix
|
2023-04-15 00:08:04 -04:00 |
|
Andrei Betlen
|
e90e122f2a
|
Use clear
|
2023-04-14 23:33:18 -04:00 |
|
Andrei Betlen
|
ac7068a469
|
Track generated tokens internally
|
2023-04-14 23:33:00 -04:00 |
|
Andrei Betlen
|
25b646c2fb
|
Update llama.cpp
|
2023-04-14 23:32:05 -04:00 |
|
Andrei Betlen
|
6e298d8fca
|
Set kv cache size to f16 by default
|
2023-04-14 22:21:19 -04:00 |
|
Andrei Betlen
|
9c8c2c37dc
|
Update llama.cpp
|
2023-04-14 10:01:57 -04:00 |
|
Andrei Betlen
|
6c7cec0c65
|
Fix completion request
|
2023-04-14 10:01:15 -04:00 |
|
Andrei Betlen
|
6153baab2d
|
Clean up logprobs implementation
|
2023-04-14 09:59:33 -04:00 |
|
Andrei Betlen
|
26cc4ee029
|
Fix signature for stop parameter
|
2023-04-14 09:59:08 -04:00 |
|
Andrei Betlen
|
7dc0838fff
|
Bump version
|
2023-04-13 00:35:05 -04:00 |
|
Andrei Betlen
|
6595ad84bf
|
Add field to disable reseting between generations
|
2023-04-13 00:28:00 -04:00 |
|
Andrei Betlen
|
22fa5a621f
|
Revert "Deprecate generate method"
This reverts commit 6cf5876538 .
|
2023-04-13 00:19:55 -04:00 |
|
Andrei Betlen
|
4f5f99ef2a
|
Formatting
|
2023-04-12 22:40:12 -04:00 |
|
Andrei Betlen
|
0daf16defc
|
Enable logprobs on completion endpoint
|
2023-04-12 19:08:11 -04:00 |
|
Andrei Betlen
|
19598ac4e8
|
Fix threading bug. Closes #62
|
2023-04-12 19:07:53 -04:00 |
|
Andrei Betlen
|
005c78d26c
|
Update llama.cpp
|
2023-04-12 14:29:00 -04:00 |
|
Andrei Betlen
|
c854c2564b
|
Don't serialize stateful parameters
|
2023-04-12 14:07:14 -04:00 |
|
Andrei Betlen
|
2f9b649005
|
Style fix
|
2023-04-12 14:06:22 -04:00 |
|
Andrei Betlen
|
6cf5876538
|
Deprecate generate method
|
2023-04-12 14:06:04 -04:00 |
|
Andrei Betlen
|
b3805bb9cc
|
Implement logprobs parameter for text completion. Closes #2
|
2023-04-12 14:05:11 -04:00 |
|
Niek van der Maas
|
9ce8146231
|
More generic model name
|
2023-04-12 11:56:16 +02:00 |
|
Niek van der Maas
|
c14201dc0f
|
Add Dockerfile + build workflow
|
2023-04-12 11:53:39 +02:00 |
|
Andrei Betlen
|
2a60eb820f
|
Update llama.cpp
|
2023-04-11 23:53:46 -04:00 |
|
Andrei Betlen
|
9f1e565594
|
Update llama.cpp
|
2023-04-11 11:59:03 -04:00 |
|
Andrei Betlen
|
213cc5c340
|
Remove async from function signature to avoid blocking the server
|
2023-04-11 11:54:31 -04:00 |
|
Andrei Betlen
|
3727ba4d9e
|
Bump version
|
2023-04-10 12:56:48 -04:00 |
|
Andrei Betlen
|
5247e32d9e
|
Update llama.cpp
|
2023-04-10 12:56:23 -04:00 |
|
jm12138
|
90e1021154
|
Add unlimited max_tokens
|
2023-04-10 15:56:05 +00:00 |
|
Andrei Betlen
|
ffb1e80251
|
Bump version
|
2023-04-10 11:37:41 -04:00 |
|
Andrei
|
a5554a2f02
|
Merge pull request #61 from jm12138/fix_windows_install
Add UTF-8 Encoding in read_text.
|
2023-04-10 11:35:04 -04:00 |
|
jm12138
|
adfd9f681c
|
Matched the other encode calls
|
2023-04-10 15:33:31 +00:00 |
|
Andrei
|
0460fdb9ce
|
Merge pull request #28 from SagsMug/local-lib
Allow local llama library usage
|
2023-04-10 11:32:19 -04:00 |
|
Mug
|
2559e5af9b
|
Changed the environment variable name into "LLAMA_CPP_LIB"
|
2023-04-10 17:27:17 +02:00 |
|
Andrei
|
63d8a3c688
|
Merge pull request #63 from SagsMug/main
Low level chat: Added iterative search to prevent instructions from being echoed
|
2023-04-10 11:23:00 -04:00 |
|
Mug
|
ee71ce8ab7
|
Make windows users happy (hopefully)
|
2023-04-10 17:12:25 +02:00 |
|
Mug
|
cf339c9b3c
|
Better custom library debugging
|
2023-04-10 17:06:58 +02:00 |
|
Mug
|
4132293d2d
|
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python into local-lib
|
2023-04-10 17:00:42 +02:00 |
|
Mug
|
76131d5bb8
|
Use environment variable for library override
|
2023-04-10 17:00:35 +02:00 |
|
Mug
|
3bb45f1658
|
More reasonable defaults
|
2023-04-10 16:38:45 +02:00 |
|