Andrei Betlen
230fc8b535
Bump version
2023-11-21 05:04:55 -05:00
Andrei Betlen
128dc4731f
Fix #569
2023-11-21 04:39:05 -05:00
Andrei Betlen
7a3f87846b
Format
2023-11-21 04:02:20 -05:00
Andrei Betlen
422ebc89ce
Fix: Add logit_bias to all completion api methods
2023-11-21 04:01:36 -05:00
Andrei Betlen
79efc85206
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2023-11-21 03:59:49 -05:00
Andrei Betlen
07e47f55ba
Add support for logit_bias outside of server api. Closes #827
2023-11-21 03:59:46 -05:00
James Braza
23a221999f
Documenting server usage ( #768 )
2023-11-21 00:24:22 -05:00
Maarten ter Huurne
c21edb6908
Do not set grammar
to None
for new LlamaGrammar
objects ( #834 )
...
* Do not set `grammar` to `None` for new `LlamaGrammar` objects
The `grammar` attribute is written by `init()`, but that method always
returns `None`, so `__init__()` would then discard the previously
written object.
* Add minimal test for grammar parsing
2023-11-21 00:23:18 -05:00
mrfakename
ef65fc5ff4
Add MistralLite, Intel, and OpenChat prompt formats ( #927 )
...
* Add MistralLite format
* Update llama_chat_format.py
* Update llama_chat_format.py
2023-11-21 00:19:25 -05:00
Andrei Betlen
9d7c8307cd
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2023-11-20 23:23:20 -05:00
Andrei Betlen
3dc21b2557
tests: Improve llama.cpp mock
2023-11-20 23:23:18 -05:00
TK-Master
b8438f70b5
Added support for min_p ( #921 )
...
* Added support for min_p
My small contribution to this great project.
Ref: https://github.com/ggerganov/llama.cpp/pull/3841
Closes: https://github.com/abetlen/llama-cpp-python/issues/911
* Fix for negative temp (sample_softmax)
2023-11-20 23:21:33 -05:00
Andrei Betlen
63fe1370ed
Update llama.cpp
2023-11-20 23:19:47 -05:00
Andrei Betlen
a34d480141
Fix #929
2023-11-20 22:50:59 -05:00
Andrei Betlen
2c2afa320f
Update llama.cpp
2023-11-20 14:11:33 -05:00
zocainViken
6dde6bd09c
bug fixing ( #925 )
2023-11-20 12:31:52 -05:00
ZisisTsatsas
f3117c0cf6
fix: openblas_simple missing pkg-config to build ( #920 )
2023-11-20 12:31:02 -05:00
Andrei Betlen
55efc9e6b2
Update llama.cpp
2023-11-16 18:51:55 -05:00
Andrei Betlen
96a377648c
Merge tag 'v0.2.18' into main
2023-11-14 15:32:15 -05:00
Andrei Betlen
ca30d898e9
Merge tag 'v0.2.17' into main
2023-11-14 15:32:12 -05:00
Andrei Betlen
3af167d8db
Merge tag 'v0.2.16' into main
2023-11-14 15:32:08 -05:00
Andrei Betlen
cc0fe43849
Disable opencl test
2023-11-14 14:59:08 -05:00
Andrei Betlen
f2901d840e
Bump version
2023-11-14 14:10:00 -05:00
Andrei Betlen
a14f46a720
Update llama.cpp
2023-11-14 14:08:52 -05:00
Andrei Betlen
020945049a
Update llama.cpp
2023-11-13 22:20:19 -05:00
Andrei Betlen
01846a76b9
Bump version
2023-11-10 16:36:12 -05:00
Andrei Betlen
7e3e85b53d
Update llama.cpp
2023-11-10 16:33:55 -05:00
Andrei Betlen
4388f33414
Set CUDA_ARCHITECTURES=OFF for windows
2023-11-10 16:32:36 -05:00
Andrei Betlen
74167bdfb2
Update Functions notebook
2023-11-10 13:02:30 -05:00
Andrei Betlen
85ead98a3e
Update Functions notebook example
2023-11-10 12:49:14 -05:00
Andrei Betlen
b7e60b66f4
Bump version
2023-11-10 06:21:24 -05:00
Andrei Betlen
fb743f6c87
Update llama.cpp
2023-11-10 06:21:14 -05:00
Andrei Betlen
5f15a3d91c
Disable wheel repair command
2023-11-10 06:05:42 -05:00
Andrei Betlen
e02d52df29
Try to clean before calling cibuildwheel
2023-11-10 06:01:58 -05:00
Andrei Betlen
ed5a9260f6
Force LD_LIBRARY_PATH
2023-11-10 05:54:23 -05:00
Andrei Betlen
2f070afd61
Don't install in editable mode for release
2023-11-10 05:45:44 -05:00
Andrei Betlen
e32ecb0516
Fix tests
2023-11-10 05:39:42 -05:00
Andrei Betlen
6f0b0b1b84
Fix sampling bug when logits_all=False
2023-11-10 05:15:41 -05:00
Andrei Betlen
d9b38e3e3a
Potential bugfix for eval
2023-11-10 04:41:19 -05:00
Andrei Betlen
52350cc9d7
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python into main
2023-11-10 04:24:51 -05:00
Andrei Betlen
b84d76a844
Fix: add default stop sequence to chatml chat format
2023-11-10 04:24:48 -05:00
Andrei Betlen
841f6167cc
Add Code Completion section to docs
2023-11-10 04:06:14 -05:00
Andrei Betlen
1b376c62b7
Update functionary for new OpenAI API
2023-11-10 02:51:58 -05:00
Andrei Betlen
17da8fb446
Add missing tool_calls finish_reason
2023-11-10 02:51:06 -05:00
Andrei Betlen
770df34436
Add $ref and $defs support to json schema converter
2023-11-10 02:50:46 -05:00
Andrei Betlen
faeae181b1
Fix: json_schema_to_gbnf should take string dump of json schema as input
2023-11-10 02:50:17 -05:00
Andrei Betlen
e7962d2c73
Fix: default max_tokens matches openai api (16 for completion, max length for chat completion)
2023-11-10 02:49:27 -05:00
Andrei Betlen
82072802ea
Add link to bakllava gguf model
2023-11-09 03:05:18 -05:00
Andrei Betlen
baeb7b34b3
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2023-11-09 00:55:25 -05:00
Andrei Betlen
b62c449839
Bugfix: missing response_format for functionary and llava chat handlers
2023-11-09 00:55:23 -05:00