Andrei Betlen
8d75016549
Install required runtime dlls to package directory on windows
2023-09-16 14:57:49 -04:00
Andrei Betlen
acf18fcdf0
Bump version
2023-09-15 14:22:21 -04:00
Andrei Betlen
c7f45a7468
Update llama.cpp
2023-09-15 14:16:34 -04:00
Andrei Betlen
b047b3034e
Remove confusing helpstring from server cli args. Closes #719
2023-09-15 14:09:43 -04:00
Andrei Betlen
24fec0b242
Bump version
2023-09-14 18:33:08 -04:00
Andrei Betlen
dbd3a6d1ed
Fix issue installing on m1 macs
2023-09-14 18:25:44 -04:00
Andrei Betlen
482ecd79c9
Revert "Update llama.cpp"
...
This reverts commit f73e385c33
.
2023-09-14 17:03:18 -04:00
Andrei Betlen
f73e385c33
Update llama.cpp
2023-09-14 16:37:33 -04:00
Andrei Betlen
ca4eb952a6
Revert "Update llama.cpp"
...
This reverts commit aa2f8a5008
.
2023-09-14 15:28:50 -04:00
Andrei Betlen
7da8e0fbf1
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
2023-09-14 14:51:50 -04:00
Andrei Betlen
8474665625
Update base_path to fix issue resolving dll in windows isolation container.
2023-09-14 14:51:43 -04:00
Jason Cox
40b22909dc
Update examples from ggml to gguf and add hw-accel note for Web Server ( #688 )
...
* Examples from ggml to gguf
* Use gguf file extension
Update examples to use filenames with gguf extension (e.g. llama-model.gguf).
---------
Co-authored-by: Andrei <abetlen@gmail.com>
2023-09-14 14:48:21 -04:00
Andrei Betlen
aa2f8a5008
Update llama.cpp
2023-09-14 14:44:59 -04:00
Andrei Betlen
2291798900
Fix dockerfiles to install starlette-context
2023-09-14 14:40:16 -04:00
Andrei Betlen
65a2a20050
Enable make fallback for scikit-build-core
2023-09-14 11:43:55 -04:00
Andrei Betlen
255d653ae3
Add documentation and changelog links in pyproject
2023-09-14 04:00:37 -04:00
Andrei Betlen
95d54808a5
Upgrade pip for editable installs
2023-09-14 02:01:45 -04:00
Andrei Betlen
507bcc7171
Bump version
2023-09-13 23:15:23 -04:00
Andrei Betlen
3e2250a12e
Update CHANGELOG
2023-09-13 23:14:22 -04:00
Andrei Betlen
60119dbaeb
Update CHANGELOG
2023-09-13 23:13:19 -04:00
Andrei Betlen
0449d29b9f
Fix boolean env vars and cli arguments
2023-09-13 23:09:57 -04:00
earonesty
58a6e42cc0
Update app.py ( #705 )
2023-09-13 23:01:34 -04:00
Andrei Betlen
f4090a0bb2
Add numa support, low level api users must now explicitly call llama_backend_init at the start of their programs.
2023-09-13 23:00:43 -04:00
Andrei Betlen
c999325e8e
Fix boolean cli flags
2023-09-13 22:56:10 -04:00
Andrei Betlen
83764c5aee
Update CHANGELOG
2023-09-13 21:58:53 -04:00
Andrei Betlen
4daf77e546
Format
2023-09-13 21:23:23 -04:00
Andrei Betlen
2920c4bf7e
Update server params. Added lora_base, lora_path, low_vram, and main_gpu. Removed rms_norm_eps and n_gqa (deprecated in llama.cpp)
2023-09-13 21:23:13 -04:00
Andrei Betlen
6a20293fc2
Reorder init params to match llama.cpp order
2023-09-13 21:20:26 -04:00
Andrei Betlen
c8f9b8a734
Explicitly make all init params other than model_path into keyword only params
2023-09-13 21:19:47 -04:00
Andrei Betlen
a68f9e2791
Add kwargs to init to catch extra params
2023-09-13 21:19:02 -04:00
Andrei Betlen
9e345a47a2
remove print
2023-09-13 21:12:27 -04:00
Andrei Betlen
517f9ed80b
Convert missed llama.cpp constants into standard python types
2023-09-13 21:11:52 -04:00
Andrei Betlen
c4c440ba2d
Fix tensor_split cli option
2023-09-13 20:00:42 -04:00
Andrei Betlen
203ede4ba2
Bump version
2023-09-13 18:07:08 -04:00
Andrei Betlen
759405c84b
Fix issue with Literal and Optional cli arguments not working. Closes #702
2023-09-13 18:06:12 -04:00
Andrei Betlen
6cfc54284b
Add pyproject extra for scikit-build-core to ensure compatible pathspec version
2023-09-13 16:51:57 -04:00
Andrei Betlen
cacfd562ba
Update llama.cpp
2023-09-13 16:51:00 -04:00
Devrim
da9df78db0
Add X-Request-ID request header for mirroring custom IDs. ( #703 )
2023-09-13 16:18:31 -04:00
Andrei Betlen
1372e4f60e
Update CHANGELOG
2023-09-13 02:50:27 -04:00
Andrei Betlen
8e13520796
Bump version
2023-09-13 01:47:58 -04:00
Andrei Betlen
ec1132008e
fix: only ignore c extension files installed in-source, avoids removing ggml-metal.metal from sdist
2023-09-13 01:46:29 -04:00
Andrei Betlen
8ddf63b9c7
Remove reference to FORCE_CMAKE from docs
2023-09-12 23:56:10 -04:00
Andrei Betlen
109123c4f0
docs: Use pymdownx.snippets for easier docs management
2023-09-12 22:28:58 -04:00
Andrei Betlen
2787663a25
Bump version
2023-09-12 21:00:01 -04:00
Andrei Betlen
38cd2ac624
Update CHANGELOG
2023-09-12 20:59:54 -04:00
Andrei Betlen
6e02525971
bugfix: remove git directories from source distribution to avoid build-info.h bug
2023-09-12 20:57:28 -04:00
Andrei Betlen
c59f084005
Add python version classifiers to pyproject.toml
2023-09-12 19:15:44 -04:00
Andrei
3feeafed1a
Merge pull request #499 from abetlen/v0.2-wip
...
llama-cpp-python v0.2.0
2023-09-12 19:04:18 -04:00
Andrei Betlen
bcef9ab2d9
Update title
2023-09-12 19:02:30 -04:00
Andrei Betlen
89ae347585
Remove references to force_cmake
2023-09-12 19:02:20 -04:00