dependabot[bot]
|
e03c3806f8
|
Bump mkdocs-material from 9.1.16 to 9.1.17
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.1.16 to 9.1.17.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.1.16...9.1.17)
---
updated-dependencies:
- dependency-name: mkdocs-material
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-26 21:16:53 +00:00 |
|
Andrei
|
e18fe74bd7
|
Merge pull request #431 from abetlen/dependabot/pip/numpy-1.24.4
Bump numpy from 1.24.3 to 1.24.4
|
2023-06-26 17:15:51 -04:00 |
|
dependabot[bot]
|
c9a8b7eb43
|
Bump numpy from 1.24.3 to 1.24.4
Bumps [numpy](https://github.com/numpy/numpy) from 1.24.3 to 1.24.4.
- [Release notes](https://github.com/numpy/numpy/releases)
- [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst)
- [Commits](https://github.com/numpy/numpy/compare/v1.24.3...v1.24.4)
---
updated-dependencies:
- dependency-name: numpy
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-26 21:03:34 +00:00 |
|
Andrei Betlen
|
452929404f
|
Updated docs link
|
2023-06-26 16:35:38 -04:00 |
|
Andrei Betlen
|
66b8b979a5
|
Update readthedocs setup
|
2023-06-26 16:31:16 -04:00 |
|
Andrei Betlen
|
155dedf28f
|
Add readthedocsc config
|
2023-06-26 16:25:17 -04:00 |
|
Andrei Betlen
|
5193af297b
|
Bump version
|
2023-06-26 08:53:54 -04:00 |
|
Andrei Betlen
|
3379dc40a1
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-26 08:50:48 -04:00 |
|
Andrei Betlen
|
952228407e
|
Update llama.cpp
|
2023-06-26 08:50:38 -04:00 |
|
Andrei Betlen
|
b4a3db3e54
|
Update type signature
|
2023-06-26 08:50:30 -04:00 |
|
Andrei
|
628e3fb3df
|
Merge pull request #370 from Okabintaro/fix-state-pickle
fix: Make LLamaState pickleable for disk cache
|
2023-06-26 08:46:59 -04:00 |
|
Andrei
|
5eb4ebb041
|
Merge branch 'main' into fix-state-pickle
|
2023-06-26 08:45:02 -04:00 |
|
Andrei
|
04d9218b92
|
Merge pull request #420 from samfundev/main
Only concatenate after all batches are done
|
2023-06-26 08:10:43 -04:00 |
|
samfundev
|
d788fb49bf
|
Only concatenate after all batches are done
|
2023-06-24 15:51:46 -04:00 |
|
Andrei
|
877ca6d016
|
Merge branch 'main' into fix-state-pickle
|
2023-06-23 15:13:07 -04:00 |
|
Andrei
|
b6f9388436
|
Merge pull request #402 from abetlen/dependabot/pip/mkdocs-material-9.1.16
Bump mkdocs-material from 9.1.15 to 9.1.16
|
2023-06-23 10:10:32 -04:00 |
|
Andrei
|
0952d533fe
|
Merge pull request #415 from lexin4ever/patch-1
server: pass seed param from command line to llama
|
2023-06-23 10:09:38 -04:00 |
|
Alexey
|
282698b6d3
|
server: pass seed param from command line to llama
|
2023-06-23 00:19:24 +04:00 |
|
Andrei Betlen
|
3e7eae4796
|
Bump Version
|
2023-06-20 11:25:44 -04:00 |
|
Andrei Betlen
|
e37798777e
|
Update llama.cpp
|
2023-06-20 11:25:10 -04:00 |
|
dependabot[bot]
|
d5974a1096
|
Bump mkdocs-material from 9.1.15 to 9.1.16
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.1.15 to 9.1.16.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.1.15...9.1.16)
---
updated-dependencies:
- dependency-name: mkdocs-material
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-19 21:07:49 +00:00 |
|
Andrei Betlen
|
92b0013427
|
Bump version
|
2023-06-18 09:48:43 -04:00 |
|
Andrei Betlen
|
44dcb5cf71
|
Update llama.cpp
|
2023-06-18 09:37:20 -04:00 |
|
Andrei Betlen
|
c7d7d5b656
|
Update Changelog
|
2023-06-17 13:39:48 -04:00 |
|
Andrei Betlen
|
d410f12fae
|
Update docs. Closes #386
|
2023-06-17 13:38:48 -04:00 |
|
Andrei Betlen
|
9f528f4715
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-17 13:37:17 -04:00 |
|
Andrei Betlen
|
60426b23cc
|
Update llama.cpp
|
2023-06-17 13:37:14 -04:00 |
|
SubhranshuSharma
|
036548365f
|
added termux with root instructions
|
2023-06-17 14:50:07 +05:30 |
|
Andrei
|
ff9faaa48b
|
Merge pull request #385 from nb-programmer/main
Update llama.py: Added how many input tokens in ValueError exception
|
2023-06-16 23:12:39 -04:00 |
|
Andrei Betlen
|
d7153abcf8
|
Update llama.cpp
|
2023-06-16 23:11:14 -04:00 |
|
Andrei Betlen
|
37d5192a92
|
Update docs
|
2023-06-16 10:41:51 -04:00 |
|
imaprogrammer
|
fd9f294b3a
|
Update llama.py: Added how many input tokens in ValueError exception
|
2023-06-16 14:11:57 +05:30 |
|
Andrei Betlen
|
d938e59003
|
Bump version
|
2023-06-14 22:15:44 -04:00 |
|
Andrei Betlen
|
54e2e4ffde
|
Move metal docs to metal section of README.
|
2023-06-14 22:15:22 -04:00 |
|
Andrei Betlen
|
1e20be6d0c
|
Add low_vram to server settings
|
2023-06-14 22:13:42 -04:00 |
|
Andrei Betlen
|
44b83cada5
|
Add low_vram parameter
|
2023-06-14 22:12:33 -04:00 |
|
Andrei Betlen
|
f7c5cfaf50
|
Format server options
|
2023-06-14 22:08:28 -04:00 |
|
Andrei Betlen
|
9c41a3e990
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-14 21:50:43 -04:00 |
|
Andrei
|
f568baeef1
|
Merge pull request #351 from player1537-forks/th/add-logits-bias-parameter
Add support for `logit_bias` and `logit_bias_type` parameters
|
2023-06-14 21:49:56 -04:00 |
|
Andrei
|
abf6d4a0a1
|
Merge pull request #367 from ianscrivener/ianscrivener-macos-install-md-docs
Ianscrivener macos install md docs
|
2023-06-14 21:48:04 -04:00 |
|
Andrei
|
243c6fdca0
|
Merge pull request #368 from mattdennewitz/patch-1
Update README.md
|
2023-06-14 21:47:16 -04:00 |
|
Andrei Betlen
|
f27393ab7e
|
Add additional verbose logs for cache
|
2023-06-14 21:46:48 -04:00 |
|
Andrei Betlen
|
4cefb70cd0
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-14 21:40:19 -04:00 |
|
Andrei Betlen
|
715f98c591
|
Update llama.cpp
|
2023-06-14 21:40:13 -04:00 |
|
Andrei
|
6517900623
|
Merge pull request #360 from Orfeous/main
fixes abetlen/llama-cpp-python #358
|
2023-06-14 00:25:04 -04:00 |
|
Andrei
|
4121d03c0a
|
Merge pull request #366 from abetlen/dependabot/pip/pytest-7.3.2
Bump pytest from 7.3.1 to 7.3.2
|
2023-06-14 00:24:27 -04:00 |
|
dependabot[bot]
|
fe41cb9043
|
Bump pytest from 7.3.1 to 7.3.2
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.3.1 to 7.3.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.3.1...7.3.2)
---
updated-dependencies:
- dependency-name: pytest
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-13 15:07:50 +00:00 |
|
Andrei
|
e722bc703f
|
Merge pull request #365 from abetlen/dependabot/pip/fastapi-0.97.0
Bump fastapi from 0.96.0 to 0.97.0
|
2023-06-13 11:06:46 -04:00 |
|
Okabintaro
|
10b0cb727b
|
fix: Make LLamaState pickable for disk cache
I fixed the issue by making the saved state a bytes object instead of the ctypes one which can't be pickled.
|
2023-06-13 12:03:31 +02:00 |
|
Matt Dennewitz
|
613dd70c8a
|
Update README.md
Fixes typo in README
|
2023-06-13 00:56:05 -05:00 |
|