Commit graph

952 commits

Author SHA1 Message Date
samfundev
d788fb49bf
Only concatenate after all batches are done 2023-06-24 15:51:46 -04:00
Andrei
877ca6d016
Merge branch 'main' into fix-state-pickle 2023-06-23 15:13:07 -04:00
Andrei
b6f9388436
Merge pull request #402 from abetlen/dependabot/pip/mkdocs-material-9.1.16
Bump mkdocs-material from 9.1.15 to 9.1.16
2023-06-23 10:10:32 -04:00
Andrei
0952d533fe
Merge pull request #415 from lexin4ever/patch-1
server: pass seed param from command line to llama
2023-06-23 10:09:38 -04:00
Alexey
282698b6d3
server: pass seed param from command line to llama 2023-06-23 00:19:24 +04:00
Andrei Betlen
3e7eae4796 Bump Version 2023-06-20 11:25:44 -04:00
Andrei Betlen
e37798777e Update llama.cpp 2023-06-20 11:25:10 -04:00
dependabot[bot]
d5974a1096
Bump mkdocs-material from 9.1.15 to 9.1.16
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.1.15 to 9.1.16.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.1.15...9.1.16)

---
updated-dependencies:
- dependency-name: mkdocs-material
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-06-19 21:07:49 +00:00
Andrei Betlen
92b0013427 Bump version 2023-06-18 09:48:43 -04:00
Andrei Betlen
44dcb5cf71 Update llama.cpp 2023-06-18 09:37:20 -04:00
Andrei Betlen
c7d7d5b656 Update Changelog 2023-06-17 13:39:48 -04:00
Andrei Betlen
d410f12fae Update docs. Closes #386 2023-06-17 13:38:48 -04:00
Andrei Betlen
9f528f4715 Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-06-17 13:37:17 -04:00
Andrei Betlen
60426b23cc Update llama.cpp 2023-06-17 13:37:14 -04:00
SubhranshuSharma
036548365f added termux with root instructions 2023-06-17 14:50:07 +05:30
Andrei
ff9faaa48b
Merge pull request #385 from nb-programmer/main
Update llama.py: Added how many input tokens in ValueError exception
2023-06-16 23:12:39 -04:00
Andrei Betlen
d7153abcf8 Update llama.cpp 2023-06-16 23:11:14 -04:00
Andrei Betlen
37d5192a92 Update docs 2023-06-16 10:41:51 -04:00
imaprogrammer
fd9f294b3a
Update llama.py: Added how many input tokens in ValueError exception 2023-06-16 14:11:57 +05:30
Andrei Betlen
d938e59003 Bump version 2023-06-14 22:15:44 -04:00
Andrei Betlen
54e2e4ffde Move metal docs to metal section of README. 2023-06-14 22:15:22 -04:00
Andrei Betlen
1e20be6d0c Add low_vram to server settings 2023-06-14 22:13:42 -04:00
Andrei Betlen
44b83cada5 Add low_vram parameter 2023-06-14 22:12:33 -04:00
Andrei Betlen
f7c5cfaf50 Format server options 2023-06-14 22:08:28 -04:00
Andrei Betlen
9c41a3e990 Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-06-14 21:50:43 -04:00
Andrei
f568baeef1
Merge pull request #351 from player1537-forks/th/add-logits-bias-parameter
Add support for `logit_bias` and `logit_bias_type` parameters
2023-06-14 21:49:56 -04:00
Andrei
abf6d4a0a1
Merge pull request #367 from ianscrivener/ianscrivener-macos-install-md-docs
Ianscrivener macos install md docs
2023-06-14 21:48:04 -04:00
Andrei
243c6fdca0
Merge pull request #368 from mattdennewitz/patch-1
Update README.md
2023-06-14 21:47:16 -04:00
Andrei Betlen
f27393ab7e Add additional verbose logs for cache 2023-06-14 21:46:48 -04:00
Andrei Betlen
4cefb70cd0 Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-06-14 21:40:19 -04:00
Andrei Betlen
715f98c591 Update llama.cpp 2023-06-14 21:40:13 -04:00
Andrei
6517900623
Merge pull request #360 from Orfeous/main
fixes abetlen/llama-cpp-python #358
2023-06-14 00:25:04 -04:00
Andrei
4121d03c0a
Merge pull request #366 from abetlen/dependabot/pip/pytest-7.3.2
Bump pytest from 7.3.1 to 7.3.2
2023-06-14 00:24:27 -04:00
dependabot[bot]
fe41cb9043
Bump pytest from 7.3.1 to 7.3.2
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.3.1 to 7.3.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.3.1...7.3.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-06-13 15:07:50 +00:00
Andrei
e722bc703f
Merge pull request #365 from abetlen/dependabot/pip/fastapi-0.97.0
Bump fastapi from 0.96.0 to 0.97.0
2023-06-13 11:06:46 -04:00
Okabintaro
10b0cb727b fix: Make LLamaState pickable for disk cache
I fixed the issue by making the saved state a bytes object instead of the ctypes one which can't be pickled.
2023-06-13 12:03:31 +02:00
Matt Dennewitz
613dd70c8a
Update README.md
Fixes typo in README
2023-06-13 00:56:05 -05:00
Ian Scrivener
7ca50a3e45
Update README.md
add link to main README>md
2023-06-13 09:52:22 +10:00
Ian Scrivener
94f63a66b9
Create macos_install.md
add MacOS Metal markdown install instructions
2023-06-13 09:49:19 +10:00
dependabot[bot]
efcf380490
Bump fastapi from 0.96.0 to 0.97.0
Bumps [fastapi](https://github.com/tiangolo/fastapi) from 0.96.0 to 0.97.0.
- [Release notes](https://github.com/tiangolo/fastapi/releases)
- [Commits](https://github.com/tiangolo/fastapi/compare/0.96.0...0.97.0)

---
updated-dependencies:
- dependency-name: fastapi
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-06-12 21:03:40 +00:00
Gabor
3129a0e7e5 correction to add back environment variable support <3 docker 2023-06-11 01:11:24 +01:00
Gabor
3ea31930e5 fixes abetlen/llama-cpp-python #358 2023-06-11 00:58:08 +01:00
Andrei
ad4479e609
Merge pull request #359 from matthoffner/main
Document metal support
2023-06-10 19:31:40 -04:00
Matt Hoffner
4eb245afd8
Update README.md 2023-06-10 15:59:26 -07:00
Andrei Betlen
74fbaae157 Bump version 2023-06-10 18:19:48 -04:00
Andrei Betlen
6e302c6ee8 Update makefile and gitignore 2023-06-10 18:17:34 -04:00
Andrei Betlen
c1eaef329a Add resource destination to cmake 2023-06-10 18:11:48 -04:00
Andrei Betlen
890ae442b9 Update llama.cpp 2023-06-10 18:10:01 -04:00
Andrei Betlen
bf2bfec615 Update changelog 2023-06-10 12:22:39 -04:00
Andrei Betlen
21acd7901f Re-enable cache 2023-06-10 12:22:31 -04:00