Andrei Betlen
|
d938e59003
|
Bump version
|
2023-06-14 22:15:44 -04:00 |
|
Andrei Betlen
|
54e2e4ffde
|
Move metal docs to metal section of README.
|
2023-06-14 22:15:22 -04:00 |
|
Andrei Betlen
|
1e20be6d0c
|
Add low_vram to server settings
|
2023-06-14 22:13:42 -04:00 |
|
Andrei Betlen
|
44b83cada5
|
Add low_vram parameter
|
2023-06-14 22:12:33 -04:00 |
|
Andrei Betlen
|
f7c5cfaf50
|
Format server options
|
2023-06-14 22:08:28 -04:00 |
|
Andrei Betlen
|
9c41a3e990
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-14 21:50:43 -04:00 |
|
Andrei
|
f568baeef1
|
Merge pull request #351 from player1537-forks/th/add-logits-bias-parameter
Add support for `logit_bias` and `logit_bias_type` parameters
|
2023-06-14 21:49:56 -04:00 |
|
Andrei
|
abf6d4a0a1
|
Merge pull request #367 from ianscrivener/ianscrivener-macos-install-md-docs
Ianscrivener macos install md docs
|
2023-06-14 21:48:04 -04:00 |
|
Andrei
|
243c6fdca0
|
Merge pull request #368 from mattdennewitz/patch-1
Update README.md
|
2023-06-14 21:47:16 -04:00 |
|
Andrei Betlen
|
f27393ab7e
|
Add additional verbose logs for cache
|
2023-06-14 21:46:48 -04:00 |
|
Andrei Betlen
|
4cefb70cd0
|
Merge branch 'main' of github.com:abetlen/llama_cpp_python into main
|
2023-06-14 21:40:19 -04:00 |
|
Andrei Betlen
|
715f98c591
|
Update llama.cpp
|
2023-06-14 21:40:13 -04:00 |
|
Andrei
|
6517900623
|
Merge pull request #360 from Orfeous/main
fixes abetlen/llama-cpp-python #358
|
2023-06-14 00:25:04 -04:00 |
|
Andrei
|
4121d03c0a
|
Merge pull request #366 from abetlen/dependabot/pip/pytest-7.3.2
Bump pytest from 7.3.1 to 7.3.2
|
2023-06-14 00:24:27 -04:00 |
|
dependabot[bot]
|
fe41cb9043
|
Bump pytest from 7.3.1 to 7.3.2
Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.3.1 to 7.3.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.3.1...7.3.2)
---
updated-dependencies:
- dependency-name: pytest
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-13 15:07:50 +00:00 |
|
Andrei
|
e722bc703f
|
Merge pull request #365 from abetlen/dependabot/pip/fastapi-0.97.0
Bump fastapi from 0.96.0 to 0.97.0
|
2023-06-13 11:06:46 -04:00 |
|
Matt Dennewitz
|
613dd70c8a
|
Update README.md
Fixes typo in README
|
2023-06-13 00:56:05 -05:00 |
|
Ian Scrivener
|
7ca50a3e45
|
Update README.md
add link to main README>md
|
2023-06-13 09:52:22 +10:00 |
|
Ian Scrivener
|
94f63a66b9
|
Create macos_install.md
add MacOS Metal markdown install instructions
|
2023-06-13 09:49:19 +10:00 |
|
dependabot[bot]
|
efcf380490
|
Bump fastapi from 0.96.0 to 0.97.0
Bumps [fastapi](https://github.com/tiangolo/fastapi) from 0.96.0 to 0.97.0.
- [Release notes](https://github.com/tiangolo/fastapi/releases)
- [Commits](https://github.com/tiangolo/fastapi/compare/0.96.0...0.97.0)
---
updated-dependencies:
- dependency-name: fastapi
dependency-type: direct:production
update-type: version-update:semver-minor
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-06-12 21:03:40 +00:00 |
|
Gabor
|
3129a0e7e5
|
correction to add back environment variable support <3 docker
|
2023-06-11 01:11:24 +01:00 |
|
Gabor
|
3ea31930e5
|
fixes abetlen/llama-cpp-python #358
|
2023-06-11 00:58:08 +01:00 |
|
Andrei
|
ad4479e609
|
Merge pull request #359 from matthoffner/main
Document metal support
|
2023-06-10 19:31:40 -04:00 |
|
Matt Hoffner
|
4eb245afd8
|
Update README.md
|
2023-06-10 15:59:26 -07:00 |
|
Andrei Betlen
|
74fbaae157
|
Bump version
|
2023-06-10 18:19:48 -04:00 |
|
Andrei Betlen
|
6e302c6ee8
|
Update makefile and gitignore
|
2023-06-10 18:17:34 -04:00 |
|
Andrei Betlen
|
c1eaef329a
|
Add resource destination to cmake
|
2023-06-10 18:11:48 -04:00 |
|
Andrei Betlen
|
890ae442b9
|
Update llama.cpp
|
2023-06-10 18:10:01 -04:00 |
|
Andrei Betlen
|
bf2bfec615
|
Update changelog
|
2023-06-10 12:22:39 -04:00 |
|
Andrei Betlen
|
21acd7901f
|
Re-enable cache
|
2023-06-10 12:22:31 -04:00 |
|
Andrei Betlen
|
6639371407
|
Update llama.cpp
|
2023-06-10 12:17:38 -04:00 |
|
Andrei Betlen
|
6b764cab80
|
Bump version
|
2023-06-09 23:25:38 -04:00 |
|
Andrei Betlen
|
e3542b6627
|
Revert "Merge pull request #350 from abetlen/migrate-to-scikit-build-core"
This reverts commit fb2c5f7fd9 , reversing
changes made to 202ed4464b .
|
2023-06-09 23:23:16 -04:00 |
|
Andrei Betlen
|
3c6e1b6c42
|
Update to smoketest
|
2023-06-09 19:08:15 -04:00 |
|
Andrei Betlen
|
d4aed351e3
|
Run on workflow_dispatch
|
2023-06-09 17:08:42 -04:00 |
|
Andrei Betlen
|
2fdd873125
|
Add gihub action to test published pypi version of package
|
2023-06-09 16:52:40 -04:00 |
|
Andrei Betlen
|
a553552868
|
Add project urls to pyproject
|
2023-06-09 16:52:17 -04:00 |
|
Tanner Hobson
|
eb7645b3ba
|
Add support for logit_bias and logit_bias_type parameters
|
2023-06-09 13:13:08 -04:00 |
|
Andrei Betlen
|
c0f7e739c9
|
Update llama.cpp
|
2023-06-09 12:39:09 -04:00 |
|
Andrei Betlen
|
dd7c7bf80b
|
Bump version
|
2023-06-09 11:52:07 -04:00 |
|
Andrei Betlen
|
0da655b3be
|
Temporarily disable cache until save state bug is fixed.
|
2023-06-09 11:10:24 -04:00 |
|
Andrei Betlen
|
be0403da98
|
Add missing poetry sections to pyproject.toml
|
2023-06-09 11:09:32 -04:00 |
|
Andrei Betlen
|
f2a54ecb4c
|
Update CHANGELOG
|
2023-06-09 11:01:42 -04:00 |
|
Andrei Betlen
|
556c7edf47
|
Truncate max_tokens if it exceeds context length
|
2023-06-09 10:57:36 -04:00 |
|
Andrei
|
fb2c5f7fd9
|
Merge pull request #350 from abetlen/migrate-to-scikit-build-core
Migrate to scikit-build-core
|
2023-06-09 03:00:01 -04:00 |
|
Andrei Betlen
|
b025a859ae
|
Add full path to shared library installation path
|
2023-06-08 22:11:01 -04:00 |
|
Andrei Betlen
|
146ca2c59f
|
Add missing httpx
|
2023-06-08 22:03:24 -04:00 |
|
Andrei Betlen
|
1d6bdf8db6
|
Update server dependencies
|
2023-06-08 21:59:58 -04:00 |
|
Andrei Betlen
|
43854e6a83
|
Update server dependencies
|
2023-06-08 21:55:42 -04:00 |
|
Andrei Betlen
|
c12138f7bd
|
Update changelog
|
2023-06-08 21:53:38 -04:00 |
|