Andrei Betlen
|
00ea3af51b
|
Add makefile
|
2023-05-26 17:56:20 -04:00 |
|
Andrei Betlen
|
447a3d249e
|
Merge branch 'main' into setup
|
2023-05-26 17:53:58 -04:00 |
|
Andrei Betlen
|
030fafe901
|
Add project changelog
|
2023-05-26 17:32:34 -04:00 |
|
Andrei Betlen
|
6075e17cb6
|
Bump version
|
2023-05-26 17:21:51 -04:00 |
|
Andrei
|
2adf6f3f9a
|
Merge pull request #265 from dmahurin/fix-from-bytes-byteorder
fix "from_bytes() missing required argument 'byteorder'"
|
2023-05-26 12:53:06 -04:00 |
|
Andrei
|
34ad71f448
|
Merge pull request #274 from dmahurin/fix-missing-antiprompt
low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.
|
2023-05-26 12:52:34 -04:00 |
|
Andrei
|
d78453c045
|
Merge pull request #264 from dmahurin/fix-min-keep
fix "missing 1 required positional argument: 'min_keep'"
|
2023-05-26 12:52:05 -04:00 |
|
Andrei Betlen
|
4c1b7f7a76
|
Bugfix for logits_processor and stopping_criteria
|
2023-05-26 10:25:28 -04:00 |
|
Don Mahurin
|
0fa2ec4903
|
low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.
|
2023-05-26 06:54:28 -07:00 |
|
Andrei Betlen
|
433a2e3e8a
|
Add extra logits_processor and stopping_criteria
|
2023-05-26 03:13:24 -04:00 |
|
Andrei Betlen
|
30bf8ec557
|
Update llama.cpp
|
2023-05-26 03:03:11 -04:00 |
|
Andrei Betlen
|
f74b90ed67
|
Fix streaming hang on last token when cache is on.
|
2023-05-26 03:03:01 -04:00 |
|
Andrei Betlen
|
5be8354e11
|
Added tokenizer
|
2023-05-26 03:00:51 -04:00 |
|
Andrei Betlen
|
8fa2ef1959
|
Format
|
2023-05-26 03:00:35 -04:00 |
|
Andrei Betlen
|
6bd1075291
|
Merge branch 'Maximilian-Winter/main' into main
|
2023-05-26 02:56:11 -04:00 |
|
Andrei Betlen
|
ca01f98e09
|
Add LlamaTokenizer class
|
2023-05-25 14:11:33 -04:00 |
|
Andrei Betlen
|
1d247e0f35
|
Add StoppingCriteria and LogitsProcessor to generate to match huggingface API
|
2023-05-25 14:04:54 -04:00 |
|
Maximilian Winter
|
c6a9659972
|
Merge branch 'abetlen:main' into main
|
2023-05-25 17:09:19 +02:00 |
|
Andrei
|
de8d9a810b
|
Merge pull request #270 from gjmulder/auto-docker
"bot-in-a-box" - model d/l and automatic install into a OpenBLAS or CuBLAS Docker image
|
2023-05-25 09:30:13 -04:00 |
|
Gary Mulder
|
0e0c9bb978
|
Merge branch 'auto-docker' of github.com:gjmulder/llama-cpp-python-gary into auto-docker
|
2023-05-25 11:50:34 +00:00 |
|
Gary Mulder
|
0d2cc21202
|
Fixed repeated imports
|
2023-05-25 11:50:02 +00:00 |
|
Maximilian-Winter
|
c2585b6889
|
Fixed list elements typing
|
2023-05-25 10:54:08 +02:00 |
|
Maximilian-Winter
|
da463e6c8c
|
Added types to logit processor list and stop criteria list
|
2023-05-25 09:07:16 +02:00 |
|
Maximilian-Winter
|
c05fcdf42f
|
Fixed none value of logits processors.
|
2023-05-24 22:02:06 +02:00 |
|
Maximilian-Winter
|
5bb780d455
|
Implemented logit processors and stop criteria's
|
2023-05-24 21:55:44 +02:00 |
|
Andrei Betlen
|
fab064ded9
|
Remove unnecessary ffi calls
|
2023-05-23 17:56:21 -04:00 |
|
Gary Mulder
|
ec44bdad61
|
Update README.md
|
2023-05-23 20:50:39 +01:00 |
|
Gary Mulder
|
ed19071ef8
|
Renamed and moved old Dockerfiles
|
2023-05-23 19:38:37 +00:00 |
|
Gary Mulder
|
70f629a72f
|
Update README.md
|
2023-05-23 20:36:21 +01:00 |
|
Gary Mulder
|
eaff7a8678
|
Initial commit of auto docker
|
2023-05-23 19:26:40 +00:00 |
|
Don Mahurin
|
d6a7adb17a
|
fix "missing 1 required positional argument: 'min_keep'"
|
2023-05-23 06:42:22 -07:00 |
|
Don Mahurin
|
327eedbfe1
|
fix "from_bytes() missing required argument 'byteorder'"
|
2023-05-23 00:20:34 -07:00 |
|
Andrei Betlen
|
e5d596e0e9
|
Bump version
|
2023-05-22 23:50:58 -04:00 |
|
Andrei Betlen
|
c41b1ebca7
|
Update llama.cpp
|
2023-05-22 23:50:35 -04:00 |
|
Andrei
|
aa3d7a6299
|
Merge pull request #263 from abetlen/dependabot/pip/mkdocs-material-9.1.14
Bump mkdocs-material from 9.1.12 to 9.1.14
|
2023-05-22 23:44:51 -04:00 |
|
dependabot[bot]
|
2240b949ae
|
Bump mkdocs-material from 9.1.12 to 9.1.14
Bumps [mkdocs-material](https://github.com/squidfunk/mkdocs-material) from 9.1.12 to 9.1.14.
- [Release notes](https://github.com/squidfunk/mkdocs-material/releases)
- [Changelog](https://github.com/squidfunk/mkdocs-material/blob/master/CHANGELOG)
- [Commits](https://github.com/squidfunk/mkdocs-material/compare/9.1.12...9.1.14)
---
updated-dependencies:
- dependency-name: mkdocs-material
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-05-22 21:18:57 +00:00 |
|
Andrei
|
01c79e7bf1
|
Merge pull request #258 from Pipboyguy/main
Change docker build dynamic param to image instead of cuda version
|
2023-05-22 17:17:17 -04:00 |
|
Andrei
|
c3e80b1714
|
Merge pull request #262 from abetlen/dependabot/pip/httpx-0.24.1
Bump httpx from 0.24.0 to 0.24.1
|
2023-05-22 17:16:16 -04:00 |
|
dependabot[bot]
|
8e41d724ab
|
Bump httpx from 0.24.0 to 0.24.1
Bumps [httpx](https://github.com/encode/httpx) from 0.24.0 to 0.24.1.
- [Release notes](https://github.com/encode/httpx/releases)
- [Changelog](https://github.com/encode/httpx/blob/master/CHANGELOG.md)
- [Commits](https://github.com/encode/httpx/compare/0.24.0...0.24.1)
---
updated-dependencies:
- dependency-name: httpx
dependency-type: direct:development
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-05-22 21:05:39 +00:00 |
|
Marcel Coetzee
|
e6639e6620
|
Change docker build dynamic param to image instead of cuda version
Signed-off-by: Marcel Coetzee <marcel@mooncoon.com>
|
2023-05-22 10:10:14 +02:00 |
|
Andrei
|
4f7a6daa25
|
Merge pull request #248 from localagi/main
make git module accessible anonymously
|
2023-05-22 03:15:15 -04:00 |
|
Andrei Betlen
|
0adb9ec37a
|
Use model_name and index in response
|
2023-05-21 21:30:03 -04:00 |
|
Andrei Betlen
|
922b5b2bfd
|
Merge branch 'main' into server-embedding
|
2023-05-21 21:21:38 -04:00 |
|
Andrei Betlen
|
2c45255a0a
|
Bump version
|
2023-05-21 19:24:20 -04:00 |
|
Andrei Betlen
|
cd102e9da1
|
Cache shared library function calls for static tokens
|
2023-05-21 19:18:56 -04:00 |
|
Andrei Betlen
|
b895511cca
|
Fix penalize_nl
|
2023-05-21 18:38:06 -04:00 |
|
Andrei Betlen
|
03e2947b03
|
Fix unnecessary memory allocation while sampling
|
2023-05-21 18:36:34 -04:00 |
|
Andrei Betlen
|
fafe47114c
|
Update llama.cpp
|
2023-05-21 17:47:21 -04:00 |
|
Andrei Betlen
|
8f49ca0287
|
Bump version
|
2023-05-20 08:53:40 -04:00 |
|
Andrei Betlen
|
76b1d2cd20
|
Change properties to functions to match token functions
|
2023-05-20 08:24:06 -04:00 |
|