Commit graph

101 commits

Author SHA1 Message Date
Andrei Betlen
507bcc7171 Bump version 2023-09-13 23:15:23 -04:00
Andrei Betlen
3e2250a12e Update CHANGELOG 2023-09-13 23:14:22 -04:00
Andrei Betlen
60119dbaeb Update CHANGELOG 2023-09-13 23:13:19 -04:00
Andrei Betlen
83764c5aee Update CHANGELOG 2023-09-13 21:58:53 -04:00
Andrei Betlen
203ede4ba2 Bump version 2023-09-13 18:07:08 -04:00
Andrei Betlen
1372e4f60e Update CHANGELOG 2023-09-13 02:50:27 -04:00
Andrei Betlen
8e13520796 Bump version 2023-09-13 01:47:58 -04:00
Andrei Betlen
38cd2ac624 Update CHANGELOG 2023-09-12 20:59:54 -04:00
Andrei Betlen
3f8bc417d7 Bump version 2023-08-25 15:18:15 -04:00
Andrei Betlen
8fc3fa9f1c Bump version 2023-08-17 23:17:56 -04:00
Andrei Betlen
c7c700b0d4 Bump version 2023-07-24 14:11:21 -04:00
Andrei Betlen
4aaaec561d Bump version 2023-07-24 13:12:38 -04:00
Andrei Betlen
231123ee1e Update llama.cpp 2023-07-21 12:41:59 -04:00
Andrei Betlen
a4fe3fe350 Bump version 2023-07-20 18:56:29 -04:00
Andrei Betlen
c9985abc03 Bump version 2023-07-18 13:54:51 -04:00
Andrei Betlen
6d8892fe64 Bump version 2023-07-15 17:13:55 -04:00
Andrei Betlen
6705f9b6c6 Bump version 2023-07-13 23:32:06 -04:00
Andrei Betlen
8e0f6253db Bump version 2023-07-09 18:20:04 -04:00
Andrei Betlen
df3d545938 Update changelog 2023-07-09 18:13:41 -04:00
Andrei Betlen
0f3c474a49 Bump version 2023-07-09 11:44:29 -04:00
Andrei Betlen
670fe4b701 Update changelog 2023-07-08 03:37:12 -04:00
Andrei Betlen
4c7cdcca00 Add interruptible streaming requests for llama-cpp-python server. Closes #183 2023-07-07 03:04:17 -04:00
Andrei Betlen
a1b2d5c09b Bump version 2023-07-05 01:06:46 -04:00
Andrei Betlen
4d1eb88b13 Bump version 2023-06-29 00:46:15 -04:00
Andrei Betlen
5193af297b Bump version 2023-06-26 08:53:54 -04:00
Andrei Betlen
3e7eae4796 Bump Version 2023-06-20 11:25:44 -04:00
Andrei Betlen
e37798777e Update llama.cpp 2023-06-20 11:25:10 -04:00
Andrei Betlen
92b0013427 Bump version 2023-06-18 09:48:43 -04:00
Andrei Betlen
c7d7d5b656 Update Changelog 2023-06-17 13:39:48 -04:00
Andrei Betlen
60426b23cc Update llama.cpp 2023-06-17 13:37:14 -04:00
Andrei Betlen
d938e59003 Bump version 2023-06-14 22:15:44 -04:00
Andrei Betlen
74fbaae157 Bump version 2023-06-10 18:19:48 -04:00
Andrei Betlen
bf2bfec615 Update changelog 2023-06-10 12:22:39 -04:00
Andrei Betlen
6b764cab80 Bump version 2023-06-09 23:25:38 -04:00
Andrei Betlen
e3542b6627 Revert "Merge pull request #350 from abetlen/migrate-to-scikit-build-core"
This reverts commit fb2c5f7fd9, reversing
changes made to 202ed4464b.
2023-06-09 23:23:16 -04:00
Andrei Betlen
dd7c7bf80b Bump version 2023-06-09 11:52:07 -04:00
Andrei Betlen
0da655b3be Temporarily disable cache until save state bug is fixed. 2023-06-09 11:10:24 -04:00
Andrei Betlen
f2a54ecb4c Update CHANGELOG 2023-06-09 11:01:42 -04:00
Andrei Betlen
c12138f7bd Update changelog 2023-06-08 21:53:38 -04:00
Andrei Betlen
90874c01cd Bump version 2023-06-08 03:26:49 -04:00
Andrei Betlen
0e156ffd66 Fix changelog format 2023-06-06 17:01:10 -04:00
Andrei Betlen
7b57420ea9 Update llama.cpp 2023-06-05 18:17:29 -04:00
Andrei Betlen
18c7b8520e Bump version 2023-06-04 23:31:51 -04:00
Andrei Betlen
6d5b049801 Update llama.cpp 2023-06-04 23:30:42 -04:00
Andrei Betlen
76e364cdf2 Added 0.1.57 notes 2023-06-04 23:30:10 -04:00
Andrei Betlen
cb0bcdbbb7 Bump version 2023-05-30 03:07:36 -04:00
Andrei Betlen
828f9ec015 Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-05-29 21:39:40 -04:00
Andrei Betlen
b1daf568e3 Update changelog 2023-05-29 21:39:19 -04:00
Andrei Betlen
b0b154cfa6 Add changelog message for numpy 2023-05-26 20:26:08 -04:00
Andrei Betlen
8f35bddd7e Fix stop sequence performance bug. 2023-05-26 20:23:49 -04:00
Andrei Betlen
030fafe901 Add project changelog 2023-05-26 17:32:34 -04:00