Commit graph

16 commits

Author SHA1 Message Date
Andrei Betlen
dd7c7bf80b Bump version 2023-06-09 11:52:07 -04:00
Andrei Betlen
0da655b3be Temporarily disable cache until save state bug is fixed. 2023-06-09 11:10:24 -04:00
Andrei Betlen
f2a54ecb4c Update CHANGELOG 2023-06-09 11:01:42 -04:00
Andrei Betlen
c12138f7bd Update changelog 2023-06-08 21:53:38 -04:00
Andrei Betlen
90874c01cd Bump version 2023-06-08 03:26:49 -04:00
Andrei Betlen
0e156ffd66 Fix changelog format 2023-06-06 17:01:10 -04:00
Andrei Betlen
7b57420ea9 Update llama.cpp 2023-06-05 18:17:29 -04:00
Andrei Betlen
18c7b8520e Bump version 2023-06-04 23:31:51 -04:00
Andrei Betlen
6d5b049801 Update llama.cpp 2023-06-04 23:30:42 -04:00
Andrei Betlen
76e364cdf2 Added 0.1.57 notes 2023-06-04 23:30:10 -04:00
Andrei Betlen
cb0bcdbbb7 Bump version 2023-05-30 03:07:36 -04:00
Andrei Betlen
828f9ec015 Merge branch 'main' of github.com:abetlen/llama_cpp_python into main 2023-05-29 21:39:40 -04:00
Andrei Betlen
b1daf568e3 Update changelog 2023-05-29 21:39:19 -04:00
Andrei Betlen
b0b154cfa6 Add changelog message for numpy 2023-05-26 20:26:08 -04:00
Andrei Betlen
8f35bddd7e Fix stop sequence performance bug. 2023-05-26 20:23:49 -04:00
Andrei Betlen
030fafe901 Add project changelog 2023-05-26 17:32:34 -04:00