Mug
|
eaf9f19aa9
|
Fix lora
|
2023-05-08 15:27:42 +02:00 |
|
Mug
|
2c0d9b182c
|
Fix session loading and saving in low level example chat
|
2023-05-08 15:27:03 +02:00 |
|
Mug
|
fd80ddf703
|
Fix a bug with wrong type
|
2023-05-06 22:22:28 +02:00 |
|
Mug
|
996f63e9e1
|
Add utf8 to chat example
|
2023-05-06 15:16:58 +02:00 |
|
Mug
|
3ceb47b597
|
Fix mirastat requiring c_float
|
2023-05-06 13:35:50 +02:00 |
|
Mug
|
1895c11033
|
Rename postfix to suffix to match upstream
|
2023-05-06 13:18:25 +02:00 |
|
Mug
|
0e9f227afd
|
Update low level examples
|
2023-05-04 18:33:08 +02:00 |
|
Mug
|
c39547a986
|
Detect multi-byte responses and wait
|
2023-04-28 12:50:30 +02:00 |
|
Mug
|
5f81400fcb
|
Also ignore errors on input prompts
|
2023-04-26 14:45:51 +02:00 |
|
Mug
|
3c130f00ca
|
Remove try catch from chat
|
2023-04-26 14:38:53 +02:00 |
|
Mug
|
c4a8491d42
|
Fix decode errors permanently
|
2023-04-26 14:37:06 +02:00 |
|
Mug
|
53d17ad003
|
Fixed end of text wrong type, and fix n_predict behaviour
|
2023-04-17 14:45:28 +02:00 |
|
Mug
|
0cccb41a8f
|
Added iterative search to prevent instructions from being echoed, add ignore eos, add no-mmap, fixed 1 character echo too much bug
|
2023-04-10 16:35:38 +02:00 |
|
Mug
|
16fc5b5d23
|
More interoperability to the original llama.cpp, and arguments now work
|
2023-04-07 13:32:19 +02:00 |
|