Mug
|
c4a8491d42
|
Fix decode errors permanently
|
2023-04-26 14:37:06 +02:00 |
|
Mug
|
53d17ad003
|
Fixed end of text wrong type, and fix n_predict behaviour
|
2023-04-17 14:45:28 +02:00 |
|
Mug
|
3bb45f1658
|
More reasonable defaults
|
2023-04-10 16:38:45 +02:00 |
|
Mug
|
0cccb41a8f
|
Added iterative search to prevent instructions from being echoed, add ignore eos, add no-mmap, fixed 1 character echo too much bug
|
2023-04-10 16:35:38 +02:00 |
|
Andrei Betlen
|
196650ccb2
|
Update model paths to be more clear they should point to file
|
2023-04-09 22:45:55 -04:00 |
|
Mug
|
16fc5b5d23
|
More interoperability to the original llama.cpp, and arguments now work
|
2023-04-07 13:32:19 +02:00 |
|
Mug
|
10c7571117
|
Fixed too many newlines, now onto args.
Still needs shipping work so you could do "python -m llama_cpp.examples." etc.
|
2023-04-06 15:33:22 +02:00 |
|
Mug
|
085cc92b1f
|
Better llama.cpp interoperability
Has some too many newline issues so WIP
|
2023-04-06 15:30:57 +02:00 |
|
Mug
|
283e59c5e9
|
Fix bug in init_break not being set when exited via antiprompt and others.
|
2023-04-05 14:47:24 +02:00 |
|
Mug
|
99ceecfccd
|
Move to new examples directory
|
2023-04-05 14:28:02 +02:00 |
|
Andrei Betlen
|
b1babcf56c
|
Add quantize example
|
2023-04-05 04:17:26 -04:00 |
|
Andrei Betlen
|
c8e13a78d0
|
Re-organize examples folder
|
2023-04-05 04:10:13 -04:00 |
|