llama.cpp
|
silence warm up log
|
2023-09-21 14:53:33 -07:00 |
falcon.go
|
fix: add falcon.go
|
2023-09-13 14:47:37 -07:00 |
ggml.go
|
unbound max num gpu layers (#591)
|
2023-09-25 18:36:46 -04:00 |
gguf.go
|
unbound max num gpu layers (#591)
|
2023-09-25 18:36:46 -04:00 |
llama.go
|
windows runner fixes (#637)
|
2023-09-29 11:47:55 -04:00 |
llm.go
|
unbound max num gpu layers (#591)
|
2023-09-25 18:36:46 -04:00 |
utils.go
|
partial decode ggml bin for more info
|
2023-08-10 09:23:10 -07:00 |