ollama/llm
2023-10-05 12:53:47 -04:00
..
llama.cpp silence warm up log 2023-09-21 14:53:33 -07:00
falcon.go starcoder 2023-10-02 19:56:51 -07:00
ggml.go starcoder 2023-10-02 19:56:51 -07:00
gguf.go starcoder 2023-10-02 19:56:51 -07:00
llama.go increase streaming buffer size (#692) 2023-10-04 14:09:00 -04:00
llm.go enable q8, q5, 5_1, and f32 for linux gpu (#699) 2023-10-05 12:53:47 -04:00
starcoder.go starcoder 2023-10-02 19:56:51 -07:00
utils.go partial decode ggml bin for more info 2023-08-10 09:23:10 -07:00