ollama/llm
Bruce MacDonald 77295f716e
prevent waiting on exited command (#752)
* prevent waiting on exited command
* close llama runner once
2023-10-11 12:32:13 -04:00
..
llama.cpp llm: fix build on amd64 2023-10-06 14:39:54 -07:00
falcon.go starcoder 2023-10-02 19:56:51 -07:00
ggml.go starcoder 2023-10-02 19:56:51 -07:00
gguf.go starcoder 2023-10-02 19:56:51 -07:00
llama.go prevent waiting on exited command (#752) 2023-10-11 12:32:13 -04:00
llm.go enable q8, q5, 5_1, and f32 for linux gpu (#699) 2023-10-05 12:53:47 -04:00
starcoder.go starcoder 2023-10-02 19:56:51 -07:00
utils.go partial decode ggml bin for more info 2023-08-10 09:23:10 -07:00