ollama/llm/llama.cpp
Michael Yang 0c7a00a264 bump submodules
pin to 9e70cc03229df19ca2d28ce23cc817198f897278 for now since
438c2ca83045a00ef244093d27e9ed41a8cb4ea9 is breaking
2023-10-23 11:17:59 -07:00
..
ggml@9e232f0234 subprocess llama.cpp server (#401) 2023-08-30 16:35:03 -04:00
gguf@9e70cc0322 bump submodules 2023-10-23 11:17:59 -07:00
patches update default log target 2023-10-23 10:44:50 -07:00
generate_darwin_amd64.go update default log target 2023-10-23 10:44:50 -07:00
generate_darwin_arm64.go update default log target 2023-10-23 10:44:50 -07:00
generate_linux.go update default log target 2023-10-23 10:44:50 -07:00
generate_windows.go update default log target 2023-10-23 10:44:50 -07:00