ollama/llm
2024-03-26 16:23:16 -04:00
..
ext_server Bump llama.cpp to b2474 2024-03-23 09:54:56 +01:00
generate remove need for $VSINSTALLDIR since build will fail if ninja cannot be found (#3350) 2024-03-26 16:23:16 -04:00
llama.cpp@ad3a0505e3 Bump llama.cpp to b2527 2024-03-25 13:47:44 -07:00
patches Bump llama.cpp to b2474 2024-03-23 09:54:56 +01:00
dyn_ext_server.c Revamp ROCm support 2024-03-07 10:36:50 -08:00
dyn_ext_server.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
dyn_ext_server.h Always dynamically load the llm server library 2024-01-11 08:42:47 -08:00
ggla.go refactor readseeker 2024-03-12 12:54:18 -07:00
ggml.go refactor readseeker 2024-03-12 12:54:18 -07:00
gguf.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
llama.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
llm.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
payload_common.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
payload_darwin_amd64.go update llama.cpp submodule to 77d1ac7 (#3030) 2024-03-09 15:55:34 -08:00
payload_darwin_arm64.go Add multiple CPU variants for Intel Mac 2024-01-17 15:08:54 -08:00
payload_linux.go Revamp ROCm support 2024-03-07 10:36:50 -08:00
payload_test.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
payload_windows.go Add multiple CPU variants for Intel Mac 2024-01-17 15:08:54 -08:00
utils.go partial decode ggml bin for more info 2023-08-10 09:23:10 -07:00