ollama/llama
Michael Yang 4dc5b117dd automatically set num_keep if num_keep < 0
num_keep defines how many tokens to keep in the context when truncating
inputs. if left to its default value of -1, the server will calculate
num_keep to be the left of the system instructions
2023-08-07 16:19:12 -07:00
..
ggml-alloc.c update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-alloc.h update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-cuda.cu update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-cuda.h update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-metal.h update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-metal.m update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-metal.metal update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-mpi.c update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-mpi.h update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-opencl.cpp update llama.cpp 2023-08-03 11:50:24 -07:00
ggml-opencl.h update llama.cpp 2023-08-03 11:50:24 -07:00
ggml.c update llama.cpp 2023-08-03 11:50:24 -07:00
ggml.h update llama.cpp 2023-08-03 11:50:24 -07:00
k_quants.c update llama.cpp 2023-08-03 11:50:24 -07:00
k_quants.h update llama.cpp 2023-08-03 11:50:24 -07:00
llama-util.h update llama.cpp 2023-08-03 11:50:24 -07:00
llama.cpp update llama.cpp 2023-08-03 11:50:24 -07:00
llama.go automatically set num_keep if num_keep < 0 2023-08-07 16:19:12 -07:00
llama.h update llama.cpp 2023-08-03 11:50:24 -07:00
llama_darwin.go override ggml-metal if the file is different 2023-08-02 12:50:30 -07:00
update-llama-cpp.sh update llama.cpp to d91f3f0 2023-07-28 08:07:48 -04:00
utils.go update predict code 2023-07-27 09:31:44 -07:00