ext_server
|
llm: always add bos token to prompt (#4941)
|
2024-06-08 18:47:10 -07:00 |
generate
|
Add ability to skip oneapi generate
|
2024-06-07 08:32:49 -07:00 |
ggla.go
|
simplify safetensors reading
|
2024-05-21 11:28:22 -07:00 |
llm.go
|
revert tokenize ffi (#4761)
|
2024-05-31 18:54:21 -07:00 |
llm_linux.go
|
Switch back to subprocessing for llama.cpp
|
2024-04-01 16:48:18 -07:00 |
memory.go
|
gofmt, goimports
|
2024-06-04 13:20:24 -07:00 |
payload.go
|
replace x/exp/slices with slices
|
2024-06-04 11:13:30 -07:00 |
status.go
|
Switch back to subprocessing for llama.cpp
|
2024-04-01 16:48:18 -07:00 |