This website requires JavaScript.
Explore
Help
Sign in
baalajimaestro
/
ollama
Watch
1
Star
0
Fork
You've already forked ollama
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
5b806d8d24
ollama
/
llm
/
ext_server
History
jmorganca
fcf4d60eee
llm: add back check for empty token cache
2024-04-30 17:38:44 -04:00
..
CMakeLists.txt
Switch back to subprocessing for llama.cpp
2024-04-01 16:48:18 -07:00
httplib.h
Import server.cpp as of b2356
2024-03-12 13:58:06 -07:00
json.hpp
Import server.cpp as of b2356
2024-03-12 13:58:06 -07:00
server.cpp
llm: add back check for empty token cache
2024-04-30 17:38:44 -04:00
utils.hpp
add license in file header for vendored llama.cpp code (
#3351
)
2024-03-26 16:23:23 -04:00