ollama/llm/ext_server
Jeffrey Morgan 717f7229eb
Do not shift context for sliding window models (#5368)
* Do not shift context for sliding window models

* truncate prompt > 2/3 tokens

* only target gemma2
2024-06-28 19:39:31 -07:00
..
CMakeLists.txt Switch back to subprocessing for llama.cpp 2024-04-01 16:48:18 -07:00
httplib.h Import server.cpp as of b2356 2024-03-12 13:58:06 -07:00
json.hpp Import server.cpp as of b2356 2024-03-12 13:58:06 -07:00
server.cpp Do not shift context for sliding window models (#5368) 2024-06-28 19:39:31 -07:00
utils.hpp log clean up 2024-05-09 14:55:36 -07:00