llama.cpp/llama_cpp
Lucas Doyle efe8e6f879 llama_cpp server: slight refactor to init_llama function
Define an init_llama function that starts llama with supplied settings instead of just doing it in the global context of app.py

This allows the test to be less brittle by not needing to mess with os.environ, then importing the app
2023-04-29 11:42:23 -07:00
..
server llama_cpp server: slight refactor to init_llama function 2023-04-29 11:42:23 -07:00
__init__.py Black formatting 2023-03-24 14:59:29 -04:00
llama.py Remove excessive errors="ignore" and add utf8 test 2023-04-29 12:19:22 +02:00
llama_cpp.py Update llama.cpp 2023-04-28 15:32:43 -04:00
llama_types.py Bugfix for Python3.7 2023-04-05 04:37:33 -04:00