llama.cpp/llama_cpp/server
Lucas Doyle b47b9549d5 llama_cpp server: delete some ignored / unused parameters
`n`, `presence_penalty`, `frequency_penalty`, `best_of`, `logit_bias`, `user`: not supported, excluded from the calls into llama. decision: delete it
2023-05-01 15:38:19 -07:00
..
__init__.py llama_cpp server: app is now importable, still runnable as a module 2023-04-29 11:41:25 -07:00
__main__.py llama_cpp server: slight refactor to init_llama function 2023-04-29 11:42:23 -07:00
app.py llama_cpp server: delete some ignored / unused parameters 2023-05-01 15:38:19 -07:00