llama.cpp/llama_cpp/server
Lucas Doyle b9098b0ef7 llama_cpp server: prompt is a string
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion

This was breaking types when generating an openapi client
2023-05-02 14:47:07 -07:00
..
__init__.py llama_cpp server: app is now importable, still runnable as a module 2023-04-29 11:41:25 -07:00
__main__.py Refactor server to use factory 2023-05-01 22:38:46 -04:00
app.py llama_cpp server: prompt is a string 2023-05-02 14:47:07 -07:00