llama.cpp/llama_cpp
Lucas Doyle b9098b0ef7 llama_cpp server: prompt is a string
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion

This was breaking types when generating an openapi client
2023-05-02 14:47:07 -07:00
..
server llama_cpp server: prompt is a string 2023-05-02 14:47:07 -07:00
__init__.py Black formatting 2023-03-24 14:59:29 -04:00
llama.py Formatting 2023-05-01 21:51:16 -04:00
llama_cpp.py Update sampling api 2023-05-01 14:47:55 -04:00
llama_types.py llama_cpp server: fix to ChatCompletionRequestMessage 2023-05-01 15:38:19 -07:00