b9098b0ef7
Not sure why this union type was here but taking a look at llama.py, prompt is only ever processed as a string for completion This was breaking types when generating an openapi client |
||
---|---|---|
.. | ||
server | ||
__init__.py | ||
llama.py | ||
llama_cpp.py | ||
llama_types.py |