1e42913599
I think this is actually supported (its in the arguments of `LLama.__call__`, which is how the completion is invoked). decision: mark as supported |
||
---|---|---|
.. | ||
server | ||
__init__.py | ||
llama.py | ||
llama_cpp.py | ||
llama_types.py |