llama.cpp/examples
Andrei 153a0049d9
feat: Generic chatml Function Calling (#957)
* Add demo notebook

* Add initial chat handler

* Update OpenAI types

* Add generic chatml function calling (wip)

* Update chatml generic function calling.

* Progress on auto-tool calls

* fix streaming functions

* Remove print statements

* fix: Suppress output from llama.cpp init and grammar creation

* Add OpenAI v1 python api compatible chat completion function

* Support non-streaming multi-tool calls

* Format

* Include function_call in response.
2024-02-12 15:56:07 -05:00
..
high_level_api fix: Run server command. Closes #1143 2024-01-31 10:37:19 -05:00
low_level_api Fix low_level_api_chat_cpp example to match current API (#1086) 2024-01-15 10:46:35 -05:00
notebooks feat: Generic chatml Function Calling (#957) 2024-02-12 15:56:07 -05:00