diff --git a/docs/server.md b/docs/server.md index f90b0a9..ad48130 100644 --- a/docs/server.md +++ b/docs/server.md @@ -66,12 +66,13 @@ Then just update your settings in `.vscode/settings.json` to point to your code ### Function Calling `llama-cpp-python` supports structured function calling based on a JSON schema. +Function calling is completely compatible with the OpenAI function calling API and can be used by connecting with the official OpenAI Python client. You'll first need to download one of the available function calling models in GGUF format: - [functionary-7b-v1](https://huggingface.co/abetlen/functionary-7b-v1-GGUF) -Then when you run the server you'll need to also specify the `functionary-7b-v1` chat_format +Then when you run the server you'll need to also specify the `functionary` chat_format ```bash python3 -m llama_cpp.server --model --chat_format functionary