docs: edit function calling docs
This commit is contained in:
parent
8c3aa7858b
commit
e6a36b840e
1 changed files with 2 additions and 1 deletions
|
@ -66,12 +66,13 @@ Then just update your settings in `.vscode/settings.json` to point to your code
|
||||||
### Function Calling
|
### Function Calling
|
||||||
|
|
||||||
`llama-cpp-python` supports structured function calling based on a JSON schema.
|
`llama-cpp-python` supports structured function calling based on a JSON schema.
|
||||||
|
Function calling is completely compatible with the OpenAI function calling API and can be used by connecting with the official OpenAI Python client.
|
||||||
|
|
||||||
You'll first need to download one of the available function calling models in GGUF format:
|
You'll first need to download one of the available function calling models in GGUF format:
|
||||||
|
|
||||||
- [functionary-7b-v1](https://huggingface.co/abetlen/functionary-7b-v1-GGUF)
|
- [functionary-7b-v1](https://huggingface.co/abetlen/functionary-7b-v1-GGUF)
|
||||||
|
|
||||||
Then when you run the server you'll need to also specify the `functionary-7b-v1` chat_format
|
Then when you run the server you'll need to also specify the `functionary` chat_format
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python3 -m llama_cpp.server --model <model_path> --chat_format functionary
|
python3 -m llama_cpp.server --model <model_path> --chat_format functionary
|
||||||
|
|
Loading…
Reference in a new issue