docs: update link
This commit is contained in:
parent
e6a36b840e
commit
945e20fa2c
1 changed files with 1 additions and 1 deletions
|
@ -78,7 +78,7 @@ Then when you run the server you'll need to also specify the `functionary` chat_
|
|||
python3 -m llama_cpp.server --model <model_path> --chat_format functionary
|
||||
```
|
||||
|
||||
Check out the example notebook [here](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling.
|
||||
Check out this [example notebook](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling.
|
||||
|
||||
### Multimodal Models
|
||||
|
||||
|
|
Loading…
Reference in a new issue