docs: update link

This commit is contained in:
Andrei Betlen 2023-11-24 00:18:32 -05:00
parent e6a36b840e
commit 945e20fa2c

View file

@ -78,7 +78,7 @@ Then when you run the server you'll need to also specify the `functionary` chat_
python3 -m llama_cpp.server --model <model_path> --chat_format functionary python3 -m llama_cpp.server --model <model_path> --chat_format functionary
``` ```
Check out the example notebook [here](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling. Check out this [example notebook](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling.
### Multimodal Models ### Multimodal Models