docs: Add link to function calling notebook

This commit is contained in:
Andrei Betlen 2023-11-24 00:15:02 -05:00
parent de2e2bc083
commit 19e02f1f87

View file

@ -77,6 +77,8 @@ Then when you run the server you'll need to also specify the `functionary-7b-v1`
python3 -m llama_cpp.server --model <model_path> --chat_format functionary python3 -m llama_cpp.server --model <model_path> --chat_format functionary
``` ```
Check out the example notebook [here](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling.
### Multimodal Models ### Multimodal Models
`llama-cpp-python` supports the llava1.5 family of multi-modal models which allow the language model to `llama-cpp-python` supports the llava1.5 family of multi-modal models which allow the language model to