diff --git a/docs/server.md b/docs/server.md index ad48130..4e1e562 100644 --- a/docs/server.md +++ b/docs/server.md @@ -78,7 +78,7 @@ Then when you run the server you'll need to also specify the `functionary` chat_ python3 -m llama_cpp.server --model --chat_format functionary ``` -Check out the example notebook [here](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling. +Check out this [example notebook](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling. ### Multimodal Models