From 945e20fa2c99a94b2101401001c0f77c006bbeb2 Mon Sep 17 00:00:00 2001 From: Andrei Betlen Date: Fri, 24 Nov 2023 00:18:32 -0500 Subject: [PATCH] docs: update link --- docs/server.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/server.md b/docs/server.md index ad48130..4e1e562 100644 --- a/docs/server.md +++ b/docs/server.md @@ -78,7 +78,7 @@ Then when you run the server you'll need to also specify the `functionary` chat_ python3 -m llama_cpp.server --model --chat_format functionary ``` -Check out the example notebook [here](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling. +Check out this [example notebook](https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb) for a walkthrough of some interesting use cases for function calling. ### Multimodal Models