From 9b3a5939f3fdb2cdb5b919f12bf709a07e4113d0 Mon Sep 17 00:00:00 2001 From: Andrei Betlen Date: Fri, 22 Dec 2023 14:40:13 -0500 Subject: [PATCH] docs: Add multi-model link to readme --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 2f413db..b07f449 100644 --- a/README.md +++ b/README.md @@ -18,7 +18,7 @@ This package provides: - [Local Copilot replacement](https://llama-cpp-python.readthedocs.io/en/latest/server/#code-completion) - [Function Calling support](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling) - [Vision API support](https://llama-cpp-python.readthedocs.io/en/latest/server/#multimodal-models) - + - [Multiple Models](https://llama-cpp-python.readthedocs.io/en/latest/server/#configuration-and-multi-model-support) Documentation is available at [https://llama-cpp-python.readthedocs.io/en/latest](https://llama-cpp-python.readthedocs.io/en/latest). @@ -332,6 +332,7 @@ For possible options, see [llama_cpp/llama_chat_format.py](llama_cpp/llama_chat_ - [Local Copilot replacement](https://llama-cpp-python.readthedocs.io/en/latest/server/#code-completion) - [Function Calling support](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling) - [Vision API support](https://llama-cpp-python.readthedocs.io/en/latest/server/#multimodal-models) +- [Multiple Models](https://llama-cpp-python.readthedocs.io/en/latest/server/#configuration-and-multi-model-support) ## Docker image