Update Functions notebook
This commit is contained in:
parent
85ead98a3e
commit
74167bdfb2
1 changed files with 29 additions and 4 deletions
|
@ -4,7 +4,26 @@
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## Function Calling with OpenAI Python Client"
|
"# Functions\n",
|
||||||
|
"\n",
|
||||||
|
"The OpenAI compatbile web server in `llama-cpp-python` supports function calling.\n",
|
||||||
|
"\n",
|
||||||
|
"Function calling allows API clients to specify a schema that gives the model a format it should respond in.\n",
|
||||||
|
"Function calling in `llama-cpp-python` works by combining models pretrained for function calling such as [`functionary`](https://huggingface.co/abetlen/functionary-7b-v1-GGUF) with constrained sampling to produce a response that is compatible with the schema.\n",
|
||||||
|
"\n",
|
||||||
|
"Note however that this improves but does not guarantee that the response will be compatible with the schema.\n",
|
||||||
|
"\n",
|
||||||
|
"## Requirements\n",
|
||||||
|
"\n",
|
||||||
|
"Before we begin you will need the following:\n",
|
||||||
|
"\n",
|
||||||
|
"- A running `llama-cpp-python` server with a function calling compatible model. [See here](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling)\n",
|
||||||
|
"- The OpenAI Python Client `pip install openai`\n",
|
||||||
|
"- (Optional) The Instructor Python Library `pip install instructor`\n",
|
||||||
|
"\n",
|
||||||
|
"## Function Calling with OpenAI Python Client\n",
|
||||||
|
"\n",
|
||||||
|
"We'll start with a basic demo that only uses the OpenAI Python Client."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
@ -27,7 +46,7 @@
|
||||||
"\n",
|
"\n",
|
||||||
"client = openai.OpenAI(\n",
|
"client = openai.OpenAI(\n",
|
||||||
" api_key = \"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\", # can be anything\n",
|
" api_key = \"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\", # can be anything\n",
|
||||||
" base_url = \"http://100.64.159.73:8000/v1\"\n",
|
" base_url = \"http://100.64.159.73:8000/v1\" # NOTE: Replace with IP address and port of your llama-cpp-python server\n",
|
||||||
")\n",
|
")\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Example dummy function hard coded to return the same weather\n",
|
"# Example dummy function hard coded to return the same weather\n",
|
||||||
|
@ -113,13 +132,19 @@
|
||||||
"source": [
|
"source": [
|
||||||
"# Function Calling with Instructor\n",
|
"# Function Calling with Instructor\n",
|
||||||
"\n",
|
"\n",
|
||||||
"You'll need to install the [`instructor`](https://github.com/jxnl/instructor/) package to run this notebook. You can do so by running the following command in your terminal:\n",
|
"The above example is a bit verbose and requires you to manually verify the schema.\n",
|
||||||
|
"\n",
|
||||||
|
"For our next examples we'll use the `instructor` library to simplify the process and accomplish a number of different tasks with function calling.\n",
|
||||||
|
"\n",
|
||||||
|
"You'll first need to install the [`instructor`](https://github.com/jxnl/instructor/).\n",
|
||||||
|
"\n",
|
||||||
|
"You can do so by running the following command in your terminal:\n",
|
||||||
"\n",
|
"\n",
|
||||||
"```bash\n",
|
"```bash\n",
|
||||||
"pip install instructor\n",
|
"pip install instructor\n",
|
||||||
"```\n",
|
"```\n",
|
||||||
"\n",
|
"\n",
|
||||||
"We'll highlight a few basic examples taken from the [instructor cookbook](https://jxnl.github.io/instructor/)\n",
|
"Below we'll go through a few basic examples taken directly from the [instructor cookbook](https://jxnl.github.io/instructor/)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"## Basic Usage"
|
"## Basic Usage"
|
||||||
]
|
]
|
||||||
|
|
Loading…
Reference in a new issue