Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 74167bd

Browse filesBrowse files
committed
Update Functions notebook
1 parent 85ead98 commit 74167bd
Copy full SHA for 74167bd

File tree

Expand file treeCollapse file tree

1 file changed

+29
-4
lines changed
Filter options
Expand file treeCollapse file tree

1 file changed

+29
-4
lines changed

‎examples/notebooks/Functions.ipynb

Copy file name to clipboardExpand all lines: examples/notebooks/Functions.ipynb
+29-4Lines changed: 29 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,26 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"## Function Calling with OpenAI Python Client"
7+
"# Functions\n",
8+
"\n",
9+
"The OpenAI compatbile web server in `llama-cpp-python` supports function calling.\n",
10+
"\n",
11+
"Function calling allows API clients to specify a schema that gives the model a format it should respond in.\n",
12+
"Function calling in `llama-cpp-python` works by combining models pretrained for function calling such as [`functionary`](https://huggingface.co/abetlen/functionary-7b-v1-GGUF) with constrained sampling to produce a response that is compatible with the schema.\n",
13+
"\n",
14+
"Note however that this improves but does not guarantee that the response will be compatible with the schema.\n",
15+
"\n",
16+
"## Requirements\n",
17+
"\n",
18+
"Before we begin you will need the following:\n",
19+
"\n",
20+
"- A running `llama-cpp-python` server with a function calling compatible model. [See here](https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling)\n",
21+
"- The OpenAI Python Client `pip install openai`\n",
22+
"- (Optional) The Instructor Python Library `pip install instructor`\n",
23+
"\n",
24+
"## Function Calling with OpenAI Python Client\n",
25+
"\n",
26+
"We'll start with a basic demo that only uses the OpenAI Python Client."
827
]
928
},
1029
{
@@ -27,7 +46,7 @@
2746
"\n",
2847
"client = openai.OpenAI(\n",
2948
" api_key = \"sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\", # can be anything\n",
30-
" base_url = \"http://100.64.159.73:8000/v1\"\n",
49+
" base_url = \"http://100.64.159.73:8000/v1\" # NOTE: Replace with IP address and port of your llama-cpp-python server\n",
3150
")\n",
3251
"\n",
3352
"# Example dummy function hard coded to return the same weather\n",
@@ -113,13 +132,19 @@
113132
"source": [
114133
"# Function Calling with Instructor\n",
115134
"\n",
116-
"You'll need to install the [`instructor`](https://github.com/jxnl/instructor/) package to run this notebook. You can do so by running the following command in your terminal:\n",
135+
"The above example is a bit verbose and requires you to manually verify the schema.\n",
136+
"\n",
137+
"For our next examples we'll use the `instructor` library to simplify the process and accomplish a number of different tasks with function calling.\n",
138+
"\n",
139+
"You'll first need to install the [`instructor`](https://github.com/jxnl/instructor/).\n",
140+
"\n",
141+
"You can do so by running the following command in your terminal:\n",
117142
"\n",
118143
"```bash\n",
119144
"pip install instructor\n",
120145
"```\n",
121146
"\n",
122-
"We'll highlight a few basic examples taken from the [instructor cookbook](https://jxnl.github.io/instructor/)\n",
147+
"Below we'll go through a few basic examples taken directly from the [instructor cookbook](https://jxnl.github.io/instructor/)\n",
123148
"\n",
124149
"## Basic Usage"
125150
]

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.