File tree Expand file tree Collapse file tree 2 files changed +446
-9
lines changed
Filter options
Expand file tree Collapse file tree 2 files changed +446
-9
lines changed
Original file line number Diff line number Diff line change @@ -221,12 +221,12 @@ Chat completion is available through the [`create_chat_completion`](https://llam
221
221
The high-level API also provides a simple interface for function calling.
222
222
223
223
Note that the only model that supports full function calling at this time is "functionary".
224
- The gguf-converted files for this model can be found here: [ functionary-7b-v1 ] ( https://huggingface.co/abetlen /functionary-7b-v1 -GGUF )
224
+ The gguf-converted files for this model can be found here: [ functionary-v2.2 ] ( https://huggingface.co/meetkai /functionary-medium-v2.2 -GGUF )
225
225
226
226
227
227
``` python
228
228
>> > from llama_cpp import Llama
229
- >> > llm = Llama(model_path = " path/to/functionary/llama-model.gguf" , chat_format = " functionary " )
229
+ >> > llm = Llama(model_path = " path/to/functionary/llama-model.gguf" , chat_format = " functionary2 " )
230
230
>> > llm.create_chat_completion(
231
231
messages = [
232
232
{
@@ -260,12 +260,7 @@ The gguf-converted files for this model can be found here: [functionary-7b-v1](h
260
260
}
261
261
}
262
262
}],
263
- tool_choice = [{
264
- " type" : " function" ,
265
- " function" : {
266
- " name" : " UserDetail"
267
- }
268
- }]
263
+ tool_choice = " auto"
269
264
)
270
265
```
271
266
You can’t perform that action at this time.
0 commit comments