We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 0e70984 commit 663659fCopy full SHA for 663659f
README.md
@@ -286,7 +286,7 @@ By default [`from_pretrained`](https://llama-cpp-python.readthedocs.io/en/latest
286
287
The high-level API also provides a simple interface for chat completion.
288
289
-Chat completion requires that the model know how to format the messages into a single prompt.
+Chat completion requires that the model knows how to format the messages into a single prompt.
290
The `Llama` class does this using pre-registered chat formats (ie. `chatml`, `llama-2`, `gemma`, etc) or by providing a custom chat handler object.
291
292
The model will will format the messages into a single prompt using the following order of precedence:
0 commit comments