You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NOTE: All server options are also available as environment variables. For example, `--model` can be set by setting the `MODEL` environment variable.
34
34
35
+
Check out the server config reference below settings for more information on the available options.
36
+
CLI arguments and environment variables are available for all of the fields defined in [`ServerSettings`](#llama_cpp.server.settings.ServerSettings) and [`ModelSettings`](#llama_cpp.server.settings.ModelSettings)
37
+
38
+
Additionally the server supports configuration check out the [configuration section](#configuration-and-multi-model-support) for more information and examples.
The server supports configuration via a JSON config file that can be passed using the `--config_file` parameter or the `CONFIG_FILE` environment variable.
Config files support all of the server and model options supported by the cli and environment variables however instead of only a single model the config file can specify multiple models.
141
+
142
+
The server supports routing requests to multiple models based on the `model` parameter in the request which matches against the `model_alias` in the config file.
143
+
144
+
At the moment only a single model is loaded into memory at, the server will automatically load and unload models as needed.
0 commit comments