Releases: matlab-deep-learning/llms-with-matlab
v4.5.0: AI Agent Examples and Improved Tool Calling
Tool Calling: Enforce tool calls when generating text
You can now force models to use one or more tool calls during output generation by setting the ToolChoice
name-value argument of the generate
function to "required"
.
This option is supported for these chat completion APIs:
openAIChat
objectsazureChat
objects
Tool Calling: Temporarily override Tools
parameter when generating text
You can now set the Tools
parameter for a single API call by using the corresponding name-value argument of the generate
function.
AI Agent Workflows: New examples
Use these new examples to learn about agentic workflows:
- Solve Simple Math Problem Using AI Agent
- Fit Polynomial to Data Using AI Agent (requires Curve Fitting Toolbox™)
Minor Updates
This release includes:
- improved error handling
- bug fixes
- documentation updates
Full Changelog: v4.4.0...v4.5.0
v4.4.0: Support for GPT-5, o3, o4-mini
New OpenAI Models: GPT-5, o3, o4-mini
You can now use the OpenAI® models GPT-5, GPT-5 mini, GPT-5 nano, GPT-5 chat, o3, and o4-mini to generate text from MATLAB®. When you create an openAIChat
object, set the ModelName
name-value argument to "gpt-5"
, "gpt-5-mini"
, "gpt-5-nano"
, "gpt-5-chat-latest"
, "o4-mini"
, or "o3"
.
Functionality being removed or changed
Support for the OpenAI model o1-preview has been removed because OpenAI no longer supports this model.
Minor Updates
This release includes:
- Improved test infrastructure
- Bug fixes
- Documentation updates
Full Changelog: v4.3.0...v4.4.0
v4.3.0: Support for GPT-4.1
Support for OpenAI GPT-4.1 Models
You can now use the OpenAI® models GPT-4.1, GPT-4.1 mini, and GPT-4.1 nano to generate text from MATLAB®. When you create an openAIChat
object, set the ModelName
name-value argument to "gpt-4.1"
, "gpt-4.1-mini"
, or "gpt-4.1-nano"
.
Minor Updates
This release includes:
- Improved structured output support for
ollamaChat
MaxNumTokens
support with newer models in Azure®- Documentation updates
Full Changelog: v4.2.0...v4.3.0
v4.2.0: Tool calling with Ollama, support for OpenAI o1 and o3-mini models
Use tool calling with Ollama
You can now use tool calling, also known as function calling, with ollamaChat
objects. Some large language models (LLMs) can suggest calls to a tool that you have, such as a MATLAB function, in their generated output. An LLM does not execute the tool themselves. Instead, the model encodes the name of the tool and the name and value of any input arguments. You can then write scripts that automate the tool calls suggested by the LLM.
To use tool calling, specify the ToolChoice name-value argument of the ollamaChat
function.
For information on whether an Ollama™ model supports tool calling, check whether the model has the tools
tag in ollama.com/library.
Support for OpenAI® o1 and o3-mini models
You can now use the OpenAI models o1 and o3-mini to generate text from MATLAB®. When you create an openAIChat
object, set the ModelName
name-value argument to "o1"
or "o3-mini"
.
Full Changelog: v4.1.0...v4.2.0
v4.1.0: Structured Outputs with Ollama
Ensure output format using structured outputs with Ollama
You can now use structured output to generate output using ollamaChat
objects.
In LLMs with MATLAB, you can specify the structure of the output in two different ways.
- Specify a valid JSON Schema directly.
- Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.
To do this, set the ReponseFormat
name-value argument of ollamaChat
or generate
to one of these options.
- A string scalar containing a valid JSON Schema.
- A structure array containing an example that adheres to the required format, for example:
ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])
For more information on structured output in Ollama™, see https://ollama.com/blog/structured-outputs.
Full Changelog: v4.0.0...v4.1.0
v4.0.0: Structured Output
This release includes new features and bug fixes.
Ensure output format using structured output
You can now use structured output to generate output using openAIChat
or azureChat
objects.
In LLMs with MATLAB, you can specify the structure of the output in two different ways.
- Specify a valid JSON Schema directly.
- Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.
To do this, set the ReponseFormat
name-value argument of openAIChat
, azureChat
, or generate
to:
- A string scalar containing a valid JSON Schema.
- A structure array containing an example that adheres to the required format, for example:
ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])
For more information on structured output, see https://platform.openai.com/docs/guides/structured-outputs.
Argument name changes
The Model
name-value argument of ollamaChat
has been renamed to ModelName
. However, you can still use Model
instead.
The Deployment
name-value argument of azureChat
has been renamed to DeploymentID
. However, you can still use Deployment
instead.
v3.4.0: Documentation, support for OpenAI o1 models, general updates
This release includes new features, new documentation, and bug fixes.
New Features
Support for OpenAI® o1 models
You can now use the OpenAI models o1-mini and o1-preview to generate text from MATLAB®. When you create an openAIChat
object, set the ModelName
name-value argument to "o1-mini"
or "o1-preview"
.
Temporarily override model parameters when generating text
You can now set model parameters such as MaxNumToken
and ResponseFormat
for a single API call by using the corresponding name-value arguments of the generate
function. The generate
function will then use the specified parameter for text generation instead of the corresponding model parameter of the openAIChat
, azureChat
, or ollamaChat
input.
For a full list of supported model parameters, see generate.
Support for min-p sampling in Ollama™
You can now set the minimum probability ratio to tune the frequency of improbable tokens when generating text using Ollama models. You can do this in two different ways:
- When you create an
ollamaChat
object, specify theMinP
name-value argument. - When you generate text using the
generate
function with anollamaChat
object, specify theMinP
name-value argument.
New Documentation
There is now detailed documentation for the features included in LLMs with MATLAB:
- openAIChat
- azureChat
- ollamaChat
- generate
- openAIFunction
- addParameter
- openAIImages
- openAIImages.generate
- edit
- createVariation
- messageHistory
- addSystemMessage
- addUserMessage
- addUserMessageWithImages
- addToolMessage
- addResponseMessage
- removeMessage
You can find these pages in a new directory functions
inside the doc
directory.
Full Changelog: v3.3.0...v3.4.0
v3.3.0: Vision support in Ollama
Vision Support in Ollama
This release adds support for vision models in Ollama™. This will allow you to use the ollamaChat
function to describe images.
For an example, see Understanding the Content of an Image.
Full Changelog: v3.2.0...v3.3.0
v3.2.0: Supporting Ollama servers not on localhost:11434
What's Changed
- gpt-4o-mini does handle images by @ccreutzi in #52
- Add support for remote ollama by @ccreutzi in #53
Full Changelog: v3.1.0...v3.2.0
v3.1.0: Update OpenAI models
What's Changed
- Azure api version tests by @vpapanasta in #46
- Adding extra tests to increase code coverage in ollamaChat and azureChat by @vpapanasta in #47
- Add Ollama-based RAG example by @ccreutzi in #45
- Create uihtml inside uifigure by @ccreutzi in #48
- Add gpt-4o-mini and make it the default by @ccreutzi in #50
Full Changelog: v3.0.0...v3.1.0