Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Releases: matlab-deep-learning/llms-with-matlab

v4.5.0: AI Agent Examples and Improved Tool Calling

15 Oct 11:19

Choose a tag to compare

Tool Calling: Enforce tool calls when generating text

You can now force models to use one or more tool calls during output generation by setting the ToolChoice name-value argument of the generate function to "required".

This option is supported for these chat completion APIs:

  • openAIChat objects
  • azureChat objects

Tool Calling: Temporarily override Tools parameter when generating text

You can now set the Tools parameter for a single API call by using the corresponding name-value argument of the generate function.

AI Agent Workflows: New examples

Use these new examples to learn about agentic workflows:

Minor Updates

This release includes:

  • improved error handling
  • bug fixes
  • documentation updates

Full Changelog: v4.4.0...v4.5.0

v4.4.0: Support for GPT-5, o3, o4-mini

13 Aug 10:33

Choose a tag to compare

New OpenAI Models: GPT-5, o3, o4-mini

You can now use the OpenAI® models GPT-5, GPT-5 mini, GPT-5 nano, GPT-5 chat, o3, and o4-mini to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "gpt-5", "gpt-5-mini", "gpt-5-nano", "gpt-5-chat-latest", "o4-mini", or "o3".

Functionality being removed or changed

Support for the OpenAI model o1-preview has been removed because OpenAI no longer supports this model.

Minor Updates

This release includes:

  • Improved test infrastructure
  • Bug fixes
  • Documentation updates

Full Changelog: v4.3.0...v4.4.0

v4.3.0: Support for GPT-4.1

16 Apr 12:59

Choose a tag to compare

Support for OpenAI GPT-4.1 Models

You can now use the OpenAI® models GPT-4.1, GPT-4.1 mini, and GPT-4.1 nano to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "gpt-4.1", "gpt-4.1-mini", or "gpt-4.1-nano".

Minor Updates

This release includes:

  • Improved structured output support for ollamaChat
  • MaxNumTokens support with newer models in Azure®
  • Documentation updates

Full Changelog: v4.2.0...v4.3.0

v4.2.0: Tool calling with Ollama, support for OpenAI o1 and o3-mini models

07 Feb 09:16

Choose a tag to compare

Use tool calling with Ollama

You can now use tool calling, also known as function calling, with ollamaChat objects. Some large language models (LLMs) can suggest calls to a tool that you have, such as a MATLAB function, in their generated output. An LLM does not execute the tool themselves. Instead, the model encodes the name of the tool and the name and value of any input arguments. You can then write scripts that automate the tool calls suggested by the LLM.

To use tool calling, specify the ToolChoice name-value argument of the ollamaChat function.

For information on whether an Ollama™ model supports tool calling, check whether the model has the tools tag in ollama.com/library.

Support for OpenAI® o1 and o3-mini models

You can now use the OpenAI models o1 and o3-mini to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "o1" or "o3-mini".

Full Changelog: v4.1.0...v4.2.0

v4.1.0: Structured Outputs with Ollama

13 Dec 12:30

Choose a tag to compare

Ensure output format using structured outputs with Ollama

You can now use structured output to generate output using ollamaChat objects.

In LLMs with MATLAB, you can specify the structure of the output in two different ways.

  • Specify a valid JSON Schema directly.
  • Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.

To do this, set the ReponseFormat name-value argument of ollamaChat or generate to one of these options.

  • A string scalar containing a valid JSON Schema.
  • A structure array containing an example that adheres to the required format, for example:
    ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])

For more information on structured output in Ollama™, see https://ollama.com/blog/structured-outputs.

Full Changelog: v4.0.0...v4.1.0

v4.0.0: Structured Output

29 Oct 10:27

Choose a tag to compare

This release includes new features and bug fixes.

Ensure output format using structured output

You can now use structured output to generate output using openAIChat or azureChat objects.

In LLMs with MATLAB, you can specify the structure of the output in two different ways.

  • Specify a valid JSON Schema directly.
  • Specify an example structure array that adheres to the required output format. The software automatically generates the corresponding JSON Schema and provides this to the LLM. Then, the software automatically converts the output of the LLM back into a structure array.

To do this, set the ReponseFormat name-value argument of openAIChat, azureChat, or generate to:

  • A string scalar containing a valid JSON Schema.
  • A structure array containing an example that adheres to the required format, for example: ResponseFormat=struct("Name","Rudolph","NoseColor",[255 0 0])

For more information on structured output, see https://platform.openai.com/docs/guides/structured-outputs.

Argument name changes

The Model name-value argument of ollamaChat has been renamed to ModelName. However, you can still use Model instead.

The Deployment name-value argument of azureChat has been renamed to DeploymentID. However, you can still use Deployment instead.

v3.4.0: Documentation, support for OpenAI o1 models, general updates

27 Sep 14:35

Choose a tag to compare

This release includes new features, new documentation, and bug fixes.

New Features

Support for OpenAI® o1 models

You can now use the OpenAI models o1-mini and o1-preview to generate text from MATLAB®. When you create an openAIChat object, set the ModelName name-value argument to "o1-mini" or "o1-preview".

Temporarily override model parameters when generating text

You can now set model parameters such as MaxNumToken and ResponseFormat for a single API call by using the corresponding name-value arguments of the generate function. The generate function will then use the specified parameter for text generation instead of the corresponding model parameter of the openAIChat, azureChat, or ollamaChat input.

For a full list of supported model parameters, see generate.

Support for min-p sampling in Ollama™

You can now set the minimum probability ratio to tune the frequency of improbable tokens when generating text using Ollama models. You can do this in two different ways:

  1. When you create an ollamaChat object, specify the MinP name-value argument.
  2. When you generate text using the generate function with an ollamaChat object, specify the MinP name-value argument.

New Documentation

There is now detailed documentation for the features included in LLMs with MATLAB:

You can find these pages in a new directory functions inside the doc directory.

Full Changelog: v3.3.0...v3.4.0

v3.3.0: Vision support in Ollama

09 Aug 13:37
150d9c1

Choose a tag to compare

Vision Support in Ollama

This release adds support for vision models in Ollama™. This will allow you to use the ollamaChat function to describe images.

For an example, see Understanding the Content of an Image.

Full Changelog: v3.2.0...v3.3.0

v3.2.0: Supporting Ollama servers not on localhost:11434

24 Jul 10:05
bdb84b8

Choose a tag to compare

What's Changed

Full Changelog: v3.1.0...v3.2.0

v3.1.0: Update OpenAI models

19 Jul 09:00
d74ec2b

Choose a tag to compare

What's Changed

Full Changelog: v3.0.0...v3.1.0

Morty Proxy This is a proxified and sanitized view of the page, visit original site.