NaturalLanguage
|
complete a sequence of messages sent to a large language model
|
|
Calling Sequence
|
|
Dialog(, )
Dialog(, , )
Dialog(, , )
|
|
Parameters
|
|
|
-
|
list of [string, string] pairs, where the first string in each pair is one of "system", "user", or "assistant"
|
|
-
|
system message (string)
|
|
-
|
user prompt (string)
|
|
-
|
(optional) equation(s) of the form or , where is a name and is , , or .
|
|
|
|
|
Description
|
|
•
|
The Dialog command sends the sequence of messages specified in its arguments to a large language model (LLM). The command returns the message from the LLM, optionally together with the dialog history.
|
•
|
Each message has an associated source: it can be "system", "user", or "assistant". A system message sets up the dialog; it typically gives general guidelines for the LLM on how to respond. A user message typically comes from an end user of the system. An assistant message is a message sent by the LLM. Please see OpenAI's documentation for more information on this.
|
•
|
There are three different calling sequences.
|
–
|
For the most general calling sequence, you submit a list of pairs which represents the dialog so far; the LLM will generate an assistant message that it judges a reasonable continuation of the dialog.
|
–
|
Alternatively, you can use just two strings. These are interpreted as a system message and a user prompt: passing strings and is equivalent to passing .
|
–
|
Finally, you can ask Maple to return an object representing the conversation so far as a history object, explained in more detail below. You can then pass this history object back to Dialog to continue the conversation. For this calling sequence, you can also pass zero or more strings that will be considered user messages to be appended to the history so far before it is passed to the LLM.
|
•
|
You can request three types of output:
|
–
|
With the option, Maple returns just the (string) return value from the LLM. This is the default output.
|
–
|
With the option , Maple returns the list of pairs of strings that represents the full conversation including the newly returned value from the LLM.
|
–
|
With the option , Maple returns an object that represents the history so far, including the newly returned value from the LLM. You can interrogate this object, or pass it back into the Dialog command to continue the conversation. Ways you can interrogate the object (let's call it ) are:
|
•
|
appends a message to the history, where is the history object; is the source for the message - "system", "user", or "assistant"; and is the message to add to the end of the history.
|
•
|
returns a list of pairs of strings like you can pass into Dialog.
|
•
|
Maple can interface with OpenAI's GPT-4o, o1-mini, and o3-mini models. You can select these by using the options ; ; and , respectively. By default, or when explicitly selected with the option, a Maplesoft server chooses an appropriate model. (At the time of release of Maple 2025, this was the o3-mini model.) Note that OpenAI may deprecate and disable models, so the set of models supported may change in the future.
|
|
Note: Large language models often generate inaccurate statements. Please keep this in mind: this is not technology for building a bridge with.
|
|
|
Examples
|
|
>
|
|
In this example, we instruct the large language model to act as a college instructor. We can see how we can construct a dialog between the model and the user.
>
|
|
| (1) |
>
|
|
| (2) |
>
|
|
| (3) |
>
|
|
| (4) |
| (5) |
When an LLM does not give the exact type of answers you're looking for, it can help to frame the input as a conversation where you include one or a few earlier answers of the form that you need. For example, if you want the LLM to include steps in a computation, it can be hard to describe exactly the level of detail you are looking for. An example, or a few examples, might work better.
>
|
|
| (6) |
>
|
|
>
|
|
>
|
|
|
|
References
|
|
|
|
Compatibility
|
|
•
|
The NaturalLanguage:-Dialog command was introduced in Maple 2025.
|
|
|