SoftMax
compute max over entries of a Tensor
SoftMin
compute min over entries of a Tensor
SoftPlus
compute sum over entries of a Tensor
SoftProduct
compute product over entries of a Tensor
Calling Sequence
Parameters
Options
Description
Examples
Compatibility
SoftMax(t,opts)
SoftMaxCrossEntropyWithLogits(t,opts)
SoftPlus(t,opts)
t
-
Tensor
opts
zero or more options as specified below
axis=list(integer) or integer
The value of option axis is an integer or list of integers which describes which axis of the input Tensor to reduce across.
name=string
The value of option name specifies an optional name for this Tensor, to be displayed in output and when visualizing the dataflow graph.
The SoftMax(t,opts) command computes the softmax function of a Tensor,
The SoftMaxCrossEntropyWithLogits(t,labels=x,logits=y) command computes the softmax function with labels x and logits y.
The SoftPlus(t,opts) command computes log(exp(t)+t).
with⁡DeepLearning:
W≔Variable⁡29.,93.,−29.,−12.,−80.,96.,96.,−92.,89.,datatype=float8
W≔DeepLearning TensorName: Variable:0Shape: undefinedData Type: float[8]
SoftMax⁡W
DeepLearning TensorName: noneShape: undefinedData Type: float[8]
SoftPlus⁡W,name=sp
The SoftMax, SoftMin, SoftPlus and SoftProduct commands were introduced in Maple 2018.
For more information on Maple 2018 changes, see Updates in Maple 2018.
See Also
DeepLearning Overview
Download Help Document
What kind of issue would you like to report? (Optional)