SoftMax - Maple Programming Help

Online Help

All Products    Maple    MapleSim


Home : Support : Online Help : Programming : DeepLearning Package : Tensors : Operations on Tensors : DeepLearning/Tensor/SoftMax

SoftMax

compute max over entries of a Tensor

SoftMin

compute min over entries of a Tensor

SoftPlus

compute sum over entries of a Tensor

SoftProduct

compute product over entries of a Tensor

 

Calling Sequence

Parameters

Options

Description

Examples

Compatibility

Calling Sequence

SoftMax(t,opts)

SoftMaxCrossEntropyWithLogits(t,opts)

SoftPlus(t,opts)

Parameters

t

-

Tensor

opts

-

zero or more options as specified below

Options

• 

axis=list(integer) or integer

The value of option axis is an integer or list of integers which describes which axis of the input Tensor to reduce across.

• 

name=string

The value of option name specifies an optional name for this Tensor, to be displayed in output and when visualizing the dataflow graph.

Description

• 

The SoftMax(t,opts) command computes the softmax function of a Tensor,

• 

The SoftMaxCrossEntropyWithLogits(t,labels=x,logits=y) command computes the softmax function with labels x and logits y.

• 

The SoftPlus(t,opts) command computes log(exp(t)+t).

Examples

withDeepLearning:

WVariable29.,93.,29.,12.,80.,96.,96.,92.,89.,datatype=float8

WDeepLearning TensorName: Variable:0Shape: [3, 3]Data Type: float[8]

(1)

SoftMaxW

DeepLearning TensorName: Softmax:0Shape: [3, 3]Data Type: float[8]

(2)

SoftPlusW,name=sp

DeepLearning TensorName: sp:0Shape: [3, 3]Data Type: float[8]

(3)

Compatibility

• 

The SoftMax, SoftMin, SoftPlus and SoftProduct commands were introduced in Maple 2018.

• 

For more information on Maple 2018 changes, see Updates in Maple 2018.

See Also

DeepLearning Overview

Tensor