Entropy - Maple Help
For the best experience, we recommend viewing online help using Google Chrome or Microsoft Edge.

Online Help

All Products    Maple    MapleSim


StringTools

  

Entropy

  

compute the Entropy of a string

 

Calling Sequence

Parameters

Description

Examples

Calling Sequence

Entropy( s )

Parameters

s

-

Maple string

Description

• 

The Entropy(s) command returns the Shannon entropy of the string s. A floating-point number, the entropy of the string, is returned.

• 

Shannon's entropy is defined as -add( P( ch ) * log[ 2 ]( P( ch ) ), ch = Support( s ) ), where . It is a measure of the information content of the string, and can be interpreted as the number of bits required to encode each character of the string given perfect compression. The entropy is maximal when each character is equally likely. For arbitrary non-null characters, this maximal value is .

  

(The null byte, with code point , cannot appear in a Maple string. If all 256 single byte code points could appear, then the maximal entropy would be , which is the number of bits per byte).

• 

Note that the entropy is computed as a floating-point number, at hardware (double) precision.

• 

All of the StringTools package commands treat strings as (null-terminated) sequences of -bit (ASCII) characters.  Thus, there is no support for multibyte character encodings, such as unicode encodings.

Examples

(1)

(2)

(3)

Entropy( Iota( 1, 255 ) );

(4)

(5)

(6)

(7)

(8)

(9)

(10)

(11)

(12)

The following steps illustrate the definition of Entropy.

(13)

(14)

(15)

(16)

(17)

(18)

See Also

add

convert

evalf

length

log

map

seq

string

StringTools

StringTools[CountCharacterOccurrences]

StringTools[Iota]

StringTools[Random]

StringTools[Repeat]

StringTools[Support]

with

 


Download Help Document