Information Theory¶
Entropy calculation and Huffman coding.
v_huffman
¶
V_HUFFMAN - Calculate a D-ary Huffman code.
v_huffman
¶
Calculate a D-ary Huffman code.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
p
|
array_like
|
Vector of symbol probabilities. |
required |
a
|
str or list
|
Alphabet characters. Length determines code order. Default '01' (binary). |
'01'
|
Returns:
| Name | Type | Description |
|---|---|---|
cc |
list
|
Code for each symbol (list of strings or lists). |
ll |
ndarray
|
Code lengths for each symbol. |
l |
float
|
Average code length. |
Source code in pyvoicebox/v_huffman.py
v_entropy
¶
V_ENTROPY - Shannon entropy of discrete and sampled continuous distributions.
v_entropy
¶
Calculate the entropy of discrete and sampled continuous distributions.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
p
|
array_like
|
Probability array. |
required |
dim
|
int or list of int
|
Dimensions along which to evaluate entropy. Default: first non-singleton. |
None
|
cond
|
list of int
|
Dimensions to use as conditional variables. |
None
|
arg
|
list of int
|
Dimensions to use as parameters in output. |
None
|
step
|
float or list of float
|
Sample increment for continuous distributions. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
h |
ndarray
|
Entropy value(s). |