You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Review of codebase and docs - Probabilities and Encodings- Datseris (#213)
* update probabilities table
* CountOccurrences works with `Any` input
* better terminology header
* simpler headers in probabilities
* Add encodings page
* simplify SymbolicPermutation docstring
* reference complexity measures
* correct dosctring to reference isrand
* more organized tests for symbolic permutat
* full rewrite of `SymbolicPermutation` and proper `encode` for Ordinal.
* type optimization in making the embedding
* remove entropy!
* simplifi probabilities! even more
* move fasthist to encoding folder
* complete unification of symbolic perm methods
* docstring for weighted fversion
* add docstring to amplkitude aware
* delete ALL other files
* fix all symbolic permutation tests
* fix all permutation tests (and one file only)
* clarify source code of encode Gaussian
* better docstring for GaussEncod
* simplify docstring of Dispersion
* more tests for naivekernel
* Zhu -> Correa
* shorter docstring for spatial permutation
* port spatial permutation example to Examples
* re-write SpatialSymb to have encoding as field. All tests pass.
* better display of exampels in decode
* better doc for ordinal encoding
* Some typos/nitpickery
* Probabilities can't compute.
Computations are done with probabilities as *input*
* Don't duplicate `SpatialDispersion`
* Clarify docstrings a bit
* Typo
* Cross-reference spatial estimators
Co-authored-by: Kristian Haaga <[email protected]>
Copy file name to clipboardExpand all lines: docs/src/devdocs.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ Good practices in developing a code base apply in every Pull Request. The [Good
11
11
5. If suitable, the estimator may be able to operate based on [`Encoding`]s. If so, it is preferred to implement an `Encoding` subtype and extend the methods [`encode`](@ref) and [`decode`](@ref). This will allow your probabilities estimator to be used with a larger span of entropy and complexity methods without additional effort.
12
12
6. Implement dispatch for [`probabilities_and_outcomes`](@ref) and your probabilities estimator type.
13
13
7. Implement dispatch for [`outcome_space`](@ref) and your probabilities estimator type.
14
-
8. Add your probabilities estimator type to the list in the docstring of [`ProbabilitiyEstimator`](@ref), and if you also made an encoding, add it to the [`Encoding`](@ref) docstring.
14
+
8. Add your probabilities estimator type to the table list in the documentation page of probabilities. If you made an encoding, also add it to corresponding table in the encodings section.
15
15
16
16
### Optional steps
17
17
You may extend any of the following functions if there are potential performance benefits in doing so:
Some probability estimators first "encode" input data into an intermediate representation indexed by the positive integers. This intermediate representation is called an "encoding" and its API is defined by the following:
Copy file name to clipboardExpand all lines: docs/src/examples.md
+36-6Lines changed: 36 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -233,7 +233,7 @@ fig
233
233
234
234
### Kaniadakis entropy
235
235
236
-
Here, we show how [`Kaniadakis`](@ref) entropy changes as function of the parameter `a` for
236
+
Here, we show how [`Kaniadakis`](@ref) entropy changes as function of the parameter `a` for
237
237
a range of two-element probability distributions given by
238
238
`Probabilities([p, 1 - p] for p in 1:0.0:0.01:1.0)`.
239
239
@@ -370,11 +370,41 @@ end
370
370
You see that while the direct entropy values of the chaotic and noisy signals change massively with `N` but they are almost the same for the normalized version.
371
371
For the regular signals, the entropy decreases nevertheless because the noise contribution of the Fourier computation becomes less significant.
372
372
373
+
## Spatiotemporal permutation entropy
374
+
375
+
Usage of a [``SpatialSymbolicPermutation`](@ref) estimator is straightforward.
376
+
Here we get the spatial permutation entropy of a 2D array (e.g., an image):
377
+
378
+
```@example MAIN
379
+
using Entropies
380
+
x = rand(50, 50) # some image
381
+
stencil = [1 1; 0 1] # or one of the other ways of specifying stencils
382
+
est = SpatialSymbolicPermutation(stencil, x)
383
+
h = entropy(est, x)
384
+
```
385
+
386
+
To apply this to timeseries of spatial data, simply loop over the call, e.g.:
387
+
388
+
```@example MAIN
389
+
data = [rand(50, 50) for i in 1:10] # e.g., evolution of a 2D field of a PDE
390
+
est = SpatialSymbolicPermutation(stencil, first(data))
391
+
h_vs_t = map(d -> entropy(est, d), data)
392
+
```
393
+
394
+
Computing any other generalized spatiotemporal permutation entropy is trivial, e.g. with [`Renyi`](@ref):
395
+
396
+
```@example MAIN
397
+
x = reshape(repeat(1:5, 500) .+ 0.1*rand(500*5), 50, 50)
398
+
est = SpatialSymbolicPermutation(stencil, x)
399
+
entropy(Renyi(q = 2), est, x)
400
+
```
401
+
402
+
373
403
## Spatial discrete entropy: Fabio
374
404
375
405
Let's see how the normalized permutation and dispersion entropies increase for an image that gets progressively more noise added to it.
In the literature, the term "entropy" is used (and abused) in multiple contexts.
19
-
The API and documentation of Entropies.jl aim to clarify some aspects of its usage, and
20
-
to provide a simple way to obtain probabilities, entropies, or other complexity measures.
19
+
The API and documentation of Entropies.jl aim to clarify some aspects of its usage, and to provide a simple way to obtain probabilities, entropies, or other complexity measures.
21
20
22
21
### Probabilities
23
22
24
23
Entropies and other complexity measures are typically computed based on _probability distributions_.
25
-
These are obtained from [Input data for Entropies.jl](@ref) in a plethora of different ways.
26
-
The central API function that returns a probability distribution (in fact, just a vector of probabilities) is [`probabilities`](@ref), which takes in a subtype of [`ProbabilitiesEstimator`](@ref) to specify how the probabilities are computed.
27
-
All estimators available in Entropies.jl can be found in the [estimators page](@ref probabilities_estimators).
24
+
These can be obtained from input data in a plethora of different ways.
25
+
The central API function that returns a probability distribution (or more precisely a probability mass function) is [`probabilities`](@ref), which takes in a subtype of [`ProbabilitiesEstimator`](@ref) to specify how the probabilities are computed.
26
+
All available estimators can be found in the [estimators page](@ref probabilities_estimators).
28
27
29
28
### Entropies
30
29
@@ -40,24 +39,28 @@ Thus, any of the implemented [probabilities estimators](@ref probabilities_estim
40
39
41
40
These names are commonplace, and so in Entropies.jl we provide convenience functions like [`entropy_wavelet`](@ref). However, it should be noted that these functions really aren't anything more than 2-lines-of-code wrappers that call [`entropy`](@ref) with the appropriate [`ProbabilitiesEstimator`](@ref).
42
41
43
-
In addition to `ProbabilitiesEstimators`, we also provide [`EntropyEstimator`](@ref)s,
44
-
which compute entropies via alternate means, without explicitly computing some
42
+
In addition to `ProbabilitiesEstimators`, we also provide [`EntropyEstimator`](@ref)s,
43
+
which compute entropies via alternate means, without explicitly computing some
45
44
probability distribution. Differential/continuous entropy, for example, is computed
46
-
using a dedicated [`EntropyEstimator`](@ref). For example, the [`Kraskov`](@ref)
47
-
estimator computes Shannon differential entropy via a nearest neighbor algorithm, while
48
-
the [`Zhu`](@ref) estimator computes Shannon differential entropy using order statistics.
45
+
using a dedicated [`EntropyEstimator`](@ref). For example, the [`Kraskov`](@ref)
46
+
estimator computes Shannon differential entropy via a nearest neighbor algorithm, while
47
+
the [`Correa`](@ref) estimator computes Shannon differential entropy using order statistics.
49
48
50
49
### Other complexity measures
51
50
52
-
Other complexity measures, which strictly speaking don't compute entropies, and may or may
53
-
not explicitly compute probability distributions, are found in
54
-
[Complexity.jl](https://github.com/JuliaDynamics/Complexity.jl) package. This includes
55
-
measures like sample entropy and approximate entropy.
51
+
Other complexity measures, which strictly speaking don't compute entropies, and may or may not explicitly compute probability distributions, are found in
52
+
[Complexity measures](@ref) page.
53
+
This includes measures like sample entropy and approximate entropy.
56
54
57
55
## [Input data for Entropies.jl](@id input_data)
58
56
59
-
The input data type typically depend on the probability estimator chosen. In general though, the standard DynamicalSystems.jl approach is taken and as such we have three types of input data:
57
+
The input data type typically depend on the probability estimator chosen.
58
+
In general though, the standard DynamicalSystems.jl approach is taken and as such we have three types of input data:
60
59
61
60
-_Timeseries_, which are `AbstractVector{<:Real}`, used in e.g. with [`WaveletOverlap`](@ref).
62
61
-_Multi-dimensional timeseries, or datasets, or state space sets_, which are [`Dataset`](@ref), used e.g. with [`NaiveKernel`](@ref).
63
62
-_Spatial data_, which are higher dimensional standard `Array`s, used e.g. with [`SpatialSymbolicPermutation`](@ref).
0 commit comments