On cumulative Tsallis entropies
We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected me...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We investigate the cumulative Tsallis entropy, an information measure
recently introduced as a cumulative version of the classical Tsallis
differential entropy, which is itself a generalization of the Boltzmann-Gibbs
statistics. This functional is here considered as a perturbation of the
expected mean residual life via some power weight function. This point of view
leads to the introduction of the dual cumulative Tsallis entropy and of two
families of coherent risk measures generalizing those built on mean residual
life. We characterize the finiteness of the cumulative Tsallis entropy in terms
of ${\mathcal L}_p$-spaces and show how they determine the underlying
distribution. The range of the functional is exactly described under various
constraints, with optimal bounds improving on all those previously available in
the literature. Whereas the maximization of the Tsallis differential entropy
gives rise to the classical $q-$Gaussian distribution which is a generalization
of the Gaussian having a finite range or heavy tails, the maximization of the
cumulative Tsallis entropy leads to an analogous perturbation of the Logistic
distribution. |
---|---|
DOI: | 10.48550/arxiv.2210.09047 |