On Cumulative Tsallis Entropies
We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected me...
Gespeichert in:
Veröffentlicht in: | Acta applicandae mathematicae 2023-12, Vol.188 (1), p.9, Article 9 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We investigate the cumulative Tsallis entropy, an information measure recently introduced as a cumulative version of the classical Tsallis differential entropy, which is itself a generalization of the Boltzmann-Gibbs statistics. This functional is here considered as a perturbation of the expected mean residual life via some power weight function. This point of view leads to the introduction of the dual cumulative Tsallis entropy and of two families of coherent risk measures generalizing those built on mean residual life. We characterize the finiteness of the cumulative Tsallis entropy in terms of
L
p
-spaces and show how they determine the underlying distribution. The range of the functional is exactly described under various constraints, with optimal bounds improving on all those previously available in the literature. Whereas the maximization of the Tsallis differential entropy gives rise to the classical
q
-Gaussian distribution which is a generalization of the Gaussian having a finite range or heavy tails, the maximization of the cumulative Tsallis entropy leads to an analogous perturbation of the Logistic distribution. |
---|---|
ISSN: | 0167-8019 1572-9036 |
DOI: | 10.1007/s10440-023-00620-3 |