Data-Efficient Mutual Information Neural Estimator
Measuring Mutual Information (MI) between high-dimensional, continuous, random variables from observed samples has wide theoretical and practical applications. Recent work, MINE (Belghazi et al. 2018), focused on estimating tight variational lower bounds of MI using neural networks, but assumed unli...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Measuring Mutual Information (MI) between high-dimensional, continuous,
random variables from observed samples has wide theoretical and practical
applications. Recent work, MINE (Belghazi et al. 2018), focused on estimating
tight variational lower bounds of MI using neural networks, but assumed
unlimited supply of samples to prevent overfitting. In real world applications,
data is not always available at a surplus. In this work, we focus on improving
data efficiency and propose a Data-Efficient MINE Estimator (DEMINE), by
developing a relaxed predictive MI lower bound that can be estimated at higher
data efficiency by orders of magnitudes. The predictive MI lower bound also
enables us to develop a new meta-learning approach using task augmentation,
Meta-DEMINE, to improve generalization of the network and further boost
estimation accuracy empirically. With improved data-efficiency, our estimators
enables statistical testing of dependency at practical dataset sizes. We
demonstrate the effectiveness of our estimators on synthetic benchmarks and a
real world fMRI data, with application of inter-subject correlation analysis. |
---|---|
DOI: | 10.48550/arxiv.1905.03319 |