Neural networks taking probability distributions as input: A framework for analyzing exchangeable networks
In recent years, exchangeable network structures that take datasets as input have been widely used to obtain representations of various datasets. Although they perform well, analyzing exchangeable network with a dataset as input is challenging. Given that this type of network can be viewed as a func...
Gespeichert in:
Veröffentlicht in: | Neurocomputing (Amsterdam) 2024-06, Vol.584, p.127572, Article 127572 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent years, exchangeable network structures that take datasets as input have been widely used to obtain representations of various datasets. Although they perform well, analyzing exchangeable network with a dataset as input is challenging. Given that this type of network can be viewed as a function acting on probability measures since datasets are drawn from various distributions, this paper theoretically analyzes exchangeable network structures from a probabilistic perspective. This paper proposes a probabilistic analytical framework that neural networks acting on probability measures, which is an extension of Multi-Layer Perceptrons (MLP). When only samples from each distribution are available, in this new analytical framework, neural networks acting on probability measures correspond to the traditional exchangeable structure defined on datasets. Using this new analytical framework, we can demonstrate the ability of exchangeable network structures to capture complex patterns, as it provides the universal approximation property of exchangeable network structures. Furthermore, we derive a consistency result that shows the parameter estimation of exchangeable network structures is consistent statistically under certain conditions. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2024.127572 |