Characterizations of Learnability for Classes of {0, ..., n)-Valued Functions

We investigate the PAC learnability of classes of {0, ..., n}-valued functions (n < ∞). For n = 1 it is known that the finiteness of the Vapnik-Chervonenkis dimension is necessary and sufficient for learning. For n > 1 several generalizations of the VC-dimension, each yielding a distinct chara...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of computer and system sciences 1995-02, Vol.50 (1), p.74-86
Hauptverfasser: Bendavid, S., Cesabianchi, N., Haussler, D., Long, P.M.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We investigate the PAC learnability of classes of {0, ..., n}-valued functions (n < ∞). For n = 1 it is known that the finiteness of the Vapnik-Chervonenkis dimension is necessary and sufficient for learning. For n > 1 several generalizations of the VC-dimension, each yielding a distinct characterization of learnability, have been proposed by a number of researchers. In this paper we present a general scheme for extending the VC-dimension to the case n > 1. Our scheme defines a wide variety of notions of dimension in which all these variants of the VC-dimension, previously introduced in the context of learning, appear as special cases. Our main result is a simple condition characterizing the set of notions of dimension whose finiteness is necessary and sufficient for learning. This provides a variety of new tools for determining the learnability of a class of multi-valued functions. Our characterization is also shown to hold in the "robust" variant of PAC model and for any "reasonable" loss function.
ISSN:0022-0000
1090-2724
DOI:10.1006/jcss.1995.1008