Integrating Multiple Learning Strategies in First Order Logics
This paper describes a representation framework that offers a unifying platform for alternative systems, which learn concepts in First Order Logics. The main aspects of this framework are discussed. First of all, the separation between the hypothesis logical language (a version of the VL21 language)...
Gespeichert in:
Veröffentlicht in: | Machine learning 1997-06, Vol.27 (3), p.209 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper describes a representation framework that offers a unifying platform for alternative systems, which learn concepts in First Order Logics. The main aspects of this framework are discussed. First of all, the separation between the hypothesis logical language (a version of the VL21 language) and the representation of data by means of a relational database is motivated. Then, the functional layer between data and hypotheses, which makes the data accessible by the logical level through a set of abstract properties is described. A novelty, in the hypothesis representation language, is the introduction of the construct of internal disjunction; such a construct, first used by the AQ and Induce systems, is here made operational via a set of algorithms, capable to learn it, for both the discrete and the continuous-valued attributes case. These algorithms are embedded in learning systems (SMART+, REGAL, SNAP, WHY, RTL) using different paradigms (symbolic, genetic or connectionist), thus realizing an effective integration among them; in fact, categorical and numerical attributes can be handled in a uniform way. In order to exemplify the effectiveness of the representation framework and of the multistrategy integration, the results obtained by the above systems in some application domains are summarized.[PUBLICATION ABSTRACT] |
---|---|
ISSN: | 0885-6125 1573-0565 |
DOI: | 10.1023/A:1007361708126 |