AUTOMATED STAR GALAXY DISCRIMINATION WITH NEURAL NETWORKS
We discuss progress in the development of automatic star/galaxy discriminators for processing images generated by the University of Minnesota Automated Plate Scanner (APS) for cataloging the first epoch Palomar Sky Survey. Classifications are based on 14 image parameters computed for each object det...
Gespeichert in:
Veröffentlicht in: | The Astronomical journal 1992-01, Vol.103 (1), p.318-331 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We discuss progress in the development of automatic star/galaxy discriminators for processing images generated by the University of Minnesota Automated Plate Scanner (APS) for cataloging the first epoch Palomar Sky Survey. Classifications are based on 14 image parameters computed for each object detected by the APS operating in a threshold densitometry mode. It is shown that a number of parameter spaces formed with these vector elements are effective in separating a sample into the two basic populations of stellar and nonstellar objects. An artificial intelligence technique known as a neural network is employed to perform the image classification. We have experimented with a simple linear classifier known as a perceptron, as well as with a more sophisticated backpropagation neural network with the result that we are able to attain classification success rates of 99% for galaxy images with B less-than-or-equal-to 18.5 and above 95% for the magnitude range 18.5 less-than-or-equal-to B less-than-or-equal-to 19.5. The analysis presented here uses a training dataset consisting of 2665 galaxies and 2082 stars, along with a test sample of 936 galaxies and 2378 stars. We have determined the success rate of these classifiers as a function of image diameter and integrated magnitude. Simple numerical experiments have been conducted in an effort to illustrate the robust nature of this method as well as to isolate the most significant image parameters used by the networks in distinguishing image class. |
---|---|
ISSN: | 0004-6256 1538-3881 |
DOI: | 10.1086/116063 |