Using Artificial Neural Networks and Feature Saliency Techniques for Improved Iris Segmentation
One of the basic challenges to robust iris recognition is iris segmentation. This paper proposes the use of a feature saliency algorithm and an artificial neural network to perform iris segmentation. Many current Iris segmentation approaches assume a circular shape for the iris boundary if the iris...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | One of the basic challenges to robust iris recognition is iris segmentation. This paper proposes the use of a feature saliency algorithm and an artificial neural network to perform iris segmentation. Many current Iris segmentation approaches assume a circular shape for the iris boundary if the iris is directly facing the camera. Occlusion by the eyelid can cause the visible boundary to have an irregular shape. In our approach an artificial neural network is used to statistically classify each pixel of an iris image with no assumption of circularity. First, a feed-forward feature saliency technique is performed to determine which combination of features contains the greatest discriminatory information. Image brightness, local moments, local orientated energy measurements and relative pixel location are evaluated for saliency. Next, the set of salient features is used as the input to a multi-layer perceptron feed-forward artificial neural network trained for classification. Testing showed 96.46 percent accuracy in determining which pixels in an image of the eye were iris pixels. For occluded images, the iris masks created by the neural network were consistently more accurate than the truth mask created using the circular iris boundary assumption. Post-processing to retain the largest contiguous piece in the iris mask increased the accuracy to 98.2 percent. |
---|---|
ISSN: | 2161-4393 2161-4407 |
DOI: | 10.1109/IJCNN.2007.4371143 |