Can Hyperspectral Imaging and Neural Network Classification Be Used for Ore Grade Discrimination at the Point of Excavation?
This work determines whether hyperspectral imaging is suitable for discriminating ore from waste at the point of excavation. A prototype scanning system was developed for this study. This system combined hyperspectral cameras and a three-dimensional LiDAR, mounted on a pan-tilt head, and a positioni...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2022-03, Vol.22 (7), p.2687 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This work determines whether hyperspectral imaging is suitable for discriminating ore from waste at the point of excavation. A prototype scanning system was developed for this study. This system combined hyperspectral cameras and a three-dimensional LiDAR, mounted on a pan-tilt head, and a positioning system which determined the spatial location of the resultant hyperspectral data cube. This system was used to obtain scans both in the laboratory and at a gold mine in Western Australia. Samples from this mine site were assayed to determine their gold concentration and were scanned using the hyperspectral apparatus in the laboratory to create a library of labelled reference spectra. This library was used as (i) the reference set for spectral angle mapper classification and (ii) a training set for a convolutional neural network classifier. Both classification approaches were found to classify ore and waste on the scanned face with good accuracy when compared to the mine geological model. Greater resolution on the classification of ore grade quality was compromised by the quality and quantity of training data. The work provides evidence that an excavator-mounted hyperspectral system could be used to guide a human or autonomous excavator operator to selectively dig ore and minimise dilution. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s22072687 |