Robust boosting neural networks with random weights for multivariate calibration of complex samples
Neural networks with random weights (NNRW) has been used for regression due to its excellent performance. However, NNRW is sensitive to outliers and unstable to some extent in dealing with the real-world complex samples. To overcome these drawbacks, a new method called robust boosting NNRW (RBNNRW)...
Gespeichert in:
Veröffentlicht in: | Analytica chimica acta 2018-06, Vol.1009, p.20-26 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Neural networks with random weights (NNRW) has been used for regression due to its excellent performance. However, NNRW is sensitive to outliers and unstable to some extent in dealing with the real-world complex samples. To overcome these drawbacks, a new method called robust boosting NNRW (RBNNRW) is proposed by integrating a robust version of boosting with NNRW. The method builds a large number of NNRW sub-models sequentially by robustly reweighted sampling from the original training set and then aggregates these predictions by weighted median. The performance of RBNNRW is tested with three spectral datasets of wheat, light gas oil and diesel fuel samples. As comparisons to RBNNRW, the conventional PLS, NNRW and boosting NNRW (BNNRW) have also been investigated. The results demonstrate that the introduction of robust boosting greatly enhances the stability and accuracy of NNRW. Moreover, RBNNRW is superior to BNNRW particularly when outliers exist.
[Display omitted]
•A novel ensemble method named as robust boosting neural networks with random weights (RBNNRW) is proposed.•Hampel robust step is introduced for the method.•The method has marked superiorities in predictive accuracy and stability especially when outliers exist. |
---|---|
ISSN: | 0003-2670 1873-4324 |
DOI: | 10.1016/j.aca.2018.01.013 |