Predicting Leaf Nitrogen Content in Cotton with UAV RGB Images
Rapid and accurate prediction of crop nitrogen content is of great significance for guiding precise fertilization. In this study, an unmanned aerial vehicle (UAV) digital camera was used to collect cotton canopy RGB images at 20 m height, and two cotton varieties and six nitrogen gradients were used...
Gespeichert in:
Veröffentlicht in: | Sustainability 2022-08, Vol.14 (15), p.9259 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Rapid and accurate prediction of crop nitrogen content is of great significance for guiding precise fertilization. In this study, an unmanned aerial vehicle (UAV) digital camera was used to collect cotton canopy RGB images at 20 m height, and two cotton varieties and six nitrogen gradients were used to predict nitrogen content in the cotton canopy. After image-preprocessing, 46 hand features were extracted, and deep features were extracted by convolutional neural network (CNN). Partial least squares and Pearson were used for feature dimensionality reduction, respectively. Linear regression, support vector machine, and one-dimensional CNN regression models were constructed with manual features as input, and the deep features were used as inputs to construct a two-dimensional CNN regression model to achieve accurate prediction of cotton canopy nitrogen. It was verified that the manual feature and deep feature models constructed from UAV RGB images had good prediction effects. R2 = 0.80 and RMSE = 1.67 g kg−1 of the Xinluzao 45 optimal model, and R2 = 0.42 and RMSE = 3.13 g kg−1 of the Xinluzao 53 optimal model. The results show that the UAV RGB image and machine learning technology can be used to predict the nitrogen content of large-scale cotton, but due to insufficient data samples, the accuracy and stability of the prediction model still need to be improved. |
---|---|
ISSN: | 2071-1050 2071-1050 |
DOI: | 10.3390/su14159259 |