Tree Cover Estimation in Global Drylands from Space Using Deep Learning

Accurate tree cover mapping is of paramount importance in many fields, from biodiversity conservation to carbon stock estimation, ecohydrology, erosion control, or Earth system modelling. Despite this importance, there is still uncertainty about global forest cover, particularly in drylands. Recentl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2020-02, Vol.12 (3), p.343
Hauptverfasser: Guirado, Emilio, Alcaraz-Segura, Domingo, Cabello, Javier, Puertas-Ruíz, Sergio, Herrera, Francisco, Tabik, Siham
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Accurate tree cover mapping is of paramount importance in many fields, from biodiversity conservation to carbon stock estimation, ecohydrology, erosion control, or Earth system modelling. Despite this importance, there is still uncertainty about global forest cover, particularly in drylands. Recently, the Food and Agriculture Organization of the United Nations (FAO) conducted a costly global assessment of dryland forest cover through the visual interpretation of orthoimages using the Collect Earth software, involving hundreds of operators from around the world. Our study proposes a new automatic method for estimating tree cover using artificial intelligence and free orthoimages. Our results show that our tree cover classification model, based on convolutional neural networks (CNN), is 23% more accurate than the manual visual interpretation used by FAO, reaching up to 79% overall accuracy. The smallest differences between the two methods occurred in the driest regions, but disagreement increased with the percentage of tree cover. The application of CNNs could be used to improve and reduce the cost of tree cover maps from the local to the global scale, with broad implications for research and management.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs12030343