Explainable artificial intelligence reveals environmental constraints in seagrass distribution

[Display omitted] •Ensemble models were used to predict the potential distribution areas of seagrass.•For the first time, we propose to use an explainable artificial intelligence method to interpret the distribution area of seagrass in Hainan, China.•Regional and site-levels environmental features a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Ecological indicators 2022-11, Vol.144, p.109523, Article 109523
Hauptverfasser: He, Bohao, Zhao, Yanghe, Mao, Wei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:[Display omitted] •Ensemble models were used to predict the potential distribution areas of seagrass.•For the first time, we propose to use an explainable artificial intelligence method to interpret the distribution area of seagrass in Hainan, China.•Regional and site-levels environmental features are affected differently.•Explainable artificial intelligence reveals the interactive effects of environmental variables in species distribution models. Seagrass is a globally vital marine resource that plays an essential global role in combating climate change, protecting coastlines, ensuring food security, and enriching biodiversity. However, global climate change and human activities have led to dramatic environmental changes severely affecting seagrass growth and development. Therefore, it is crucial to understand accurately how environmental changes, affect seagrass distribution. In this study, we selected the seagrass distribution area in Hainan, China, as the study area and proposed an ensemble model combining five machine learning models to predict the potential distribution of seagrass. Fifteen environmental variables were entered into the model, and the study results showed that the ensemble model provided the highest accuracy (Area Under Curve (AUC) = 0.91). The environmental variables were then classified into regional and site explanations with the help of explainable artificial intelligence (XAI) methods. The difference in the contribution of regional and site environmental variables is demonstrated, and the model provides a more reasonable explanation for the site. The Shapley value (SHAP) and Partial dependency plot (PDP) analysis methods explain the importance of environmental variables in the seagrass distribution model and the effect of multiple environmental variable interactions on the prediction results, which implies opening the black box model for machine learning. More evidence that explainable artificial intelligence can explain the effects of environmental variables in seagrass distribution models will help to improve environmental understanding in seagrass conservation processes.
ISSN:1470-160X
1872-7034
DOI:10.1016/j.ecolind.2022.109523