Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images

Very high resolution satellite imageries provide an excellent foundation for precise mapping of plant communities and even single plants. We aim to perform individual tree recognition on the basis of very high resolution RGB (red, green, blue) satellite images using deep learning approaches for nort...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Forests 2021-01, Vol.12 (1), p.66
Hauptverfasser: Korznikov, Kirill A., Kislov, Dmitry E., Altman, Jan, Doležal, Jiří, Vozmishcheva, Anna S., Krestov, Pavel V.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page 66
container_title Forests
container_volume 12
creator Korznikov, Kirill A.
Kislov, Dmitry E.
Altman, Jan
Doležal, Jiří
Vozmishcheva, Anna S.
Krestov, Pavel V.
description Very high resolution satellite imageries provide an excellent foundation for precise mapping of plant communities and even single plants. We aim to perform individual tree recognition on the basis of very high resolution RGB (red, green, blue) satellite images using deep learning approaches for northern temperate mixed forests in the Primorsky Region of the Russian Far East. We used a pansharpened satellite RGB image by GeoEye-1 with a spatial resolution of 0.46 m/pixel, obtained in late April 2019. We parametrized the standard U-Net convolutional neural network (CNN) and trained it in manually delineated satellite images to solve the satellite image segmentation problem. For comparison purposes, we also applied standard pixel-based classification algorithms, such as random forest, k-nearest neighbor classifier, naive Bayes classifier, and quadratic discrimination. Pattern-specific features based on grey level co-occurrence matrices (GLCM) were computed to improve the recognition ability of standard machine learning methods. The U-Net-like CNN allowed us to obtain precise recognition of Mongolian poplar (Populus suaveolens Fisch. ex Loudon s.l.) and evergreen coniferous trees (Abies holophylla Maxim., Pinus koraiensis Siebold & Zucc.). We were able to distinguish species belonging to either poplar or coniferous groups but were unable to separate species within the same group (i.e. A. holophylla and P. koraiensis were not distinguishable). The accuracy of recognition was estimated by several metrics and exceeded values obtained for standard machine learning approaches. In contrast to pixel-based recognition algorithms, the U-Net-like CNN does not lead to an increase in false-positive decisions when facing green-colored objects that are similar to trees. By means of U-Net-like CNN, we obtained a mean accuracy score of up to 0.96 in our computational experiments. The U-Net-like CNN recognizes tree crowns not as a set of pixels with known RGB intensities but as spatial objects with a specific geometry and pattern. This CNN’s specific feature excludes misclassifications related to objects of similar colors as objects of interest. We highlight that utilization of satellite images obtained within the suitable phenological season is of high importance for successful tree recognition. The suitability of the phenological season is conceptualized as a group of conditions providing highlighting objects of interest over other components of vegetation cover. In our case
doi_str_mv 10.3390/f12010066
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2477132466</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2477132466</sourcerecordid><originalsourceid>FETCH-LOGICAL-c292t-d6b5df30334458d33e9700e6cba904fb8c6eac5f4cda7b334bab6aee353229f33</originalsourceid><addsrcrecordid>eNpNkEFPwkAQhRujiQQ5-A8m8SIJ1e3utmWPggokBA2C12bbzuJC6eJuq-Ff-JMtYoxzeZPMNy95z_MuA3LDmCC3KqAkICSKTrxWIITwuSDx6b_93Os4tybNhHFfUN7yvpZOlytY-jOs_KneINwj7mBoyg9T1JU2pSxghrX9kerT2I0DZSw8W8y0Q1hYRJhjZlalPuCgS3hFu4exXr01B_drA_PRAK7nmPdg1LyUPRgUNXbhRVZYFLpCmGzlCt2Fd6Zk4bDzq21v-fiwGI796dNoMryb-hkVtPLzKA1zxQhjnIf9nDEUMSEYZakUhKu0n0Uos1DxLJdx2lCpTCOJyEJGqVCMtb2ro-_OmvcaXZWsTW2btC6hPI4DRnkUNVT3SGXWOGdRJTurt9Luk4Akh86Tv87ZN2L1c2I</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2477132466</pqid></control><display><type>article</type><title>Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images</title><source>MDPI - Multidisciplinary Digital Publishing Institute</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Korznikov, Kirill A. ; Kislov, Dmitry E. ; Altman, Jan ; Doležal, Jiří ; Vozmishcheva, Anna S. ; Krestov, Pavel V.</creator><creatorcontrib>Korznikov, Kirill A. ; Kislov, Dmitry E. ; Altman, Jan ; Doležal, Jiří ; Vozmishcheva, Anna S. ; Krestov, Pavel V.</creatorcontrib><description>Very high resolution satellite imageries provide an excellent foundation for precise mapping of plant communities and even single plants. We aim to perform individual tree recognition on the basis of very high resolution RGB (red, green, blue) satellite images using deep learning approaches for northern temperate mixed forests in the Primorsky Region of the Russian Far East. We used a pansharpened satellite RGB image by GeoEye-1 with a spatial resolution of 0.46 m/pixel, obtained in late April 2019. We parametrized the standard U-Net convolutional neural network (CNN) and trained it in manually delineated satellite images to solve the satellite image segmentation problem. For comparison purposes, we also applied standard pixel-based classification algorithms, such as random forest, k-nearest neighbor classifier, naive Bayes classifier, and quadratic discrimination. Pattern-specific features based on grey level co-occurrence matrices (GLCM) were computed to improve the recognition ability of standard machine learning methods. The U-Net-like CNN allowed us to obtain precise recognition of Mongolian poplar (Populus suaveolens Fisch. ex Loudon s.l.) and evergreen coniferous trees (Abies holophylla Maxim., Pinus koraiensis Siebold &amp; Zucc.). We were able to distinguish species belonging to either poplar or coniferous groups but were unable to separate species within the same group (i.e. A. holophylla and P. koraiensis were not distinguishable). The accuracy of recognition was estimated by several metrics and exceeded values obtained for standard machine learning approaches. In contrast to pixel-based recognition algorithms, the U-Net-like CNN does not lead to an increase in false-positive decisions when facing green-colored objects that are similar to trees. By means of U-Net-like CNN, we obtained a mean accuracy score of up to 0.96 in our computational experiments. The U-Net-like CNN recognizes tree crowns not as a set of pixels with known RGB intensities but as spatial objects with a specific geometry and pattern. This CNN’s specific feature excludes misclassifications related to objects of similar colors as objects of interest. We highlight that utilization of satellite images obtained within the suitable phenological season is of high importance for successful tree recognition. The suitability of the phenological season is conceptualized as a group of conditions providing highlighting objects of interest over other components of vegetation cover. In our case, the use of satellite images captured in mid-spring allowed us to recognize evergreen fir and pine trees as the first class of objects (“conifers”) and poplars as the second class, which were in a leafless state among other deciduous tree species.</description><identifier>ISSN: 1999-4907</identifier><identifier>EISSN: 1999-4907</identifier><identifier>DOI: 10.3390/f12010066</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Abies holophylla ; Accuracy ; Algorithms ; Artificial neural networks ; Bayesian analysis ; Classifiers ; Computer applications ; Coniferous trees ; Conifers ; Deciduous trees ; Decision trees ; Deep learning ; Evergreen trees ; Forests ; High resolution ; Image classification ; Image processing ; Image resolution ; Image segmentation ; K-nearest neighbors algorithm ; Learning algorithms ; Machine learning ; Mixed forests ; Neural networks ; Object recognition ; Pine trees ; Pinus koraiensis ; Pixels ; Plant communities ; Plant species ; Poplar ; Populus suaveolens ; Remote sensing ; Satellite imagery ; Spatial discrimination ; Spatial resolution ; Species ; Temperate forests ; Trees ; Unmanned aerial vehicles ; Vegetation cover ; Vegetation mapping</subject><ispartof>Forests, 2021-01, Vol.12 (1), p.66</ispartof><rights>2021. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c292t-d6b5df30334458d33e9700e6cba904fb8c6eac5f4cda7b334bab6aee353229f33</citedby><cites>FETCH-LOGICAL-c292t-d6b5df30334458d33e9700e6cba904fb8c6eac5f4cda7b334bab6aee353229f33</cites><orcidid>0000-0002-1597-6685 ; 0000-0003-4879-5773 ; 0000-0003-2850-1483</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Korznikov, Kirill A.</creatorcontrib><creatorcontrib>Kislov, Dmitry E.</creatorcontrib><creatorcontrib>Altman, Jan</creatorcontrib><creatorcontrib>Doležal, Jiří</creatorcontrib><creatorcontrib>Vozmishcheva, Anna S.</creatorcontrib><creatorcontrib>Krestov, Pavel V.</creatorcontrib><title>Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images</title><title>Forests</title><description>Very high resolution satellite imageries provide an excellent foundation for precise mapping of plant communities and even single plants. We aim to perform individual tree recognition on the basis of very high resolution RGB (red, green, blue) satellite images using deep learning approaches for northern temperate mixed forests in the Primorsky Region of the Russian Far East. We used a pansharpened satellite RGB image by GeoEye-1 with a spatial resolution of 0.46 m/pixel, obtained in late April 2019. We parametrized the standard U-Net convolutional neural network (CNN) and trained it in manually delineated satellite images to solve the satellite image segmentation problem. For comparison purposes, we also applied standard pixel-based classification algorithms, such as random forest, k-nearest neighbor classifier, naive Bayes classifier, and quadratic discrimination. Pattern-specific features based on grey level co-occurrence matrices (GLCM) were computed to improve the recognition ability of standard machine learning methods. The U-Net-like CNN allowed us to obtain precise recognition of Mongolian poplar (Populus suaveolens Fisch. ex Loudon s.l.) and evergreen coniferous trees (Abies holophylla Maxim., Pinus koraiensis Siebold &amp; Zucc.). We were able to distinguish species belonging to either poplar or coniferous groups but were unable to separate species within the same group (i.e. A. holophylla and P. koraiensis were not distinguishable). The accuracy of recognition was estimated by several metrics and exceeded values obtained for standard machine learning approaches. In contrast to pixel-based recognition algorithms, the U-Net-like CNN does not lead to an increase in false-positive decisions when facing green-colored objects that are similar to trees. By means of U-Net-like CNN, we obtained a mean accuracy score of up to 0.96 in our computational experiments. The U-Net-like CNN recognizes tree crowns not as a set of pixels with known RGB intensities but as spatial objects with a specific geometry and pattern. This CNN’s specific feature excludes misclassifications related to objects of similar colors as objects of interest. We highlight that utilization of satellite images obtained within the suitable phenological season is of high importance for successful tree recognition. The suitability of the phenological season is conceptualized as a group of conditions providing highlighting objects of interest over other components of vegetation cover. In our case, the use of satellite images captured in mid-spring allowed us to recognize evergreen fir and pine trees as the first class of objects (“conifers”) and poplars as the second class, which were in a leafless state among other deciduous tree species.</description><subject>Abies holophylla</subject><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Bayesian analysis</subject><subject>Classifiers</subject><subject>Computer applications</subject><subject>Coniferous trees</subject><subject>Conifers</subject><subject>Deciduous trees</subject><subject>Decision trees</subject><subject>Deep learning</subject><subject>Evergreen trees</subject><subject>Forests</subject><subject>High resolution</subject><subject>Image classification</subject><subject>Image processing</subject><subject>Image resolution</subject><subject>Image segmentation</subject><subject>K-nearest neighbors algorithm</subject><subject>Learning algorithms</subject><subject>Machine learning</subject><subject>Mixed forests</subject><subject>Neural networks</subject><subject>Object recognition</subject><subject>Pine trees</subject><subject>Pinus koraiensis</subject><subject>Pixels</subject><subject>Plant communities</subject><subject>Plant species</subject><subject>Poplar</subject><subject>Populus suaveolens</subject><subject>Remote sensing</subject><subject>Satellite imagery</subject><subject>Spatial discrimination</subject><subject>Spatial resolution</subject><subject>Species</subject><subject>Temperate forests</subject><subject>Trees</subject><subject>Unmanned aerial vehicles</subject><subject>Vegetation cover</subject><subject>Vegetation mapping</subject><issn>1999-4907</issn><issn>1999-4907</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNpNkEFPwkAQhRujiQQ5-A8m8SIJ1e3utmWPggokBA2C12bbzuJC6eJuq-Ff-JMtYoxzeZPMNy95z_MuA3LDmCC3KqAkICSKTrxWIITwuSDx6b_93Os4tybNhHFfUN7yvpZOlytY-jOs_KneINwj7mBoyg9T1JU2pSxghrX9kerT2I0DZSw8W8y0Q1hYRJhjZlalPuCgS3hFu4exXr01B_drA_PRAK7nmPdg1LyUPRgUNXbhRVZYFLpCmGzlCt2Fd6Zk4bDzq21v-fiwGI796dNoMryb-hkVtPLzKA1zxQhjnIf9nDEUMSEYZakUhKu0n0Uos1DxLJdx2lCpTCOJyEJGqVCMtb2ro-_OmvcaXZWsTW2btC6hPI4DRnkUNVT3SGXWOGdRJTurt9Luk4Akh86Tv87ZN2L1c2I</recordid><startdate>20210101</startdate><enddate>20210101</enddate><creator>Korznikov, Kirill A.</creator><creator>Kislov, Dmitry E.</creator><creator>Altman, Jan</creator><creator>Doležal, Jiří</creator><creator>Vozmishcheva, Anna S.</creator><creator>Krestov, Pavel V.</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SN</scope><scope>7SS</scope><scope>7X2</scope><scope>8FE</scope><scope>8FH</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>M0K</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PYCSY</scope><orcidid>https://orcid.org/0000-0002-1597-6685</orcidid><orcidid>https://orcid.org/0000-0003-4879-5773</orcidid><orcidid>https://orcid.org/0000-0003-2850-1483</orcidid></search><sort><creationdate>20210101</creationdate><title>Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images</title><author>Korznikov, Kirill A. ; Kislov, Dmitry E. ; Altman, Jan ; Doležal, Jiří ; Vozmishcheva, Anna S. ; Krestov, Pavel V.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c292t-d6b5df30334458d33e9700e6cba904fb8c6eac5f4cda7b334bab6aee353229f33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Abies holophylla</topic><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Bayesian analysis</topic><topic>Classifiers</topic><topic>Computer applications</topic><topic>Coniferous trees</topic><topic>Conifers</topic><topic>Deciduous trees</topic><topic>Decision trees</topic><topic>Deep learning</topic><topic>Evergreen trees</topic><topic>Forests</topic><topic>High resolution</topic><topic>Image classification</topic><topic>Image processing</topic><topic>Image resolution</topic><topic>Image segmentation</topic><topic>K-nearest neighbors algorithm</topic><topic>Learning algorithms</topic><topic>Machine learning</topic><topic>Mixed forests</topic><topic>Neural networks</topic><topic>Object recognition</topic><topic>Pine trees</topic><topic>Pinus koraiensis</topic><topic>Pixels</topic><topic>Plant communities</topic><topic>Plant species</topic><topic>Poplar</topic><topic>Populus suaveolens</topic><topic>Remote sensing</topic><topic>Satellite imagery</topic><topic>Spatial discrimination</topic><topic>Spatial resolution</topic><topic>Species</topic><topic>Temperate forests</topic><topic>Trees</topic><topic>Unmanned aerial vehicles</topic><topic>Vegetation cover</topic><topic>Vegetation mapping</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Korznikov, Kirill A.</creatorcontrib><creatorcontrib>Kislov, Dmitry E.</creatorcontrib><creatorcontrib>Altman, Jan</creatorcontrib><creatorcontrib>Doležal, Jiří</creatorcontrib><creatorcontrib>Vozmishcheva, Anna S.</creatorcontrib><creatorcontrib>Krestov, Pavel V.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Agricultural Science Collection</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>Agricultural Science Database</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Environmental Science Collection</collection><jtitle>Forests</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Korznikov, Kirill A.</au><au>Kislov, Dmitry E.</au><au>Altman, Jan</au><au>Doležal, Jiří</au><au>Vozmishcheva, Anna S.</au><au>Krestov, Pavel V.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images</atitle><jtitle>Forests</jtitle><date>2021-01-01</date><risdate>2021</risdate><volume>12</volume><issue>1</issue><spage>66</spage><pages>66-</pages><issn>1999-4907</issn><eissn>1999-4907</eissn><abstract>Very high resolution satellite imageries provide an excellent foundation for precise mapping of plant communities and even single plants. We aim to perform individual tree recognition on the basis of very high resolution RGB (red, green, blue) satellite images using deep learning approaches for northern temperate mixed forests in the Primorsky Region of the Russian Far East. We used a pansharpened satellite RGB image by GeoEye-1 with a spatial resolution of 0.46 m/pixel, obtained in late April 2019. We parametrized the standard U-Net convolutional neural network (CNN) and trained it in manually delineated satellite images to solve the satellite image segmentation problem. For comparison purposes, we also applied standard pixel-based classification algorithms, such as random forest, k-nearest neighbor classifier, naive Bayes classifier, and quadratic discrimination. Pattern-specific features based on grey level co-occurrence matrices (GLCM) were computed to improve the recognition ability of standard machine learning methods. The U-Net-like CNN allowed us to obtain precise recognition of Mongolian poplar (Populus suaveolens Fisch. ex Loudon s.l.) and evergreen coniferous trees (Abies holophylla Maxim., Pinus koraiensis Siebold &amp; Zucc.). We were able to distinguish species belonging to either poplar or coniferous groups but were unable to separate species within the same group (i.e. A. holophylla and P. koraiensis were not distinguishable). The accuracy of recognition was estimated by several metrics and exceeded values obtained for standard machine learning approaches. In contrast to pixel-based recognition algorithms, the U-Net-like CNN does not lead to an increase in false-positive decisions when facing green-colored objects that are similar to trees. By means of U-Net-like CNN, we obtained a mean accuracy score of up to 0.96 in our computational experiments. The U-Net-like CNN recognizes tree crowns not as a set of pixels with known RGB intensities but as spatial objects with a specific geometry and pattern. This CNN’s specific feature excludes misclassifications related to objects of similar colors as objects of interest. We highlight that utilization of satellite images obtained within the suitable phenological season is of high importance for successful tree recognition. The suitability of the phenological season is conceptualized as a group of conditions providing highlighting objects of interest over other components of vegetation cover. In our case, the use of satellite images captured in mid-spring allowed us to recognize evergreen fir and pine trees as the first class of objects (“conifers”) and poplars as the second class, which were in a leafless state among other deciduous tree species.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/f12010066</doi><orcidid>https://orcid.org/0000-0002-1597-6685</orcidid><orcidid>https://orcid.org/0000-0003-4879-5773</orcidid><orcidid>https://orcid.org/0000-0003-2850-1483</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1999-4907
ispartof Forests, 2021-01, Vol.12 (1), p.66
issn 1999-4907
1999-4907
language eng
recordid cdi_proquest_journals_2477132466
source MDPI - Multidisciplinary Digital Publishing Institute; EZB-FREE-00999 freely available EZB journals
subjects Abies holophylla
Accuracy
Algorithms
Artificial neural networks
Bayesian analysis
Classifiers
Computer applications
Coniferous trees
Conifers
Deciduous trees
Decision trees
Deep learning
Evergreen trees
Forests
High resolution
Image classification
Image processing
Image resolution
Image segmentation
K-nearest neighbors algorithm
Learning algorithms
Machine learning
Mixed forests
Neural networks
Object recognition
Pine trees
Pinus koraiensis
Pixels
Plant communities
Plant species
Poplar
Populus suaveolens
Remote sensing
Satellite imagery
Spatial discrimination
Spatial resolution
Species
Temperate forests
Trees
Unmanned aerial vehicles
Vegetation cover
Vegetation mapping
title Using U-Net-Like Deep Convolutional Neural Networks for Precise Tree Recognition in Very High Resolution RGB (Red, Green, Blue) Satellite Images
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T01%3A48%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Using%20U-Net-Like%20Deep%20Convolutional%20Neural%20Networks%20for%20Precise%20Tree%20Recognition%20in%20Very%20High%20Resolution%20RGB%20(Red,%20Green,%20Blue)%20Satellite%20Images&rft.jtitle=Forests&rft.au=Korznikov,%20Kirill%20A.&rft.date=2021-01-01&rft.volume=12&rft.issue=1&rft.spage=66&rft.pages=66-&rft.issn=1999-4907&rft.eissn=1999-4907&rft_id=info:doi/10.3390/f12010066&rft_dat=%3Cproquest_cross%3E2477132466%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2477132466&rft_id=info:pmid/&rfr_iscdi=true