Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification
In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (D...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2018-01, Vol.56 (1), p.117-126 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 126 |
---|---|
container_issue | 1 |
container_start_page | 117 |
container_title | IEEE transactions on geoscience and remote sensing |
container_volume | 56 |
creator | Liu, Qingshan Hang, Renlong Song, Huihui Li, Zhi |
description | In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (DCNN). However, simultaneously training multiple DCNNs is time-consuming. To address this issue, we explore DCNN with spatial pyramid pooling (SPP-net). Since different SPP-nets have the same number of parameters, which share the identical initial values, and only fine-tuning the parameters in fully connected layers ensures the effectiveness of each network, thereby greatly accelerating the training process. Then, the multiscale satellite images are fed into their corresponding SPP-nets, respectively, to extract multiscale deep features. Finally, a multiple kernel learning method is developed to automatically learn the optimal combination of such features. Experiments on two difficult data sets show that the proposed method achieves favorable performance compared with other state-of-the-art methods. |
doi_str_mv | 10.1109/TGRS.2017.2743243 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2174511998</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8036413</ieee_id><sourcerecordid>2174511998</sourcerecordid><originalsourceid>FETCH-LOGICAL-c407t-fdd465c00c9e61561881a30c86eaf24af7a5f25d86ba971c09e9c3ceb5d525173</originalsourceid><addsrcrecordid>eNo9kD1PwzAURS0EEqXwAxCLJeYUP38lHlGhLVIRUlsWlsh1XoqrNCl2MvDvSdSK6S3n3qd7CLkHNgFg5mkzX60nnEE64akUXIoLMgKlsoRpKS_JiIHRCc8MvyY3Me4ZA6kgHZGvJdpQ-3pH37uq9dHZCukL4pHO0LZdwEjLJtCF330nK4xN1bW-qenatlhVvkX6drA7pGuHNdJpZWP0pXd2gG7JVWmriHfnOyafs9fNdJEsP-Zv0-dl4iRL26QsCqmVY8wZ1KA0ZBlYwVym0ZZc2jK1quSqyPTWmhQcM2iccLhVheL9BjEmj6feY2h-Ooxtvm-6UPcvcw5pPxOMyXoKTpQLTYwBy_wY_MGG3xxYPijMB4X5oDA_K-wzD6eMR8R_PmNCSxDiDzHJbVQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2174511998</pqid></control><display><type>article</type><title>Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Qingshan ; Hang, Renlong ; Song, Huihui ; Li, Zhi</creator><creatorcontrib>Liu, Qingshan ; Hang, Renlong ; Song, Huihui ; Li, Zhi</creatorcontrib><description>In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (DCNN). However, simultaneously training multiple DCNNs is time-consuming. To address this issue, we explore DCNN with spatial pyramid pooling (SPP-net). Since different SPP-nets have the same number of parameters, which share the identical initial values, and only fine-tuning the parameters in fully connected layers ensures the effectiveness of each network, thereby greatly accelerating the training process. Then, the multiscale satellite images are fed into their corresponding SPP-nets, respectively, to extract multiscale deep features. Finally, a multiple kernel learning method is developed to automatically learn the optimal combination of such features. Experiments on two difficult data sets show that the proposed method achieves favorable performance compared with other state-of-the-art methods.</description><identifier>ISSN: 0196-2892</identifier><identifier>EISSN: 1558-0644</identifier><identifier>DOI: 10.1109/TGRS.2017.2743243</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Artificial neural networks ; Classification ; Deep convolutional neural networks (DCNNs) ; Feature extraction ; feature fusion ; High resolution ; Histograms ; Image classification ; Image resolution ; Learning ; Learning systems ; multiple kernel learning (MKL) ; Multiscale analysis ; multiscale deep features ; Nets ; Neural networks ; Parameters ; Resolution ; satellite image classification ; Satellite imagery ; Satellites ; spatial pyramid pooling ; Spatial resolution ; State of the art ; Teaching methods ; Training ; Visualization ; Warp</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2018-01, Vol.56 (1), p.117-126</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c407t-fdd465c00c9e61561881a30c86eaf24af7a5f25d86ba971c09e9c3ceb5d525173</citedby><cites>FETCH-LOGICAL-c407t-fdd465c00c9e61561881a30c86eaf24af7a5f25d86ba971c09e9c3ceb5d525173</cites><orcidid>0000-0002-5512-6984 ; 0000-0001-6046-3689</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8036413$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/8036413$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Qingshan</creatorcontrib><creatorcontrib>Hang, Renlong</creatorcontrib><creatorcontrib>Song, Huihui</creatorcontrib><creatorcontrib>Li, Zhi</creatorcontrib><title>Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (DCNN). However, simultaneously training multiple DCNNs is time-consuming. To address this issue, we explore DCNN with spatial pyramid pooling (SPP-net). Since different SPP-nets have the same number of parameters, which share the identical initial values, and only fine-tuning the parameters in fully connected layers ensures the effectiveness of each network, thereby greatly accelerating the training process. Then, the multiscale satellite images are fed into their corresponding SPP-nets, respectively, to extract multiscale deep features. Finally, a multiple kernel learning method is developed to automatically learn the optimal combination of such features. Experiments on two difficult data sets show that the proposed method achieves favorable performance compared with other state-of-the-art methods.</description><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Deep convolutional neural networks (DCNNs)</subject><subject>Feature extraction</subject><subject>feature fusion</subject><subject>High resolution</subject><subject>Histograms</subject><subject>Image classification</subject><subject>Image resolution</subject><subject>Learning</subject><subject>Learning systems</subject><subject>multiple kernel learning (MKL)</subject><subject>Multiscale analysis</subject><subject>multiscale deep features</subject><subject>Nets</subject><subject>Neural networks</subject><subject>Parameters</subject><subject>Resolution</subject><subject>satellite image classification</subject><subject>Satellite imagery</subject><subject>Satellites</subject><subject>spatial pyramid pooling</subject><subject>Spatial resolution</subject><subject>State of the art</subject><subject>Teaching methods</subject><subject>Training</subject><subject>Visualization</subject><subject>Warp</subject><issn>0196-2892</issn><issn>1558-0644</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kD1PwzAURS0EEqXwAxCLJeYUP38lHlGhLVIRUlsWlsh1XoqrNCl2MvDvSdSK6S3n3qd7CLkHNgFg5mkzX60nnEE64akUXIoLMgKlsoRpKS_JiIHRCc8MvyY3Me4ZA6kgHZGvJdpQ-3pH37uq9dHZCukL4pHO0LZdwEjLJtCF330nK4xN1bW-qenatlhVvkX6drA7pGuHNdJpZWP0pXd2gG7JVWmriHfnOyafs9fNdJEsP-Zv0-dl4iRL26QsCqmVY8wZ1KA0ZBlYwVym0ZZc2jK1quSqyPTWmhQcM2iccLhVheL9BjEmj6feY2h-Ooxtvm-6UPcvcw5pPxOMyXoKTpQLTYwBy_wY_MGG3xxYPijMB4X5oDA_K-wzD6eMR8R_PmNCSxDiDzHJbVQ</recordid><startdate>201801</startdate><enddate>201801</enddate><creator>Liu, Qingshan</creator><creator>Hang, Renlong</creator><creator>Song, Huihui</creator><creator>Li, Zhi</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-5512-6984</orcidid><orcidid>https://orcid.org/0000-0001-6046-3689</orcidid></search><sort><creationdate>201801</creationdate><title>Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification</title><author>Liu, Qingshan ; Hang, Renlong ; Song, Huihui ; Li, Zhi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c407t-fdd465c00c9e61561881a30c86eaf24af7a5f25d86ba971c09e9c3ceb5d525173</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Deep convolutional neural networks (DCNNs)</topic><topic>Feature extraction</topic><topic>feature fusion</topic><topic>High resolution</topic><topic>Histograms</topic><topic>Image classification</topic><topic>Image resolution</topic><topic>Learning</topic><topic>Learning systems</topic><topic>multiple kernel learning (MKL)</topic><topic>Multiscale analysis</topic><topic>multiscale deep features</topic><topic>Nets</topic><topic>Neural networks</topic><topic>Parameters</topic><topic>Resolution</topic><topic>satellite image classification</topic><topic>Satellite imagery</topic><topic>Satellites</topic><topic>spatial pyramid pooling</topic><topic>Spatial resolution</topic><topic>State of the art</topic><topic>Teaching methods</topic><topic>Training</topic><topic>Visualization</topic><topic>Warp</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Qingshan</creatorcontrib><creatorcontrib>Hang, Renlong</creatorcontrib><creatorcontrib>Song, Huihui</creatorcontrib><creatorcontrib>Li, Zhi</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Qingshan</au><au>Hang, Renlong</au><au>Song, Huihui</au><au>Li, Zhi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2018-01</date><risdate>2018</risdate><volume>56</volume><issue>1</issue><spage>117</spage><epage>126</epage><pages>117-126</pages><issn>0196-2892</issn><eissn>1558-0644</eissn><coden>IGRSD2</coden><abstract>In this paper, we propose a multiscale deep feature learning method for high-resolution satellite image scene classification. Specifically, we first warp the original satellite image into multiple different scales. The images in each scale are employed to train a deep convolutional neural network (DCNN). However, simultaneously training multiple DCNNs is time-consuming. To address this issue, we explore DCNN with spatial pyramid pooling (SPP-net). Since different SPP-nets have the same number of parameters, which share the identical initial values, and only fine-tuning the parameters in fully connected layers ensures the effectiveness of each network, thereby greatly accelerating the training process. Then, the multiscale satellite images are fed into their corresponding SPP-nets, respectively, to extract multiscale deep features. Finally, a multiple kernel learning method is developed to automatically learn the optimal combination of such features. Experiments on two difficult data sets show that the proposed method achieves favorable performance compared with other state-of-the-art methods.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TGRS.2017.2743243</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-5512-6984</orcidid><orcidid>https://orcid.org/0000-0001-6046-3689</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0196-2892 |
ispartof | IEEE transactions on geoscience and remote sensing, 2018-01, Vol.56 (1), p.117-126 |
issn | 0196-2892 1558-0644 |
language | eng |
recordid | cdi_proquest_journals_2174511998 |
source | IEEE Electronic Library (IEL) |
subjects | Artificial neural networks Classification Deep convolutional neural networks (DCNNs) Feature extraction feature fusion High resolution Histograms Image classification Image resolution Learning Learning systems multiple kernel learning (MKL) Multiscale analysis multiscale deep features Nets Neural networks Parameters Resolution satellite image classification Satellite imagery Satellites spatial pyramid pooling Spatial resolution State of the art Teaching methods Training Visualization Warp |
title | Learning Multiscale Deep Features for High-Resolution Satellite Image Scene Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T15%3A56%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Learning%20Multiscale%20Deep%20Features%20for%20High-Resolution%20Satellite%20Image%20Scene%20Classification&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=Liu,%20Qingshan&rft.date=2018-01&rft.volume=56&rft.issue=1&rft.spage=117&rft.epage=126&rft.pages=117-126&rft.issn=0196-2892&rft.eissn=1558-0644&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2017.2743243&rft_dat=%3Cproquest_RIE%3E2174511998%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2174511998&rft_id=info:pmid/&rft_ieee_id=8036413&rfr_iscdi=true |