FusionOC: Research on optimal control method for infrared and visible light image fusion

•We have created a new fusion model, which can perceive the fusion results, fusion quality and source image features.•BP neural network is introduced to improve the automation level of the fusion control system.•According to the difference of image quality index, two fusion control modes are constru...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2025-01, Vol.181, p.106811, Article 106811
Hauptverfasser: Dong, Linlu, Wang, Jun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page 106811
container_title Neural networks
container_volume 181
creator Dong, Linlu
Wang, Jun
description •We have created a new fusion model, which can perceive the fusion results, fusion quality and source image features.•BP neural network is introduced to improve the automation level of the fusion control system.•According to the difference of image quality index, two fusion control modes are constructed to enhance the robustness. Infrared and visible light image fusion can solve the limitations of single-type visual sensors and can boost the target detection performance. However, since the traditional fusion strategy lacks the controllability and feedback mechanism, the fusion model cannot precisely perceive the relationship between the requirements of the fusion task, the fused image quality, and the source image features. To this end, this paper establishes a fusion model based on the optimal controlled object and control mode called FusionOC. This method establishes two types of mathematical models of the controlled objects by verifying the factors and conflicts affecting the quality of the fused image. It combines the image fusion model with the quality evaluation function to determine the two control factors separately. At the same time, two proportional-integral-derivative (PID) control and regulation modes based on the backpropagation (BP) neural network are designed according to the control factor characteristics. The fusion system can adaptively select the regulation mode to regulate the control factor according to the user requirements or the task to make the fusion system perceive the connection between the fusion task and the result. Besides, the fusion model employs the feedback mechanism of the control system to perceive the feature difference between the fusion result and the source image, realize the guidance of the source image feature to the entire fusion process, and improve the fusion algorithm's generalization ability and intelligence level when handling different fusion tasks. Experimental results on multiple public datasets demonstrate the advantages of FusionOC over advanced methods. Meanwhile, the benefits of our fusion results in object detection tasks have been demonstrated.
doi_str_mv 10.1016/j.neunet.2024.106811
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_3123548740</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608024007354</els_id><sourcerecordid>3123548740</sourcerecordid><originalsourceid>FETCH-LOGICAL-c241t-66a009ec5db3d766d2167c955d12f04e572d4bfebb283c9bc744fe8e8094fecc3</originalsourceid><addsrcrecordid>eNp9kF1LwzAUhoMobn78A5FcetOZpGmaeiHIcCoIgih4F9rk1GV0yUzagf_eaKeXXh04PO_5eBA6o2RGCRWXq5mDwUE_Y4Tx1BKS0j00pbKsMlZKto-mRFZ5JogkE3QU44qQBPH8EE3yiktBRTVFb4shWu-e5lf4GSLUQS-xd9hveruuO6y964Pv8Br6pTe49QFb14Y6gMG1M3hro206wJ19X_Y4Rd4Btz8TT9BBW3cRTnf1GL0ubl_m99nj093D_OYx04zTPhOiJqQCXZgmN6UQhlFR6qooDGUt4VCUzPCmhaZhMtdVo0vOW5AgSZWq1vkxuhjnboL_GCD2am2jhq6rHfghqpyyvOCy5CShfER18DEGaNUmpJPDp6JEfTtVKzU6Vd9O1eg0xc53G4ZmDeYv9CsxAdcjAOnPrYWgorbgNBgbQPfKePv_hi-LFYqI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3123548740</pqid></control><display><type>article</type><title>FusionOC: Research on optimal control method for infrared and visible light image fusion</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals Complete</source><creator>Dong, Linlu ; Wang, Jun</creator><creatorcontrib>Dong, Linlu ; Wang, Jun</creatorcontrib><description>•We have created a new fusion model, which can perceive the fusion results, fusion quality and source image features.•BP neural network is introduced to improve the automation level of the fusion control system.•According to the difference of image quality index, two fusion control modes are constructed to enhance the robustness. Infrared and visible light image fusion can solve the limitations of single-type visual sensors and can boost the target detection performance. However, since the traditional fusion strategy lacks the controllability and feedback mechanism, the fusion model cannot precisely perceive the relationship between the requirements of the fusion task, the fused image quality, and the source image features. To this end, this paper establishes a fusion model based on the optimal controlled object and control mode called FusionOC. This method establishes two types of mathematical models of the controlled objects by verifying the factors and conflicts affecting the quality of the fused image. It combines the image fusion model with the quality evaluation function to determine the two control factors separately. At the same time, two proportional-integral-derivative (PID) control and regulation modes based on the backpropagation (BP) neural network are designed according to the control factor characteristics. The fusion system can adaptively select the regulation mode to regulate the control factor according to the user requirements or the task to make the fusion system perceive the connection between the fusion task and the result. Besides, the fusion model employs the feedback mechanism of the control system to perceive the feature difference between the fusion result and the source image, realize the guidance of the source image feature to the entire fusion process, and improve the fusion algorithm's generalization ability and intelligence level when handling different fusion tasks. Experimental results on multiple public datasets demonstrate the advantages of FusionOC over advanced methods. Meanwhile, the benefits of our fusion results in object detection tasks have been demonstrated.</description><identifier>ISSN: 0893-6080</identifier><identifier>ISSN: 1879-2782</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2024.106811</identifier><identifier>PMID: 39486169</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Algorithms ; BP neural network ; Feedback ; Humans ; Image fusion ; Image Processing, Computer-Assisted - methods ; Infrared Rays ; Light ; Neural Networks, Computer ; Optimal control ; PID controller</subject><ispartof>Neural networks, 2025-01, Vol.181, p.106811, Article 106811</ispartof><rights>2024 Elsevier Ltd</rights><rights>Copyright © 2024 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c241t-66a009ec5db3d766d2167c955d12f04e572d4bfebb283c9bc744fe8e8094fecc3</cites><orcidid>0000-0002-7206-018X ; 0000-0002-0791-867X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2024.106811$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39486169$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Dong, Linlu</creatorcontrib><creatorcontrib>Wang, Jun</creatorcontrib><title>FusionOC: Research on optimal control method for infrared and visible light image fusion</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>•We have created a new fusion model, which can perceive the fusion results, fusion quality and source image features.•BP neural network is introduced to improve the automation level of the fusion control system.•According to the difference of image quality index, two fusion control modes are constructed to enhance the robustness. Infrared and visible light image fusion can solve the limitations of single-type visual sensors and can boost the target detection performance. However, since the traditional fusion strategy lacks the controllability and feedback mechanism, the fusion model cannot precisely perceive the relationship between the requirements of the fusion task, the fused image quality, and the source image features. To this end, this paper establishes a fusion model based on the optimal controlled object and control mode called FusionOC. This method establishes two types of mathematical models of the controlled objects by verifying the factors and conflicts affecting the quality of the fused image. It combines the image fusion model with the quality evaluation function to determine the two control factors separately. At the same time, two proportional-integral-derivative (PID) control and regulation modes based on the backpropagation (BP) neural network are designed according to the control factor characteristics. The fusion system can adaptively select the regulation mode to regulate the control factor according to the user requirements or the task to make the fusion system perceive the connection between the fusion task and the result. Besides, the fusion model employs the feedback mechanism of the control system to perceive the feature difference between the fusion result and the source image, realize the guidance of the source image feature to the entire fusion process, and improve the fusion algorithm's generalization ability and intelligence level when handling different fusion tasks. Experimental results on multiple public datasets demonstrate the advantages of FusionOC over advanced methods. Meanwhile, the benefits of our fusion results in object detection tasks have been demonstrated.</description><subject>Algorithms</subject><subject>BP neural network</subject><subject>Feedback</subject><subject>Humans</subject><subject>Image fusion</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Infrared Rays</subject><subject>Light</subject><subject>Neural Networks, Computer</subject><subject>Optimal control</subject><subject>PID controller</subject><issn>0893-6080</issn><issn>1879-2782</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kF1LwzAUhoMobn78A5FcetOZpGmaeiHIcCoIgih4F9rk1GV0yUzagf_eaKeXXh04PO_5eBA6o2RGCRWXq5mDwUE_Y4Tx1BKS0j00pbKsMlZKto-mRFZ5JogkE3QU44qQBPH8EE3yiktBRTVFb4shWu-e5lf4GSLUQS-xd9hveruuO6y964Pv8Br6pTe49QFb14Y6gMG1M3hro206wJ19X_Y4Rd4Btz8TT9BBW3cRTnf1GL0ubl_m99nj093D_OYx04zTPhOiJqQCXZgmN6UQhlFR6qooDGUt4VCUzPCmhaZhMtdVo0vOW5AgSZWq1vkxuhjnboL_GCD2am2jhq6rHfghqpyyvOCy5CShfER18DEGaNUmpJPDp6JEfTtVKzU6Vd9O1eg0xc53G4ZmDeYv9CsxAdcjAOnPrYWgorbgNBgbQPfKePv_hi-LFYqI</recordid><startdate>202501</startdate><enddate>202501</enddate><creator>Dong, Linlu</creator><creator>Wang, Jun</creator><general>Elsevier Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-7206-018X</orcidid><orcidid>https://orcid.org/0000-0002-0791-867X</orcidid></search><sort><creationdate>202501</creationdate><title>FusionOC: Research on optimal control method for infrared and visible light image fusion</title><author>Dong, Linlu ; Wang, Jun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c241t-66a009ec5db3d766d2167c955d12f04e572d4bfebb283c9bc744fe8e8094fecc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Algorithms</topic><topic>BP neural network</topic><topic>Feedback</topic><topic>Humans</topic><topic>Image fusion</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Infrared Rays</topic><topic>Light</topic><topic>Neural Networks, Computer</topic><topic>Optimal control</topic><topic>PID controller</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dong, Linlu</creatorcontrib><creatorcontrib>Wang, Jun</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dong, Linlu</au><au>Wang, Jun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>FusionOC: Research on optimal control method for infrared and visible light image fusion</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2025-01</date><risdate>2025</risdate><volume>181</volume><spage>106811</spage><pages>106811-</pages><artnum>106811</artnum><issn>0893-6080</issn><issn>1879-2782</issn><eissn>1879-2782</eissn><abstract>•We have created a new fusion model, which can perceive the fusion results, fusion quality and source image features.•BP neural network is introduced to improve the automation level of the fusion control system.•According to the difference of image quality index, two fusion control modes are constructed to enhance the robustness. Infrared and visible light image fusion can solve the limitations of single-type visual sensors and can boost the target detection performance. However, since the traditional fusion strategy lacks the controllability and feedback mechanism, the fusion model cannot precisely perceive the relationship between the requirements of the fusion task, the fused image quality, and the source image features. To this end, this paper establishes a fusion model based on the optimal controlled object and control mode called FusionOC. This method establishes two types of mathematical models of the controlled objects by verifying the factors and conflicts affecting the quality of the fused image. It combines the image fusion model with the quality evaluation function to determine the two control factors separately. At the same time, two proportional-integral-derivative (PID) control and regulation modes based on the backpropagation (BP) neural network are designed according to the control factor characteristics. The fusion system can adaptively select the regulation mode to regulate the control factor according to the user requirements or the task to make the fusion system perceive the connection between the fusion task and the result. Besides, the fusion model employs the feedback mechanism of the control system to perceive the feature difference between the fusion result and the source image, realize the guidance of the source image feature to the entire fusion process, and improve the fusion algorithm's generalization ability and intelligence level when handling different fusion tasks. Experimental results on multiple public datasets demonstrate the advantages of FusionOC over advanced methods. Meanwhile, the benefits of our fusion results in object detection tasks have been demonstrated.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>39486169</pmid><doi>10.1016/j.neunet.2024.106811</doi><orcidid>https://orcid.org/0000-0002-7206-018X</orcidid><orcidid>https://orcid.org/0000-0002-0791-867X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2025-01, Vol.181, p.106811, Article 106811
issn 0893-6080
1879-2782
1879-2782
language eng
recordid cdi_proquest_miscellaneous_3123548740
source MEDLINE; Elsevier ScienceDirect Journals Complete
subjects Algorithms
BP neural network
Feedback
Humans
Image fusion
Image Processing, Computer-Assisted - methods
Infrared Rays
Light
Neural Networks, Computer
Optimal control
PID controller
title FusionOC: Research on optimal control method for infrared and visible light image fusion
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T20%3A05%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=FusionOC:%20Research%20on%20optimal%20control%20method%20for%20infrared%20and%20visible%20light%20image%20fusion&rft.jtitle=Neural%20networks&rft.au=Dong,%20Linlu&rft.date=2025-01&rft.volume=181&rft.spage=106811&rft.pages=106811-&rft.artnum=106811&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2024.106811&rft_dat=%3Cproquest_cross%3E3123548740%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3123548740&rft_id=info:pmid/39486169&rft_els_id=S0893608024007354&rfr_iscdi=true