NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM
The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data a...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , , , , , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng ; fre |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | ZHAO, Xiongbo WU, Songling LI, Xiaomin JIN, Ruixi ZHANG, Hui WANG, Xiaofeng ZHOU, Hui YANG, Junyu XIE, Yujia LI, Yue LIN, Ping LU, Kunfeng ZHANG, Juan CONG, Longjian GAI, Yifan WEI, Xiaodan LIN, Yuye |
description | The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data and converting floating-point number input data into fixed-point number input data, wherein in the first hierarchical quantization module, the whole input tensor shares a quantization step size and a quantization zero point; a channel-level quantization unit used for performing hierarchical quantization on output of an activation layer, wherein the channel-level quantization module separately calculates a quantization step size and a quantization zero point for each channel; and a channel multiplication weighting module used for performing channel weighting multiplication calculation on first hierarchical quantization output data and channel-level quantization output data. According to the present disclosure, lossle |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_WO2024113945A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>WO2024113945A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_WO2024113945A13</originalsourceid><addsrcrecordid>eNrjZHD0cw0NcvRR8HMNCfcP8lbwcAzz9HNXcHVz83T2dPULUXD2cPTzc_VRcAwJAXI9_f0UNFydHTUVfF1BMp7BvjwMrGmJOcWpvFCam0HZzTXE2UM3tSA_PrW4IDE5NS-1JD7c38jAyMTQ0NjSxNTR0Jg4VQBmKCuP</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM</title><source>esp@cenet</source><creator>ZHAO, Xiongbo ; WU, Songling ; LI, Xiaomin ; JIN, Ruixi ; ZHANG, Hui ; WANG, Xiaofeng ; ZHOU, Hui ; YANG, Junyu ; XIE, Yujia ; LI, Yue ; LIN, Ping ; LU, Kunfeng ; ZHANG, Juan ; CONG, Longjian ; GAI, Yifan ; WEI, Xiaodan ; LIN, Yuye</creator><creatorcontrib>ZHAO, Xiongbo ; WU, Songling ; LI, Xiaomin ; JIN, Ruixi ; ZHANG, Hui ; WANG, Xiaofeng ; ZHOU, Hui ; YANG, Junyu ; XIE, Yujia ; LI, Yue ; LIN, Ping ; LU, Kunfeng ; ZHANG, Juan ; CONG, Longjian ; GAI, Yifan ; WEI, Xiaodan ; LIN, Yuye</creatorcontrib><description>The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data and converting floating-point number input data into fixed-point number input data, wherein in the first hierarchical quantization module, the whole input tensor shares a quantization step size and a quantization zero point; a channel-level quantization unit used for performing hierarchical quantization on output of an activation layer, wherein the channel-level quantization module separately calculates a quantization step size and a quantization zero point for each channel; and a channel multiplication weighting module used for performing channel weighting multiplication calculation on first hierarchical quantization output data and channel-level quantization output data. According to the present disclosure, lossle</description><language>chi ; eng ; fre</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240606&DB=EPODOC&CC=WO&NR=2024113945A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20240606&DB=EPODOC&CC=WO&NR=2024113945A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>ZHAO, Xiongbo</creatorcontrib><creatorcontrib>WU, Songling</creatorcontrib><creatorcontrib>LI, Xiaomin</creatorcontrib><creatorcontrib>JIN, Ruixi</creatorcontrib><creatorcontrib>ZHANG, Hui</creatorcontrib><creatorcontrib>WANG, Xiaofeng</creatorcontrib><creatorcontrib>ZHOU, Hui</creatorcontrib><creatorcontrib>YANG, Junyu</creatorcontrib><creatorcontrib>XIE, Yujia</creatorcontrib><creatorcontrib>LI, Yue</creatorcontrib><creatorcontrib>LIN, Ping</creatorcontrib><creatorcontrib>LU, Kunfeng</creatorcontrib><creatorcontrib>ZHANG, Juan</creatorcontrib><creatorcontrib>CONG, Longjian</creatorcontrib><creatorcontrib>GAI, Yifan</creatorcontrib><creatorcontrib>WEI, Xiaodan</creatorcontrib><creatorcontrib>LIN, Yuye</creatorcontrib><title>NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM</title><description>The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data and converting floating-point number input data into fixed-point number input data, wherein in the first hierarchical quantization module, the whole input tensor shares a quantization step size and a quantization zero point; a channel-level quantization unit used for performing hierarchical quantization on output of an activation layer, wherein the channel-level quantization module separately calculates a quantization step size and a quantization zero point for each channel; and a channel multiplication weighting module used for performing channel weighting multiplication calculation on first hierarchical quantization output data and channel-level quantization output data. According to the present disclosure, lossle</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHD0cw0NcvRR8HMNCfcP8lbwcAzz9HNXcHVz83T2dPULUXD2cPTzc_VRcAwJAXI9_f0UNFydHTUVfF1BMp7BvjwMrGmJOcWpvFCam0HZzTXE2UM3tSA_PrW4IDE5NS-1JD7c38jAyMTQ0NjSxNTR0Jg4VQBmKCuP</recordid><startdate>20240606</startdate><enddate>20240606</enddate><creator>ZHAO, Xiongbo</creator><creator>WU, Songling</creator><creator>LI, Xiaomin</creator><creator>JIN, Ruixi</creator><creator>ZHANG, Hui</creator><creator>WANG, Xiaofeng</creator><creator>ZHOU, Hui</creator><creator>YANG, Junyu</creator><creator>XIE, Yujia</creator><creator>LI, Yue</creator><creator>LIN, Ping</creator><creator>LU, Kunfeng</creator><creator>ZHANG, Juan</creator><creator>CONG, Longjian</creator><creator>GAI, Yifan</creator><creator>WEI, Xiaodan</creator><creator>LIN, Yuye</creator><scope>EVB</scope></search><sort><creationdate>20240606</creationdate><title>NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM</title><author>ZHAO, Xiongbo ; WU, Songling ; LI, Xiaomin ; JIN, Ruixi ; ZHANG, Hui ; WANG, Xiaofeng ; ZHOU, Hui ; YANG, Junyu ; XIE, Yujia ; LI, Yue ; LIN, Ping ; LU, Kunfeng ; ZHANG, Juan ; CONG, Longjian ; GAI, Yifan ; WEI, Xiaodan ; LIN, Yuye</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_WO2024113945A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng ; fre</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>ZHAO, Xiongbo</creatorcontrib><creatorcontrib>WU, Songling</creatorcontrib><creatorcontrib>LI, Xiaomin</creatorcontrib><creatorcontrib>JIN, Ruixi</creatorcontrib><creatorcontrib>ZHANG, Hui</creatorcontrib><creatorcontrib>WANG, Xiaofeng</creatorcontrib><creatorcontrib>ZHOU, Hui</creatorcontrib><creatorcontrib>YANG, Junyu</creatorcontrib><creatorcontrib>XIE, Yujia</creatorcontrib><creatorcontrib>LI, Yue</creatorcontrib><creatorcontrib>LIN, Ping</creatorcontrib><creatorcontrib>LU, Kunfeng</creatorcontrib><creatorcontrib>ZHANG, Juan</creatorcontrib><creatorcontrib>CONG, Longjian</creatorcontrib><creatorcontrib>GAI, Yifan</creatorcontrib><creatorcontrib>WEI, Xiaodan</creatorcontrib><creatorcontrib>LIN, Yuye</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>ZHAO, Xiongbo</au><au>WU, Songling</au><au>LI, Xiaomin</au><au>JIN, Ruixi</au><au>ZHANG, Hui</au><au>WANG, Xiaofeng</au><au>ZHOU, Hui</au><au>YANG, Junyu</au><au>XIE, Yujia</au><au>LI, Yue</au><au>LIN, Ping</au><au>LU, Kunfeng</au><au>ZHANG, Juan</au><au>CONG, Longjian</au><au>GAI, Yifan</au><au>WEI, Xiaodan</au><au>LIN, Yuye</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM</title><date>2024-06-06</date><risdate>2024</risdate><abstract>The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data and converting floating-point number input data into fixed-point number input data, wherein in the first hierarchical quantization module, the whole input tensor shares a quantization step size and a quantization zero point; a channel-level quantization unit used for performing hierarchical quantization on output of an activation layer, wherein the channel-level quantization module separately calculates a quantization step size and a quantization zero point for each channel; and a channel multiplication weighting module used for performing channel weighting multiplication calculation on first hierarchical quantization output data and channel-level quantization output data. According to the present disclosure, lossle</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | chi ; eng ; fre |
recordid | cdi_epo_espacenet_WO2024113945A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING PHYSICS |
title | NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T14%3A58%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=ZHAO,%20Xiongbo&rft.date=2024-06-06&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EWO2024113945A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |