NEURAL NETWORK HAVING EFFICIENT CHANNEL ATTENTION (ECA) MECHANISM

The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: ZHAO, Xiongbo, WU, Songling, LI, Xiaomin, JIN, Ruixi, ZHANG, Hui, WANG, Xiaofeng, ZHOU, Hui, YANG, Junyu, XIE, Yujia, LI, Yue, LIN, Ping, LU, Kunfeng, ZHANG, Juan, CONG, Longjian, GAI, Yifan, WEI, Xiaodan, LIN, Yuye
Format: Patent
Sprache:chi ; eng ; fre
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The present disclosure relates to a neural network having an ECA channel attention mechanism. The neural network comprises an ECA channel attention device. The ECA channel attention device comprises: a first hierarchical quantization unit used for performing hierarchical quantization on input data and converting floating-point number input data into fixed-point number input data, wherein in the first hierarchical quantization module, the whole input tensor shares a quantization step size and a quantization zero point; a channel-level quantization unit used for performing hierarchical quantization on output of an activation layer, wherein the channel-level quantization module separately calculates a quantization step size and a quantization zero point for each channel; and a channel multiplication weighting module used for performing channel weighting multiplication calculation on first hierarchical quantization output data and channel-level quantization output data. According to the present disclosure, lossle