Optimizing global sparsity for neural network
Some embodiments provide a method for training multiple parameters of a machine-trained (MT) network subject to a sparsity constraint that requires a threshold portion of the parameters to be equal to zero. A first set of the parameters subject to the sparsity constraint are grouped into groups of p...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Some embodiments provide a method for training multiple parameters of a machine-trained (MT) network subject to a sparsity constraint that requires a threshold portion of the parameters to be equal to zero. A first set of the parameters subject to the sparsity constraint are grouped into groups of parameters. For each parameter of a second set of the parameters subject to the sparsity constraint, the method determines an accuracy penalty associated with the parameter being set to zero. For each group of parameters in the first set of parameters, the method determines a minimum accuracy penalty for each possible number of parameters in the group being set to zero. The method uses the determined accuracy penalties to set to the value zero at least the threshold portion of the plurality of parameters. |
---|