ATTENTION NEURAL NETWORKS WITH N-GRAMMER LAYERS
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes a neural network configured to perform the machine learning task, the...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes a neural network configured to perform the machine learning task, the neural network comprising an N-grammer layer and an output neural network, the N-grammer layer configured to: at each of one or more heads: receive a sequence of input embeddings; generate a discrete latent representation of the sequence of input embeddings by using a learned product quantization codebook; generate a plurality of n-gram indices from the discrete latent representation; and generate a latent n-gram representation of the sequence of input embeddings; and generate a sequence of output embeddings, and the output neural network configured to: receive the sequence of output embeddings; and process the sequence of output embeddings to generate the network output. |
---|