An RNN Model for Generating Sentences with a Desired Word at a Desired Position
Generating sentences with a desired word is useful in many natural language processing tasks. State-of-the-art recurrent neural network (RNN)-based models mainly generate sentences in a left-to-right manner, which does not allow explicit and direct constraints on the words at arbitrary positions in...
Gespeichert in:
Veröffentlicht in: | Tehnički vjesnik 2020-02, Vol.27 (1), p.81-88 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Generating sentences with a desired word is useful in many natural language processing tasks. State-of-the-art recurrent neural network (RNN)-based models mainly generate sentences in a left-to-right manner, which does not allow explicit and direct constraints on the words at arbitrary positions in a sentence. To address this issue, we propose a generative model of sentences named Coupled-RNN. We employ two RNN's to generate sentences backwards and forwards respectively starting from a desired word, and inject position embeddings into the model to solve the problem of position information loss. We explore two coupling mechanisms to optimize the reconstruction loss globally. Experimental results demonstrate that Coupled-RNN can generate high quality sentences that contain a desired word at a desired position. |
---|---|
ISSN: | 1330-3651 1848-6339 |
DOI: | 10.17559/TV-20190929153200 |