Technical gazette, Vol. 27 No. 1, 2020.
Original scientific paper
https://doi.org/10.17559/TV-20190929153200
An RNN Model for Generating Sentences with a Desired Word at a Desired Position
Tianbao Song
orcid.org/0000-0002-0369-9668
; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Jingbo Sun
orcid.org/0000-0002-0369-9668
; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Yinbing Zhang
orcid.org/0000-0001-8442-1790
; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Weiming Peng*
; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Jihua Song
; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Abstract
Generating sentences with a desired word is useful in many natural language processing tasks. State-of-the-art recurrent neural network (RNN)-based models mainly generate sentences in a left-to-right manner, which does not allow explicit and direct constraints on the words at arbitrary positions in a sentence. To address this issue, we propose a generative model of sentences named Coupled-RNN. We employ two RNN's to generate sentences backwards and forwards respectively starting from a desired word, and inject position embeddings into the model to solve the problem of position information loss. We explore two coupling mechanisms to optimize the reconstruction loss globally. Experimental results demonstrate that Coupled-RNN can generate high quality sentences that contain a desired word at a desired position.
Keywords
desired word; lexically constrained; RNN; sentence generation
Hrčak ID:
234163
URI
Publication date:
15.2.2020.
Visits: 1.870 *