Skip to the main content

Original scientific paper

https://doi.org/10.17559/TV-20190929153200

An RNN Model for Generating Sentences with a Desired Word at a Desired Position

Tianbao Song orcid id orcid.org/0000-0002-0369-9668 ; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Jingbo Sun orcid id orcid.org/0000-0002-0369-9668 ; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Yinbing Zhang orcid id orcid.org/0000-0001-8442-1790 ; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Weiming Peng* ; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China
Jihua Song ; Beijing Normal University, No. 19, Xinjiekouwai St, Haidian District, Beijing, 100875, P. R. China


Full text: english pdf 1.065 Kb

page 81-88

downloads: 886

cite


Abstract

Generating sentences with a desired word is useful in many natural language processing tasks. State-of-the-art recurrent neural network (RNN)-based models mainly generate sentences in a left-to-right manner, which does not allow explicit and direct constraints on the words at arbitrary positions in a sentence. To address this issue, we propose a generative model of sentences named Coupled-RNN. We employ two RNN's to generate sentences backwards and forwards respectively starting from a desired word, and inject position embeddings into the model to solve the problem of position information loss. We explore two coupling mechanisms to optimize the reconstruction loss globally. Experimental results demonstrate that Coupled-RNN can generate high quality sentences that contain a desired word at a desired position.

Keywords

desired word; lexically constrained; RNN; sentence generation

Hrčak ID:

234163

URI

https://hrcak.srce.hr/234163

Publication date:

15.2.2020.

Visits: 1.830 *