Skoči na glavni sadržaj

Izvorni znanstveni članak

https://doi.org/10.1080/00051144.2020.1809221

GSA-Net: gated scaled dot-product attention based neural network for reading comprehension

Xiang Ma ; Institute of Tourism, Changchun Vocational Institute of Technology, Changchun, People’s Republic of China
Junsheng Zhang ; Research Center for Information Science Theory and Methodology, Institute of Scientific and Technical Information of China, Beijing, People’s Republic of China


Puni tekst: engleski pdf 1.791 Kb

str. 643-650

preuzimanja: 303

citiraj


Sažetak

Reading Comprehension (RC) is concerned with building systems that automatically answer questions about a given context passage. The interactions between the context and question are very important to locate the correct answer. In this paper, we propose a Gated Scaled DotProduct Attention based model for RC task. The character-level embedding is incorporated into the word embedding which is helpful to deal with Out-of-Vocabulary (OOV) tokens. The attention
distribution is obtained by scaled dot product which captures the interaction between question and passage effectively. Further, self-matching attention mechanism is adopted to resolve the problem of long-distance dependency. These components provides more information for the prediction of the starting and ending position of the answer. We evaluate our method on Stanford Question Answering Dataset (SQuAD) and the results show that different components in
the model boost the performance.

Ključne riječi

Natural language processing; attention; machine reading comprehension

Hrčak ID:

258402

URI

https://hrcak.srce.hr/258402

Datum izdavanja:

23.9.2020.

Posjeta: 849 *