Skoči na glavni sadržaj

Izvorni znanstveni članak

https://doi.org/10.32985/ijeces.14.10.4

Improving Scientific Literature Classification: A Parameter-Efficient Transformer-Based Approach

Mohammad Munzir Ahanger orcid id orcid.org/0000-0003-0985-905X ; University of Kashmir Faculty of Applied Sciences, Department of Computer Sciences, Srinagar, India *
M. Arif Wani ; University of Kashmir Faculty of Applied Sciences, Department of Computer Sciences, Srinagar, India

* Dopisni autor.


Puni tekst: engleski pdf 1.159 Kb

str. 1115-1123

preuzimanja: 147

citiraj


Sažetak

Transformer-based models have been utilized in natural language processing (NLP) for a wide variety of tasks like summarization, translation, and conversational agents. These models can capture long-term dependencies within the input, so they have significantly more representational capabilities than Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN). Nevertheless, these models require significant computational resources in terms of high memory usage, and extensive training time. In this paper, we propose a novel document categorization model, with improved parameter efficiency that encodes text using a single, lightweight, multiheaded attention encoder block. The model also uses a hybrid word and position embedding to represent input tokens. The proposed model is evaluated for the Scientific Literature Classification task (SLC) and is compared with state-of-the-art models that have previously been applied to the task. Ten datasets of varying sizes and class distributions have been employed in the experiments. The proposed model shows significant performance improvements, with a high level of efficiency in terms of parameter and computation resource requirements as compared to other transformer-based models, and outperforms previously used methods.

Ključne riječi

text classification; document categorization; scientific literature classification; deep learning;

Hrčak ID:

311153

URI

https://hrcak.srce.hr/311153

Datum izdavanja:

12.12.2023.

Posjeta: 476 *