Skip to the main content

Original scientific paper

https://doi.org/10.32985/ijeces.14.10.4

Improving Scientific Literature Classification: A Parameter-Efficient Transformer-Based Approach

Mohammad Munzir Ahanger orcid id orcid.org/0000-0003-0985-905X ; University of Kashmir Faculty of Applied Sciences, Department of Computer Sciences, Srinagar, India *
M. Arif Wani ; University of Kashmir Faculty of Applied Sciences, Department of Computer Sciences, Srinagar, India

* Corresponding author.


Full text: english pdf 1.159 Kb

page 1115-1123

downloads: 147

cite


Abstract

Transformer-based models have been utilized in natural language processing (NLP) for a wide variety of tasks like summarization, translation, and conversational agents. These models can capture long-term dependencies within the input, so they have significantly more representational capabilities than Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN). Nevertheless, these models require significant computational resources in terms of high memory usage, and extensive training time. In this paper, we propose a novel document categorization model, with improved parameter efficiency that encodes text using a single, lightweight, multiheaded attention encoder block. The model also uses a hybrid word and position embedding to represent input tokens. The proposed model is evaluated for the Scientific Literature Classification task (SLC) and is compared with state-of-the-art models that have previously been applied to the task. Ten datasets of varying sizes and class distributions have been employed in the experiments. The proposed model shows significant performance improvements, with a high level of efficiency in terms of parameter and computation resource requirements as compared to other transformer-based models, and outperforms previously used methods.

Keywords

text classification; document categorization; scientific literature classification; deep learning;

Hrčak ID:

311153

URI

https://hrcak.srce.hr/311153

Publication date:

12.12.2023.

Visits: 476 *