Izvorni znanstveni članak
https://doi.org/10.1080/00051144.2021.1922150
Advancing natural language processing (NLP) applications of morphologically rich languages with bidirectional encoder representations from transformers (BERT): an empirical case study for Turkish
Akın Özçift
; Hasan Ferdi Turgutlu Technology Faculty, Software Engineering Department, Manisa Celal Bayar University, Manisa, Turkey
Kamil Akarsu
; Hasan Ferdi Turgutlu Technology Faculty, Software Engineering Department, Manisa Celal Bayar University, Manisa, Turkey
Fatma Yumuk
; Hasan Ferdi Turgutlu Technology Faculty, Software Engineering Department, Manisa Celal Bayar University, Manisa, Turkey
Cevhernur Söylemez
; Hasan Ferdi Turgutlu Technology Faculty, Software Engineering Department, Manisa Celal Bayar University, Manisa, Turkey
Sažetak
Language model pre-training architectures have demonstrated to be useful to learn language representations. bidirectional encoder representations from transformers (BERT), a recent deep bidirectional self-attention representation from unlabelled text, has achieved remarkable results in many natural language processing (NLP) tasks with fine-tuning. In this paper, we want to demonstrate the efficiency of BERT for a morphologically rich language, Turkish. Traditionally morphologically difficult languages require dense language pre-processing steps in order to model the data to be suitable for machine learning (ML) algorithms. In particular, tokenization, lemmatization or stemming and feature engineering tasks are needed to obtain an efficient data model to overcome data sparsity or high-dimension problems. In this context, we selected five various Turkish NLP research problems as sentiment analysis, cyberbullying identification, text classification, emotion recognition and spam detection from the literature. We then compared the empirical performance of BERT with the baseline ML algorithms. Finally, we found enhanced results compared to base ML algorithms in the selected NLP problems while eliminating heavy pre-processing tasks.
Ključne riječi
Bidirectional encoder representations transformers; language pre-processing; morphologically rich language; natural language processing; Turkish
Hrčak ID:
269829
URI
Datum izdavanja:
4.6.2021.
Posjeta: 2.608 *