Pre-Trained Transformer-Based Approach for Arabic Question Answering: A Comparative Study

Authors

  • Amani Jamal Department of Computer Science, King Abdulaziz University, Jeddah, Saudi Arabia
  • Kholoud Alsubhi Department of Computer Science, King Abdulaziz University, Jeddah, Saudi Arabia

DOI:

https://doi.org/10.35877/454RI.asci4209

Keywords:

Arabic Question Answering, Arabic Pre-trained Language Models, Reading Comprehension

Abstract

Question answering (QA) is one of the most challenging yet widely investigated problems in Natural Language Processing (NLP). Question-answering (QA) systems try to produce answers for given questions. These answers can be generated from unstructured or structured text. Hence, QA is considered an important research area that can be used in evaluating text understanding systems. A large volume of QA studies was devoted to the English language, investigating the most advanced techniques and achieving state-of-the-art results. However, research efforts in the Arabic questionanswering progress at a considerably slower pace due to the scarcity of research efforts in Arabic QA and the lack of large benchmark datasets. Recently many pre-trained language models provided high performance in many Arabic NLP problems. In this work, we evaluate the state-of-the-art pre-trained transformers models for Arabic QA using four reading comprehension datasets which are Arabic-SQuAD), ARCD, AQAD, and TyDiQA-GoldP datasets. We fine-tuned and compared the performance of the AraBERTv2-base model, AraBERTv0.2-large model, and AraELECTRA model. In the last, we provide an analysis to understand and interpret the low-performance results obtained by some models.

Downloads

Download data is not yet available.

Downloads

Published

2025-04-30

How to Cite

Jamal, A., & Alsubhi, K. (2025). Pre-Trained Transformer-Based Approach for Arabic Question Answering: A Comparative Study. Journal of Applied Science, Engineering, Technology, and Education, 7(1), 157–170. https://doi.org/10.35877/454RI.asci4209

Issue

Section

Articles