Attention Free BIGBIRD Transformer for Long Document Text Summarization

Authors

  • Gitanjali Mishra
  • Nilambar Sethi
  • Agilandeeswari Loganathan
  • Yu-Hsiu Lin
  • Yu-Chen Hu

Abstract

The requirement for creating automatic text summarization systems has dramatically increased as a result of the web's tremendous expansion in textual data and the challenge of finding desired information within this vast volume of data. Transformer-based automatic text summarization using Pre-trained Language models is most attractive in terms of cutting-edge performance and accuracy. Unfortunately, due to their full attention mechanism, one of their key weaknesses is the quadratic memory dependency on the sequence length. Also, the suitability of those transformers for summarizing long documents is another issue. Both these issues can be addressed using a BIGBIRD transformer with a linear computational complexity of O(n). To attain improved accuracy by reducing the redundancy and to improve the similarity among the sentences, a novel attention-free BIGBIRD hierarchical Transformer is introduced in this paper, where the general BIGBIRD involves sparse attention, which is not scalable. Additionally, for handling the long document efficiently, it is crucial to build an effective model using a DistilBARTCNN-12-6 and multi-objective meta-heuristics algorithm that can learn and represent various compositions efficiently by selecting the sentences. These selected sentences from both the meta-heuristics and DistilBART-CNN are given as input for the Attention Free BIGBIRD Transformer to strengthen the summarization ability. Thus, the proposed Attention Free BIGBIRD Transformer using DistilBART-CNN and Meta-Heuristics Algorithm for Long Document Text Summarization (AFBB-LDTS) system achieves a better ROUGE and BLEU score when compared to the related state-of-the-art systems and it is less complex.

Downloads

Download data is not yet available.

Downloads

Published

2024-05-24

How to Cite

Gitanjali Mishra, Nilambar Sethi, Agilandeeswari Loganathan, Yu-Hsiu Lin, & Yu-Chen Hu. (2024). Attention Free BIGBIRD Transformer for Long Document Text Summarization. International Journal of Computer Information Systems and Industrial Management Applications, 16(2), 20. Retrieved from https://cspub-ijcisim.org/index.php/ijcisim/article/view/633

Issue

Section

Original Articles