Please use this identifier to cite or link to this item: https://dspace.ctu.edu.vn/jspui/handle/123456789/110404
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorLâm, Nhựt Khang-
dc.contributor.authorNguyễn, Phúc Trường Giang-
dc.date.accessioned2025-01-10T09:51:55Z-
dc.date.available2025-01-10T09:51:55Z-
dc.date.issued2024-
dc.identifier.otherB2014911-
dc.identifier.urihttps://dspace.ctu.edu.vn/jspui/handle/123456789/110404-
dc.description45 Trvi_VN
dc.description.abstractText summarization remains a vital challenge in natural language processing and artificial intelligence. With the advancement of deep learning, large language models such as BART (Bidirectional and Auto-Regressive Transformer) have demonstrated substantial success in abstractive summarization by combining bidirectional encoding with autoregressive decoding. This research explores the integration of Mixture of Experts (MoE) into the BART architecture to enhance its flexibility and efficiency in summarizing text. By replacing the feed-forward network in BART with MoE, the model dynamically routes information through specialized “experts”, optimizing both computational resources and the quality of the generated summaries. The study involves fine-tuning this modified BART model on the CNN/Daily Mail dataset and evaluating its performance using the ROUGE metric. . The results show that the baseline BART model achieved F1-scores of 36.89 (ROUGE-1), 16.24 (ROUGE-2), and 27.60 (ROUGE-L), while the MoE-enhanced BART model achieved scores of 37.24 (ROUGE-1), 16.60 (ROUGE-2), and 28.51 (ROUGE-L). These results highlight improvements in certain aspects of accuracy and robustness for the MoE-enhanced BART configuration. The findings not only underscore the potential of MoE-enhanced BART in addressing the inherent complexities of text summarization but also contribute to the ongoing development of more adaptive and powerful large language models for NLP applications.vi_VN
dc.language.isoenvi_VN
dc.publisherTrường Đại Học Cần Thơvi_VN
dc.subjectCÔNG NGHỆ THÔNG TIN - CHẤT LƯỢNG CAOvi_VN
dc.titleABSTRACTIVE TEXT SUMMARIZATION USING BART WITH MIXTURE OF EXPERTSvi_VN
dc.title.alternativeTÓM TẮT TÓM LƯỢC VĂN BẢN SỬ DỤNG BART VÀ MIXTURE OF EXPERTSvi_VN
dc.typeThesisvi_VN
Appears in Collections:Trường Công nghệ Thông tin & Truyền thông

Files in This Item:
File Description SizeFormat 
_file_
  Restricted Access
1.5 MBAdobe PDF
Your IP: 13.59.196.41


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.