Please use this identifier to cite or link to this item:
https://dspace.ctu.edu.vn/jspui/handle/123456789/110404
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Lâm, Nhựt Khang | - |
dc.contributor.author | Nguyễn, Phúc Trường Giang | - |
dc.date.accessioned | 2025-01-10T09:51:55Z | - |
dc.date.available | 2025-01-10T09:51:55Z | - |
dc.date.issued | 2024 | - |
dc.identifier.other | B2014911 | - |
dc.identifier.uri | https://dspace.ctu.edu.vn/jspui/handle/123456789/110404 | - |
dc.description | 45 Tr | vi_VN |
dc.description.abstract | Text summarization remains a vital challenge in natural language processing and artificial intelligence. With the advancement of deep learning, large language models such as BART (Bidirectional and Auto-Regressive Transformer) have demonstrated substantial success in abstractive summarization by combining bidirectional encoding with autoregressive decoding. This research explores the integration of Mixture of Experts (MoE) into the BART architecture to enhance its flexibility and efficiency in summarizing text. By replacing the feed-forward network in BART with MoE, the model dynamically routes information through specialized “experts”, optimizing both computational resources and the quality of the generated summaries. The study involves fine-tuning this modified BART model on the CNN/Daily Mail dataset and evaluating its performance using the ROUGE metric. . The results show that the baseline BART model achieved F1-scores of 36.89 (ROUGE-1), 16.24 (ROUGE-2), and 27.60 (ROUGE-L), while the MoE-enhanced BART model achieved scores of 37.24 (ROUGE-1), 16.60 (ROUGE-2), and 28.51 (ROUGE-L). These results highlight improvements in certain aspects of accuracy and robustness for the MoE-enhanced BART configuration. The findings not only underscore the potential of MoE-enhanced BART in addressing the inherent complexities of text summarization but also contribute to the ongoing development of more adaptive and powerful large language models for NLP applications. | vi_VN |
dc.language.iso | en | vi_VN |
dc.publisher | Trường Đại Học Cần Thơ | vi_VN |
dc.subject | CÔNG NGHỆ THÔNG TIN - CHẤT LƯỢNG CAO | vi_VN |
dc.title | ABSTRACTIVE TEXT SUMMARIZATION USING BART WITH MIXTURE OF EXPERTS | vi_VN |
dc.title.alternative | TÓM TẮT TÓM LƯỢC VĂN BẢN SỬ DỤNG BART VÀ MIXTURE OF EXPERTS | vi_VN |
dc.type | Thesis | vi_VN |
Appears in Collections: | Trường Công nghệ Thông tin & Truyền thông |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
_file_ Restricted Access | 1.5 MB | Adobe PDF | ||
Your IP: 13.59.196.41 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.