Please use this identifier to cite or link to this item:
https://dspace.ctu.edu.vn/jspui/handle/123456789/94764
Title: | TEXT SUMMARIZATION USING BART WITH DEEP ATTENTION |
Other Titles: | TÓM TẮT VĂN BẢN SỬ DỤNG MÔ HÌNH BART VỚI DEEP ATTENTION |
Authors: | Lâm, Nhựt Khang Nguyễn, Khánh Vinh |
Keywords: | CÔNG NGHỆ THÔNG TIN - CHẤT LƯỢNG CAO |
Issue Date: | 2023 |
Publisher: | Trường Đại Học Cần Thơ |
Abstract: | Automatic text summarization is a pivotal facet of Natural Language Processing (NLP) and Artificial Intelligence (AI), seeking to distill essential information while preserving the core meaning of the original text. In this study, the Bart model, a stateof-the-art transformer-based architecture designed for sequence-to-sequence tasks, was harnessed to explore its efficacy in conjunction with the CNN Dailymail dataset – a widely adopted benchmark for text summarization, comprising news articles paired with human-generated summaries. The experiments yielded compelling results, showcasing the Bart model's commendable performance. Through fine-tuning on the CNN Dailymail dataset, the model demonstrated a remarkable ability to generate coherent and informative summaries. Notably, it successfully captured salient details while maintaining contextual coherence, underscoring its efficacy in real-world summarization scenarios. This research contributes to advancing automatic text summarization by highlighting the successful integration of the Bart model with a highquality dataset, emphasizing the importance of robust models and datasets for training effective summarization systems in an era dominated by information overload. |
Description: | 37 Tr |
URI: | https://dspace.ctu.edu.vn/jspui/handle/123456789/94764 |
ISSN: | B1910726 |
Appears in Collections: | Trường Công nghệ Thông tin & Truyền thông |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
_file_ Restricted Access | 1.05 MB | Adobe PDF | ||
Your IP: 3.144.21.237 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.