Please use this identifier to cite or link to this item:
https://dspace.ctu.edu.vn/jspui/handle/123456789/101209
Title: | Multistage deep learning for air quality index prediction |
Authors: | Nguyen, Viet Hung Dang, Ngoc Hung |
Keywords: | Convolutional neural network Air quality monitoring UAV Bi-directional long short term memory |
Issue Date: | 2022 |
Series/Report no.: | Tạp chí Khoa học Công nghệ Thông tin và Truyền thông;Số 04(CS.01) .- P.17-24 |
Abstract: | Air quality prediction is a challenging but practical research topic in machine learning and data analytics. Since air quality directly affects human health and life in the long term, predicting its index values has always attracted much attention from researchers and government agencies. Today, many ground-based stations are established to provide air quality index values in monitored areas. Meanwhile, Unmanned Aerial Vehicles (UAVs) are being used more and more for surveillance applications, and become a good candidate application for air quality monitoring. However, monitoring and predicting air quality using UAVs is still a new domain and poses many challenges for the research community. To solve the problem of predicting air quality based on sensor values measured using UAV, in this paper, we propose a solution that based on a model combing an unidirectional convolutional neural network and a bi-directional long short term memory network (IDCNN-BiLSTM).. Experimental results with highly efficient and practical performance have shown that our proposed method can be deployed in real monitoring applications. The proposed system can also be a useful source of data in complement with ground-based stations. |
URI: | https://dspace.ctu.edu.vn/jspui/handle/123456789/101209 |
ISSN: | 2525-2224 |
Appears in Collections: | Khoa học Công nghệ Thông tin và Truyền thông |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
_file_ Restricted Access | 4.36 MB | Adobe PDF | ||
Your IP: 18.118.252.215 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.