SALEH, Nurul, AB GHANI, Hadhrami and JILANI, Zairul (2022). Defining factors in hospital admissions during COVID-19 using LSTM-FCA explainable model. Artificial Intelligence in Medicine: 102394. [Article]
Documents
30682:607856
PDF
Jilani-DefiningFactorsHospital(AM).pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Jilani-DefiningFactorsHospital(AM).pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Download (1MB) | Preview
Abstract
Outbreaks of the COVID-19 pandemic caused by the SARS-CoV-2 infection that started in Wuhan, China, have quickly spread worldwide. The current situation has contributed to a dynamic rate of hospital admissions. Global efforts by Artificial Intelligence (AI) and Machine Learning (ML) communities to develop solutions to assist COVID-19-related research have escalated ever since. However, despite overwhelming efforts from the AI and ML community, many machine learning-based AI systems have been designed as black boxes. This paper proposes a model that utilizes Formal Concept Analysis (FCA) to explain a machine learning technique called Long-short Term Memory (LSTM) on a dataset of hospital admissions due to COVID-19 in the United Kingdom. This paper intends to increase the transparency of decision-making in the era of ML by using the proposed LSTM-FCA explainable model. Both LSTM and FCA are able to evaluate the data and explain the model to make the results more understandable and interpretable. The results and discussions are helpful and may lead to new research to optimize the use of ML in various real-world applications and to contain the disease.
More Information
Statistics
Downloads
Downloads per month over past year
Metrics
Altmetric Badge
Dimensions Badge
Share
Actions (login required)
View Item |