AL-TEKREETI, Zeena Sabah Ismaeel (2024). Artificial Intelligence Facial Expression Recognition for Early Prediction of Human Health Deterioration. Doctoral, Sheffield Hallam University. [Thesis]
Documents
34872:839220
PDF
Al-Tekreeti_2024_PhD_ArtificialIntelligenceFacial.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Al-Tekreeti_2024_PhD_ArtificialIntelligenceFacial.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.
Download (12MB) | Preview
Abstract
Facial expressions are a universally recognised means of conveying internal emotional states across diverse human cultural and ethnic groups. Recent advances in understanding people's emotions expressed through verbal and non-verbal communication are particularly noteworthy in the clinical context for the assessment of patients’ health and well-being. Facial expression recognition (FER) plays an important and vital role in healthcare, providing communication with a patient’s feelings and allowing the assessment and monitoring of mental and physical health conditions. However, the subtle and rapid nature of facial expressions poses a challenge to swift recognition and interpretation. Previous research collaboration between North Middlesex Hospital and the GMPR group has demonstrated for the first time that human recognised patterns of facial action units can be used to predict admission to intensive care. This thesis shows that automatic machine learning methods may predict health deterioration accurately and robustly, independent of human subjective assessment. Methods are developed to create a facial database mimicking the underlying muscular structure of the face, whose Action Unit motions can then be transferred to human face images, thus displaying animated expressions of interest. To detect and recognize expressions, five models are proposed and tested. The first model combined face detection method with a 1D Convolutional Neural Network (1D-CNN), using raw generated data coordinates as input. Results show 99.74% accuracy in predicting patient deterioration. The second model combines 1D-CNN with Long Short-Term Memory (LSTM) with different data pre-processing methods with an overall accuracy of 99.89%. The third and fourth models, based on Random Forest and Support Vector Machine methods yield accuracies of 100% and 60% respectively. Finally, the Transformer model yields a low accuracy of 20%. The main contributions to knowledge from this thesis can be summarized as 1) the generation of visual datasets mimicking real-life samples of facial expressions indicating health deterioration; 2) to improve understanding and communication with patients at risk of deterioration through facial expression analysis, and 3) developing state-of-the-art models to recognize such facial expressions based on simulated facial expressions. Hence, the significance of the investigation and prediction model designs is to directly support clinical systems in detecting and assessing early signs of health deterioration directly from the analysis of patients’ facial expressions. As such, the outcomes of this PhD thesis may help to improve assessment of health deterioration by introducing real-time, health trend analysis and early warning systems to support timely interventions.
More Information
Statistics
Downloads
Downloads per month over past year
Metrics
Altmetric Badge
Dimensions Badge
Share
Actions (login required)
![]() |
View Item |