RODRIGUES, Marcos and KORMANN, Mariza (2019). Real-time detection and analysis of facial action Units to identify patients at risk in critical care (abstract only). In: Proceedings of International Conference on Recent Advance in Social Sciences, Entrepreneurship, Business and Economics Research (SEBER-AUG-2019). CIES. [Book Section]
Documents
24983:534949
PDF (Only abstract was published)
root.pdf - Accepted Version
Restricted to Repository staff only
Available under License All rights reserved.
root.pdf - Accepted Version
Restricted to Repository staff only
Available under License All rights reserved.
Download (4MB)
Abstract
The emotional expressions of humans and animals have been investigated ever since
Charles Darwin.It is generally accepted that facial expressions convey what we are feeling, although interpretations may vary among cultural groups. Previous collaborative research between North
Middlesex University Hospital (London) and the GMPR Research Group at Sheffield
Hallam University[1], [2], [3], [4]have demonstrated for the first time that patterns of facial action
units identified in deteriorating patients can predict admission to intensive care. Our aims are to
extend the current data set and compare automatic predictions with standard methods. We are
collectingfacialdatafrompatientsincriticalcareandinvestigatingmethodsforautomaticrecognition
offacial action units as predictors of patient deterioration. The system was implemented on a
MacBook Pro 2.5GHZ Intel Core i7, 16GB memory 1600 MHz DDR3 running on macOS
Mojave 10.14.5. The algorithms operate on live video comparing current measurements with a
baseline. The baseline is estimated from a patient’s face image from a few hours back or from
a previous day. Upon detecting all regions and data points of interest on the face, the following
are the parameters for AU evaluation: • AU15: lip corner depressor. From the detected data
points on the lip, evaluate the corner depression using both trigonometric relationships and
curvature measures. Compare those measurements with the baseline and determine whether the
measures indicate deterioration or improvement or indifferent. • AU25: lips relaxation.
Determine how much the lips are relaxed(open)by evaluating the area enclosed by the detected data
points. Determine whether the area is increasing, decreasing, or stable as an indication of
deterioration/improvement/indifferent. • AU43: eyes relaxation. Similar to lips, estimate the
area enclosed by the data points. It constantly compares previous to current image and makes a
prediction based on measurements in real time. The green banner means no significant change
from baseline measurements. To calculate the lip depressor we use the angle between two
straight lines. The lines are defined between the end of the lips intercepting at the point at the
centre of the top lip – note that these points are automatically detected by our algorithms. Let
m1 and m2 be the slopes of the two lines, then the angle θ is estimated by: θ = tan−1± m2 −m1
1 + m1m2 (1) 2 Our approach to automatic recognition of facial action units can overcome
human factors associated with the lack of recognition of deteriorating patients in the ward. We
are at TRL1-basic principles observed as we have demonstrated that patterns of action units
can beused as a predictor of admission to critical care, and at TRL2technology concept
formulated for a non-contact device to assist admission to critical care. The advantages of our
solution are independent of human factors (safety) and can decrease the number of nurses
needed to assure monitoring patients in the ward (cost-effective).
More Information
Metrics
Altmetric Badge
Dimensions Badge
Share
Actions (login required)
View Item |