User context recognition using smartphone sensors and classification models

OTEBOLAKU, Abayomi and ANDRADE, M.T. (2016). User context recognition using smartphone sensors and classification models. Journal of Network and Computer Applications, 66, 33-51.

Full text not available from this repository.
Official URL: https://www.sciencedirect.com/science/article/pii/...
Link to published version:: https://doi.org/10.1016/j.jnca.2016.03.013
Related URLs:

    Abstract

    © 2016 Elsevier Ltd. All rightsreserved. Context recognition is an indispensable functionality of context-aware applications that deals with automatic determination and inference of contextual information from a set of observations captured by sensors. It enables developing applications that can respond and adapt to user's situations. Thus much attention has been paid to developing innovative context recognition capabilities into context-aware systems. However, some existing studies rely on wearable sensors for context recognition and this practice has limited the incorporation of contexts into practical applications. Additionally, contexts are usually provided as low-level data, which are not suitable for more advanced mobile applications. This article explores and evaluates the use of smartphone's built-in sensors and classification algorithms for context recognition. To realize this goal, labeled sensor data were collected as training and test datasets from volunteers' smartphones while performing daily activities. Time series features were then extracted from the collected data, summarizing user's contexts with 50% overlapping slide windows. Context recognition is achieved by inducing a set of classifiers with the extracted features. Using cross validation, experimental results show that instance-based learners and decision trees are best suitable for smartphone-based context recognition, achieving over 90% recognition accuracy. Nevertheless, using leave-one-subject-out validation, the performance drops to 79%. The results also show that smartphone's orientation and rotation data can be used to recognize user contexts. Furthermore, using data from multiple sensors, our results indicate improvement in context recognition performance between 1.5% and 5%. To demonstrate its applicability, the context recognition system has been incorporated into a mobile application to support context-aware personalized media recommendations.

    Item Type: Article
    Uncontrolled Keywords: Context recognition; Smartphone sensing; Multimedia personalization; Context-awareness; Context classification; 0899 Other Information and Computing Sciences; Networking & Telecommunications
    Identification Number: https://doi.org/10.1016/j.jnca.2016.03.013
    Page Range: 33-51
    SWORD Depositor: Symplectic Elements
    Depositing User: Symplectic Elements
    Date Deposited: 05 Jun 2020 09:01
    Last Modified: 05 Jun 2020 09:03
    URI: http://shura.shu.ac.uk/id/eprint/24429

    Actions (login required)

    View Item View Item

    Downloads

    Downloads per month over past year

    View more statistics