Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model

OSWALD, Marion, GRACE, Jamie, URWIN, Sheena and BARNES, Geoffrey (2018). Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model. Information and Communications Technology Law, 27 (2), 223-250.

This is the latest version of this item.

[img]
Preview
PDF
Executive summary Algorithmic Intelligence Analysis - Final version.pdf - Accepted Version
All rights reserved.

Download (323kB) | Preview
Official URL: https://www.tandfonline.com/doi/full/10.1080/13600...
Link to published version:: https://doi.org/10.1080/13600834.2018.1458455
Related URLs:

Abstract

To permit the use of unproven algorithms in the police service in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, our research proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. As is common across the public sector, the UK police service is under pressure to do more with less, and to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This research uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our research comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The research concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, our research notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.

Item Type: Article
Research Institute, Centre or Group - Does NOT include content added after October 2018: Law Research Group
Departments - Does NOT include content added after October 2018: Faculty of Social Sciences and Humanities > Department of Law and Criminology
Identification Number: https://doi.org/10.1080/13600834.2018.1458455
Page Range: 223-250
Depositing User: Jamie Grace
Date Deposited: 21 Dec 2017 12:20
Last Modified: 18 Mar 2021 01:16
URI: https://shura.shu.ac.uk/id/eprint/17462

Available Versions of this Item

  • Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model. (deposited 21 Dec 2017 12:20) [Currently Displayed]

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics