Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model

OSWALD, Marion, GRACE, Jamie, URWIN, Sheena and BARNES, Geoffrey (2018). Algorithmic risk assessment policing models: Lessons from the Durham Constabulary HART model. Information and Communications Technology Law, 27 (2), 223-250. [Article]

Documents
17462:316751
[thumbnail of Executive summary Algorithmic Intelligence Analysis - Final version.pdf]
Preview
PDF
Executive summary Algorithmic Intelligence Analysis - Final version.pdf - Accepted Version
Available under License All rights reserved.

Download (323kB) | Preview
Abstract
To permit the use of unproven algorithms in the police service in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, our research proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. As is common across the public sector, the UK police service is under pressure to do more with less, and to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This research uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our research comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The research concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, our research notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.
More Information
Statistics

Downloads

Downloads per month over past year

View more statistics

Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Actions (login required)

View Item View Item