‘Algorithmic Impropriety’ in UK Policing Contexts: A Developing Narrative?

GRACE, Jamie (2021). ‘Algorithmic Impropriety’ in UK Policing Contexts: A Developing Narrative? In: MCDANIEL, John and PEASE, Ken, (eds.) Predictive Policing and Artificial Intelligence. Routledge Frontiers of Criminal Justice . Abingdon, Routledge.

[img] PDF
Grace Algorithmic Impropriety.pdf - Accepted Version
Restricted to Repository staff only until 26 August 2022.
All rights reserved.

Download (260kB)
Official URL: https://www.taylorfrancis.com/chapters/algorithmic...
Link to published version:: https://doi.org/10.4324/9780429265365
Related URLs:

    Abstract

    There is an increasing use of algorithmic or machine learning-based intelligence analysis in the UK policing context. Two of the most high-profile types of intelligence retention and analysis practices used by the Metropolitan Police have recently been found to be unlawful. Notably, these were i) the indefinite retention of a peaceable individual’s records on a specialist domestic extremism database, and ii) the overly-lengthy retention of disproportionately BAME citizens in London on a ‘Gangs Matrix’. These two findings, from the European Court of Human Rights and the UK Information Commissioner’s Office, respectively, have been indications that forces that would heed the call of Her Majesty’s Chief Inspector of Constabulary in 2018 to devote more resources toward investment in ‘AI’ for policing purposes must do so carefully. Indeed, the new National Data Analytics Solution (NDAS) project, based within West Midlands Police, has recently been the subject of critical ethical scrutiny on a number of fronts. The West Midlands force has had its own offering of a data-driven ‘Integrated Offender Management’ tool delayed by the demands for more clarity from a bespoke ethics committee. This has possibly headed off a later finding of unlawfulness in the courts, as there could possibly have been a challenge by way of judicial review on administrative law principles as well as data protection law and human rights and equality law. As a result, this chapter seeks to draw out lessons for policymakers from these early skirmishes in the field of ‘predictive policing’. This piece also concludes with some observations about the need for a set of minimum standards of transparency in a statutory authorization process for algorithmic police intelligence analysis tools (APIATs), in a mooted Predictive Policing (Technology) Bill.

    Item Type: Book Section
    Identification Number: https://doi.org/10.4324/9780429265365
    SWORD Depositor: Symplectic Elements
    Depositing User: Symplectic Elements
    Date Deposited: 06 Feb 2020 09:45
    Last Modified: 17 Mar 2021 13:30
    URI: http://shura.shu.ac.uk/id/eprint/25795

    Actions (login required)

    View Item View Item

    Downloads

    Downloads per month over past year

    View more statistics