GRACE, Jamie (2019). 'Algorithmic impropriety' in UK policing? Journal of Information Rights, Policy and Practice. [Article]
Documents
24378:528662
PDF
Grace_algorithmic_impropriety(VoR).pdf - Published Version
Available under License Creative Commons Attribution.
Grace_algorithmic_impropriety(VoR).pdf - Published Version
Available under License Creative Commons Attribution.
Download (351kB) | Preview
24378:528579
HTML
Jamie Grace - Algorithmic impropriety in policing - 2019 proofread version.docx - Accepted Version
Restricted to Repository staff only
Jamie Grace - Algorithmic impropriety in policing - 2019 proofread version.docx - Accepted Version
Restricted to Repository staff only
Download (121kB)
Abstract
There are concerns that UK policing could soon be awash with 'algorithmic impropriety'. Big(ger) data and machine learning-based algorithms combine to produce opportunities for better intelligence-led management of offenders, but also creates regulatory risks and some threats to civil liberties - even though these can be mitigated. In constitutional and administrative law terms, the use of predictive intelligence analysis software to serve up 'algorithmic justice' presents varying human rights and data protection problems based on the manner in which the output of the tool concerned is deployed. But regardless of exact context, in all uses of algorithmic justice in policing there are linked fears; of risks around potential fettering of discretion, arguable biases, possible breaches of natural justice, and troubling failures to take relevant information into account. The potential for 'data discrimination' in the growth of algorithmic justice is a real and pressing problem. This paper seeks to set out a number of arguments, using grounds of judicial review as a structuring tool, that could be deployed against algorithmically-based decision making processes that one might conceivably object to when encountered in the UK criminal justice system. Such arguments could be used to enhance and augment data protection and/or human rights grounds of review, in this emerging algorithmic era, for example, if a campaign group or an individual claimant were to seek to obtain a remedy from the courts in relation to a certain algorithmically-based decision-making process or outcome.
More Information
Statistics
Downloads
Downloads per month over past year
Share
Actions (login required)
View Item |