Watching me watching you – how much authority do the police have when it comes to prediction algorithms? banner

News

Watching me watching you – how much authority do the police have when it comes to prediction algorithms?

  • Posted on

The film Minority Report raised a number of serious social issues, not least of which was - is it morally and ethically right for someone to be arrested and charged just because they might constitute a threat, according to a prediction from a psychic? Well, although the police are adamant that they do not use psychics, the age of prediction algorithms and machine learning' is potentially bringing the dystopian world of Minority Report a little closer.

AI (Artificial Intelligence) is the current buzzword. According to a study published by the Royal United Services Institute in conjunction with the Centre for Information Rights at the University of Winchester, it could have a real impact on the criminal justice system in the years to come. In their words, the use of AI could have unintended or indirect consequences that are difficult to anticipate'. Scary stuff indeed.

A technology that's still finding its feet - and your face

Despite what you might see on CSI, facial recognition software isn't as good as you'd think, and there have been cases of mistaken identity already, as well as claims that AFR or Automated Facial Recognition technology violates individuals' privacy rights. Back in June this year, two legal challenges were launched against South Wales and the Metropolitan police services over their use of AFR during public events. The technology is currently unregulated, and the use of biometric checks has been questioned by human rights organisations such as Liberty. In general, the use of high-tech surveillance equipment and AI-based algorithms is making the public feel deeply uneasy.

How bad is this software? Well, at the 2017 Champions League final in Cardiff, the AFR was found to have incorrectly identified over 2,200 perfectly innocent people as potential criminals. Technical issues' were cited, but what kind of protection do the general public have when it comes to this kind of surveillance technology?

GDPR - a potential hole in the legislation?

GDPR has given people more rights than ever before to control what information is held about them, how it is used, and the right to be forgotten'. But AI and AFR seems to have slipped through the net somewhat. Liberty and other human rights organisations claim that AFR systems capture peoples' biometric data without their consent, which is in direct contradiction to the new legislation. They also claim that because the systems are fundamentally flawed, they disproportionately misidentify ethnic minority people and women's faces. They also say that in general, AFR turns people into walking ID cards'. And seeing as ID cards have been rejected time and again by the public, AFR (for all its well-intentioned good points) doesn't sit easily in the consciousness of the public.

What can you do if you feel your ID has been harvested' by AI?

In short, not a lot. The Home Office is in favour of the use of AFR, especially at high-profile events (it's even been used at the Notting Hill Carnival in recent years). They do acknowledge that people's privacy should be respected, it's just that they haven't quite worked out how to go about that yet, without scrapping AFR altogether and starting again from scratch.

Individual's rights in accordance with GDPR and other existing human rights legislation could be the only way to challenge this Big Brother approach to data collection. But unless you're part of a class action, it's going to be very difficult indeed to prove that your data has been harvested without your consent, or that the act of using AFR breaches your personal liberties and human rights.

Currently, there are strict guidelines on where ordinary consumers can position CCTV cameras on their homes, to ensure that members of the public who may be innocently passing by their property are not inadvertently caught on camera. It seems, however, that the same cannot be said of AI and AFR, which is randomly (and as it's turned out, quite incorrectly) capturing and processing data without the individuals' knowledge.
If you feel that your privacy rights have been violated either by private CCTV surveillance or by AFR, it's best to talk to a solicitor or legal expert who specialises in both GDPR and human rights legislation.

Grant Sanders

Grant Sanders is the Practice Manager and can be contacted on 01323 644222 or gs@stephenrimmer.com

    Ask a question