As our regular readers will know, Be-IT has written regularly about the dangers posed by Facial Recognition Technology (FRT). News recently that Clearview AI has been fined £7.5 million for acting illegally therefore comes as no surprise.
As The Guardian and others report, “Clearview AI has collected more than 20bn images of people’s faces from Facebook, other social media companies and from scouring the web.
“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service.” The UK Information Commissioner deems this to be unacceptable.
More specifically, the Information Commissioner’s Office said that “Clearview AI broke UK data protection laws in several ways, including: failing to use information of UK residents in a fair and transparent way; failing to have a lawful reason for collecting that information; and failing to have a process in place to stop the data being retained indefinitely.”
Let’s face it (no pun intended), almost all of us use social media constantly. We all click on ‘accept cookies’ because we can’t be bothered to wait to adjust them. Yet, if we were told that the police were using FRT to monitor our movements in the streets, we would probably not be happy, especially if this were to then be used for a form of social credits (as happens in China). Similarly, parents might be aghast to learn that it is being used in schools. But of course, this wouldn’t happen in this country, would it?
Fact: several UK police forces, the MOD and the National Crime Agency were using Clearview AI technology until recently, having been offered it on a free trial basis. They have all stopped now and the company no longer operates in the UK.
Fact: just over a year ago, analysis of 100 countries showed only six that didn’t use FRT. The full table from the report is shown below. Each country was scored out of 40, with higher scores indicating no or less invasive use of FRT and lower scores highlighting more widespread and invasive use. China was the lowest, with only four out of 40, followed by Russia, with nine out of 40, but the UK was none too great, scoring 17. This report also said that 70% of governments in their study “are using FRT on a large-scale basis,” and the same percentage of police forces use it. Some 20% of countries have FRT in some schools. Only a handful of countries worldwide have banned its use and even in some of them some use by the police is allowed.
Now, it must be stressed that not all use of FRT is malign. It has been used to help track the spread of Covid for example, and clearly if it can be used to prevent serious crime and terrorism then that is unequivocally a good thing. Clearview AI’s CEO claims that his company is a force for good and “that we only collect data from the open internet and comply with all standards of privacy and law.” The ICO obviously didn’t think so. And, be honest, do you really trust governments, even democratically-elected ones, never to use your data against you…?