CO109 Aaron Naparstek on the War on Cars

Aaron Naparstek is a cohost of the War on Cars podcast, and also the founder of


There have been a couple of stories about facial recognition. This audio is from a BBC report where the police set up a van with cameras filming passersby and searching for records on them based on facial recognition. One man decided that he didn’t like that, and pulled his sweater up over his mouth and nose to frustrate the camera system; the police stopped him, forced him to be photographed, and fined him £90, about $115 for what they called disorderly conduct.

There was no suggestion that he was guilty of any crime, at least of any other crime, if you call not wanting to be filmed a crime.

This was a trial, the police brought a BBC camera crew along with them to film the demonstration, and it was notable that three other people were arrested when the cameras, and the associated computer system recognized them as people who had outstanding warrants against them.

Compare that to the story another British man, this time in the French port city of Calais who will go on trial shortly for an incident that started when police noticed that he was filming them. He says that the police attacked a woman he was with without provocation; the two of them were recording police behavior.

There is an ongoing dispute at migrant camps around Calais, where volunteers distributing food to migrants say that they are suffering intense harassment from the French police. Amnesty International have said that the charges are an abuse of process, and should be dropped. The man faces large fines and up to five years in prison if he is convicted.

This has echoes of the Glik v Cunniffe case in the United States, where the Supreme Court ruled that citizens have the right to film public officials, including police, who are working in public. There is a clear case here to say that what’s good for the goose is good for the gander. People have a right to film in public space. It has a good affect too, people, not least the police, who know that their actions are being recorded are more likely to behave in a decent way.

But facial recognition is a whole new technology. It’s notable that San Francisco has banned its use in public. This is not just filming people; it is effectively looking up each person up in a database, just because they went out in public.

Now, the UK operation that led to one man being stopped and fined for pulling up his sweater, that’s effectively the equivalent of setting up a checkpoint and saying that nobody is allowed to pass until they have given their ID and been checked. Is it justified?

It certainly seems like a scary use of new technology, but that alone is not a valid argument against it. You can’t rationally say that, while the rest of society moves on, law enforcement must only use technology invented before an arbitrary date.

But new technology has given governments ways of violating rights that were never before contemplated. If the police can use facial recognition to look up all passersby in a database of, say, outstanding warrants, and then it can use facial recognition to record all passersby in a database.

That is the electronic equivalent of putting a checkpoint on every street corner, and not letting anyone pass unless they produce ID and have their movements recorded. If you don’t immediately see why that is incompatible with democracy, then two things – one, you don’t know the difference between democracy and a police state, and two, you are going to learn about that difference pretty soon.

But remember, in that UK trial, the police arrested three wanted men. That is the electronic equivalent of a cop spotting a fugitive in the street and collaring them. Should we really complain about that? This has the potential to take a lot of criminals off the streets and be a much more efficient use of resources.

My opinion is this: if something is already permissible, there is no good argument for saying that doing the same thing electronically, rather than by a human, is a violation. So spotting a crook in the street, either by person or by machine, fine. But where something is clearly impermissible when done by humans, there is no justification for saying because we can do the same thing by machine, and be more subtle about it, it becomes acceptable. It doesn’t.

So the devil is in the detail here. We could say, if the system is programmed only to flag fugitives for the police to arrest, then that’s OK, but if it is programmed to record everyone who passes, for the authorities to use later however they see fit, that is not OK. But that would require an independent audit of the entire system being used. Let’s see how fast police forces are willing to agree to that.