Gun-Toting Police Swarm, Handcuff Young Black Man After AI Mistakes Doritos Bag For a Gun
On Monday night a Black Baltimore teenager, Taki Allen, had finished football practice and was sitting outside his school waiting to be picked up. That school, which is supposed to protect its students, instead brought down on his head a traumatic and potentially deadly incident thanks to its video cameras and their 鈥渆nhancement鈥 with AI.
Allen said he ate a bag of chips, crumbled it up, and put it in his pocket. Apparently an AI-enhanced surveillance camera decided that he had a gun, and soon the police swamped the scene. Allen told Baltimore, 鈥淚t was like eight cop cars that came pulling up to us [and] they started walking toward me with guns鈥 They made me get on my knees and put my hands behind my back and cuff me. And they searched me.鈥
Allen said the police 鈥渟aid that an AI detector or something detected that I had a gun. But I was just holding 鈥 they showed me a picture 鈥 I was just holding a Doritos bag.鈥
Anyone being swarmed by eight cars鈥 worth of gun-waving police shouting orders at them would feel traumatized. But for a young Black man, given the history and present reality of racist policing in this country? That was a life-threatening situation. Allen that during the incident he was thinking, 鈥淎m I gonna die? Are they going to kill me?鈥 The violent response by police should not have happened at all, but could all-too-easily have had a very tragic ending.
One reporter that the story is 鈥渇urther evidence that artificial intelligence is not all that intelligent,鈥 and while true, the blame here lies not with the technology but with the human beings and institutions that created the techno-social system behind the incident.
To set up AI to be in a position to trigger this kind of response is grossly irresponsible, and the fault lies with some combination of the school that installed it, the vendor that pushed it on perhaps na茂ve-about-technology school officials, the school security staff who called the police, and the police, who it sounds from Allen鈥檚 account were aware that the alert came from an AI and had a still of what the AI had told them was a gun. Humans should have been in the loop here 鈥 before police were deployed, guns drawn 鈥 and recognized a Doritos bag when they saw one. (The school gave a of how the alert was transmitted to police, suggesting that a school resource officer escalated the situation 鈥 more evidence that putting police in schools is a bad idea.)
So the biggest scandal here is not that AI is imprecise (stop the presses!) but that this was allowed to happen at all 鈥 and that it doesn鈥檛 look like any of the very human people to blame are being held accountable. Quite the opposite: Schools superintendent Dr. Myriam Rogers said the way it was supposed to.
The vendor is a company called , which touts that its AI was 鈥渂uilt with military-grade precision鈥 鈥 which made me burst out laughing. 鈥淢ilitary grade鈥 is a hoary, over-used marketing term that is often mocked because of the hand-wavy way it evokes seriousness while being applied far beyond the few narrow areas where it may be a real thing. To apply it to today鈥檚 AI is even funnier because of how non-deterministic and lacking in precision that technology is. (I recently analyzed the state of AI machine vision here.) If this is indeed military grade technology, then any military that uses it is in .
That an incident like this would happen was entirely predictable 鈥 and in fact I predicted it in a piece about gun-detection technology three years ago:
like all alarm systems, and especially AI systems, there will be false positives 鈥 potentially a lot of them. Blanketing public spaces with buggy gun detectors may increase the incidence of tragic confrontations sparked by people holding , toy guns, or other everyday objects that police have mistaken for firearms.
It didn鈥檛 take any special genius or insight on my part to see this coming. All the factors 鈥 imprecise AI, AI that is 鈥渟old and marketed way beyond real-life performance,鈥 the human predilection to give too much credence to computer alerts, and everybody鈥檚 state of high anxiety over the frequency of shootings in this country 鈥 were lined up to make it obvious that something like this would happen. And in an American policing system far too prone to shooting people 鈥 especially Black people 鈥 that is a dangerous combination.