'The Computer Got It Wrong': How Facial Recognition Led To False Arrest Of Black Man
Updated 9:05 p.m. ET Wednesday
Police in Detroit were trying to figure out who stole five watches from a Shinola retail store. Authorities say the thief took off with an estimated $3,800 worth of merchandise.
Investigators pulled a security video that had recorded the incident. Detectives zoomed in on the grainy footage and ran the person who appeared to be the suspect through facial recognition software.
A hit came back: Robert Julian-Borchak Williams, 42, of Farmington Hills, Mich., about 25 miles northwest of Detroit.
In January, police pulled up to Williams' home and arrested him while he stood on his front lawn in front of his wife and two daughters, ages 2 and 5, who cried as they watched their father being placed in the patrol car.
His wife, Melissa Williams, wanted to know where police were taking her husband.
" 'Google it,' " she recalls an officer telling her.
Robert Williams was led to an interrogation room, and police put three photos in front of him: Two photos taken from the surveillance camera in the store and a photo of Williams' state-issued driver's license.
"When I look at the picture of the guy, I just see a big Black guy. I don't see a resemblance. I don't think he looks like me at all," Williams said in an interview with NPR.
"[The detective] flips the third page over and says, 'So I guess the computer got it wrong, too.' And I said, 'Well, that's me,' pointing at a picture of my previous driver's license," Williams said of the interrogation with detectives. " 'But that guy's not me,' " he said, referring to the other photographs.
"I picked it up and held it to my face and told him, 'I hope you don't think all Black people look alike,' " Williams said.
Williams was detained for 30 hours and then released on bail until a court hearing on the case, his lawyers say.
At the hearing, a Wayne County prosecutor announced that the charges against Williams were being dropped due to insufficient evidence.
Civil rights experts say Williams is the first documented example in the U.S. of someone being wrongfully arrested based on a false hit produced by facial recognition technology.
Lawyer: Artificial intelligence 'framed and informed everything'
What makes Williams' case extraordinary is that police admitted that facial recognition technology, conducted by Michigan State Police in a crime lab at the request of the Detroit Police Department, prompted the arrest, according to charging documents reviewed by NPR.
The pursuit of Williams as a possible suspect came despite repeated claims by him and his lawyers that the match generated by artificial intelligence was faulty.
The alleged suspect in the security camera image was wearing a red St. Louis Cardinals hat. Williams, a Detroit native, said he would under no circumstances be wearing that hat.
"They never even asked him any questions before arresting him. They never asked him if he had an alibi. They never asked if he had a red Cardinals hat. They never asked him where he was that day," said lawyer Phil Mayor with the ACLU of Michigan.
On Wednesday, the ACLU of Michigan filed a complaint against the Detroit Police Department asking that police stop using the software in investigations.
In a statement to NPR, the Detroit Police Department said after the Williams case, the department enacted new rules. Now, only still photos, not security footage, can be used for facial recognition. And it is now used only in the case of violent crimes.
"Facial recognition software is an investigative tool that is used to generate leads only. Additional investigative work, corroborating evidence and probable cause are required before an arrest can be made," Detroit Police Department Sgt. Nicole Kirkwood said in a statement.
In Williams' case, police had asked the store security guard, who had not witnessed the robbery, to pick the suspect out of a photo lineup based on the footage, and the security guard selected Williams.
Victoria Burton-Harris, Williams' lawyer, said in an interview that she is skeptical that investigators used the facial recognition software as only one of several possible leads.
"When that technology picked my client's face out, from there, it framed and informed everything that officers did subsequently," Burton-Harris said.
One of the leading studies on bias in face recognition was conducted by Joy Buolamwini, an MIT researcher and founder of the Algorithmic Justice League.
"This egregious mismatch shows just one of the dangers of facial recognition technology which has already been shown in study after study to fail people of color, people with dark skin more than white counterparts generally speaking," Buolamwini said.
"The threats to civil liberties posed by mass surveillance are too high a price," she said. "You cannot erase the experience of 30 hours detained, the memories of children seeing their father arrested, or the stigma of being labeled criminal."
Maria Miller, a spokeswoman for the prosecutor's office, said the case was dismissed over insufficient evidence, including that the charges were filed without the support of any live witnesses.
Wayne County Prosecutor Kym Worthy said any case sent to her office that uses facial recognition technology cannot move forward without other supporting evidence.
"This case should not have been issued based on the DPD investigation, and for that we apologize," Worthy said in a statement to NPR. "Thankfully, it was dismissed on our office's own motion. This does not in any way make up for the hours that Mr. Williams spent in jail."
Worthy said Williams is able to have the case expunged from his record.
Williams: "Let's say that this case wasn't retail fraud. What if it's rape or murder?"
According to Georgetown Law's Center on Privacy and Technology, at least a quarter of the nation's law enforcement agencies have access to face recognition tools.
"Most of the time, people who are arrested using face recognition are not told face recognition was used to arrest them," said Jameson Spivack, a researcher at the center.
While Amazon, Microsoft and IBM have announced a halt to sales of face recognition technology to law enforcement, Spivack said that will have little effect, since most major facial recognition software contracts with police are with smaller, more specialized companies, like South Carolina-based DataWorks Plus, which is the company that supplied the Detroit Police Department with its face-scanning software.
The company did not respond to an interview request.
DataWorks Plus has supplied the technology to government agencies in Santa Barbara, Calif., Chicago and Philadelphia.
Facial recognition technology is used by consumers every day to unlock their smartphones or to tag friends on social media. Some airports use the technology to scan passengers before they board flights.
Its deployment by governments, though, has drawn concern from privacy advocates and experts who study the machine learning tool and have highlighted its flaws.
"Some departments of motor vehicles will use facial recognition to detect license fraud, identity theft, but the most common use is law enforcement, whether it's state, local or federal law enforcement," Spivack said.
The government use of facial recognition technology has been banned in half a dozen cities.
In Michigan, Williams said he hopes his case is a wake-up call to lawmakers.
"Let's say that this case wasn't retail fraud. What if it's rape or murder? Would I have gotten out of jail on a personal bond, or would I have ever come home?" Williams said.
Williams and his wife, Melissa, worry about the long-term effects the arrest will have on their two young daughters.
"Seeing their dad get arrested, that was their first interaction with the police. So it's definitely going to shape how they perceive law enforcement," Melissa Williams said.
In his complaint, Williams and his lawyers say if the police department won't ban the technology outright, then at least his photo should be removed from the database, so this doesn't happen again.
"If someone wants to pull my name and look me up," Williams said, "who wants to be seen as a thief?"
Copyright 2021 NPR. To see more, visit https://www.npr.org.