Transcribe your podcast

From The New York Times, I'm Michael Barbaro. This is a daily. Today. Facial recognition is becoming an increasingly popular tool for solving crimes. The dailies, Annie Brown speaks to Kashmir Hill about how that software is not treating everybody equally.


It's Monday, August 3rd. And then I'm just going to I'm just going to tape record with an app that I use. Do you guys have any questions or concerns before we start talking about what happened? No, no.


OK, so where do you think we should start the story of this case?


Kashmir, the story started for the Williams family in January of twenty twenty.


Listen, I got to go first. I have to go for the Thursday afternoon and Farmington Hills, Michigan, which is just outside of Detroit. Oh, I picked up Julia from school regular Thursday. And Melissa Williams, a realtor, is driving home from work. She's picking up her daughter. And it was right around like four o'clock. And I got a call and she gets a call from somebody who says they're a police officer. They immediately said.


We're calling about Robert from an incident in twenty eighteen, he needs to turn himself in. So I was confused off the bat. She is white and her husband, Robert Williams, is black and they said we assume you're you're you're his baby mama, that you're not together anymore. And what I said, that's my husband. And what is this regarding? And they said, we can't tell you. But he needs to come turn himself in, and I said, well, why didn't you call him?


And they said, Can't you just give him a message?


Wait, so why is this officer calling her? She doesn't know why the officer is calling her. All she knows is that the police want to be in touch with her husband. So she gives the officer her husband's number and then she calls Robert. And I said, I just got a really weird call, like, what did you do? What is this about? And while they're talking, Robert Williams gets a call from the police department after the other one and the detective from Detroit.


And I need to turn myself. So, of course, I'm like, for what reason? Well, I can tell you over phone. So I'm like, well, I can't turn myself in there.


It was a couple of days before his birthday, so he thought maybe it was a prank call, but it became pretty clear that the person was serious about probably 10 minutes later in the driveway.


And when he pulls into his driveway, a police car pulls in behind him, blocking him in and two officers get out. Yes, so I get out of the car and the driver, right, runs up and he's like, Are you Robin Williams? I'm like, Yeah, he's like, you're under arrest. And like, no, I'm not. And the guy comes over like a white sheet of paper and it's a felony warrant on the cop larceny.


And I'm but I'm I'm confused like, oh, my. Isn't worth stealing.


His wife comes out with his two young daughters and his oldest daughter, who's five, is watching this happen. I say, do you go back in the house? I'll be back in a minute. They just make no mistake. The guy the other cop is behind me when he comes out already. So he's like, come on, have you are you know, the drill? And I'm like, what? The officers arrest him. They have to use two pairs of handcuffs to get his hands behind his back because it's a really big guy.


We started moving seats around trying to get me in the back of this little bitty Impala and off we go. And then they drive to the detention center. I took fingerprints, and so your mug shot, mug shot pictures, then he's put in a cell to sleep overnight. At this point, I'm in a holding cell. With two other guys and they were watching for and on like, I don't know.


So when do you actually find out why you've been arrested on this kind of vague glass?


So. Well, maybe like noon the next day. Around noon the next day, he is taken to an interrogation room and there's two detectives there and they have three pieces of paper face down in front of them and they turn over, you know, the first sheet of paper. And it's a picture from a surveillance video of a large black man standing in a store wearing a red cardinals cap and a black jacket. And the detectives ask, is this you?


Hello? And I said, no, that's not me. So he turns over another paper and they turn over a second piece of paper, which is just a close up of that same guy's face. And he says, I guess that's not true. I said, no, this is not me.


So Robert picks the piece of paper up, hold it next to his own face.


Like what? You think all black men look alike and says, do all black men look the same to you? So what's your understanding, Kashmir, of what happened to bring Robert Williams into that police department?


So Robert Williams had no idea what was happening, but two years earlier in October, twenty eighteen, a man who was not him walked into Shinola store in downtown Detroit. And China was kind of like a high end store that sells expensive watches and bikes. And so this man came in. He was there for a few minutes. He stole five watches worth thirty eight hundred dollars and walked out. None of the employees there actually saw the theft occur. And so they had to review the surveillance footage and they found the moment it happened.


So they sent that surveillance footage picture that Robert Williams had been shown to the Detroit police. And the police turned to what a lot of police turned to these days when they have a suspect that they don't recognize a facial recognition system. So they ran a search on this, what they call a probe image. This picture for them, the surveillance video, which is really grainy, it's not a very good photo. And the way these systems work is that they have access not just to mug shots, but also to driver's license photos.


You get a bunch of different results and there's a human involved who decides which of the results looks the most like the person who committed the crime.


Mm hmm. So you're saying the facial recognition algorithm basically created a lineup of potential suspects and then from that lineup, someone picks the person that they think looks the most like the man in the surveillance video. Right. And so that is how they wound up arresting Robert Williams. So back in this room, the two detectives now have the real Robert Williams in front of them and he doesn't look like this guy. They sat back and looked at him, was like with the face.


So I guess the computer got it wrong to. And so they kind of leaned back and said, I guess the computer got it wrong. The computer that Iran is what threw me off on my computer, got it wrong. And what is the significance of that statement that the computer got it wrong, so this was an admission by the detectives that it was a computer that had pointed the finger at Robert Williams. And that's significant because this is the first documented case of an innocent person being arrested because of a flawed facial recognition match.


And just to put all of this into context for a second, the last time that you and I talked Kashmir, we were talking about a different development in facial recognition, this new algorithm being used by some police departments that drew from pictures all over social media and all over the Internet to make it kind of super algorithm. But the fear wasn't that it wasn't accurate. It was almost that it was too accurate, that it knew too much. But what you're describing is something altogether different, right.


So when we talk about facial recognition, we often think of it as a monolith, that there's kind of one facial recognition, but in fact, there's a bunch of different companies that all have their own algorithms and some work well and some don't work well and some work well. Sometimes, like identifying a really clear photo is a lot easier than identifying surveillance footage.


And why wouldn't police departments be using the most sophisticated, the most kind of up to date version of this software?


And this is where you run into just bureaucracy, right? You have contracts with companies that go back years and just a lot of different vendors. And so in this case, I tried to figure out exactly whose algorithms were responsible for Robert Williams getting arrested. And I had to really dig down. And I discovered the police had no idea. You know, they contract out to a company called Data Works plus and data works plus contracts out to two other companies called NSC.


And rank one actually supply the algorithm. Is this whole chain of companies that are involved and there's no standardized testing. There's no one really regulating this. There's just nobody saying which algorithms, you know, pass the test to be used by law enforcement. It's just up to police officers who for the most part, seem to be just testing it in the field to see if it works, if it's identifying the right people. But the really big problem is that these systems have been proven to be biased.


The Michelle Obama podcast is out now on Spotify, the series brings listeners inside the former first lady's most candid and personal conversations, showing us what's possible when we allow ourselves to be vulnerable, to open up and to focus on what matters most. Listen free at Spotify Dotcom. Michelle Obama.


This is Sarah Koenig, host of the serial podcast. I want to tell you about our new show, Nice White Parents. It's reported by Chana Joffe. Walt, who's made some of the best, most thought provoking, most emotional radio stories I've ever heard back in 2015.


Hannah wanted to find out what would happen inside this one public school in her neighborhood during a sudden influx of white students into a school that had barely had any white students before. And they're not satisfied that she fully understood what she was seeing. She went all the way back to the founding of the school in the 1960s and then fought again up to the present day.


And eventually Hannah realized she could put a name to the unspoken force that kept getting in the way of making the school better.


White parents, I've been waiting so long to tell people about this show and now I can finally say it. Go listen to nice white parents. Nice White Parents is made by Zero Productions, a New York Times company. You can find it wherever you get your podcasts.


So help me understand how an algorithm can become biased. Well, the bias tends to come from how the algorithm is trained and these algorithms tend to be trained by basically feeding them with a bunch of images of people. But the problem with the algorithms is that they tended to be trained with non diverse data sets, so so on. So one good example is that many of the algorithms used by law enforcement in the US, by government in the U.S. are very good at recognizing white men and not as good at recognizing black people or Asian-Americans.


But if you go to an algorithm from a company in China where they fed it with a lot of images of Asian people, they're really good at recognizing Asian people and not as good at recognizing white men. So you can just you can see the biases that come in from the kind of data that we feed into these systems.


And is this a widely agreed upon reality that because of these methods, the algorithms used in the US are just worse at identifying faces that aren't white men?


Yeah, a few years ago, an MIT researcher did this study and found that facial recognition algorithms were biased to be able to recognize white men better. And shortly after that missed the National Institute of Standards and Technology decided to run its own study on this, and it found the same thing. It looked at over a hundred different algorithms and it found that they were biased. And actually the two algorithms that were at the heart of this case that Robert Williams, his case, were in that study.


So the the algorithm that was used by this police department was actually studied by the federal government and was proven to be biased against faces like Robert Williams. Exactly.


So given these widely understood problems with these algorithms, how can police departments justify continuing to use them?


So police departments are aware of the bias problem, but they feel that face recognition is just too valuable a tool in their tool set to solve crimes. And their defense is that they never arrest somebody based on facial recognition alone, that facial recognition is only what they call an investigative lead. It doesn't supply probable cause for arrest. So what police are supposed to do is they get a facial recognition match and you're supposed to do more investigating. So you could go to the person's, you know, social media account and see if there are other photos of them wearing the same clothes that they were wearing on the day they committed this crime.


Or you can try to get proof that they were in that part of town on the day that the theft occurred. You know, try to get location data, basically find other evidence that this person is the person that committed the crime. But in the case of Robert Williams, they didn't do any other investigating based on that. They they went out and they arrested Mr. Williams. But if the police had done their job correctly, if they had looked into his social media accounts, if they had tried to get his location data from his phone records, essentially surveilling him more closely, wouldn't that be its own sort of violation just because their technology wrongfully identified this man?


He gets more closely watched by the police without his knowledge.


Right. And this is actually what police ask the facial recognition vendors to do. They want to have more what you call false positives, because they want to have the greatest pool of possible suspects that they can because they want to find the bad guy. But there is there's a real toll from that.


Hmm. I just you know, as a person who's been reporting on technology for a decade, I just think people trust computers. And even when we know something is flawed, if it's a computer telling us to do it, we just think it's right. And this is why we always used to see for a long time when mapping technology was first being developed and it wasn't that great. You know, people would drive into lakes, they would drive over cliffs because a mapping app said, you're supposed to go straight here.


Right. And even though they could look and see that their life is going to be in danger, they would think, well, this app must know what it's talking about. That's facial recognition now. And when I was reporting the story, all the experts I talked to said this is surely not the first case where somebody has been mistakenly an innocent person, has been mistakenly arrested because of a bad facial recognition image. But usually people don't find out about it.


Police don't tell people that they're there because of facial recognition. You know, usually when they charge them, they'll just say they were identified through investigative means. It's kind of a vague there were clues that pointed at you in that way. Robert's case was unusual because there was so little evidence against him. It basically had to tell him that they use facial recognition, you know, to put him there. Right.


They showed him what most people don't get to see, which is this false match between his photo and the photo of the crime. Right. And what's happened since Robert was arrested, so Robert had to hire a lawyer to defend himself, but when he went to the hearing, the prosecutor decided to drop the case, but they dropped it without prejudice, which meant that they could charge him again for the same crime with the same crime. So as I was reporting on the story, you know, I went to the prosecutor's office, I went to the Detroit Police Department and I said, you know what happened here?


Did you have any other evidence? It just seems like a clear misfire and misuse of facial recognition. And everyone involved was pretty defensive and said, well, you know, there might be more evidence that proves that Robert Williams did it. But after the story came out, everybody's tune changed dramatically. Prosecutor's office apologized, said that Robert Williams shouldn't have spent any time in jail. The Detroit Police Department said this was a horrible investigation. The police officers involved just did this all wrong.


This isn't how it's supposed to work.


And they said that Robert Williams would have his information expunged from the system, his mug shot, his DNA, and they personally apologized to the Williams family, though the Williams family told me that no one ever actually called them to personally apologize, but he can no longer be charged in the future for this crime.


That's exactly right. And what about their use of facial recognition software? Has there been any change there?


So one thing the Detroit Police Department said was, well, this was you know, this was a case that predates this new policy we have that says, you know, we're only supposed to be using facial recognition for violent crimes.


And what do you make of that? Why only use this tool for that? Well, their justification is that when it comes to violent crimes, when it comes to murder, rape, they need to solve these cases and they'll use any clue they can to do it, including facial recognition. But I think about something that Robert's wife said when they pulled up to our house.


They were already combative on the phone. They were aggressive in the doorway to me. What if he had been argumentative? If he'd been defensive, we had not complied. What could that have turned into in our yard like? It could have been a different way in the recent news has shown us that it definitely could have gone a different way.


Do you feel like there's a shame to this, that the police arrest of you, even though you did nothing? It's a little humiliating. It's not it's not so easily rolls off the tongue like, oh yeah, and guess what? I got arrested. And what about for Robert himself? What has life been like for him after the arrest? So this is very embarrassing for him.


It's kind of painful in some ways. So he had a perfect attendance at work until that day that he was arrested and his wife had to email his boss and say that they had a family emergency and that he couldn't show up that day. Once he did tell his boss what happened, his boss said, you know, you don't want to tell other people at work, you know, could be bad for you.


The night he got home, his daughter is five year old, was still awake. Julie was still up.


And I was like, what are you doing up right away for you? And I was like, I told you, I'll be right back. Like, you didn't come right back up. So the fact that they made a mistake and it just took longer than we expected.


But she started wanting to play cops and robbers, and she would always pretend like he was the robber who stole something and she would need to lock him up in the living room.


She told us that she told one of our jokes and her friend at school and we weren't sure. Did she tell her teacher? Did she tell her friends that? We're not sure. And we didn't know what to say to people like just bring it up out of nowhere. Like, oh, yeah, anyone mentioned that he was arrested, but he didn't do anything.


Has this made you look back to see where you like where you were October twenty eighteen? Yeah, I put it up.


I was at the time I was on my Facebook and Instagram last.


He has since looked back and realized that he had posted to Instagram at basically the same time as the shoplifting was occurring. He was driving home from work and a song came on the radio that his mother loved the song We Are One by Mays and Frankie Beverly, both singing songs on my way home in the car. So if the cops had looked into his social media, if they had tried to verify that it was possible that he could have committed this crime, they could have found this video.


Right. If the police had done a real investigation, they would have found out he had an alibi that day. Kazmir, thank you so much. Thank you. We'll be right back. Zalkind spent more than a decade building Khan Academy, the free remote learning platform.


Now, all of a sudden, it seems custom-made for today, we realize is one of those moments where you look left, look right, and you're like, I think this is us.


I'm Alicia Burke, host of the podcast that made all the difference. I'll be talking to some incredible people like Zall about how they're managing the crisis while helping others through it. Find that made all the difference anywhere you get your podcasts created by Bank of America, here's what else you need to know.


Federal unemployment benefits have expired for tens of millions of Americans after Congress failed to reach a deal to renew them last week.


So what do you say to those 30 million Americans who are now without federal unemployment help? I said to them, talk to President Trump. He's the one who is standing in the way of that. We have been for the six hundred dollars. They have a 200 dollar proposal which does not meet the needs of America's working families.


And in interviews on Sunday with ABC's This Week, House Speaker Nancy Pelosi blamed Republicans for demanding a drastic cut in the weekly benefit, while Treasury Secretary Steve Manoogian claimed that the six hundred dollar payments risked overpaying unemployed workers.


So you do think it is a disincentive to find a job if you have that extra six hundred dollars? There's no question in certain cases where we're paying people more to work, stay home than to work. That's created issues in the entire economy.


And The Times reports that July was a devastating month for the pandemic in the U.S., the country recorded nearly two million new infections, twice as many as any previous month.


I want to be very clear. What we're seeing today is different for March and April. It is extraordinarily widespread. It's into the rural as equal urban areas. In an interview on Sunday with CNN, Dr. Deborah Berk's a top White House adviser on the pandemic, acknowledged that the United States has failed to contain the virus.


And to everybody who lives in a rural area, you are not immune or protected from this virus. And that's why we keep saying no matter where you live in America, you need to wear a mask and socially just do the personal hygiene.


That's it for the daily unlikeable Borro. See you tomorrow.