Happy Scribe Logo

Transcript

Proofread by 0 readers
Proofread
[00:00:01]

This is Amitay Technology Review. I was completely shocked and stunned to be arrested in broad daylight in front of my daughter, in front of my wife, in front of my neighbors. It was one of the most shocking things I ever had happened to me.

[00:00:18]

That's Robert Williams. He's describing what happened outside of his home in an affluent suburb of Detroit called Farmington Hills back in January. The day started like any other Sissel, born Thursday. He got up, went to work, but then things got weird on the phone with Lisa around for Melissa.

[00:00:37]

Is his wife there in the middle of a call.

[00:00:39]

When he hears the other line, I click over, I'm like, Hello, Robert? And I'm like, who is this? You need to come down and turn yourself in.

[00:00:48]

Who is this? All of us or somebody from the third precinct.

[00:00:52]

And I need to turn myself in for what it was like. I can't tell you that. Then I can't come down. Well, if you come down, it'll be much easier on you. You don't want us to come out of your job, do you? At this point? I think it's a prank call. So I'm like, look, mom, if you want me, come get me.

[00:01:09]

I'll be at home, bring a warrant and I'll hang up on.

[00:01:14]

Melissa's at home waiting for her mom and daughter and she goes to greet them when they pull in.

[00:01:19]

And as I was walking through, I looked out in the cop car, was outside and I said, oh, so it wasn't a prank call. There really are people here. They came to the door. I answered it, and they kind of stuck their foot in the door and said, Sen. Robert out. And I said, he's not here. And they said, we just saw him come out of that van. And I said, that was my mom.

[00:01:39]

He's not here.

[00:01:41]

Clearly, something is very wrong, but they don't know what it is.

[00:01:44]

There's got to be a mistaken identity or something. I don't know what Detroit police are at my house. Turns out they were there because facial recognition software had wrongly matched his driver's license photo to security camera footage of a person stealing watches.

[00:02:00]

I pull in the driveway here, pull up in my regular spot, hop out.

[00:02:08]

By the time I close the door, the car's in the driveway blocking me. I parked this way across the across my driveway as if I'm going to back out or something and try to take off.

[00:02:20]

As soon as you shut the door, they were right on him. And I was in here still because I had the girls and they were already starting to cuff him.

[00:02:28]

By the time we got out there, he told his daughter to go back inside, that the police were making a mistake and he'd be right back. But he wasn't right back. The police took him into custody and he spent the night in jail. He still had no idea what was going on and he was angry. But he says as a black man, he had to consider what could happen if he let that show. So he stayed calm and he waited.

[00:02:53]

The next morning, officers showed him some photos of a man stealing watches, except those photos weren't of him, they were someone else. So that's not true. I look, I said, no, that's not me. He turns another paper over. We say, I guess that's not true either. I picked that paper up and hold it next to my face and I said, this is not me. Like, I hope you all don't think all black people look alike.

[00:03:20]

And then he says, the computer said, you really just brought the picture with him. He could have looked it up and down. He could have left and say, oh, my bad. I didn't mean to bother you. What's unusual about this story is not that face ID was used to find a suspect.

[00:03:36]

What's unusual is Robert Williams was told, because police aren't required to disclose that facial recognition isn't regulated, not how it's used by law enforcement, not how it's used by employers. I'm Jennifer Strong and this is Episode one of a new series exploring what happens when everything around us gets automated. We're kicking things off with a four part look at facial recognition and policing. We'll meet people building this technology, fighting against it and trying to regulate how it gets used.

[00:04:15]

Welcome to The Age of with predict what's possible in the age of with, then translate insight into trustworthy performance. Deloitte brings together end to end offerings with domain and industry insight to drive stronger outcomes through human and machine collaboration. Let's go in machines we trust. I'm listening to a podcast about the automation of everything.

[00:04:48]

You have reached your destination.

[00:04:53]

Think of it this way. Facial recognition is being used as a search engine for criminals, and your face is the search term. By 2016, the faces of half of all U.S. adults were believed to be stored inside systems police used to name suspects. Some refer to it as the perpetual lineup. But the nation may be at an inflection point both in its relationship with policing and with this technology. In June, tech giants Amazon and Microsoft put a pause on selling their Farside products to law enforcement.

[00:05:26]

IBM stopped selling it altogether. Then New York City passed a bill providing oversight of all surveillance technologies, despite opposition from the NYPD and after the wrongful arrest of Robert Williams came to light, Detroit police say they'll only use face ID to investigate violent crimes and they'll do it with still photos because those are more likely to produce an accurate match. But is it enough? OK, so at the moment, we're in East London and a place called Stratford, Peter Fussy is a criminologist at the University of Essex.

[00:06:03]

I took a walk with him back in February before the pandemic, which is historically been an area of a lot of deprivation, which had an awful lot of investment just before the 2012 Olympics, which were staged here.

[00:06:15]

It's a spot where the London police tested cameras that match faces with identities in real time. You are part of a team working on a national surveillance strategy, is that right?

[00:06:25]

So we're part of a research project. We look at emerging technology and the human rights implications separate to that. I also work with the surveillance camera regulator in the U.K. and I lead part of his strategy on human rights.

[00:06:39]

He studied technological surveillance for more than 20 years and started looking at closed circuit television CCTV cameras. They're very familiar on street CCTV. I was always surprised by how little people seemed concerned about it. I'd be making a case for why we should regulate, and it was largely met with indifference and facial recognition seems very different. It was caught the public imagination in the media on a daily basis.

[00:07:09]

Well, your face can tell people a lot more than you might think in a new world of facial recognition technology, your every move can be tracked.

[00:07:16]

Shoppers don't know it, but a computer's scanning their faces and hearing their features to those of known shoplifters. It's horrible invasion of privacy.

[00:07:24]

This technology is being installed with zero public oversight and accountability.

[00:07:29]

We're being bullied into taking our picture in order to get our key.

[00:07:32]

Even pop star Taylor Swift secretly deployed the technology to root out stalkers.

[00:07:39]

But while the public outcry has led some places to ban the technology, including tech hub cities like San Francisco and Cambridge, Massachusetts, where MIT is based, London's police tested a highly aggressive version of it in 10 different public spaces.

[00:07:54]

What you see in the U.K. is live facial recognition, which means that there is a database of individuals the police are interested in. Then, as the public walks past the camera, each of those people is scanned and then matched against a database. Here you are enacting surveillance before you know any offense. It's one thing for a police department to hold up a photo of someone to try to identify them in a system. It is something very different to have live identification happening in real time.

[00:08:25]

Yeah, that's exactly right. And I think it's a really important part of the debate that often gets lost there. The difference is that existing cameras or low tech analog human surveillance doesn't involve biometric data, which is universally seen as an intrusive practice.

[00:08:43]

And that special category of data has to be safely sorted and stored. And as he points out, no human can possibly process the volume that's being captured by these systems. That raises some serious questions about how proportionate that is, for instance, how necessary it is to biometric scan tens of thousands of people just because you're interested in talking to somebody. Now, if it's a known killer on the loose or the example it's always given a terrorist attack about to happen, then that's different.

[00:09:12]

You can make a much stronger necessity and proportionality argument around that, but less so if it's just somebody you're interested in talking to about an incident of anti-social behaviour or something like that. Well, the other question, when you say humans can't process that information, but also it's unclear whether the technology can yet either. What happens if you're falsely identified? If the camera says that you are a suspect, you're somebody on the watchlist. How many times do we know?

[00:09:40]

It's correct? In our research, we found it was correct eight times out of 42.

[00:09:45]

So on six full days sitting in police vans, eight times, he did the only independent review of these trials and he found it was accurate. Less than 20 percent of the time.

[00:09:55]

It might work brilliantly and lab conditions. But, you know, outside like an environment we're in now, the light is fading, winter light. Much of the intelligence picture for a lot of the offences are linked to the night time economy. So facial recognition works less well in low lying and all sorts of issues around that.

[00:10:14]

It's also less effective across different demographics. So not just ethnicity or race, ethnicity, but also gender, and then falls into a whole issue around transgender rights, an age as well.

[00:10:26]

You know, you say, for example, kids faces are harder for the technology to read. That's because they're still developing. Why that's important is if the police are using a technology which is not as effective for different groups, then unless they are aware of those limitations, unless they can somehow mitigate against them, then it's impossible to say that they are employing a technology that is compatible with human rights. How do you align a human rights and a surveillance strategy?

[00:10:56]

We often think about things like security as being oppositional to human rights. Or, of course, the first responsibility of states under the UN Declaration of Human Rights is for states to provide for the safety and security of its citizens. So there's often this framing of liberty versus security, which myself and my colleagues would find quite unhelpful. You know, you can have both and you can have neither.

[00:11:19]

We make our way to another spot he wants to show me. Just at the end of this bridge, you can see a poll with some cameras on it. If you're walking along this bridge towards those cameras, you would get to a point where there was a sign saying that facial recognition was in operation. Now, if you wanted to continue your journey, you would have to walk past those cameras. However, the police were saying this was a trial.

[00:11:41]

So if you didn't want to be part of that trial, you had to turn around. And to get to the same point beyond those cameras would take about a 20 minute detail.

[00:11:50]

OK, so this part is really important here. There's no real meaningful consent. If you withdraw consent because you don't want to be on the camera, then you should be able to withdraw consent without penalty. Otherwise, it's not consent.

[00:12:04]

Something else when you walk down the street, are you aware of the times you. Crossed from a public sidewalk on the concrete that's owned by a business. Did you know your rights to privacy might be different in just a few steps?

[00:12:16]

So here where we're standing outside Westfield shopping mall is private space. But we figured it's probably it's lots of people around here. It has a sense of a public space. What happens, though, is if you walk 30 meters to to our left, you're in a public area and all the cameras are owned by public authorities. And if you walk 30 meters to our right there, owned by private companies. Now, what about the one over your head?

[00:12:42]

Which one? Not one. So, yeah, this is owned by private company.

[00:12:47]

The difference comes back to a simple point. Public groups are meant to be accountable to the public. And if private companies grab an image of your face, they can do just about whatever they want with it. It's also worth mentioning what happened to a man who didn't want to participate in one of those police trials.

[00:13:03]

So he covered his face and the BBC captured it on tape.

[00:13:06]

Police stopped him. They photographed him anyway.

[00:13:10]

An argument followed by the police said this was disorderly behavior. So they gave him a fine company for life on their newly found FOID. We'll be back in a moment right after this.

[00:13:30]

At Deloitte, we believe the age of width is upon us, what's happening around us share data, digital assistance, cloud platforms, connected devices. It's not about people versus EHI, it's about the potential for people to collaborate with A.I. to discover and solve complex problems. But for organizations to enable innovation and break boundaries, they must harness this power responsibly. Deloitte Trustworthy, a framework, is a common language for leaders and organizations to articulate the responsible use of A.I. across internal and external stakeholders.

[00:14:02]

Prepare to turn possibility into performance. To take a closer look at the Deloitte Trustworthy Framework, visit Deloitte Dotcom Slash U.S. Trust I. Face ID works by mapping out the unique set of measurements between your features, like the spacing between your eyes, the length of your nose and the curvature of your lips. The earliest systems were invented in the 1960s, but for decades the technology wasn't really useful. Then in the early 2000s, local law enforcement in Florida created a statewide face recognition program.

[00:14:41]

A decade after that, Facebook invented a new way to start recognizing an auto tagging people in photos, rapidly improving face recognition to what we have today. Now it's widely used in airports and by police, but there's little transparency about what systems are used or how.

[00:14:59]

Any time surveillance gets legitimized, then it is open to be expanded over time.

[00:15:05]

Hamid Kahn is an activist fighting against the use of surveillance and many other technological tools used by Los Angeles police and has historically been used to trace and track and monitor and stalk particular communities, communities who are poor, communities where black and brown communities who would be considered suspect quatrains bodies. And it's a process of social control. Concreted the stop LAPD spying coalition, a group he describes as fiercely abolitionist. He doesn't think restricting the way police use face ID will work.

[00:15:40]

And so during what could best be described as a tsunami of adoption, with debate mostly focused on best practices, his focus is on getting these technologies banned.

[00:15:50]

Algorithms have no place in policing. I think it's crucial that we understand that because there are lives at stake and it's been successful.

[00:15:59]

Several data policing and predictive policing programs in Los Angeles ended after public and legal pressure from his group Tarkan. Part of how we got to this moment is by changing the way we define and police suspicious activity.

[00:16:12]

The definition is that it's observed behavior reasonably indicative of preoperational planning of criminal and or terrorist activity. So you're observing somebody's behavior not a fact, but a concern that a person may be thinking of doing something wrong. Right. So this is not going into that speculative and hunch based policing is real.

[00:16:36]

What we do know, thanks to academic and government research, is facial recognition works best on white men.

[00:16:43]

Eye camera. I've got a face. Can you see my face? No glasses face.

[00:16:50]

That's MIT researcher Joy Boyle and winning giving a TED talk.

[00:16:54]

So what's going on? Why isn't my face being detected? Well, we have to look at how we give machines site. So how this works. If you create a training set with examples of faces as the faces of faith, this is not a faith. And over time you can teach a computer how to recognize other faces. However, if the training sets aren't really that diverse, any face that deviates too much from the established norm will be harder to detect, which is what was happening to me.

[00:17:23]

In twenty eighteen, she led a groundbreaking study showing that commercial face recognition systems repeatedly failed to classify dark skinned women like her. A year later, a major report on Face ID from a federal agency called NYST, or the National Institute of Standards and Technology, found some face I.D. algorithms were up to 100 times more likely to falsely match photos of people of color. But even if these systems can be engineered to reach perfect accuracy, they can be used in dangerous ways.

[00:17:54]

And these problems go deeper than just skewed data and imperfect math.

[00:17:59]

Technology is not operating by itself. From the from the design to the production to the deployment, there is constantly bias built in. And it's not just the biases of the people themselves. That is only one part of it. It's the inherent bias within the system. Next episode, would it be surprising that photos of you, including some you've maybe never seen, are used by companies to build facial recognition systems and Twitter?

[00:18:31]

Do you remember this photo at all?

[00:18:34]

No, I didn't know that was taken. And I look very, very serious in that one year. Yeah.

[00:18:42]

In part two, we meet the founder of one of the most controversial companies working in this space, Clearview, AZ, chief executive Quantitate. This episode was reported and produced by me, Tate, Ryan Mosley and Emma Silicon's we had help from Karen Howe and Benji Rosen were edited by Michael Reilly and Gideon Lichfield, our technical director is Jacob Goreski, special thanks to Kyle Thomas Hemingway, Eric Margin and the ACLU for sharing their recordings of Robert Williams. Thanks for listening.

[00:19:16]

I'm Jennifer Strong.

[00:19:21]

This is Amitay Technology Review.