Happy Scribe Logo

Transcript

Proofread by 0 readers
Proofread
[00:00:02]

This is Amity Technology Review. All right, so we're out taking a drive and one of New York City's busiest bridges, it's called the traffic. And the reason we're doing this is because the transit authority, the MTA, has installed live facial recognition. Actually, we're about to go under it right now. Do you see that? The camera's pointed right at our faces. They've now put this all over the city. It's in a number of bridges and tunnels.

[00:00:30]

But the reason we're on this one is because this is where it all started. And what it does, at least in theory, what it's supposed to do is read our faces through our windshield. And what's crazy about this is, well, what's crazy about it is that nobody knew there was a test. But also crazy about it is when the results of that test were leaked to the press last year. You want to guess how many faces were captured during the test period?

[00:00:56]

What do you think this is my silicon's, by the way, my producer, I don't know, like thousands, maybe, maybe millions. New York City, maybe millions. No, they got none, zero percent. But they moved forward and they proceeded with it anyway.

[00:01:12]

The cashless tolling, which is really not about tolling, it's really about putting in an electronic system that we have never had before.

[00:01:23]

That's New York Governor Andrew Cuomo speaking at an event in 2018.

[00:01:28]

What it is doing is it's holding a system that reads every license plate that comes through that crossing and report to the police car that is stationed right there within five seconds.

[00:01:45]

But this is not some tech event, nor is it about policing. It's celebrating the end of repairs to a tunnel connecting Brooklyn and Manhattan. You see the city subways and tunnels flooded with saltwater in the aftermath of Hurricane Sandy. All the electronics and wiring had to be replaced and it gave them an opportunity to try something new.

[00:02:04]

We're now moving to facial recognition technology, which takes you to a whole new level where it can see the face of the person in the car and run that against databases.

[00:02:20]

And they're experimenting with something even more cutting edge than reading faces or license plates.

[00:02:26]

They're attempting to read people's ears because many times a person will turn their head when they see a security camera. So they're now experimenting with technology that just identifies a person by the ear.

[00:02:39]

Believe it or not, I'm Jennifer Strong and this is part three of our series on police and facial recognition. This episode, we take a closer look at how the technology is being used in different cities and meet some police chiefs helping to make those decisions.

[00:03:02]

Welcome to The Age of with predict what's possible in the age of with, then translate insight into trustworthy performance. Deloitte brings together end to end offerings with domain and industry insight to drive stronger outcomes through human and machine collaboration. Hmm.

[00:03:27]

Let's go in machines we trust, I'm listening to a podcast about the automation of everything.

[00:03:35]

You have reached your destination. Roads are not the only place New York City's transit authority may be experimenting with face ID, but without rules for how it gets used, we tend to find out about it after the fact and often by accident, like when a New York Times reporter called out on Twitter something she'd seen in the subway.

[00:03:59]

So this was a monitor set up in the Times Square subway station that had these yellow boxes around people's faces indicating that it was using a minimum facial identification technology, if not facial recognition.

[00:04:13]

Albert Foxconn is the founder of the Surveillance Technology Oversight Project, otherwise known as STOP. He reached out to the transit agency, the MTA, after seeing that tweet. But he says they didn't give much of a response.

[00:04:26]

So we see them. And recently we got a favorable decision from a New York state judge who said that the MTA wrongly withheld information about those monitors without providing us any explanation of how they were being used, whether there was facial recognition involved, and what the purpose of setting them up in the first place was. Because the MTA is justification for using these monitors was don't worry, these aren't facial recognition. We just want to scare the public into thinking we had facial recognition so that they wouldn't skip their fair.

[00:05:04]

But scare tactics aren't the only ones he's concerned about. He's been arguing with the NYPD about how they use this technology for a number of years.

[00:05:12]

You have this incredibly powerful technology, but it's already prone to certain types of errors and certain types of buyers. And the NYPD is going through and using in a way that only makes that risk more pronounced. You have them feeding in doppelganger photos and we have no idea how many times as it's been done.

[00:05:34]

There's even a name for this called the Celebrity Comparison. A few years ago, investigators were searching for a suspect caught on tape stealing beer from a drug store in New York City. The camera captured his face, but not well enough for the facial recognition system to return any matches. A detective noticed the suspect looked a little bit like the actor Woody Harrelson. And after running Harrison's picture through the system, they found a match. We don't know how often this happens, but we know it's more than once.

[00:06:03]

In another example from the NYPD, a New York Knicks player stands in for the suspect.

[00:06:08]

And so using a celebrity's photo to me, that's it's really problematic that you're appropriating someone's image for this sort of purpose. Also, there's no evidence that this is accurate.

[00:06:21]

Another practice he finds troubling police manually Photoshop images. You see, face recognition often fails to recognize faces with closed eyes. So some police departments will paste open eyes onto those photos to get results. They also alter their photos in other ways.

[00:06:37]

If the mouth is open, your Photoshop and clothes, if part of the face is obscured, you'll see cases where they go on Google images and copy and paste like a part of another image to try to create a composite that the facial recognition algorithm will identify as a viable human face. Because even though these tools can be incredibly powerful, they are also so fragile. And even having one eye close can be enough to make it so that a facial recognition algorithm doesn't actually see a human face that it can match.

[00:07:15]

What's more, we don't really even know what tools police have in some places, including New York.

[00:07:21]

But even our elected officials, our city council, don't know what tools they're using. And it's a threat to democracy itself when you have the police operating without any oversight. But it also makes it impossible to have the sort of public pushback we need to start banning these tools when we don't know that they are being used in the first place.

[00:07:42]

But this is changing. This summer, the city passed the Public Oversight of Surveillance Technology Act or the Post Act. It requires police disclose basic information about what surveillance technologies they have, how they work, how they're used, and how often it covers tech, ranging from cell phone location trackers, automated license plate readers, body cameras and social media monitoring software and, of course, facial recognition. But it's not a new bill. It's been sitting around for the last three years.

[00:08:12]

None of the sweeping reforms we've seen in New York would have been possible without the incredible protests and uprisings we've seen around the city, around the country. It's fundamentally changed the debate over policing in the city.

[00:08:27]

The ACLU is another player working on the same puzzle. It's a nonprofit focused on individual rights and liberties. They're outspoken about the need for regulation of face ID, especially in the case of Robert Williams, the man we met in Episode one. I don't know that they used any type of facial recognition or anything like that until talking with the detectives who showed me that, and that's what they used to apprehend me. We didn't think this could happen. We didn't think it was a thing.

[00:08:52]

Even with me following the facial recognition news and how it was being used, I didn't ever expect the police to show up at our doorstep and arrest my husband. So I just feel like other people should know that it can happen. And it did happen and it shouldn't happen.

[00:09:08]

He and his wife, Melissa, who you just heard, are represented by Phil Mayer, a senior staff attorney at the ACLU of Michigan.

[00:09:15]

Mr. and Mrs. Williams, defense attorney, talks about how this is the first time in her years of representing criminal defendants that they've actually learned that her client was identified through facial recognition technology. And again, that didn't come out in court. It came out because the police accidentally said something. So I just am as confident as I can be that there are people who are in jail today convicted of crimes that they took plea bargains to, all because a computer made the same kind of mistake it did in Mr.

[00:09:42]

Williams's case.

[00:09:44]

But real harm can still be done by a false match going public even when nobody gets arrested on the morning of April 25th, in the midst of the final season, I woke up in my dorm room to 35 missed calls, all frantically informing me that I had been falsely identified as one of the terrorists involved in the recent EU attacks in my beloved motherland, Sri Lanka.

[00:10:08]

Amara Majid was 22 years old and a senior at Brown University when her face appeared on a poster with the name of a different woman accused of these attacks, she'd been falsely identified by an algorithm.

[00:10:20]

There are no words to describe the pain of being associated with such heinous attacks on my own native homeland and people. The pictures and posts falsely implicating me have compromised my family's peace of mind and endangered our extended family's lives.

[00:10:38]

Sri Lankan authorities later apologized for the mistake, but not before she was harassed, trolled and threatened on social media. It all boils down to this policy.

[00:10:47]

Transparency and oversight of facial ID differs radically from one place to the next. And even when there are rules in place to prevent these types of things from happening, they're only as good as they are followed. When we come back, we'll meet some police departments working on that.

[00:11:09]

At Deloitte, we believe the age of width is upon us, what's happening around us share data, digital assistants, cloud platforms, connected devices. It's not about people versus A.I. It's about the potential for people to collaborate with A.I. to discover and solve complex problems. But for organizations to enable innovation and break boundaries, they must harness this power responsibly. Deloitte Trustworthy, a framework, is a common language for leaders and organizations to articulate the responsible use of A.I. across internal and external stakeholders prepared to turn possibility into performance.

[00:11:43]

To take a closer look at the Deloitte Trustworthy Framework, visit Deloitte Dotcom Slash U.S. Trust I. The NYPD released its face ID policy back in March after nearly a decade of public pressure, but other police departments have been much more willing to engage the public on this from the start, including in the land of casinos, Las Vegas, where visitors and those who live there are no strangers to being surveilled because of the huge sums of money that pass through those gaming room floors.

[00:12:18]

Police Captain Doree Curran is the commanding officer who oversees the Las Vegas Strip, and he tells me about a high tech surveillance room that sounds like it could be featured on one of those cop shows because of coronavirus.

[00:12:30]

I can't visit the command center in person. So he describes it for me.

[00:12:34]

It is a little bit Hollywoodish, and we did that on purpose in terms of how we wanted to feel. So imagine walking into a large room and on the front wall you have this massive display, all kinds of camera feeds like a surveillance room, but a little bit more high tech, a little bit bigger, a little bit more advanced. Everything in this room is connected.

[00:12:54]

Alerts paying when audio sensors detect a shooting, plus lots of other things officers might be interested in knowing about. And all of these identity technologies show up on a big gridded map so people and events can be placed at a specific location in near real time. And it kind of speaks to the city as a whole. Las Vegas is one of the country's pioneering smart cities, with sensors embedded just about everywhere, all the way down to trash cans that can smell what's inside.

[00:13:23]

And Captain Kurin says face ID has revolutionized policing. Instead of just responding to crimes, he says they can get in front of them. Facial recognition for us has proven absolutely instrumental, instrumental in saving lives, instrumental in preventing violent crime.

[00:13:39]

He says it allows officers to identify a pattern of criminal activity in real time. Figure out that there's a second robbery that just happened at a convenience store, and perhaps the description of the suspect who committed the robbery with a firearm has the same color shirt as the robber that just happened 10 minutes ago or the same vehicle or the same behavior. And they try to pick up on these patterns and then they send that out real time so that way could prevent the third robbery or fourth robbery from happening.

[00:14:10]

And they're happy to talk about it. We're firm believers of being transparent. I mean, we at the end of the day, police serve the community that they come from. So we want to make sure the innovative and advanced things that we are doing, particularly when it comes to deploying technologies for fighting crime, is accepted by the community. This kind of openness is rare, and not just with law enforcement. Private companies use facial recognition too often without telling anyone.

[00:14:36]

A recent example, face ID cameras were quietly installed at hundreds of Rite Aid pharmacies, largely in lower income and non-white parts of Los Angeles and New York, according to Reuters. And casinos have been using some sort of facial identification for decades for fraud prevention, enhanced security and even to recognize gamblers at the tables.

[00:14:56]

The private sector has some of the more advanced on edge facial recognition platforms. These are platforms that are recording and analyzing people's faces in real time as they cross the camera view. Don't want to leave just the private companies, the general public, which also will include all your criminals to have the best and most advanced technologies and to leave the law enforcement with archaic tools that aren't able to do the job in the 21st century.

[00:15:22]

But that doesn't mean he thinks police should have free reign with all types of the technology.

[00:15:27]

We do not use facial recognition on edge, which basically means live recording of the footage on that camera to scan everybody's face that comes across that camera angle. We don't do that. I don't know if law enforcement's ready to use that. And certainly I don't think that law enforcement should enlist. Their community supports the use of that type of very advanced technology. They also don't search social media photos for suspects. They only run searches on someone who has committed a crime.

[00:15:54]

And if they get a match, they limit its value, meaning you can't just go arrest that person based on that match. But learning all of this meant I was in for a surprise when I asked, do you ever alter the photos so that they work better?

[00:16:08]

That aspect of altering the photos has got so much negative attention as if the police were doing something wrong. And I don't think that the public generally understands why you would alter a photo. So they answer your question is yes, we do alter photos as part of that facial recognition exam. If anything, we should be arguing for that, not against it.

[00:16:28]

He says in some cases, if they didn't do that, their accuracy would actually go down by changing the photo, by making it a little lighter or a little darker or changing the angle gives us the best chance to be able to confirm the results, to be able to test the algorithm. We don't want to go and pursue the wrong lead of the wrong individual. And so whatever we have to do to modify the photo to be able to get the most accurate results.

[00:16:51]

And like I said, for us, we use a variety of other. And balances to then determine that that is truly the likely candidate, Coryn did go on to say they don't merge other people's faces with the input photo. So no pasting on open eyes, for example. But it begs the question, how exactly are we measuring accuracy and who gets to decide when and how?

[00:17:12]

This practice of modifying photos is fair play, but safely is not just physical safety, but it's also doing so with a balance to privacy, civil rights, civil liberties. And I think that there's a right balance. There's a way to do that. But the conversation has to be open and people have to be open on both sides to have that conversation. So the future for policing can become much better with these technologies. So long as we do it right.

[00:17:37]

It cannot be overstated just how much things like police department policies, tools, community relationships and public oversight vary from place to place. So this balance of civil liberties and policing has to be struck not just in Las Vegas, but in every town in the country still. We spoke to other departments and there are some common themes. Armando Aguilar is Miami's assistant chief of police and he oversees the Criminal Investigations Division.

[00:18:04]

This is an invaluable tool for law enforcement, but it's certainly very dangerous. And if you think about it, many of the other tools that law enforcement officers have in the wrong hands are also very dangerous. And so we wanted to make sure that we not only use the technology, but that we use the technology responsibly and that we did so in a way that the people that we served were comfortable with.

[00:18:26]

They've been using facial recognition in some form since 2013. We started out and continue to use a program called Faces, which is operated by the Pinellas County Sheriff's Office. It's one of the counties in Florida and it's a shared database with all the counties throughout the state. And so we moved in late 2000 19 to a product called Clearview.

[00:18:51]

EHI Faces runs off a government database of photos like Las Vegas and Clearview runs off of billions of photos scraped from the Web.

[00:19:00]

The technology moves faster than policy. There was a time when the technology was in use here without a guiding policy. And so once once I stepped into my current role, I thought it was it was very important to to set those parameters and just make sure that we were responsible in our use of the technology coronavirus, threw them a curve ball.

[00:19:20]

And like everything else, their town halls went virtual, such as this one on Facebook Live.

[00:19:25]

What data does Miami PD send or gets tracked by CAVU? So what?

[00:19:31]

We are in crafting their policy. Aguilar says he also met with the ACLU.

[00:19:35]

They respectfully started the meeting by saying that there is nothing that we could do that would satisfy them other than saying we're going to scrap the program altogether. But knowing that likely that that wouldn't happen, they had about seven or eight very valid concerns that they brought to our attention. And we actually agreed with them and we incorporated each one of those into into our policy.

[00:19:57]

They also put limits on who can use it. So what we did also was in order to make sure that we would limit the opportunities for abuse, we limited the number of users. So anybody that needs to have somebody picture run through clear view or through faces needs to send a request to our real time crime center.

[00:20:18]

He made sure that all trial Clearview accounts offered to police in Miami were canceled and he shrunk the number of people who have access.

[00:20:26]

It's a lot easier to control when you have a dozen people working in the same unit versus people that are spread out throughout the entire eighteen hundred member police department using the program with these new guidelines, even if a photo finds a match, it doesn't give probable cause to go make an arrest.

[00:20:43]

So now we have this match. But can we put that person in a photographic lineup and also have an eyewitness identify them? Is there a fingerprint evidence? Is there DNA evidence? Can the investigating detective or officer make an identification himself after after examining the video and after examining the photograph that was that was uploaded into the system? So it's not just automatically, hey, this person came up as a ninety nine percent match. Let's go get them. Just because somebody calls in a tip and says, hey, I think that's my neighbor Bobby.

[00:21:12]

We don't go in arrest, Bobby.

[00:21:14]

And once again, I wanted to know if they ever manipulate their photos before they run them to the software to up the odds of getting a match.

[00:21:21]

No, we we do not in any way manipulate our probe photograph, which is the again, the photograph that we upload into into the system. We we use whatever is available and we either come up with a match or we don't.

[00:21:34]

They also don't subin celebrities to help find a suspect nor monitor people in real time. And they also thought about how face ID might infringe on constitutional rights.

[00:21:44]

Maybe if one day we have protests outside of the police station, which became very real in the last few months, we don't want to use this technology to just go identify. Protesters or protest organizers, people want to live life out loud and post every every waking moment of their life on social media and also asked for privacy. And so certainly many of us do set our personal social media accounts to private settings. And so I do want to make sure that everybody is aware that this system in no way breaks into your private settings.

[00:22:19]

So this only searches the Internet and social media pages for those images that are available for anyone to see. And so anything beyond that would be a clear violation of of people's Fourth Amendment rights. And that's something that we neither have the capability of doing nor the interest of doing.

[00:22:38]

So what he's saying is we're responsible for protecting our own privacy on the Web and we do have the ability to turn Facebook settings to private. But using default settings on social media isn't quite the same thing as walking into a police department and providing a bunch of photos that give away lots of personal information. And this is part of what makes consent online. Really thorny. Next episode, we meet the founder of the Russian company behind what may be the world's largest real time face ID system, which aims to not only work on people wearing masks, but also tell if they're worn properly and read your car's license plate at the same time.

[00:23:16]

Do you ever worry that somebody might take all of your hard work and use it to build a world you don't really want to live in? Join us as we wrap up this mini series by exploring the way forward and examining what policy might look like.

[00:23:33]

This episode was reported and produced by me, Tate, Ryan Mosley and Emma Silicon's, we had help from Benji Rosen and Karen How were edited by Michael Reilly and Gideon Lichfield, our technical director is Jacob Gorsky. Thanks for listening. I'm Jennifer Strong. This is an MIT technology review.