Transcribe your podcast
[00:00:04]

I mean, Liz, Hugh, this is Ted talks daily, it feels like ancient history now, but the scandal surrounding Elizabeth Holmes and her start up Theranos was one of the biggest business frauds in Silicon Valley history. Entrepreneur Erica Chang blew the whistle on the company in her talk from Ted Berkley in 2020.

[00:00:22]

She takes us on her journey from stressed college student to how she trusted her gut and spoke truth to power. But it asks a bigger question for all of us how do we make it easier to speak up about problems in our workplaces and institutions?

[00:00:40]

So I had graduated seven years ago from Berkeley with a dual degree in molecular and cell biology and linguistics, and I had gone to a career fair here on campus where I had gotten an interview with a startup called Theranos.

[00:00:56]

And at the time, there wasn't really that much information about the company, but the little of that was there was really impressive.

[00:01:04]

Essentially, what the company was doing was creating a medical device where you would be able to run your entire blood panel on a fingerstick of blood so you wouldn't have to get a big needle stuck in your arm in order to get your blood test done.

[00:01:20]

So this was interesting, not only because it was less painful, but also it could potentially open the door to predictive diagnostics. If you had a device that allowed for more frequent and continuous diagnosis, potentially you could diagnose disease before someone got sick.

[00:01:41]

And this was confirmed in an interview that the founder, Elizabeth Holmes, had said in The Wall Street Journal, You know, the reality within our health care system today is that when someone you care about gets really sick, by the time you found out, it's too late to do anything about it. And it's heartbreaking.

[00:01:57]

This was a moonshot that I really wanted to be a part of and I really wanted to help build.

[00:02:03]

And there was another reason why I think the story of Elizabeth really appealed to me.

[00:02:10]

So there was a time that someone had said to me, Erica, there are two types of people. There are those that thrive and those that survive in you, my dear, are a survivor. Before I went to university, I had grown up in a one bedroom trailer with my six family members. And when I told people I wanted to go to Berkeley, they would say, Well, I want to be an astronaut, so good luck. And I stuck with it and I worked hard and I managed to get in.

[00:02:36]

And honestly, my first year was very challenging.

[00:02:39]

I was the victim of a series of crimes. I was robbed at gunpoint. I was sexually assaulted and I was sexually assaulted a third time, spurring on very severe panic attacks where I was failing my classes and I dropped out of school. And at this moment, people had said to me, Erica, maybe you're not cut out for the sciences. Maybe you should reconsider doing something else. And I told myself, you know what? If I don't make the cut, I don't make the cut, but I cannot give up on myself.

[00:03:08]

And I'm going to go for this. And even if I'm not the best for it, I'm going to try and make it happen. And luckily, I stuck with it and I got the degree and I graduated to.

[00:03:25]

So when I heard Elizabeth Holmes had dropped out of Stanford at age 19 to start this company and it was being quite successful to me, it was a signal of, you know, didn't matter what your background was, as long as you committed to hard work and intelligence, that was enough to make an impact in the world.

[00:03:46]

And this was something for me personally that I had to believe in my life because it was one of the few anchors that I had had that got me through the day.

[00:03:56]

So you can imagine when I thought about Theranos, I really anticipated that this would be the first and the last company that I was going to work for. This was finally my opportunity to contribute to society to solve the problems that I had seen in the world.

[00:04:11]

But I started to notice some problems. So I started off as an entry level associate in the lab, and we would be sitting in a lab meeting reviewing data to confirm whether the technology worked or not.

[00:04:27]

And someone would say to me, well, let's get rid of the outlier and see how that affects the accuracy rate.

[00:04:36]

So what constitutes an outlier here? Which one is the outlier? And the answer is, you have no idea, you don't know, right, in deleting a data point is really violating one of the things that I found so beautiful about the scientific process, which it really allows the data to reveal the truth to you. And as tempting as it might be in certain scenarios, to place your story on the data, to confirm your own narrative when you do this, it has really bad future consequences.

[00:05:11]

So this to me was almost immediately a red flag and it kind of folded in to the next experience and the next red flag that I started to see within the clinical laboratory. So a clinical laboratory is where you actively process patient samples. And so before I would run a patient sample, I would have a sample where I knew what the concentration was. And in this case, it was point to for T.S.A., which is an indicator of whether someone has prostate cancer or is at risk of prostate cancer or not.

[00:05:42]

But I when I would run it in the Theranos device, it would come out eight point nine and then I'd run it again and it come out five point one and I'd run it again and it come out point five, which is technically in range. But what do you do in this scenario?

[00:05:59]

What is the accurate answer in this wasn't an instance that I was seeing just one off. This was happening nearly every day across so many different test in mind.

[00:06:14]

You, this is, for example, where I know what the concentration is, what happens when I don't know what the concentration is like with a patient sample. How am I supposed to trust what the result is at that point?

[00:06:31]

So this led to sort of the last and final red flag for me.

[00:06:36]

And this is when we were doing testing in order to confirm and certify whether we could continue processing patient samples.

[00:06:45]

So what regulators will do is they'll give you a sample and they'll say, run this sample, just like the quality control through your normal workflow, how you normally test on patients and then give us the results and we will tell you, do you pass or do you fail?

[00:07:02]

So because we were seeing so many issues with the Theranos device that was actively being used to test on patients, what we had done is we had taken the sample and we had run it through an FDA approved machine and we had run it through the Theranos device. And guess what happened? We got two very, very different results. So what do you think they did in this scenario? You would anticipate that you would tell the regulators, like we have some discrepancies here with this new technology.

[00:07:32]

But instead, Theranos had sent the results of the FDA approved machine. So what does this signal to you, this signals to you that even within your own organization, you don't trust the results that your technology is producing. So how do we have any business running patient samples on this particular machine? Of course, you know, I am a recent grad, I have at this point ran all these different experiments, I've compiled all this evidence, and I've gone into the office of the CEO and I was raising my concerns within the lab.

[00:08:12]

We're seeing a lot of variability. The accuracy rate doesn't seem right. I don't feel right about testing on patients. These things I'm just not comfortable with. And the response that I got back is you don't know what you're talking about. What you need to do is what I'm paying you to do. And you need a process, patient samples. So that night I called up a colleague of mine who I had befriended within the organization, Tyler Schultz, who also happened to have a grandfather who was on the board of directors.

[00:08:45]

And so we had decided to go to his grandfather's house and tell him at dinner what the company was telling him. What was going on was actually not what was happening behind closed doors. And not to mention that Tyler's grandfather was George Shultz, the secretary of state of the United States. So you can imagine me as a 20 something year old just shaking, like, what are you what are you getting yourself into?

[00:09:13]

But we had sat down at his dinner table and said, when you think that they've taken this blood sample and they put it in this device and it pops out a result, well, what's really happening is the moment you step outside of the room, they take that blood sample, they run it to a back door. And there are five people on standby that are taking this tiny blood sample and splitting it amongst five different machines. And he says to us, I know is very smart.

[00:09:43]

You seem very smart. But the fact of the matter is, I brought in a wealth of intelligent people and they tell me that this device is going to revolutionize health care. And so maybe you should consider doing something else. So this had gone through a period of about seven months and I decided to quit that very next day and this is.

[00:10:14]

But this was a moment that I had a sit with myself and do a bit of a mental health check, I had raised concerns in the lab. I had raised concerns with the CEO. I had raised concerns with a board member.

[00:10:29]

And meanwhile, Elizabeth is on the cover of every major magazine across America.

[00:10:37]

So there's one common thread here, and that's me. Maybe I'm the problem. Maybe there's something that I'm not seeing. Maybe I'm the crazy one.

[00:10:47]

And this is the part of my story where I really get lucky.

[00:10:51]

I was approached by a very talented journalist, John Kerry Ruu from The Wall Street Journal.

[00:10:56]

And he and he had basically said that he also had heard concerns about the company from other people in the industry and working for the company.

[00:11:09]

And in that moment, it clicked in my head. Erica, you are not crazy. You're not the crazy one. In fact, there are other people out there just like you that are just as scared of coming forward, but see the same problems in the same concerns that you do.

[00:11:26]

So before John's expose, an investigative report had come out to reveal the truth of what was going on in the company.

[00:11:33]

The company decided to go on a witch hunt for all sorts of former employees, myself included, to basically intimidate us from coming forward or talking to one another.

[00:11:45]

And the scary thing really for me in this instance was the fact that it triggered and I realized that they were following me.

[00:11:53]

But it was also, in a way, a bit of a blessing because it forced me to call a lawyer. And I was lucky enough. I called a free lawyer, but he had suggested, why don't you report to a regulatory agency?

[00:12:07]

And this was something that didn't even click in my head, probably because I was so inexperienced.

[00:12:13]

But once that happened, that's exactly what I did.

[00:12:17]

I had decided to write a letter and a complaint letter to regulators illustrating all the deficiencies and the problems that I had seen in the laboratory.

[00:12:27]

And as endearingly as my dad kind of notes this as being my, like, Dragonslayer moment where I had risen up and sort of like fought this behemoth and it caused this domino effect.

[00:12:38]

I can tell you right now, I felt anything but courageous.

[00:12:42]

I was scared. I was terrified. I was anxious.

[00:12:47]

I was ashamed slightly that it took me a month to write the letter. There was a glimmer of hope in there that maybe somehow no one would ever figure out that it was me. But despite all that emotion and all that volatility, I still did it.

[00:13:04]

And luckily, it triggered an investigation that shown to light that there were huge deficiencies in the lab and it stopped Theranos from processing patient samples.

[00:13:22]

So you would hope going through a very challenging and crazy situation like this, that I would be able to sort of culminate somehow to use a recipe for success for other people that are in this situation. But frankly, when it comes to situations like this, the only quote that kind of gets it right is this Mike Tyson quote that says, everyone has a plan.

[00:13:44]

And so you go to the mouth and that's exactly how this is.

[00:13:52]

But today, you know, we're here to kind of convene on moonshots and moonshots. Are these highly innovative projects that are very ambitious that everyone wants to believe in.

[00:14:04]

But what happens when the vision is so compelling and the desire to believe is so strong that it starts to cloud your judgment about what reality is and particularly when these innovative projects start to be a detriment to society?

[00:14:24]

What are the mechanisms in place in which we can prevent these potential consequences? And really, in my mind, the simplest way to do that is to foster stronger cultures of people who speak up in listening to those who speak up.

[00:14:42]

So now the big question is, you know, how do we make speaking up the norm and not the exception?

[00:14:58]

So luckily, you know, in my own experience, I realized that when it comes to speaking up, the action tends to be pretty straightforward in most cases, but the hard part is really deciding whether to act or not.

[00:15:12]

So how do we frame our decisions in a way that makes it easier, easier for us to act and produce more ethical outcomes? So you can you see San Diego come up, came up with this excellent framework called the three C's, and it's called commitment, consciousness and competency and commitment is the desire to do the right thing regardless of the cost. In my case, that Theranos, if I was wrong, I was going to have to pay the consequences.

[00:15:42]

But if I was right, the fact that I could have been a person that knew what was going on and didn't say something that was purgatory being silent was purgatory. Then there's consciousness, the awareness to act consistently and apply moral convictions to daily behavior, behavior.

[00:16:05]

And the third aspect is competency. And competency is the ability to collect and evaluate information and foresee potential consequences and risk. And the reason I could trust my competency was because I was acting in service of others. So I think a simple process is really taking those actions and imagining if this happened to my children, to my parents, to my spouse, to my neighbors, to my community, if I took that. How will it be remembered? And with that, I hope as we all venture off to build our own moonshots, we don't just conceptualize them in a way as a means for people to survive, but really see them as opportunities and chances for everybody to thrive.

[00:16:59]

Thank you.

[00:17:02]

Ted talks daily, is hosted by Elise and produced by Ted theme music is from Allison Layton Brown. In our Mixu is Christopher Fazi Bogon.

[00:17:11]

We record the talks at TED events we host or from TED events which are organized independently by volunteers all over the world. And we'd love to hear from you. Leave us a review on Apple podcasts or email us at Podcast's at Terracom PUREX.