Transcribe your podcast
[00:00:00]

It Casey.

[00:00:01]

Where are you right now?

[00:00:02]

Okay, first of all, you have to stop calling me, Kevin. I'm trying to be on vacation over here. I am I'm literally in the middle of gate E eight at San Francisco International Airport.

[00:00:12]

Is it loud there? How are you recording this?

[00:00:15]

It's surprisingly quiet, but only because my flight just got canceled.

[00:00:18]

No, what happened?

[00:00:22]

Well, step one was they were waiting on some paperwork. Step two was they opened up some sort of compartment and they said that they needed to replace a broken latch. And then during that process, they announced that something was cracked and they were taking the airplane out of service. The best option for us was to hop on and chat while we wait for our next plane to arrive. And that's how my Thanksgiving I'm so.

[00:00:44]

Sorry that happened to you.

[00:00:45]

Well, how's your holiday going so far?

[00:00:48]

Well, I would say it's been very relaxing. I've just been spending quiet afternoons reading books and catching up on sleep. No, of course, I have been in a frenzy of reporting and trying to figure out what the heck is going on behind all this drama. This is a sort of triumphal return of Sam Altman, and some OpenAI employees have been saying things like, we are so back.

[00:01:13]

Well, before we get into the aftermath, Kevin, we should probably just quickly tell folks who may not have heard what happened since last week came to them in an emergency podcast just hours ago.

[00:01:25]

I feel like this week has just been one never ending emergency podcast. I feel like we have really not stopped recording since Friday. But basically, here's the deal. Late Tuesday, OpenAI announced that Sam Altman was being brought back after this five day campaign that had been waged by Sam and his allies, OpenAI's employees who had threatened to quit en masse and go work at Microsoft, as well as the company's investors. They said they had a quote, agreement in principle for Sam Altman to return and that they were overhauling the company's board of directors. So Adam D'Angelo, who is the chief executive of Quora and was one of the board directors who had voted to fire Sam Altman, is staying on. But he is the only member of that board who is staying on. Helen Toner and Tasha McCauley, two of the other board members who voted to fire Sam Altman are leaving the board, and two new people are joining to replace them. Bret Taylor, who is an early Facebook executive and former executive of salesforce, is coming onto the board. He will be the new chairman. And Larry Summers, the former Treasury Secretary, is also coming onto the board.

[00:02:42]

So, Casey, what was your reaction to this news?

[00:02:45]

I mean, on one hand, it was very surprising given that there had been a few failed attempts to return Sam to this position so far. On the other hand, though, I think by the time this happened, Kevin. This really was inevitable. And there was one particular detail I read in some reporting that I want to share right now. And this was the moment where I thought, there is no way Sam Altman doesn't come back to this board. Can I just share this with you?

[00:03:09]

Yes, please.

[00:03:10]

So this is from a story written by Kichege deepest etheraman and Berber Jin at the Wall Street Journal. And in their piece on the matter, they said, and this is, of course, as the board is discussing the situation with some supporters for Sam Altman. And this is the quote the board agreed to discuss the matter with their counsel. After a few hours, they returned, still unwilling to provide specifics, specifics in this case about why Altman was fired. The story goes on. They said that Altman wasn't candid and often got his way. The board said that Altman had been so deft they couldn't even give a specific example, according to the people familiar with the executives. So when the people who were trying to get Sam back asked the board, hey, no, seriously, why did you fire this guy? Their answer was, he's so good at being bad, we can't even tell you what he did bad. And that was the moment where I thought, this man is going to be CEO of this company again.

[00:04:00]

Yeah, that's a good observation. And it dovetails with some reporting that my colleagues and I have been doing at the Times about why Sam Altman was fired and about some of the conflicts between him and the board that have been going on for a while now. In particular, this conflict between Sam Altman and Helen Toner, one of the board members who is departing over this academic paper that she had written that sort of drew attention to OpenAI in a negative light. And Sam Altman was upset about this. And this is sort of part of what sparked the disagreement between him and the sort of Helen Toner faction of the board. But obviously you're right. We still don't know exactly what the trigger was for firing Sam Altman, but it seems to have been vague enough or unconvincing enough that the faction of the board that wanted to push him out was not able to stand their ground. And ultimately, in these bargaining sessions, they agreed to bring him back in exchange for certain changes to the company's governance.

[00:05:03]

That's right. I do think that dispute that you mentioned is important to spend another beat on, though, right? Because I do think that the entire conflict is contained in this story. OpenAI is, of course, famously a nonprofit board that runs a for profit company. Helen represented the non profit board. Sam, his duty is to the nonprofit, right? He is hired essentially by the nonprofit. But I also think that his loyalties have been much more to the sort of commercial corporate side of the venture, at least as this most recent drama has been playing out. And the paper that Helen co wrote was a paper in part about AI safety. And the thing that she and her co authors wrote was that OpenAI's rival Anthropic, which is, of course, co founded by a bunch of former OpenAI people, they wrote that Anthropic had essentially built their product more safely than OpenAI had. And so you can understand why in Sam's mind that was a betrayal. Right. But you can also understand why in Helen's mind, that was just her doing her job. Her job is to make sure that AI gets built in the safest manner possible.

[00:06:07]

Her job is not to protect the reputation of OpenAI. And so that appears to be where the schism was. And even if that wasn't the trigger for why Sam got fired, I think it tells you a lot about what happened over the past week.

[00:06:19]

Yeah. So we'll see who gets added to the board in the coming days. This is not the final composition for the board.

[00:06:26]

Are you throwing your hat in the ring, by the way, Kevin?

[00:06:29]

I will not serve if elected. This company has already cost me too much sleep. So it remains to be seen who will end up on the final version of the board. This is sort of being seen and portrayed as an interim board that's just there to kind of sort things out and ultimately decide who should be on the nonprofit board going forward. But I would say a couple things. One is Microsoft is definitely going to have a bigger hand in the governance of OpenAI going forward. When Microsoft did this deal with OpenAI, investing billions of dollars in the company, they were kind of a passive investor, right? They did not have a seat on the board. They were not making the decisions about the future of this company, even though this company and its technology have become very important to Microsoft's future business plans.

[00:07:18]

The best joke I heard about that, by the way, was from Matt Levine, who wrote something like, microsoft invested in a nonprofit at a valuation of $80 billion.

[00:07:26]

Yeah. So Microsoft obviously will want to ensure that something like this doesn't happen again, that its investment in OpenAI is not sort of jeopardized by this nonprofit board. And so I expect that they will want a board suite going forward. And the bigger picture here, I think and this is something that I've been writing about today, is that this war in AI between sort of the capitalists and the catastrophists that's catchy. Yeah, thank you. The people who think that AI is going to be a powerful business tool and the people who worry that it's going to take over the world and destroy humanity. That war, I think, as of now, is over. The capitalists have won. The people who are now in charge of the board of OpenAI are the kind of seasoned deal makers and Silicon Valley insiders that you would expect to govern a for profit technology company. They are not these kind of academics and ideologues who worry that AI could become too powerful and need to be shut down. And I think that's mostly the way that things are going to be from here on out.

[00:08:30]

Certainly I think the pro safety people have lost their most important perch that they had on any sort of power in the circumstance at all. At the same time, there is a faction throughout the government, academia, in journalism, and within the industry itself that wants to build this technology in a safe way. So I don't think that disappears. But you're right, it did lose a lot of power. I think Mike's sort of wrapping up. Question for you, Kevin, is how much do you think this changes OpenAI? Is it the case that Monday morning rolls around and it is just back to business as usual for these folks? Or do you think that this crazy series of events will have affected Sam and the company in some profound way that might change what we expect to see from them going forward?

[00:09:12]

It's hard to say. I was talking to some OpenAI employees who were going back to the office to celebrate. They were having a party at the office. I did not get invited to that party, but I was hearing dispatches from inside of it, and at one point, the fire alarm of the OpenAI offices was set off by a fog machine. In case you want to do sort of a vibe check on how people at the company are feeling right now. They are very happy. They are celebrating. They are bonded. Nothing bonds people together like going through a cris like the one we're having.

[00:09:46]

Right now, where I'm recording an emergency episode in the airport. It's true. I've never felt close.

[00:09:50]

I do feel very bonded to you right now, Casey. The people I'm talking to, they think that this is going to be a real moment of reinvigoration for the company, that employees are feeling optimistic about the future, and they are now even more devoted to this mission of building AGI. And obviously, I think there are going to be some people who come out ahead or behind of this kind of reorganized, reconstituted OpenAI. And there's a lot of questions we still don't know the answers to about how the company will change going forward. But I think if you're looking for a sort of clear before and after picture of OpenAI, I would say before there was this sense that there was this fragile structure that needed to be sort of balanced the needs of the business and the needs of the nonprofit. And now I think people feel like the business is firmly in the driver's seat.

[00:10:45]

That all sounds right to me. I think the one thing I would add is that I do think the company, and Sam Altman in particular, are just going to be under more scrutiny now. Right. We all learned a lot about the history of this company and of Sam Altman in particular over the past week. And I think to the extent that the company makes moves that are perceived as sort of pro corporate, pro Microsoft and anti safety, I just think they're going to get ten times the attention that they did before all of this happens, and that might be a good thing. Right. So at the end of the day, I think that the board that they had did not execute its responsibilities well and did need to go. But I do hope that you and I will keep our eye on some of the concerns that they were raising behind closed doors, even if they would never be straightforward about what those concerns actually were.

[00:11:29]

Totally. Well, Casey, thank you for taking one for the team and recording an emergency podcast from the airport. I hope that you are able to get on a new flight and make your Thanksgiving plans after all.

[00:11:40]

Yeah. And let me just say to the people of OpenAI, this is the last one I have in me this week, okay? I don't care what kind of crazy shenanigans you guys get up to with your fog machine and your fire alarms that you're pulling at company headquarters. You're not going to hear my voice again until next Friday, and you can count on that.

[00:11:54]

That's the hard fork. Promise. We are going to take a vacation.

[00:11:58]

All right? Maybe go grab that turkey out of the oven, Kevin. I'm starting to see some smoke coming out from the door behind you.

[00:12:03]

That's just my fog machine in solidarity.

[00:12:48]

Hard Fork is produced by Rachel Cohn. We're edited by Jen Poyant. Today's show was engineered by Dan Ramirez. Original music by Dan Powell. Our audience editor is Nell Galokley. Video production by Ryan Manning and Dylan Berguson. Special thanks to paula schumann, pui wing tam, kate lepresty, corey Schreppel and jeffrey miranda.