Pricing Sign in

English
Transcribe your podcast
[00:00:00]

Wndri Plus subscribers can listen to Armchair Expert early and ad-free right now. Join WNDRI Plus in the WNDRI app or on Apple podcast, or you can listen for free wherever you get your podcast. Welcome, welcome, welcome to Armchair Expert, Experts on Expert. My friend is here today. Yes. Ken Goldberg. You'll hear our origin story, but I met him somewhere. I was judgmental of him. Then, of course, I fell in love with him. And now I'm just smitten with this gentleman. Ken Goldberg is the William S. Floyd Distinguished Chair in Engineering at UC Berkeley and an award-winning roboticist, filmmaker, artist, and public speaker on AI and robotics. Now, really quick, I've read a couple of comments that people are over AI. I get it. People feel a little inundated, and I get it. It's the topic of the day. A, this isn't very heavy in AI talk, and then B, this is a million times more playful than you could ever imagine robotics could be. I'll also He's glad that he has an art exhibit that is going until March. Don't wait till then, go now, at Skirball, if you live in LA or are visiting Skirball Cultural Center in LA.

[00:01:09]

It's called Ancient Wisdom for a Future Ecology: Trees, Time, and Technology. Very, very cool art project with these tree rings that are gorgeous and very creative. Ken Goldberg, I love you. I think you all will love them, too. Please enjoy. I'm Afwa Hirsch. I'm Peter Frankerpen. In our podcast, Legacy, we explore the lives of some of the biggest characters in history.

[00:01:34]

This season, we're looking at the life of the most famous Queen of France, Marie-Antoinette.

[00:01:41]

Her death is seemingly more well known than her life, but her journey from the daughter of the Austrian Emperor to becoming the most hated woman in France is just as fascinating.

[00:01:52]

We're going to look at the ways in which her story was distorted during the French Revolution and dig deeper into her real experiences in a troubled, difficult time.

[00:02:01]

Marie-antoinette is one of the most well-recognized but least well-understood names in history.

[00:02:07]

We're talking about how her death led to the way that she was spoken about in the 19th, 20th, and 21st centuries. Follow Legacy now from wherever you get your podcasts.

[00:02:18]

Or binge entire seasons early and ad free on WNDYRI Plus. Alice and Matt here from British Scandal. Matt, if we had a bingo card What would be on there? Compelling storytelling, egotistical white men, and dubious humor. If that sounds like your cup of tea, you will love our podcast, British Scandal, the show where every week we bring you stories from this green and not always so pleasant land. We've looked at spies, politicians, media magnates, a king, no one is safe.

[00:02:48]

And knowing our country, we won't be out of a job anytime soon.

[00:02:51]

Follow British Scandal wherever you listen to your podcasts.

[00:03:08]

Ken Degasee.

[00:03:14]

Good company. I I prepped Monica by saying, You're going to really like my friend Ken. If Fred Armisen was a roboticist, this would be Ken. Okay. I think maybe am I unique in that comparison, or have you ever heard that before?

[00:03:28]

I remember you saying I don't hear it a lot, but I take it as a compliment. You should.

[00:03:33]

He's one of our favorite.

[00:03:34]

We love him. Really? Oh, yeah.

[00:03:35]

Very unicorny, like you.

[00:03:38]

Unicorny, though. Is that mean rare?

[00:03:40]

Well, not corny. Rule out the corny part. Focus on the unicorn part of that.

[00:03:44]

It is mixed Which is because he calls people unicorns he likes, but then he also says he doesn't like unicorns. So it is tricky. There's confusion. You're right to be confused. Understandably.

[00:03:53]

Okay, all right. Because I don't know if you meant rare as a human being. Rare and special. Rare and special. Okay, I'll take that.

[00:03:58]

Rare, unique, Special, Colorful, Vibrant, Playful. Our favorite word.

[00:04:04]

I love it. Good.

[00:04:05]

I think we should start with how we met because I would love to hear your perspective. I have a very specific perspective. I don't even know if I let you in on the full details.

[00:04:14]

I'm curious now.

[00:04:15]

You and I were at a conference, and it's the conference I would have never imagined getting invited to. There's a lot of people there like yourself, professors and stuff.

[00:04:24]

Smarty Pants.

[00:04:25]

Smarty Pants, billionaires. A fun group, actually. We're We're walking into this event, and they are very militant about everyone wearing name tags as they should be, because everyone there thinks everyone knows their name, but people don't know each other's names. I see this guy with crazy hair, and he doesn't have a name tag on.

[00:04:43]

Bucking the system.

[00:04:44]

I'm like this, which is funny because I should immediately love that. I should go, Yeah, fuck these name tags. That's my essence. But for some reason, because I've complied, who does this guy think he's so famous? He's the only one here that doesn't need a name tag. I'm immediately a little triggered. And I say to Chris, I'm like, Who's this guy with the wild hair? It doesn't have a name tag on. And then by luck, somehow I hear your name, Ken Goldberg. And then I immediately go into the bathroom. I go into the bathroom before the event starts. You don't know any of this. What? Don't worry, you're not being led in a bad direction. I go into the bathroom and I Google you and I see Robot Professor at Berkeley. And I immediately am like, That's a cool job. Okay, so he's not a billionaire who thinks he doesn't need to wear a name tag. This AI is just an absent-minded genius, maybe.

[00:05:34]

Yeah, because I didn't know about the name tag.

[00:05:37]

That says so much more about Dax than it does about you.

[00:05:40]

These are all my shortcomings and character defects, but they work out beautifully because then 20 minutes later, and most of the things we were sitting through the seminars were very AI-heavy, and I have a chip on my shoulder about AI. So then now I'm standing next to you randomly, and this is probably where I would enter your life story, because I just lean over to you and I say, You're a robotics professor, yeah? And you go, yeah. These robots are so far away from doing our laundry and working on our car for us or doing anything, really. It's like they keep saying AI is going to take over everything. You're going to be a leisure class. What are we going to do with all this? And I'm like, Where are the fucking robots? And you go, Oh, I'm so delighted this is your question.

[00:06:22]

All right, so should I tell you my side of this?

[00:06:24]

Because- I'll just wrap it up by saying within 30 seconds of talking to you, I'm like, Oh, this is my favorite guy here by a long shot. I hope I'm at every dinner with him. Then we since developed a friendship.

[00:06:34]

Yes. Okay. I remember that I was in this place also a little intimidated because a lot of A-list people, and I was sitting there, and I forget who it was. He was on the stage, and you raised a comment and said something about he looked really sharp.

[00:06:49]

Pharrell. Pharrell. Yeah. I think I said he was really dazzling.

[00:06:53]

Exactly. Yeah. I just love the way you said that. I thought it was such an unusual thing to say. It was spot on, but it was spot But it was just the thing that nobody would normally say.

[00:07:02]

Yeah, unicorny.

[00:07:03]

Very unicorny. I think it was after the lunch or something, I saw you standing over there and I just went over and I said, Hey, I love that comment you made. Then we started talking. That's how I remember it. I didn't know anything about you.

[00:07:18]

You must have known Kristen.

[00:07:19]

No. We were just having this fun conversation and you guys were so charming. Then Tiffany came over. Your wife. My wife.

[00:07:26]

Also dazzling and unicorny.

[00:07:27]

Yes. We walked away and she said, You who they were? And I was like, no. Of course not.

[00:07:33]

I hear about important stuff.

[00:07:35]

Then we had several great conversations at that thing.

[00:07:38]

Yeah, but it was really comforting to hear you say that as someone who is an authority in the space because I've heard many people lecture on AI, and I'm hearing all of the... What are you peaking at?

[00:07:48]

Is there underwear on the floor?

[00:07:50]

Oh my God. Okay, I got to walk. I'm so sorry, Ken. But obviously this needs to be addressed. What's happening? That took me a second. Okay, so here's what happened. I just put the pieces together.

[00:08:07]

There's an explanation. There is.

[00:08:09]

Monica and I did a commercial yesterday. As I told you. You told me that. When I arrived, I changed my clothes and I put them in a bag and I brought a bag of extra shoes and pants they asked me to bring. Then I threw this sweater in there, and then that underwear was in there. Then I just threw on my sweater just now. Clearly, my panties were attached and now they've fallen off. Wow. For the viewer, I would be so angry if I didn't get to see the-Oh my God, we don't even see that. You have to if you're watching and you're like, Everyone's seeing these panties and I'm not. Wouldn't you throw your computer out the car?

[00:08:43]

I love it. Full disclosure.

[00:08:45]

Okay, so these are the offensive panties.

[00:08:48]

Look at them, they have an elephant. They do. That's me, Andy, he's a former sponsor.

[00:08:53]

Are you being polite?

[00:08:54]

That was quite nice. Or do I now have a Christmas idea for you?

[00:08:57]

Well, actually, yeah. I'm going to buy a pair I like that.

[00:09:00]

I like that. Andy's is a great brand. Really?

[00:09:02]

It's a brand? Yeah. Okay. I'm always looking for good.

[00:09:04]

Very comfortable, very playful. It's almost as if they were a current sponsor, we planned all that.

[00:09:09]

That would be great. Placement.

[00:09:11]

Paola. Sorry, I just had to call that out. Yes.

[00:09:15]

The look on your face, I thought there was maybe a squirrel under my chair. She had a very damn good look on her face.

[00:09:21]

Well, it was a little surprising. I'm thinking like, what else was going on?

[00:09:24]

Yeah, there was some implication. This is video, so I know people can see that there's something on the floor, so I had to say it.

[00:09:31]

Glad you did, because if you had just not said it, it would have been this lingering presence.

[00:09:35]

An elephant in the room, if you will.

[00:09:37]

Oh, my God. Did they know if you really planned it?

[00:09:41]

Oh, my God. That is brilliant.

[00:09:42]

The elephant panties in the room.

[00:09:44]

Oh, my God.

[00:09:45]

Oh, man.

[00:09:48]

We're not going to top that in the episode. We should wrap it up. Okay, so back to AI. Everyone's quite scared, and I think there's a lot of reasons to be scared. But also, I think maybe we're a little more panicked than we need to be. I just found you to be a a comforting voice in that. Oh, good. We became friends, and you've been over, and we love you and your wife, and you're also artists, so you're impossibly interesting. Let's start, though, with, of course, you would be born in Nigeria. Is that where you were born? I was. Of course.

[00:10:14]

Of course. All the best unicorns are. How did that happen?

[00:10:19]

My parents were idealists during the '60s, and they were at Penn in Philadelphia, and they were going on civil rights marches and things like that. They wanted to continue that idea of doing things for civil rights. When they were graduating, they wrote to various people in Africa and they said, we'd love to help. One person who ran a school there, and he's actually quite famous in Nigeria, Thais Chalarin, he invited them to come to his school and work for two years. So they basically got over there and there was no running water and no electricity when they got there. So it was very rough. And they lived under these circumstances. My dad taught physics and my mom taught English.

[00:11:01]

They were graduate students at Penn or undergrad?

[00:11:04]

Undergrads. They just finished their undergrad.

[00:11:05]

Also, they were ahead of the curve because you were born in '61? Yeah. The civil rights movement in its full velocity is later.

[00:11:13]

Yeah, it's a good point. It was starting. Philadelphia, the city of Brotherly Love, has a lot of integration, great history there. They were starting there. But that was also around the time of Nigerian independence. There was a real movement across Africa. I was very proud of them. I'm still proud of them for doing that.

[00:11:29]

Were you delivered Are you scared at a hospital?

[00:11:31]

Yeah. I always thought it must have been an accident because why would you do that? Who would plan? Yeah. I never asked because it was like that elephant in the room. I just didn't want to know. But a couple of years ago, my mom said, We wanted to have a baby because we had all this time together, and we knew it would be a time to focus on you. I was born in a hospital nearby called Abaddon, which is about an hour from this village. But I have a really big vaccination mark from that.

[00:11:58]

My father had that one, right? Is it the size of a quarter and indented. What? Yes. It's a specific vaccine that would do that, I think.

[00:12:05]

I think you're right. I don't know what it is, but yes, exactly.

[00:12:08]

I would gaze at it on my father's shoulder all the time. Looked like someone put a cigar out in us. Yes. That's it.

[00:12:13]

That's a great way to put it. That's exactly what it is. Oh, my God.

[00:12:16]

What if that was the vaccine? The doctor just left the cigar. How long were you there as a baby?

[00:12:22]

Just six months.

[00:12:23]

Did you get any citizenship out of that deal?

[00:12:26]

No, I looked into that, too, because I thought it'd be nice to have a dual citizenship.

[00:12:29]

Yeah, be getting water up there.

[00:12:30]

Yeah, it was always good. You need to flee the country. No, but apparently you can't have both. They don't allow it.

[00:12:36]

They're like, Fuck you. We're not a side dish.

[00:12:38]

You can't have both. Yeah.

[00:12:39]

Or the entree.

[00:12:40]

Right.

[00:12:41]

Then you do grow up in, I guess, would it be the suburb of Philadelphia?

[00:12:46]

Yeah, Bethlehem, Pennsylvania, Steeltown.

[00:12:48]

Right, Bethlehem Steel. Yeah.

[00:12:50]

My parents come back there because my dad was a metalurgist.

[00:12:53]

Well, this is fascinating.

[00:12:54]

In addition to being a physics teacher?

[00:12:57]

Yeah, well, physics was what he could teach as a... He was an engineering Grad. Then he went back and he actually got his PhD at Ohio State. Then we moved to Bethlehem. Bethlehem was known for where the time and motion studies were done. There's a whole history of scientific management. You know about this? Frederick Taylor?

[00:13:15]

No, but is this to increase productivity through the scientific method or something?

[00:13:19]

Yes. Okay, tell us. All right, so this is fascinating. You've heard of time and motion studies where they have a stopwatch and they would time people doing their work?

[00:13:27]

I hadn't heard of that.

[00:13:28]

Just to see for productivity? Efficiency?

[00:13:29]

Yeah, efficiency experts. It was very big in the early part of this 20th century.

[00:13:35]

Datifying.

[00:13:36]

Yes. Making work scientific by quantifying it. What they would do is they had all these things, mostly stopwatches back then, but they would time how long it took you to, say, carry a shovel of ore from one end of this lot to another. Then they would clock people, and then they would try and get them to increase their speeds. This guy wrote this book called The Scientific Management, something like that. It was very influential on Stalin, apparently. Oh, really? Yes, but workers hated it. Sure. For obvious reasons.

[00:14:04]

You're rushing. You're getting enough efficiency out of me.

[00:14:07]

Exactly. That was this whole thing was that you could increase the productivity of the average worker by a factor of two or more if you followed these methods, but it would squeeze the workers to the breaking point. So they didn't like it, but it became popular until unions came and pushed back. But this whole wave was still around in the form of industrial engineering, which is actually the department I'm in at Berkeley, which used to do these studies where it was how do you arrange office to be the most efficient or the assembly line to be most efficient for workers and now machines.

[00:14:36]

Well, and by the way, and we'll just earmark this, among the AI accomplishments that I find most fascinating is their ability to make things more efficient. I know there was a server farm they let AI loose on, and it had been studied forever. Within hours, it figured out how to make it like 30% more efficient or something crazy.

[00:14:55]

Yeah, so energy efficient. They could lower the amount of electricity it used, which is really undeniably good thing. That's good for the environment and everything else. Yeah.

[00:15:03]

Okay, so you're growing up in Bethleham. Yeah. Your mom and dad didn't teach in your childhood?

[00:15:07]

No, he was working at the research lab. Actually, she did. She taught at the elementary school. We were very close. She was a great mom. I was lucky. It was a good town to grow up in, but a little rough and tumble because you had to fight. Really?

[00:15:19]

Okay, great. Here's my guest because I'm from Detroit. You have this enormous working class. Many of the folks had migrated up from Kentucky to fulfill these roles. You have this culture of pride. Violence was on the table at all times. Yes.

[00:15:34]

That's interesting. You say the pride thing. I didn't know about that, but the pecking order was all about fighting, and kids would call you out and say, I'll see you after school, and you had to do it. Everyone would go watch.

[00:15:42]

There'd be a few additional fights for the people who got excited watching the first fight. Yeah, exactly. The most dangerous thing was watching one of those fights because afterwards there was going to be a few more.

[00:15:50]

Another fight would break out. I had both my front teeth knocked out.

[00:15:54]

You did? In what grade?

[00:15:56]

Like, 10th grade.

[00:15:57]

In school?

[00:15:58]

Okay, so the The story is that I was at a party and a girl asked me to take her home because she was having a fight with her boyfriend. I was being a nice guy, I thought. I drove her, and then I didn't even know who her boyfriend, but she said it was Eddie, who was a very tough guy.

[00:16:17]

Perfect name for him.

[00:16:18]

Exactly. I was like, Oh, no, that's not good. I did not want it to cross Eddie. And so the next thing I know, we had gone over, I dropped her at her friends, and the doorbell rings, and it was Eddie. I came out and Eddie just cocked me right in the mouth.Right out of the gate.Right out of the gate. Like a sucker punch. I remember it was snowing and all this blood on the snow.Oh, yeah.That was my front teeth.

[00:16:42]

Now, do you have the same thing I have, which is we're both really lucky and we're running in circles that are mostly people that are college bound and stuff. I try to explain the level of violence that was ever present. I can tell there's no connection to what I'm saying. Then I wonder, was it an era? Do you wonder if it's still like that in Bethleham? Because I'm curious, was that just our generation?

[00:17:02]

I don't know. It's interesting because it wasn't talked about. We didn't report it. I don't remember even occurring to me to even tell anyone. You teased it. Yeah. I wasn't going to- Well, that would lead to more abuse. Probably. You just sucked it up and you took it. It was definitely rough. Although it's interesting because now the way it does come up, a few years ago, I was in this academic setting and this guy double crossed me and he basically said, well, we're going to do it my way. I remember sitting across from him and I was really upset because I had put all this work into something and he was basically going to trash it and put somebody else in to take the credit. And I said, you don't know, but where I come from, I don't stand for that. I said, I'm going to really- Give your hands full. I'm going to get physical. Oh, no, I'm not going to get physical. No, I didn't say. Somehow, I can't remember actually the language, but I wasn't saying I'm going to hit you, but I was going to basically say, I'm going to come back.

[00:17:54]

I'm going to fight this. Yeah, not a pushover. Yeah, I'm not going to take this laying down.

[00:17:58]

For better or worse, at least in my experience, you walk away with this weird paradigm, which is it's better to get beat up and stand up for yourself, because if you don't, it's going to lead to so much more suffering. It's just an equation. You come to accept it, and that's it.

[00:18:14]

All right, so how do you feel about bullies?

[00:18:16]

Number one pet peeve in life is bullies. Bullies. I hate bullies.

[00:18:20]

I hate them.

[00:18:20]

I was big, so I can't claim that I was some victim, but I also was a punk rock, skateboarder, snowboarder. So I was alternative. Bullies, for me, was a big, big thing.

[00:18:29]

I learned that the best way to deal with them was to stand up to them, even if they were bigger than you. Then oftentimes, they would cave in. They were cowards.

[00:18:37]

And or they would at least move to someone else who wasn't going to stand up for themselves. They took another guy. You only had to do it once or twice.

[00:18:43]

Oh, that's interesting because there was a big reputation thing. It was very weird. You had this whole pecking order, and so people knew not to mess with you, and you had to be in a few fights, and then people would lay off.

[00:18:52]

Yeah, then you could exist. Okay, but now back to- Wild. You did engineering type stuff with dad. You were bonding dad over that as a kid. But then you decided you wanted to be an artist at some point?

[00:19:05]

Yeah, because my mom was an artist. She would paint, and she took us to the art school in the neighboring town, and I really loved that. They were both into modern art and would take us to museums, like in New York or in Philadelphia. Philadelphia Museum. I have very fond memories.

[00:19:20]

They discouraged you from pursuing that?

[00:19:23]

Yeah. I remember talking to my mother and saying, I think I'm going to major in art. And she said, oh, great, you can major in art after you finish your engineering degree.

[00:19:32]

Sure.

[00:19:33]

It's a very immigrant parent thing to do, actually. So that's funny.

[00:19:37]

Well, it was also because my parents had a lot of financial troubles growing up. And so it was hard because there were times when we didn't go on vacation for many years. I want to be really careful because I don't want to sound like we were suffering. There was some real poverty out there, and we weren't facing that. But we had money problems, and my mom and dad would fight a lot about that.

[00:19:56]

See, that's, I think, the most relevant aspect is, was it this concept concept in your life that every time it was brought up, you saw fear on the faces of the adults, and it was the cause of fighting, and it is this big elephant panties in the room. I think just once you have that association with it. So Yes, there were a lot of people poorer than me, but I was a single mom, and I still have the most unrealistic relationship with money to this day. It's just so grounded in fear that there's really nothing I can do to overcome it.

[00:20:26]

Yeah. A simple thing is whenever I look at a menu, I'm always studying the price. Still. Yeah. I will never order the most expensive thing on the menu. And you could. Yeah, but it's interesting because I'll contrast that with my wife and she'll say, What do you think of this dress? I'll come over and I'll be like, What does it cost? She'll be the first question. She said, I don't know. She does No.

[00:20:45]

She's evaluating if it's beautiful.

[00:20:47]

But the first thing I'm looking at is the price. Before I try something on, I want to know if I could even afford it.

[00:20:53]

Right. Even though you can.

[00:20:55]

Probably, but it's still in that mind. It's in there.

[00:20:58]

Totally. How about this without stealing your networth. Let's just say if you had a billion dollars, don't you think you'd probably still be the same way? Yes. That's when I'm saying that it's an irrational relationship with it. It's not grounded in the facts at all.

[00:21:11]

No, no, it's true. Another pet peeve is if I'm in a hotel or something, and you know how they charge $20 for a Diet Coke? Then they deliver it and it's an additional delivery charge and a tip on top of it. Then it comes, you're going to give a tip on top of that. So it's going to be $40.

[00:21:27]

You're in triple digits for a Diet Coke. Yeah. I That actually triggers a second issue I have, which is, and we had an expert on talking about this bias of being taken advantage of to be the sucker. Then I have two things going. I've got my financial insecurity, and then I have these people think I'm a fucking idiot. I'm a sucker. I'm going to pay this much for a date, so it's a lot going on. You do what they urge you to do, and then you go to Penn, and you double major in economics and in engineering. Summa Cumlaude in both. Wow. Yeah. That's wild.

[00:21:58]

For a double major, that's I was a double major, and I was also summa. We won't say what the majors were because that might dilute what I'm saying.

[00:22:08]

Are you a double major, too?

[00:22:09]

Yeah. For the same reason, because I wanted to do theater. My parents were like, Probably not. You're going to need to do something else.

[00:22:15]

You have a safety net.

[00:22:16]

A realistic plan in place as well.

[00:22:19]

Because it's risky to be an artist, for sure.

[00:22:22]

Then you go to graduate school at Carnegie Mellon, and you get a PhD in computer science. Yes. Okay, but you have a trip. You study abroad in Edinburgh. Yeah. Which, by the way, when I read Edinburgh today, I was like, Isn't it Edinburgh?

[00:22:35]

Yes. It is Edinburgh? Oh, it's Edinburgh.

[00:22:37]

But we don't have an O at the end of that.

[00:22:39]

I know, but that's the way it is.

[00:22:41]

This language is madness.

[00:22:42]

I know, but it is Edinburgh.

[00:22:45]

Okay, thank God. I thought it was in the same way. I've been saying that wrong for 30 years.

[00:22:48]

No, you're absolutely right. But I'm glad you brought that up because that was a huge turning point in my life.

[00:22:52]

Yeah, tell me, where in this eight-year schooling?

[00:22:55]

My junior year abroad, and also my dad was very sick. He had leukemia because he had the plant that was a lot of toxic chemicals and stuff. But then he got remissioned, so he was feeling better. I was so stressed with that whole thing. I wanted to take some time off. And Penn had this junior year abroad program. I had never left the country, actually, since I was a baby. But it set off in this year-long adventure. I had so much fun.

[00:23:20]

You had junior year, you're like, '21, '20?

[00:23:23]

'19. I remember that distinctly because I remember saying, I'm 19 years old and I'm on my own, and I had a backpack and the Let's Go Europe, this big volume, and it was my Bible. I would just travel as much as I could.

[00:23:36]

Yeah, you're around, the train schedule. Oh, my God, I can go all the way to here.

[00:23:40]

One of the highlights was going to Morocco.

[00:23:42]

Back to your continent of birds.

[00:23:45]

It's very interesting you say this because this is the story I always like to tell, which is that basically with some friends that said, let's go somewhere really exotic. We'll go to Morocco. It was over Christmas and we went to Spain, Madrid, and then we were taking the train down. On the train, it was all these soldiers. Everybody was drunk and it was really super fun. We're having this blast going to the last stop and it was packed. When we get there, we get on the ferry and we all have to turn in our passports for processing. My mother had warned me, she said, you're going to Morocco, but you're Jewish. It's an Arab country and you should be careful. But I was like, Oh, that's ridiculous. We got off the ferry and all my friends, there were about three or four of us, they had gotten their passports. I didn't get mine. We're waiting, and then it got stretched into like 45 minutes. I was like, Listen, guys, I think there's going to be a problem. Go ahead. I'll just go back. But I was definitely queezy about what would happen.

[00:24:36]

Yeah. Because you're already now on the side of-We were on the other side, but we're still on the ferry because we can't get off without the passports.

[00:24:42]

Then this door opens. I'll never forget this. I can see it like it was yesterday. Across the back of the ferry, I see this guy walking over. He's in a full Kafia headdress, very Arab-looking. Yeah. He walking toward us, and now I'm starting to think, Oh, she was right. Then he holds up passport and he says, Monsieur Goldbière? They're French. I looked forward and then he looks at me, he looks at the passport, he looks at me, and then he throws open his arms. With this big smile with gold teeth, I remember he says, Welcome home. What? I have no idea what he's talking about. That's sweet. Then it hits me. Just what you said, It was the first time I had set foot on the continent of Africa.

[00:25:27]

Yeah, since you were here.

[00:25:29]

Oh, my God. That's Or does he have any sense of that?

[00:25:31]

It's in passport that I was born in Africa, but I have no stamps or anything else.

[00:25:34]

Oh, wow. That is very heartwarming.

[00:25:37]

It was wonderful.

[00:25:38]

The delta between what you were expecting and what came, maybe the biggest delta of your life.

[00:25:44]

Oh, my God. Yeah, exactly. I was ready. He was going to clap me in handcuffs or something.

[00:25:48]

Welcome home with a hug. It's incredible.

[00:25:50]

Also for people with backgrounds like the both of you, you're expecting the worst thing to happen. So it's really nice to have evidence that it could go a positive way.

[00:25:59]

Totally.

[00:26:00]

Sometimes it goes the other way. Yeah. Is it in Edinburgh? At some point, you get introduced to, I guess, the concept of AI?

[00:26:08]

Yes. They actually have one of the few AI departments in the world there. I didn't know that.

[00:26:13]

Is this Edinburgh University? Yeah.

[00:26:15]

They had connections with Alan Turing and all the early work in AI. They had this department, and I remember going into this fair, all the departments had these little tables, geography, art history. I saw this little table with AI, and I was like, What? I walked over, and sure enough, they had a class that you could take in AI.

[00:26:34]

This is in 1987?

[00:26:36]

1981.

[00:26:37]

No, you graduated in 84. '81. Yeah, that's early, early. I loved it.

[00:26:41]

It was a great course. We had a little bit of robotics, a little bit of natural language and a little bit of computer vision, all these different aspects of AI. It was so much fun.

[00:26:49]

At that point in 1981, what was the robotics component of that course?

[00:26:54]

Basically, controlling these arms to move in certain ways.

[00:26:58]

The kind we're used to seeing in automotive assembly plans?

[00:27:01]

Those claws. That was big thing was like, how can you get it to just move around on a table or something like that?

[00:27:06]

Why do you think that was so exciting?

[00:27:10]

Well, I loved machines like that. I guess it was partly my dad's influence. We had a go-cart when I was a kid. I was really into that. And rockets. Model rockets were a big thing. Also, of course, I watched those shows like Lost in Space and things, and I liked robots from that.

[00:27:24]

You then go and you get this PhD in computer science, and then you teach at USA for a minute, which is interesting.

[00:27:31]

Yeah. So I was there yesterday, and it's actually so nice to return. It was wonderful. Okay, the story is that when I graduated in 1990, there were no jobs for robotics. And this comes back to what you were saying earlier. Robotics has had these waves and it was in the trough at that time. It was a backlash to a lot of the over-optimism about robots. There's this thing called AI Winters.

[00:27:55]

Yeah. You know about this? We've heard it from a few people have talked about these.

[00:27:59]

Yes. Yeah, we had Fefe Leon. Oh, good. Yeah, I read her book. They give up on it, then something happens, they come back to it. Yes.

[00:28:05]

Since maybe the year 2000, it's only been positive. There's been no negative. The students don't know. They've never seen that. But Fefe and I, we've lived through it, and it's quite dramatic when suddenly everybody decides it's not going to work and it's not useful and all the funding dries up. Yes. That time, it was very interesting. Japan was on the rise and everybody thought Japan was the future. There was a whole lot of hope that robots were going to do all the things we're saying today, and it didn't work. And so it was dismissed. It was a backlash.

[00:28:34]

They were like, We tried it, but it didn't work.

[00:28:36]

Exactly.

[00:28:36]

Am I wrong? Even Honda was one of the first big corporations that was committed some real money to building a robot.

[00:28:41]

That was later. But in 1990, I was looking at jobs in Japan, and that was my only option. And then this job at USC came along. I was so lucky. I was so happy that they hired me.

[00:28:53]

I don't know how you teach here, and then you leave at any point. This is a fly trap, LA. The weather's just too good, especially if you're from Pennsylvania.

[00:28:59]

Especially if you're from Pennsylvania. I know. It was quite good.

[00:29:02]

Let's add, this is a sincere question. The most shocking thing that occurred to me when I moved to California, I remember it so clearly. I was at a Kareos restaurant, which is like a Denny's. I'm at a booth by myself, and there's a guy at a booth by himself, and he stares right at me, and I stare at him, and I'm now waiting for what happens in Detroit. Like, either he looks away, I look away, or I go, What's up? That whole exchange that's just unavoidable where I came from. And then he's just looking at me, and I realized, this guy's just looking at me, and I'm looking at him. We can do this here. That's California. I couldn't believe you could just look at somebody.

[00:29:39]

Okay, so my version of this is that I was visiting California, and we were driving from San Francisco down to LA. This was a couple of years before. Someone had figured out that you could go to Eslin in the middle of the night and just go in and experience the hot tubs.

[00:29:53]

Eslin is this crazy retreat, hippy-ish vibe, nudities welcomed, right? That's the vibe.

[00:29:58]

Have you ever been there?

[00:29:58]

No, but my father I'm sure Lee used to go there from Detroit.

[00:30:01]

What? Oh, my gosh. Yes. Well, it's been around for a long time, right? It's on the Coast. I remember we go and it's dark and there's nobody there. I remember thinking, they're not going to open up in the middle of the night. What are you talking about? We were standing there and I was like, let's go. Then all of a sudden, this mysterious figure comes and unlocks this gate. We go in. Sure enough, a couple comes out of those shadows and it's these two beautiful women, and they come in there. So there's three guys. Are they clothed? Everybody's clothed at this point. But then we go in this point. We go in this point.

[00:30:29]

We go in this That's encouraging.

[00:30:30]

Well, that's the thing because they say, you can leave your clothes here and we'll walk down to the thing. I'm like, I don't know what they mean by that. I take off everything but my jeans. I go down there. As soon as I get down there, it's on the Coast, like right on the cliffs. I realize everyone's naked. And so they all get in. I take my pants off and I jump in. I'm sitting there at this moment with the stars above me and everything else and thinking, this is where I want to be. I'm never leaving. I'm never leaving. This is California.

[00:30:58]

I'm talking about a culture shift. California, baby. Pennsylvania.

[00:31:00]

After being there for an hour or so, we get up to leave and I pick up my pants and I had put them down at a place where the water was rushing through, so they were completely still.

[00:31:10]

Seawater dungarees, rest of the ride.

[00:31:13]

Exactly.

[00:31:13]

My only pair. So how do you end up at Berkeley?

[00:31:15]

I love Berkeley because I love the counterculture part, too. When I was a kid, there was also a head shop in Bethlehem. I got a little taste like the Furry Freak Brothers, and I listened to Grateful Dead. I don't know what head shop is.

[00:31:28]

Where you go buy a water pie and some tie-dye clothing. Yeah.

[00:31:32]

Okay. Yeah.

[00:31:33]

Countercultural.

[00:31:34]

They had also those posters with the blue black light. Yeah.

[00:31:37]

Remember those?

[00:31:38]

Yeah. They also had a smell to them, remember?

[00:31:41]

Yeah. Pachuli,.

[00:31:42]

Pachuli, exactly.

[00:31:43]

I don't like them, but I'm glad you like it. Oh, interesting.

[00:31:46]

Okay, all right. Well, that was a big thing for me. It was somehow a little bit illicit and off-limits, and I found it very interesting to see what was going on there. I also love the Beat Poet and all the rebels. So, Berkeley was a big attraction. Stay tuned for more Armchair Expert, If You Dare.

[00:32:10]

Transform your home into a sanctuary of style. Tilestyle's Winter Sal. Now on brings you the perfect opportunity with a guaranteed 40% off all in-stock tiles, including the exquisite Porcelainosa collection. But that's not all. Discover incredible savings on luxury bathrooms, kitchens, and wood flooring, too. Only while stocks last. Shop in store at Ballymante or online at tilestyle. Ie. Limited exclusion supply.

[00:32:36]

Tile Style, where your dream home becomes a reality. On January fifth, 2024, an Alaska Airlines door plug tore away mid-flight, leaving a gaping hole in the side of a plane that carried 171 passengers.

[00:32:51]

This heart-stopping incident was just the latest in a string of crises surrounding the aviation manufacturing giant, Boeing. In the past decade, Boeing has been involved in a series of damning scandals and deadly crashes that have chipped away at its once sterling reputation. At the center of it all, the 737 MAX. The latest season of business wars, explores how Boeing wants the gold standard of aviation engineering, descend it into a nightmare of safety concerns and public mistrust, the decisions, denials, and devastating consequences bringing the Titan to its knees, and what, if anything, can save the company's reputation. Now, follow business wars on the WNDYRI app or wherever you get your podcast. You can binge business wars, the unraveling of Boeing, early and ad-free right now on WNDYRI Plus. I like the physicist history there as well. There's so many cool historical elements to Berkeley that I think make it such a special place.

[00:33:54]

I agree. There was this idea that it's really about being a rebel at some level and being able to buck the status quo. And that I've always admired from Oppenheimer. They discovered all these elements and all these Nobel Prize winners.

[00:34:06]

That's what's rad is they're fringe, but they're crushing. They're doing it in a totally untraditional way, but they're still bringing in the results. Rigorous.

[00:34:13]

It's not about just being space cadets.

[00:34:15]

Being on acid all day.

[00:34:17]

Right. Because there is a beserkly idea, right? But it's not because you have to be grounded. Berkeley is not the easiest school to attend because it's big and it doesn't hold your hand.

[00:34:28]

I've driven up to just look at it and you don't get the sense of, oh, I'm entering this secure border.

[00:34:32]

It's very shaggy. There's no real border. There's no gates. By the way, USC has these gates. You have to have an ID and everything. Nothing like that at Berkeley. A hippie vibe is still floating around. It's definitely got a lot of coffee shops, which I love that. But the rigor is that you have to work hard and you have to be willing to get what you want. You can't get into this class, but you go and you camp out in your sleeping bag next to the professor and talk your way in.

[00:34:55]

You have to be motivated, self-motivated.

[00:34:57]

Motivated and grit.

[00:34:58]

Angela Duckworth.

[00:34:59]

That's right. Yes, exactly. She's the queen. I think that word really sums up. Students, when they ask, should I come to Berkeley or not, or even faculty, and I say, well, if you want to be comfortable, maybe not. It's not the most comfortable.

[00:35:11]

Stanford is right up the road.

[00:35:13]

Exactly.

[00:35:13]

I wasn't going to say it. I think you'll love it there. Okay, so could you walk us through the history of robotics? When this automation arrive, what are the pillars of progress in robotics?

[00:35:27]

Okay, so it goes all the way back to start the ancient Egyptians. Oh, really? Yeah, because if you think about something that looks human or a machine that has surprising abilities. People have been always fascinated by that. They had these statues that use steam to move their arms and stuff. That has a long history.

[00:35:45]

But they were functional or these were ideas drawn?

[00:35:48]

No, they're functional. Wow. Yeah, with levers and chambers that would fill with fluid, and then they would raise their arm.

[00:35:54]

But obviously, they're using some hydraulics or something.

[00:35:58]

Yeah, it wasn't steam steam, per se, but it was just like liquid that would fill up a chamber. But they had simple mechanisms. That goes up through the Greeks. Then there's all these stories about Pygmalian coming to life, the statue who comes to life. The story, of course, there is that he falls in love with the statue.

[00:36:15]

I don't know Pygmalian, so help me.

[00:36:17]

Pygmalian is a really good story to know. It's one of the Greek myths, and it's a sculptor who's renowned for being incredibly skilled. He, at one point, sculpts this beautiful woman, and it's so lifelike that he falls in in love with her. How could he not? How could he not? But then it has a tragic end because he won't eat and it never returns his affections.

[00:36:36]

It's literally the movie Her. Yeah.

[00:36:38]

That's this archetype. It's Frankenstein. It's the same story that you see over and over again.

[00:36:43]

Falling in love with an innermal object.

[00:36:45]

Falling in love with your creation.

[00:36:47]

Oh, your creation. Oh, that's a good detail. It's hubris. It's hubris. Oh, yeah, that's juicy.

[00:36:52]

That's so deeply rooted in Western culture that we're warned against these things. It's overstepping to try to take on this God-like role.

[00:36:59]

It's God because you're being a creator.

[00:37:02]

Exactly. There's a lot of idea that that's going to come to a bad end. I think that's largely what's behind all these fears.

[00:37:08]

It underpins a lot of our current. Totally. Even if we don't believe in God, we have some sense that we're not supposed to be tampered. Let's just use nature as God.

[00:37:16]

That it's in the back of our minds that if we do this, there's going to be some price to pay. It's going to run amok. That's the story with Frankenstein, right? It runs amok. Then the Golem story, it precedes Frankenstein in the 14th century. Century, a rabbi. There was a lot of pogrums in this little village. So he makes a robot out of clay, just a being out of clay. He puts these words on his forehead that bring it to life. And then it goes around and it basically defends against all the... The bad guys and it defends the community. But then when it's done, he's like, well, now why don't you go fetch some water for me? Then he goes to sleep. Anyway, he wakes up and he's drowning because the robot just keeps fetching water over and over again. And he has to stop it, but he doesn't know how. So he then reaches up and he wipes off the words on the forehead and the golem collapses on top and kills him.

[00:38:06]

Now, that one is specifically, I hear this antidote all the time that you could deploy AI for this simple innocuous task, but it could determine to execute that task, it would be best that we kill all the humans. This is the very common one that goes around right now.

[00:38:22]

If you want to save energy, get rid of these humans. If you're not careful...

[00:38:26]

The most efficient way is to do that.

[00:38:28]

It has such a myopic It's a command it's following. Right. Oh, great. Okay, so then when do we get into something that is actually practically helping us? I'm an idiot. I'm thinking Henry Ford is when this starts.

[00:38:40]

No, it's actually really good. So the Industrial Revolution with the invention of the steam engine, all those things, all the machinery starts coming out. And so Henry Ford is definitely part of the assembly line and the car. But robots actually also start at Ford. There are some very early robots in the late '50s, early '60s. They're like a programmable machine. And so you can basically tell it to go from point A to point B. And so it's very primitive. But they're in the World's Fair and people start talking. Oh, and by the way, the word robot is coined in 1920.

[00:39:10]

By a sci-fi writer? Yeah.

[00:39:12]

By a sci-fi writer, it's It's actually a playwright in Czechoslovakia. So interestingly, it's right around the time of the pandemic, the 1918 pandemic. And also, I think significantly, Sigmund Freud wrote this essay called The Uncanny in 1919. So a year later, this play comes out about essentially robots. That's where the first time this word has ever been used. Oh, wow. Yeah, robot, which means worker or a forced worker in checklist bucket.

[00:39:40]

Now, the '60s Ford robot, was it being employed to move objects or was it like a CNC cutting device?

[00:39:47]

It's much like a CNC. It's computer-numerical-controlled.

[00:39:51]

They were able to program these spinning drill bits with three axes, right? It could say, go up, go left, go right. Through those little three sexist movements, it could carve out the perfect shape of a part from a block of agate.Deal or something.Seal whatever. So you go like, oh, I want this rim for your car. I start with this big block of aluminum. This thing, just with a drill, it can go through. It has all these steps programmed in, and it just, like a sculptor, chisels out everything you don't want.

[00:40:19]

And it's programmable. So then you can make a whole bunch of them over and over again. And that's still used, by the way, very heavily.

[00:40:25]

And then what's the next big leap forward?

[00:40:27]

There's a lot of fear around that time that robots are going to take over in the newspaper, there's all these articles that they're going to do all the work, but that doesn't happen. And the first Robotics conference is in 1984. Then there's a big research field that starts to grow around robotics. But then it started really taking off in factories, especially automotive. The big thing that it's used for is welding and spray painting.

[00:40:50]

The welding is awesome.

[00:40:52]

Yeah, the welding is fine because you get those-It's like a little pinches just come in, boom, boom, right?

[00:40:55]

And welding sheet metal is very hard to do.

[00:40:58]

For humans.

[00:40:58]

You burn right through it so easy.

[00:41:00]

Right. So it's very delicate, but then you're just basically doing the same thing over and over again. So it's repetitive. And that is very good for factories and also some of the assembly, putting together various devices and appliances and things like that. That's a big wave. And that's also happening in Japan and other places. So That's growing, the industrial robotics. The biggest breakthrough is now in 2012 in the breakthrough of deep learning and AI.

[00:41:23]

How does that open the door for us?

[00:41:25]

Let me back up a little bit, which is that when I did my PhD, I was interested in this incredibly simple problem of just trying to pick up objects just to grasp. It's something everybody does. Babies do it. I was actually clumsy as a kid. I later thought maybe that's why I wanted to study that, but it's still an open problem. Really? Robots to pick things up.

[00:41:45]

What's it called, Macho Varker's Paradox? What is it?

[00:41:47]

Oh, yeah, yeah. Moravex. There we go. Okay. Moravex was actually at CMU when I was there. He was this very eccentric guy. He wrote this book, and he was saying, It's a paradox that what's easy for robots, like lifting a heavy car, is hard for humans. But what's easy for humans, like stacking some blocks, that's hard for robots. And that's still true today. Yes.

[00:42:11]

You have a great TED Talk. I urge everyone to watch it.

[00:42:15]

It's called Where are the Robots? Where are Robot Butlers?

[00:42:19]

That is not the title of it, but that is really close. I found it. Why don't We Have Better Robots Yet? That's the title of your TED Talk, and it's very, very good. Yeah, so it's It's incredibly hard for a robot to grasp things. There's a bunch of reasons, right? Yeah.

[00:42:35]

It's very counterintuitive because humans is so easy. But we've evolved over millions of years, just like dogs and crows. Crows are able to pick up things amazingly.

[00:42:46]

They can put coins in slots. They can do eight-step problems.

[00:42:49]

They're far more dextrous than robots, for sure.

[00:42:52]

I would much more trust a crow to handle my business than a robot currently.

[00:42:56]

Exactly. Robots, there's a lot of uncertainty in the environment. Even if you tell a robot to go to one specific spot, because of the motors and levers and gears that are in it, it won't go to that exact spot. You want it to put its jaws or something at a specific point to grasp this cup, it'll be slightly off. That will cause it to miss, drop the objects.

[00:43:17]

Because every single movement is going to have some margin of error, and then that's going to compound. The more movements you add, all these different little margins of errors start stacking up.

[00:43:28]

Exactly. And then the other is sensing. We can take a high-resolution picture of an environment like this room. But there's no sensor that can give me the depth, the three-dimensional part of this room.

[00:43:38]

What if you used a 3D camera, so you had bilateral?

[00:43:41]

There's errors. There's a little noise in those things. If you look at the result of that, there'll be a depth map, which is like a 3D camera image, but you'll see there'll be lots of noise and imprecision and mistakes in those. Those are inevitable. There's no camera that really works reliably for 3D.

[00:43:56]

Okay, I don't know if this is the time, but this is one of the parts I want to talk about. As I've been frustrated with the exuberance of AI. One thing that has occurred to me is that our fascination with ourselves seems to be about our intelligence, and that, in fact, is what AI is trying to replicate and/or surpass. Test is our executive function, our problem solving, all these different things. But if you look at us as an animal, our motor control took 300 million years to evolve as mammals, and our big brain everything we're trying to replicate in the AI space started six million years ago when hominids arrive. You have so much time spent evolving to do all these tests that we think are just standard. And then we think this last minute thing that took the least amount of time to evolve is somehow superior to that. So when I look at this, it's like, forget artificial intelligence, try to figure out artificial physicality.

[00:44:52]

That's such a good way to put it. And that's exactly right. So if you look at that history, hundreds of millions of years of evolution to see to mobility and being able to manipulate just the opposable thumb and all of those things. And so all these other things like math is relatively very recent.

[00:45:08]

Yeah, we think that's the high watermark, but I think the most impressive thing is us moving through time and space and smelling these five senses touching. I would imagine if you could quantify it, that's like this much, and then our intelligence is like this much.

[00:45:20]

I think it's helpful for people to understand why we've made all this progress in these, quote, hard problems like playing chess and go. Yeah. But we really haven't made much progress in just being able to clear the coffee table.

[00:45:32]

Okay, so then my question is, is that a software or hardware problem? So one of the things I think about, it must be so hard when you're trying to design a robot is you're limited to all of these pulleys to operate a hand the way ours moves. And as much as we are pulleys, we're also not, right? Because the muscles are such a unique way to operate the bully. It opens up infinite options of movement, whereas these robots are really confined to this, this, this, this, right?

[00:46:02]

You're right. The muscles in human body, there are hundreds of muscles and bones, and they pull in all these nuanced ways. We have this skin that's very complex. What's amazing is how much we don't know about human biology. We don't understand how touch works. Touch is incredibly complex. Really? We can feel things that are so small. They're much smaller than human hair. We can perceive up to very complex vibrations and other things.

[00:46:28]

You add in temperature, we can feel. You add in moisture. Have you ever read the book In A Ment's World? Yes.

[00:46:34]

You have. Oh, yes.

[00:46:35]

By Ed Young. I love it. Holy fuck. He's fascinating.

[00:46:38]

I love that book.

[00:46:39]

You get into this mole that has this star-shaped nose. That's a touch sense. Definitely. That touch sense can detect the movement of moisture in the sand it's exploring at a distance of 12 inches. It actually can see through everything, but it's not seeing through, it's not smelling through. It's touch feeling through. Yes.

[00:47:00]

That's great.

[00:47:01]

I know. How would you replicate that with a machine?

[00:47:03]

Exactly.

[00:47:04]

Also, humans don't even have the ability to do a lot of the stuff that these animals can do. Totally. Which one are you even aiming for? Robotic touch sense is extremely primitive.

[00:47:14]

What do they What do you think would be the mechanism that could replicate it? Would it be electricity? What would it be?

[00:47:20]

When I was in undergrad, I tried to use electricity to do that, and it failed. It did not work. But what people are using now is light. They transformed the touch into light. Imagine that you have a little camera in your fingertips is looking inside at a pad from the bottom. When the pad gets indented, you see the pattern of what it's touching.

[00:47:41]

Okay. It's like a membrane, and above that, the membrane is being observed. That's exactly what it is.

[00:47:45]

But what happens is the membrane gets rubbed off or over time, those sensors get deformed, and so it doesn't solve the problem. It's just the latest method we're trying.

[00:47:55]

This is why the Roomba worked because it didn't have to use any digits or anything. It was just that random moving.

[00:48:02]

The Roomba replicated the very first multi-cell organism. What it actually ended up knocking off was something that can only move forward and then turn and move forward and turn and move forward. Some paramecia. What we've achieved is like, that's where we are. Multi-cell organism. You're right.

[00:48:21]

The Roomba is the most successful robot of all time. So when they count robots out there, they count these Roombas where there's 10 million of those, but that's the robot, right? It's very simple. It's basically just random motion. And over time, it does cover your carpet, and it's pretty reliable. But of course, it also has this problem that it gets stuck all the time and tangled up in stuff. Sure. And so it's not ideal, and it can't go upstairs. Also, a lot of people bought them as a novelty. There's a lot of them sitting in the closet somewhere.

[00:48:50]

Yeah, they want to see it work once, or maybe even they bought it and they got intimidated about even turning it on.

[00:48:55]

Yeah, that would be my thought. It is tricky.

[00:48:57]

Even understanding, I don't really want to deal with pulling out of the box to figure out how I deploy this thing.

[00:49:01]

I actually have a drone that's in that category. Oh, yeah.

[00:49:03]

I have two drones. I'm like, I'm not going to be able to figure out how to deploy this thing.

[00:49:06]

No, I need six hours to basically figure it out, and it's sitting in the box.

[00:49:11]

You're right. I'm like, That's going to require a day, and I don't know if I'm going to enjoy flying it enough to justify a day of me figuring it out. Okay, good. I like that we're agreeing that we're really underestimating how complicated our physical abilities are, and we're really overplaying our mental capabilities.

[00:49:28]

Right. Everybody's impressed. The analogy, if you say, okay, you can beat the best person in the world at chess, then that means you have a very powerful machine, artificial intelligence. Now it can beat the best person at go, and nobody can play go or chess that well. So you think this is smarter than everybody. That's the way people reason. But then it can't drive a car. It's a whole bunch of things it can't do. Anything physical is just picking up or opening this can that I just did. That is impossible for a robot.

[00:49:55]

Tell people about your folding project. I don't know what year you're into this, but one of projects.

[00:50:00]

Is Folding Clothes?

[00:50:01]

Folding Clothes.

[00:50:02]

That's one that I think everybody would like to have.

[00:50:05]

If it just sat on your washing machine and you could dump it in a barrel, and then, fuck. Yeah, that would be- Also, can you do a dishes one?

[00:50:12]

Like putting them- Have you ever watch them do this? No.

[00:50:16]

There's nothing funnier than wanting the robot try to cook breakfast or do dishes. It just smashes everything.

[00:50:22]

Splatters, broken glass. But I agree, taking things in and out of the dishwasher would be great. Just unloading and loading the dishwasher. Someone would say that their dishwasher is a robot. It is very successful. See, there's this idea that if you can use humans and robot together, that's very powerful. That's what I call complementarity. When if you figure out that you have a machine, you can use it, but you have the human do the parts that we're good at and then let it do the parts it's good. Together, you have a great system. And the dishwasher is a beautiful example. And the washing machine, they do all this, but we have to load it and unload it. In the laundry aspect, it's also that you want your clothes to be folded at this precise time, right when they come out, because then they're at the perfect stage. No wrinkles. No wrinkles. If you do it too soon, they're soggy. If they're too late, they get all wrinkled. Having a machine to do that would be quite good. There's some really interesting new results that just came out about this. But we've been working on it, too.

[00:51:14]

One of the ideas is you fling the clothes up and you use air to help smooth them out. Like humans do that all the time, right? You snap. That has only been really done in robotics in the last five years.

[00:51:27]

Oh, my God. This is your Good job. I'm just so annoyed all day long.

[00:51:32]

Yeah, that's a perfect time for me to ask you, knowing your work will experience failure. I don't know that there's one that would surpass it. It's just failure, failure, failure. How do you stay optimistic?

[00:51:46]

I'm super optimistic. I love working on this topic, and I feel like we have a lot more work to do. So that's also encouraging. I don't worry that it fails. I actually love the times when it does succeed. That's super rewarding knowing how hard it is.

[00:51:59]

You're I'm like a fan of hockey instead of basketball. Why? They're only going to score once a game. Where's basketball? They're going to score 110 points. So you're like, Oh, yeah, I don't get it. But when I get it, boy, it's...

[00:52:11]

That's interesting. Never thought of it that way. Yeah, because in grasping, we've actually made some good progress just in picking up objects. And that was the breakthrough. So coming back to this timeline. So in 2012, there was this breakthrough in vision. And suddenly deep learning, this new way of building these very large networks, we're using lots of data and using GPUs, graphical processing units. It's basically a new computer. It has this breakthrough where suddenly machines are able to recognize images and things in images. Like it'll say that's a book and that's a cup and that's a microphone. That's part of Fefe Lee's work. Exactly. Fefe Lee is actually at the center of this. She builds this data. So she plays a pivotal role. When all that gets put together, suddenly it's a revolution. And that's a big moment in robotics and history. We apply it to robotics. And so her system was called ImageNet. And so the system that we designed for grasping we called Dexnet as an homage to her. Oh, that's great. Yeah. Dexnet was our system. We worked on it for five years, and we basically applied deep learning techniques to be able to figure out where to grasp objects.

[00:53:11]

It started working better than anything had been done before. I was so surprised because I had been trying to work on this problem, and then I suddenly was able to pick up almost everything we could put in front of it. Oh, wow.

[00:53:24]

Well, there's this critical mass point for all these things, isn't there? In her book, she needed such a a humongous pile of data, and halfway there gets you nothing. But it's like stagnation, then the acceleration is probably shocking for you.

[00:53:37]

Yeah, no, that's a really great point. There is a critical point when you get enough data and suddenly it starts working. It took a lot. It was 80 million plus images that Fefe had put together. In our case, we had seven million grasp examples that we had found. Then it started to work and it was like, Oh, this is so exciting.

[00:53:56]

What could it do? You could put it in a new environment. It you could evaluate the environment and then you could ask it to do something.

[00:54:02]

It was just grasping. Think of it with a very simple gripper, just a parallel pincer. You would put a bin of objects in front of it and it would start pick them one by one and put them out. We would test it by going into the basement, the garage. We just throw all kinds of stuff in there. It would just pick them up consistently and clear the bin. We would try and fool it.

[00:54:20]

He must have been elated.

[00:54:21]

It was so much fun. There's a story where we got invited to show this to Jeff Bezos. He invited us down to this event in Palm Springs. He I'll bring the robot. I want to see this. We had never left a lab before, so it was a big deal to put it on a truck. We weren't sure it was going to work. We had 300 objects that we brought with us, got it all set up. He came in the booth and it was working. We were so relieved. He was trying with different things, and it was just like it was in lab. Everything was going great. Then his assistant, standing there, and took off his shoe. He said, Well, can I try my shoe? I remember my mouth goes dry because of all the of things we've tried it with, we've never tried a shoe. I have no idea. But what can we do? We have to say, go ahead.

[00:55:07]

Otherwise, it feels all mapped out, maybe. Yeah. Yeah, right. Like the panties on the ground.

[00:55:10]

Right, exactly. He drops his shoe into a bin, and we're all sitting there, and the robot just reached over and picked up the shoe. It took it right out. I remember calling Tiffany, my wife, and I said, This was the best moment of my life. She said, What It's about our wedding. Yeah, exactly.

[00:55:31]

But you can't tell it like it has the bin and it has all the objects, and you can't say, Pick the shoe.

[00:55:38]

That's exactly right. That's very important. You can't tell it to select a specific object.

[00:55:43]

You can't say, Can't you go through this bin and find me all the penny? No. Exactly.

[00:55:47]

That's called rummaging. That's very interesting. We're looking at that now. Much different problem, much harder.

[00:55:52]

That could be incredible for recycling.

[00:55:54]

Yes. Also, you think about it, you do this all the time. If you reach in your pocket and you want to pull out a pen, you can always find the or your purse. People are very good at that. What's going on is very complicated. Robot cannot do that at all. But pulling one thing out of the bin is really interesting because you have to move things around a little bit, see a little piece of it, then pick it up. That's actually very important for warehouses and for Amazon to be able to deliver packages, you have to be able to rumage and find the thing you want, and that's still unsolved.

[00:56:21]

This might be a good moment to bring up. So you say in that TED Talk, which I think would shock people, is that we are much better at predicting the trajectory of an asteroid that's a million miles away than we are how a plastic bottle on a table will move if we poke it.

[00:56:40]

Yeah, because there's physics. We really don't understand friction. And friction is so important. It lets us all sit here and things not slipping around. Friction is so important, but it's a very, very complex process. We can approximate it, and there's this model, Coulomb friction, et cetera. But to really get friction right is actually impossible. If I want to push something across the table, the way it's going to move and react to my pushing force is going to depend on what's underneath it. If you have one grain of sand, it's going to change it.

[00:57:09]

If it's in the right corner, the whole thing is going to rotate clockwise.

[00:57:12]

Exactly. If it's in the left corner, it's going to rotate the other. But I can't know that. The robot can't know it. So right there is like one of the great mysteries of nature. You don't have to talk about quantum physics. That is one unknowable thing that's sitting right in front of us.Oh.

[00:57:25]

My God.Wow..

[00:57:27]

And we deal with it all the time. So you might say, what do we do? Well, we compensate. When we reach for a glass, we don't just reach our gripper right up to it. We scoop it up.

[00:57:37]

We're almost anticipating the many different ways it could go wrong.

[00:57:40]

Exactly. We haven't figured out how to do that for robots yet.

[00:57:43]

Oh, man. And again, is that a software problem or a hardware problem?

[00:57:47]

It's largely software because we don't have the sensors, we don't have the control, we don't understand the models of physics. But I also think we need better grippers, too. But that's a whole other story. But the bottom line is that we're far from anything approximating human level performance. And there's been so much hype. And that's what I worry about.

[00:58:06]

Do you worry this whole field will go through one of those other stagnation patterns that we've already seen a bunch of times?

[00:58:12]

I really do. I think we're on a collision course with a bubble that's going to burst because people are expecting that we're almost there, especially when they see these videos.

[00:58:23]

Okay, great. Tell me about these videos because you watch them and you think we're there. And this is a big problem.

[00:58:28]

Those guys that are running Right.

[00:58:30]

Okay, the first thing to ask is how many takes were required? Many times they get to work once, and that's the video they show.

[00:58:38]

Out of hundreds of it. Hundreds? Yeah, you show the clip of a robot doing a backflip, which is mind-blowing. You're like, well, fuck it, we're there. That thing is going to work on my car next month. Right.

[00:58:46]

That was one in 200 takes.

[00:58:48]

In the other 199 takes, this thing is flying off the table and smashing around. It's violent when it gets it wrong.

[00:58:55]

In a research lab, that's what we're dealing with. It's always failing. You'll be lucky if you get it to work once. But so if you put on YouTube will say the success rate is one out of 200 or something. But nowadays, there's so much hype that is not putting those caveats in there.

[00:59:08]

Because you're an academic, and this becomes one of my next questions is, a lot of these videos I am imagining are coming from startups that are trying to raise funding. So they're heavily incentivized to mislead you.

[00:59:18]

Someone might say, I have to do that. That's how I'm going to get the next round of funding. But I really cringe about that because that goes against my instincts of, I like to say, under promise and over deliver. And I'm going to be really careful. I've never over-promised about what we're doing or a result in a paper. We're always careful. Don't exaggerate the result. It really is a problem for robotics where you see the videos, and then the other thing is they can be teleoperated, and there's a human behind the curtain.

[00:59:45]

Okay, so what about this Tesla robot?

[00:59:47]

The Kim Kardashian got. Optimist.

[00:59:48]

What's it called? Optimus.

[00:59:50]

She owns one?

[00:59:51]

I guess Eman gave her one. He gave her one? I guess she's the only one.

[00:59:55]

What a romantic gesture.

[00:59:56]

Yeah, I know. What's the deal?

[00:59:58]

Okay, so I know this is going to disappoint people when I tell you this, but it's far from being humanlike in its abilities. In dexterity, it's very, very weak. It looks good and they're beautiful designs, and they actually have made progress in the motors and the hardware. So it can move more smoothly. They're also getting very good at walking. There's definitely something positive there. But what can they do? See, this triggers that old idea that we've had in the back of our mind, which is we want these things. We've been reading about them, watching on movies and TV, we think they're going to come, and yet there's this huge gap. If you watch carefully at the demos, they're being somewhat tele-operated. That means that there's a human moving them around, essentially remote controlling them. Or if we watch what their hands are doing, they're very primitive. Now, there's a lot of work trying to address that. Stay tuned for more Armchair Expert, if you dare.

[01:01:04]

Okay, so now I want to ask you, how impacted are you by this general paradigm shift, where most technology for the 20th century was coming out of military government spending, Darpa, MIT, these great institutions, academia. And then as you saw these private corporations amass trillion dollar values that the government can't really compete now and academia can't compete. And what do we think the price of that will be? What are your thoughts on that whole realm of this?

[01:01:39]

So it's a great question because it just came up yesterday at USC because people are talking about there's not so much funding available from the government agencies, Darpa and others that used to fund a lot of this research because it was more esoteric. Now they're saying, well, the companies are doing it, and yet it's not necessarily being shared, so it's closed and complicated. But fortunately, Unfortunately, a lot of the companies are pretty open. Google and NVIDIA and many others are actually publishing their results. We work with them. They work with us. And so, robotics is surprisingly open. The minute they get a result, they publish it, and we all see it. You see it on Twitter or Archive, this other thing. But so there's a lot of communication. And by the way, all this tension between China and the US, it's really interesting, it doesn't exist in academia. We are freely exchanging information. Students are coming from China, just came from a conference. Half the papers were from China. It's a free open exchange.

[01:02:32]

No, science always, and it's so cool. It's so punk rock. They're always like, before I'm German or I'm Hungarian or I'm American, I'm a physicist. The collaboration and how everyone got along is something to be really modeled. They have a higher calling, which is knowledge.

[01:02:47]

Yeah, but wasn't that part of the whole thing with Oppenheimer? I mean, they were trying to keep it to America.

[01:02:53]

Yes, Oppenheimer had some folks working under him that were spies for Russia and were leaking our nuclear technology. So Yeah, I guess that's not to say it was devoid of any statecraft, but just in general, if a Chinese roboticist has a breakthrough, he doesn't give a fuck where this guy came from. He wants to know about the robotics breakthrough.

[01:03:11]

Well, you don't, but do they feel like we got to protect? I mean, obviously not, which is amazing. I'm shocked because that's our whole thing. We talk about it all the time with AI. Like, well, if we want to put the brakes on it- China is not going to.

[01:03:23]

Yeah, exactly.

[01:03:24]

Other countries aren't. So what are we doing?

[01:03:26]

A lot of it is open now. Facebook or Meta has actually been quite exemplary in that they share all the models.

[01:03:32]

They're doing their AI as open source, which is unique.

[01:03:35]

Really wonderful. Actually, we use it all the time, those tools, and it's very, very helpful. Then others like Open AI, we use the tool, but we don't actually get access to the source code, so we don't know how it's doing it right behind the scenes, but they let us access it and use it, which is doing some incredible things. But that idea of, I guess your question is about government versus private sector.

[01:03:55]

Let's just take really quick a hard example. I'm not an Elon Musk hater or lover. I respect him as a modern day Edison. He is a once in a generation engineer, and I respect that. I agree. His other stuff is questionable to me, but whatever. Even with that said, I can't say that I love that he has 5,000 satellites in orbit around the planet in route to having 12,000. It's just an interesting level of power to give one individual when I think I'd feel safer if University of Michigan had 8,000 satellites or the US government may be further down that list, but still I'd prefer that. It's a dicey situation when someone has a monopoly on a technology that's hugely impactful.

[01:04:39]

It's a great point. And not only that, but that individual also has a lot of power in terms of, let's say, Twitter and X. Media. Media. And seems to know how to use it very effectively. So I think it's a concern. Coming back to the fear, the reason roboticists, and I'm not the only one, almost all of us are not fearful that robots are going to take over and also that they're going to eliminate humans because it's just not that sophisticated by a long shot. And certainly the other fear is about jobs. And I don't see them taking over jobs and putting people out of work because all the jobs that require manual labor are extremely difficult to automate.

[01:05:16]

I looked it up this morning. So 39 % of US jobs are still manual labor.

[01:05:22]

Oh, that's a good number.

[01:05:23]

But I think even more importantly, that represents one third of your waking day. You have another third of your waking day where you're still going to have to do your laundry, make dinner. You have this whole sector, too, that no matter who you are, is still manual. You add those two together, now you're really looking at a number that is like 78 % of the stuff done on planet Earth in a day is manual.

[01:05:43]

A huge amount of those jobs cannot be replaced. The gardener. The mechanic.

[01:05:48]

The mechanic. The dream for me is I got a robot that's maintaining all this bullshit I bought. Diagnose why the car is not running, fix it, get in there, how intricate working on an engine is. Forget it. That's like 100 years away. Exactly.

[01:05:59]

Like a Clinics are so complex where they can reach around things and feel they can take off screws.

[01:06:04]

Bell housing on a trans, you can't see anything.

[01:06:06]

That's the thing way beyond robots. I think that there's a shortage of workers. The trade, because we're aging and we need more people to be doing all these jobs, they're not going to be unemployed.

[01:06:17]

Yeah, right. In fact, it's probably most job security.

[01:06:21]

People are realizing that they're actually in demand, which is great. They're actually getting higher wages.

[01:06:26]

Okay, so my theory on this is that the people that are being interviewed for the media where we're getting our information about AI, the people that are getting consulted and interviewed are the actual people whose jobs could be replaced. So they are very misled by their own. You're talking to computer programmers about it. All these domains where actually AI will threaten those jobs, and they're the mouthpiece of this whole thing. It's all very lopsided because those are the jobs.

[01:06:55]

Well, I have a theory about this. I think some of the most vocal doomsayers who are saying we're on the verge of these things taking over. It was very telling. The person who just won the Nobel Prize, Jeff Hinton, he said, We have never encountered anything more intelligent than ourselves. That was his big line. I read that and I thought, Wait, I encounter something more intelligent than myself every Hey. I mean, there's tons of people around that are more intelligent than me. I'm not afraid of them. They don't freak me out. I want to talk to them. I want to get to know them.

[01:07:25]

We spent a week with Bill Gates and we loved it. We loved it. We weren't scared.

[01:07:29]

Right. Exactly. And so most of us aren't afraid of something more intelligent than us. But there's a small group, and I think they think they're the most intelligent people.

[01:07:37]

That's interesting. For someone to be smarter than them, where they're at the most upper echelon, is scary to them. It's a threat to their identity.

[01:07:46]

If the AI robot is good at drifting in a car, I'm fucked because I'm defining my whole identity and self-esteem on my ability to do that. Oh, wow. That's a great insight.

[01:07:56]

That's what I think. It's not really something the rest of us need worry about.

[01:08:00]

We're constantly bumping into people smarter than us, and it's fine. Yeah.

[01:08:03]

It's great, actually.

[01:08:04]

It's great. Actually, I think this is something about AI is that it actually can be this interesting partner for us and can enhance our world and our abilities. Well, like this thing I was talking to you guys about.

[01:08:14]

You sent us.

[01:08:15]

Notebook LM.

[01:08:17]

Is that what it's called? Yes. You sent Monica and I a podcast that is entirely AI. There's a male host and a female host. Yes. And you said, do you think this thing was trained on you guys?

[01:08:29]

Yes. Which was so flattering.

[01:08:32]

Guys, do you not hear that? I mean, it sounds so much like you.

[01:08:36]

By the way, it's great, too. I was listening to it. I was like, it sounds like NPR. I would listen to this if the information was good.

[01:08:42]

It's so good, but it has a rhythm that is very much like you two.

[01:08:45]

When you said it, I heard it. Do they ever find panties, though? That's what keeps us human.

[01:08:48]

That's what keeps us human. But it has these amazing insights. It'll just come up with these analogies and things, and they weren't in the document that you gave it. It just came up with these other things. But the back and forth, the pacing and everything else is so good.

[01:09:02]

The cadence. Yeah, they got it.

[01:09:04]

It has a sense that they're comfortable with each other. They're back and forth.

[01:09:06]

You can feel their rapport. Rapport? Yeah. Perfect.

[01:09:10]

That's the word I wasn't thinking.

[01:09:12]

They have fabricated rapport.

[01:09:14]

But that's scary. That's the thing you think as a person, you cannot replicate.

[01:09:18]

And I'm like, you can.

[01:09:20]

But yeah, we're thinking doomsday, we're out of a job. But think about this, what if we own the thing and it puts out the show? We're on a beach somewhere. The show is just as good. I would love it. It's like the Picasso story where it's like, yes, I drew this in five minutes, but it took me 40 years to learn how to do this. Exactly. It's like, yes, we're not doing it anymore, but because we did so much of it, we've earned this. That's how we'll justify our beach life.

[01:09:41]

Well, you guys are not going to be replaced. Don't worry. You have a very and so on. Also, by the way, if you listen to a couple of these and I have, it starts to become a little repetitive. It's not like that's going to do a whole podcast series. There's something that's not quite new and fresh about them. At first, it sounds really great, but after a while, it's not really satisfying.

[01:09:58]

Do you think that the The missing ingredient is that we cannot help but evolve and change. We're aging, our bodies are morphing, our children get older. All these elements that really funnel into this aren't existing there. It's like what they can do is replicate very well and even create within that framework of it, but it's not going to evolve in the way we just can't help but evolve.

[01:10:21]

Yeah, I think unless we can figure out ways to feed it and prompt it, and so we can use it as a tool to discover new things. Sometimes it's good at that where you give it a paper and you say, come up with 10 extensions of this paper, and maybe one of them will be really interesting. Or you could do it for subjects for the show. You could say, well, what are some good brainstorming topics that would come up that we could have as our, what do you call it conversation starters? Yeah. It could come up with new ones of those.

[01:10:48]

Like our armchair anonymous.

[01:10:50]

Yeah. Oh, my God.

[01:10:51]

What are some good prompts? Yeah.

[01:10:53]

It would feel a little true minimally for us if we could use our data set. Exactly. Then I would feel a little less fraudulent about it.

[01:11:00]

Right. You give it your set and then build on that and see what it does.

[01:11:04]

Okay, I have a couple of rapid fire questions, then I want to talk about your art. Where are we ahead and where are we behind in our expectations? Are we ahead anywhere? We have all these fantasies about what robotics are going to do, and clearly, we're aware of all the ones we think we're behind on. We don't have Rosie, the robot in our house. But are there areas in which it would shock people how far ahead we are? No.

[01:11:23]

It's hard to say. One thing I do think we're going to see is the extension of the Roomba is a robot that can pick up clothes and declutter around the house.

[01:11:32]

I like that.

[01:11:32]

I think it might have four legs. It'll be a little dog with an arm.

[01:11:38]

You thought it might have a scooper on its back, like the tail would be a scooper?

[01:11:42]

I think it might have a tail because tails are actually really important.For balance?For also user interface. A dog's tail is very interesting, and that's very deeply rooted also in our psychology. We have a reaction when you see a dog with a tail wagging.

[01:11:55]

An emotional reaction.

[01:11:57]

But if you notice, none of these dogs that are out there yet have tails. So we're building one.

[01:12:00]

Okay. You and your wife are incredible artists. You have an exhibit that's at Skirball right now. Oh, wow.

[01:12:06]

Oh, yeah. No, I'd love to tell you. She's a filmmaker and has been involved in technology for a long time, Tiffany Schlane. She and I have collaborated a little bit, but this is our first big collaboration. And we've been having so much fun. We got invited as part of the Getty is doing this citywide exhibition on art and science. It's going on for a whole year. They have each of these different institutions do exhibits. The Murebaal invited Tiffany and I to do one for them related to art and science. We've been working on it for three years, and we came up with this idea of talking about history more broadly, but also using trees and the science of tree dating. You know how you count the rings?

[01:12:46]

If you've ever been to Muir woods- Exactly. There's a great cross-section of a redwood, and they put on the rings different events in history. This tree has been alive for, it has Jesus on the ring. Right.

[01:12:59]

And They're very Western patriarchal. She had started actually over the pandemic doing a feminist tree ring. Then a couple of other ones, she's been developing these sculptures, and they're salvaged wood. We don't cut down trees to do this, but there's a lot of these big red woods and other forms out there. For this show, we wanted to do something around... It started with the tree of knowledge.

[01:13:21]

This is very cool.

[01:13:22]

We found a tree stump that was gigantic, almost as big as this room. It's 7,000 pounds. Holy Shit. It's a eucalyptus, but it was uprooted and fell over. Then one side is sanded down. When you walk into the gallery, you see the back end of it. It's all this knotted, gnarly roots. Then around the other side, we inscribed it with questions trying to talk about the history of knowledge and how it evolved from what is fire and can I eat this, which is thousands of years ago, the questions we asked. But those evolved into, will machines be intelligent? And on the far so it has 600 questions or something on there.

[01:14:03]

That's awesome. One of them that's super cool. It's so wild to think the tree was around at that point. The first mark on the ring is from 530 BC, and it's Pythagorean William's theorem. You see a² plus b² equals c². It goes all through these great breakthrough math equations.

[01:14:22]

It's another piece actually that we call abstract expression, which is a redwood. Yes, it starts with Pythagoreus. But remember, it's not literal because that tree wasn't 5,000 years old. We take some liberties.

[01:14:32]

Okay, how old was that tree?

[01:14:34]

I think like 400 years old, maybe 500. But that's the idea. It's like we're playing off of that known concept. But this time we wanted to tell the history of science and do it through just equations. We never say Pythagoreus on there. It just has the equation. But those equations are beautiful in their own right. They're artistic because in a way, art takes an image and there's a lot of content meaning behind it. That's true with the scientific equations.

[01:15:00]

Yeah. Yes. We said that we were talking to Fefe about that, especially with physics, there's something mystical about it. There's something about releasing your firm hold on life and giving it up.

[01:15:11]

Yes. So I have one story about this. Oh, please. When I was in grad school, I had developed this method of orienting parts without sensing. So just by pushing the part along in different directions, you could orient it. And I showed it to my advisor, and he was very excited about it. And he said, well, can you prove that that would work for any part? I worked on this problem for a year and a half. I tried all these methods, and it was basically extremely difficult to try and prove that it would work for all these geometries. I was living at the end of this alley, and it was down some stairs. And so I was sitting on my porch all the time just working on this. I have this moment where this pops into my head to use this step function, and it looks like stairs. I remember writing down these equations and crossing off terms, and everything turned into zero, and then it worked. Oh, my God. It was this moment where when the whole thing integrated to zero means that there had to be a solution for any poliginal part.

[01:16:09]

Did it feel transcendent?

[01:16:11]

It felt quite transcendent. Yeah.

[01:16:12]

It worked. You would tap into something a little mystical. Totally.

[01:16:16]

It was not something that I felt like I did. It was just revealed.

[01:16:22]

You were like the vessel for this. Yes.

[01:16:25]

Very much. I still remember that very distinctly. I'll admit that one of of the equations on the tree is yours.

[01:16:32]

Oh, good. That's your signature.

[01:16:35]

I put it in there. Up with Gauss and Einstein.

[01:16:40]

But you're right. There's something really elegant about those formulas. You think of the most famous one, equals MC² on the surface is just so simple. No one could think of that for so many years.

[01:16:53]

The elegance of some of these. Euler's equation is the one that mathematicians truly love. What's that? It's E to the i Pi minus one equals zero. It's amazing because you have these three quantities. You have E, which is the natural logarithm, which is like this 2.78 blah, blah, blah. Then you have Pi, 3.14159. Then you have I, which is for imaginary numbers. And those three, there's no reason that those should all relate.

[01:17:20]

Because they're all going to infinity, so none of them work great in math.

[01:17:24]

The irrational numbers, yes. And they all came from very different sources, but they all come together at this magical moment, and it's mind-blowing. To a mathematician, it shouldn't be. It's one of those moments where you're like, the universe makes sense. By the way, I feel like that's where I believe in a higher power.

[01:17:43]

Yes. I'm a hardcore atheist. I believe in nothing. I don't want to believe in nothing. There's something happening with my kids that I feel like is something I can't articulate. I really open my mind to like, well, maybe there is some magic happening.

[01:17:58]

There's symmetry at the very least. From where? I don't know.

[01:18:02]

There's structure. There's some beauty that makes sense, and that's out there. When we discover it, we get a glimpse of it, it's like there's something beyond us. I had that one little taste of it that I'll never forget. And that really influenced me. I know it's out there, and I think that's when these breakthroughs happen. In 2012, you had the one breakthrough with AI, and then in 2022, the second one was ChatGPT. That has changed so much in so How many people are in robotics, by the way. That's the new wave that everybody's excited about. But now we'll have to wait for the next one. If you look at it, it's every 10 years or more when we get another breakthrough. I think we need a few more breakthroughs.

[01:18:42]

When you look at evolution, they talk about punctuated evolution. It's like nothing happens and then everything happens and then nothing happens.

[01:18:51]

Exactly, Dax. You're the anthropologist. Punctuated equilibrium, you have a plateau, and then there's a breakthrough or a change, and then there's a long plateau where it gets digested and everything. And then another... But that's really what progress looks like.

[01:19:05]

You would think it's this nice linear...

[01:19:09]

Not at all. And people say exponential. That's not the case at all. We're not living in exponential times. Most technologies do not increase like that. There'll be a little breakthrough and then a long... Look at air travel hasn't really improved. There was a huge jump when it started. We have had things like carbon fiber planes, and there's definitely things to make it more energy efficient. But in terms of comfort level, where's a breakthrough there?

[01:19:30]

I think it's actually gone backwards.

[01:19:32]

Yeah, right.

[01:19:34]

Well, Ken, you are a blessing on planet Earth. I think you are so fun and interesting and encouraging. You're like a polymath cosmopolitan You're an artist. You have all these interests. You're youthful beyond your years. It's a pleasure to know you. I'm so glad you came in and talked to us about all this.

[01:19:53]

You guys are so great. I have to say, I get this real joy listening to you because you're so open and the way you bring out the best in people. Now I see how it works. You sit down and you just make us feel so at home. Oh, good. But your real genuine rapport is so incredible. I think that's why you're so incredibly successful because people hear it. It's a pleasure, and it's so genuine.

[01:20:19]

Thank you. That's really nice. I love it. Okay, well, to the many dinners we will have in the future. All right.

[01:20:24]

How fun. Thank you so much. Yes, thank you. What a pleasure. Thank you, guys.

[01:20:29]

Stay tuned to hear Ms. Monica correct all the facts that were wrong. That's okay, though. We all make mistakes. I'm in an incredibly beautiful new sweater that my friend got me. It looks gorgeous. I just put it on for the first time, and I'm truly blown away.

[01:20:44]

The green is really nice.

[01:20:45]

And the fit is really perfect.

[01:20:47]

I know. They know how to do it.

[01:20:49]

And I think I like these cuffs, where you have to roll them up. They're too long on their own. They're clearly designed to be rolled. See, look, that's a seven-inch, nine-inch cuff.

[01:20:57]

It's nice, though. Yeah.

[01:20:58]

But you wouldn't wear it like that, right?

[01:21:00]

You're supposed to go- Well, I'm not allowed because I'm short, and they have a rule that if you're short, you have to show a little bit of skin. If you're wearing oversize clothing, you have to show a little bit of skin on your arm.

[01:21:09]

Because you'll get lost in it. I just wrote my bicycle. Oh. Yeah. Nice. My first time in biker shorts.

[01:21:17]

How did it go?

[01:21:18]

Well, I just can't believe I'm a person that owns biker shorts and wears them now. I'm having a hard time.

[01:21:23]

Yeah, well, you're 50. So a lot of things have changed.

[01:21:27]

Yeah, but also I think that's something that's best in much I think it's younger. Almost the 50 compounds it. First, I never envisioned myself as being someone that would be in those biker shorts. Yeah, sure. But they have a pad built into them. And the seat is very tiny on the road bike. And it hurts your anus. I don't want to say it hurts here. It just hurts. Yeah, it hurts. It's uncomfortable. I was like, Oh, it's a nice padding. I put them on for the first time. I felt like Panay always talks about, eventize your run. I was like, Well, these are built for nothing other than riding a bicycle. Sure. And let's do that. And the padding was Nice. And I love that there's no fabric flowing anywhere else. I think I went up the hill faster because of them.

[01:22:06]

Probably aerodynamics.

[01:22:07]

I'm not going to adopt the Lycris shirt, though. I decided. I swore a wife beater.

[01:22:13]

So you're back home.

[01:22:14]

I'm back home. I'm very, very, very happy to be back home.

[01:22:17]

I got you a present for your birthday. Do you want to open it?

[01:22:19]

I would love to open it. Let me really take my time here. I'm looking at a beautiful tissu paper with... Oh, can I say one thing? This will sound territory. But let me preface it by saying, I could be in the tourism board for Mexico City. I love it. It's an enchanted romantic city. Food's dynamite. If you ever go, go to Havre 77, French restaurant. We went twice. The French onion soup is the best I've ever had in my life. On the second trip on my birthday night, I got two bowls of it to start.

[01:22:53]

Oh, wow. Like when you got two steaks?

[01:22:55]

Yes. And I would tear out a finger now right now to have it again and share it with you. It was the most incredible. But anyways, the facial tissue, and I had a cold, it wasn't ideal. And where it really hit me was-One apply? Maybe less. I was at a nice hotel, mind you. We got on the flight, I went into the bathroom, and I pulled the tissue out of the mirror that's in the laboratory of the airplane. And the second I touched it, I was like, oh, that's soft. And then I thought, how bad was the tissue where the airplane tissue felt like Puffs Plus with lotion. Oh, my. Just to make it relative.

[01:23:32]

Yeah, because that's one pli.

[01:23:33]

Yeah, I think it was like 0.6 pli. Oh, okay. Anyways, beautiful tissue paper with purple flowers.

[01:23:42]

Really nice. The tissue is from Nikki Kehoe. The present is not.

[01:23:45]

Oh, this is a multi-stage gift. Yeah. Okay. Beautiful tissue paper, and then a burlap sac.

[01:23:51]

Yeah, also from Nikki Kehoe. That's how they wrap.

[01:23:53]

Wonderful. Oh, buddy. The Stories of Raymond Carver. Will you please be quiet, please? Is this an original?

[01:24:05]

I bought it as a first edition, and it is signed. It's signed? Yeah.

[01:24:13]

Did you pay the face value of $8.95?

[01:24:16]

No, it was on sale, actually. Half off.

[01:24:19]

What year was this published? Because we can... I think it's fascinating that a hard cover, beautifully bound book was $8.95. I know. That's true. I know I'm all the place in a little manic, but I just got to add back to Little Women, which I love. As you know, Greta Gerwig's number one superfan now. At the end of that movie, they show them pressing and making her first book, The Book, Little Women. I don't know if you remember that sequence.

[01:24:49]

I don't know if I remember it.

[01:24:50]

But the amount of time and effort it took to make a book in the 1890s, where they were pressing it all, they were cutting it with a saw they were sewing the binding by hand, and then they were cutting leather out in a pattern, and then gluing and putting that in a press. I'm like, it took like a week to make a single volume. They should have been $600.

[01:25:14]

Exactly. Well, That's why they're so rare.

[01:25:16]

And it explains why. I think it was Carnegie who invented the library. There were no libraries. Books were just too expensive. They were like, probably in today's dollars, they probably were hundreds of dollars, that amount of manpower. Okay, so this was We think about wealth disparity now, but then in order to even read a book, you had to be a millionaire. Yeah, I'll get the number wrong, but to put it into perspective, I guess Elon is now worth $400 billion recently, although that stock just fell. Whatever. Let's just say he hit $400 billion. $400 billion of our total GDP and national amount of money isn't even 0.01 %. When Rockefeller hit a billion, they say he actually had like 15 cents of every dollar that existed in America. So it's like as bad as it feels now, it was- It was worse. Order of magnitude crazier with the first rich people.

[01:26:15]

Yeah, that's true.

[01:26:16]

Okay, so this was 1963. So this book cost 895 in 1963.

[01:26:23]

How much do we think that is now? Rob, can you put it in?

[01:26:26]

Well, that's great. We have that technology. Yeah, we sure I added a new... I actually wrote up my resolutions last night. Oh, great. Which I don't know if I've ever written them down.

[01:26:35]

Yeah, I wrote some down, too.

[01:26:37]

You did. Did you journal this morning?

[01:26:38]

I did. Congratulations. I journaled every day. I'm proud of you. I had therapy, too, and we talked about it, and She said I could- Burn them? Yeah, or shred them or whatever.

[01:26:51]

Can I have her number? No.

[01:26:55]

She's like, if that's going to allow you to really be able to be honest and truthful with yourself in a way you won't be able to otherwise- And let it out of your body. Sometimes her and I talk about... There are things that I talk about with her that only she gets to hear. And she said, it's not just me. You also have you. Yeah. And you can have a dialog with yourself, especially via the journal. Yeah. But yes, of course, I have to be very I'm very honest with myself there. And so if I'm out of fear not doing that, then it's not worth it. So I'm still deciding.

[01:27:40]

We may have talked about this, and I had mentioned there was a period I stopped journaling over the last 20 years. And then I had a relapse, obviously. And I didn't even put all this together. But through therapy with Mark, I think what occurred to me was I There were things I couldn't write down, just like you were saying, are you afraid someone's going to find it? And I'm like, no. But in truth, there was a moment, yes, I'd be afraid someone would find it. And I had this weird dedication to never lie to that journal.

[01:28:14]

Right.

[01:28:15]

So it didn't feel like I was making a decision to stop journaling. It just was like, this is really weird. I've been journaling for 17 years or whatever, and I haven't in a while, but I'm not overthinking it. But of course, in reflection, I was like, I couldn't really be dishonest to this thing.

[01:28:32]

Yeah.

[01:28:33]

I love this. This is such a thoughtful, wonderful present. I'm glad.

[01:28:37]

It would have cost $89.18. Wow.

[01:28:41]

$89 for a book.

[01:28:43]

That's a lot. It's not enough, though. I wish it was 5,000.

[01:28:49]

Okay, this is a fantastic present. Very thoughtful. Thank you so much.

[01:28:53]

You're welcome.

[01:28:54]

Okay, how was therapy?

[01:28:55]

It was good. It's my first therapy of the new year. You So for a second, I was debating. I was like, maybe I only need to start going as check-ins now. Maybe I don't really need to be on this consistent of a schedule. But then today, I was like, no, I need to keep up my once every two weeks.

[01:29:18]

Look, I've stopped. So I really am in no position to say this, but it definitely falls under the umbrella of like, well, it couldn't hurt to go.

[01:29:26]

It does not hurt.

[01:29:27]

And it potentially could hurt to not go.

[01:29:30]

Yeah.

[01:29:31]

It's the vitamin debate. It's like the scientific community is split down the middle, whether vitamins work or not. Yes. But it's like, I don't know, on the chance that they work, they're not going to harm you. All right. Someone's going to comment. Yes, I hear you.

[01:29:44]

There are some bad ones.

[01:29:45]

Oh, I know. And you can have too much of certain things. But just in general, if you're taking the, not above the daily dose of any one thing, it's not going to harm you.

[01:29:53]

Speaking of, okay, you know how I'm always paranoid about drowning my cells? In too much water? Yeah. Or people in general, drinking too much water and then drowning their cells.

[01:30:03]

You know, Gundry's new movement is less water. Not shockingly.

[01:30:09]

So him and I are aligned. Soulmates. Why he's got those fresh hands. He doesn't drink any water. No hydration. Oh, my gosh. I'm going to put my hair up real-time.

[01:30:21]

If you want to see it- It looks so good down, but go ahead. Let's see what happens there.

[01:30:24]

Okay. If you want to see it, go to YouTube.

[01:30:27]

Do you ever do an up and then a braid and back?

[01:30:30]

Yeah. Well, I did it for... When's this out?

[01:30:34]

The eighth.

[01:30:35]

I did it for a commercial we were just in together. Oh, yeah.

[01:30:41]

That comes out yesterday.

[01:30:43]

It came out yesterday. Oh, my God. Our little commercial.

[01:30:47]

Yes. Our second commercial of I Hope Many.

[01:30:50]

Yes, exactly. It was so fun. And it's out. It was out yesterday. It's on our Instagram. And in it, I do have a ponytail with a braid that I love. It's just really hard for me to do on my own. I had a hair stylist that day. Oh, right, right. But I do like it.

[01:31:08]

Maybe your therapist can style your hair on the days you don't want to share.

[01:31:11]

Oh, hairplay? I would go every day. Yeah, yeah, yeah. I'd pay for that. Anyway, okay, so, drowning cells, everyone laughs at me. They guffaw. And I met someone who drowned his cells.

[01:31:26]

Oh, tell me.

[01:31:28]

And it was really bad. Tell Tell me more.

[01:31:30]

Okay. Who did you meet? Where did you meet him? In front of 711?

[01:31:33]

No, he's a real person I know. I'm not going to say... I'm not going to out him.

[01:31:36]

Him or her's name.

[01:31:38]

Right. He's a friend of a friend. This is a sad story. I'm transitioning into a sad story. When I was home- If you were having fun and laughing, stop.

[01:31:45]

Sorry.

[01:31:45]

Yeah, stop. A big group of friends was meeting, and one, Robbie.

[01:31:52]

Yeah, sweet Robbie from our chain.

[01:31:54]

Yes, from the Connections chain, wasn't there. I was like, Where's Robbie? And his wife said, Oh, he's at the hospital with- The mutual. Yes, with- The Unmentionable. No, because- He's not underwear.

[01:32:08]

Because that's his name?

[01:32:11]

No, his name is Unmentionable. His name is Unmentionable? We can't call him Untouchable because he's Indian. Oh, he is? Yeah. So now I'm getting a lot of info away.

[01:32:18]

Yeah, it's pretty easy to narrow this down at some point. If you know an Indian in Atlanta who's friends with Robbie.

[01:32:25]

Bingo. That's true. There is one. There is one. Anyway, this is sad. This is And he had a seizure.

[01:32:31]

And I guess he had already had a seizure a year before and was on seizure medication and stuff.

[01:32:43]

But when he went- You're the perfect person to tell this story because you have the same condition.

[01:32:46]

Right, exactly. And you're Indian. And I'm Indian. When he went the first time after his seizure, they checked his salinity levels and they were so low. And he did drink in really excessive amount of water. Do we know why he- And he drowned his cells.

[01:33:05]

Yeah. He got rid of too much salt.

[01:33:07]

He drowned his cells.

[01:33:08]

That was the medical? Yeah. Oh, okay. You have a good deal of salt, I think, from your diet. Don't take offense to that.

[01:33:17]

Are you referring to the potatoes I made or something?

[01:33:19]

No, but you have a nice seasoned chicken. I think you have a good amount of salt in your diet.

[01:33:26]

Yeah, I feel fine about my salinity.

[01:33:27]

Yes. So I don't drink any water, so I'm good there. Do we know why he was drinking so much water? Was he on an exercise routine?

[01:33:35]

He was on an exercise routine, and I'm not sure why. Anyway, so it turns out Per usual, I'm right. You can drown yourselves. Per usual. And- Unsurprisingly. Please look out for that. Okay.

[01:33:53]

Yeah. I don't know why I brought that up. You and country should collab on this.

[01:33:57]

I'm happy to join forces.

[01:33:59]

Also, just if you are having a lot of water, maybe use some- Electrolytes. Electrolytes. That's right. Keep an eye on your electrolytes. The only cases I've ever heard of is no one's ever died from ecstasy, but people have drank too much water on it.

[01:34:16]

Exactly. They drowned their cells. Yeah, okay.

[01:34:20]

I wonder if they drowned their cells or if when they drink way too much water, it backs up congenital heart failure, basically, ends up filling up their body. Because So my father, who had congenital heart... I don't know if it's congenital. He had heart disease. And what would regularly happen is his heart was too big on one side and normal on one side. And so it would pump in a lot, but it couldn't pump out a lot. And then it just ends up backing your whole body up with water, and you get really bloated, and you put on all this water away. And then it starts really affecting your breathing and your lungs and everything else. And so my dad would go into the hospital for four days, and he'd be on pediatrics, and he'd just be getting rid of gallons of water.

[01:35:03]

Oh, my God. Yeah. Okay. It says, yes, cells can drown in a condition called water intoxication or hyponutremia, which occurs when there's too much water in the body. When there's too much water in the body, sodium levels drop, causing water to move into cells and causing them to swell. This can be especially dangerous for brain cells as it can lead to pressure in the brain, confusion, drowsiness. Wait, no.

[01:35:28]

Epilepsy, pressure in the brain might have been completely all related. Exactly. Oh, man. Well, I'm sending love and well wishes to this anonymous person. Untouchable. Why does Robbie have two very close Indian epileptic friends. I know. He's very over-indexed.

[01:35:49]

He is extremely over-indexed.

[01:35:51]

Because I consider myself unique in America, low percentage, where I have a best friend who's Indian and epileptic, and he's got now I know. He has a fetish. I know you don't like that word, but- You think it's a king? Ask if there's a third. If there's a third, he has a condition.

[01:36:10]

Yeah, it is weird. Then I wonder, is epileptic Let's see.

[01:36:14]

And what's in the city is Robbie's wife?

[01:36:16]

White. White? Yeah. She doesn't have it. Actually.

[01:36:20]

No, no, no, no, no, That would make sense. He's one of the sweetest people I've ever met over text.

[01:36:32]

Yeah, but he is a dark side. Hi, Robbie. Nasty. So his wife is my oldest best friend. And when we were in high school, she had seizures. And they were dating at that time. Okay. And she got in this car accident because she had one. Yours were different, though. She had... She didn't have grandmaal seizures.

[01:36:56]

Were you about to say petite seizure?

[01:36:58]

Petite Mall. That's what they're called. That's so cute.

[01:37:00]

And then she had to- And you see a picture like a mall, you'd walk in, but there's only three stores. And then the food court is like four food carts.

[01:37:07]

They should call it Boutique Seizures.

[01:37:10]

That's way better. Yeah, that's cute. You and Gundry can work on that.

[01:37:15]

Anyway, yes. He has three.

[01:37:18]

There's a four. I mean, I only know if three people in his life and all three of them have seizures. So certainly there's more. Should we get Robbie on the phone?

[01:37:27]

Do you want to? Yeah.

[01:37:29]

We got grill them about this.

[01:37:30]

I mean, he's definitely at work.

[01:37:32]

On Saturday? Oh, I forgot. During the NFL playoff game? I'm so sorry, Georgia Lush, by the way.

[01:37:39]

Oh, no. The Sugar Bowl? You didn't know that?

[01:37:40]

They lost? They lost.

[01:37:43]

To who? Don't say Texas.

[01:37:46]

They didn't lose to Texas, but Texas won theirs. Texas is still in it. Notre Dame.

[01:37:50]

But they're still in it because... Oh, no.

[01:37:53]

That game we saw was one of our only two losses. They ended up being really good.

[01:37:59]

I know, but they played It's- I'm welcome to the SEC, bitch.

[01:38:02]

Is that what you said?

[01:38:03]

Yeah, I did. Hold on. I got to call Robbie. He's the one also that knows about all of this. Yeah, he's not at work.

[01:38:07]

He's at the hospital. It's one of his Minneapolis.

[01:38:10]

Don't say that. Knock on wood. Hello. Hey, Robbie. Yeah. You're on candid camera. You are on air. Arm's hair candid. You're on air.

[01:38:24]

I'm on air. Do we have your consent, and can you hear me? Yes, yes, yes to both. Yes.

[01:38:29]

Okay, great. Well, we wanted to call you about one thing, but now we have two things to talk to you about that are very important.

[01:38:37]

And we did not name any names, but I'm just learning of the fact that you have a second Indian friend with epilepsy, which I find to be almost statistically impossible. And then Monty said, it doesn't stop there. His wife has epilepsy.

[01:38:51]

Well, she doesn't have... Okay, not specific epilepsy, but you do have three people in your life that have had seizures, and now we're starting to And think you're at the- I know.

[01:39:01]

Yeah. I see where you're going. I honestly hadn't ever thought of this.

[01:39:07]

That's what he would say.

[01:39:10]

Monica, my sister, too. I knew it. I fucking knew it. I said, I- It's worse. Robbie, I said there's a fourth for sure.

[01:39:19]

Fuck.

[01:39:20]

Robbie. What are you doing to everyone? I don't know. I really don't know. Oh, my gosh. I'm just looking at things in my life. I don't know.

[01:39:29]

This is wild.

[01:39:31]

Do you think it's because you're so calm and sweet, all of a sudden the other person's brain feels erratic and unhinged? Is it relative to your calmness, people short circuit? It could be. I mean, yeah, that's the best.

[01:39:43]

I think that's the best we have to work on right now. My guess is Gina would say otherwise. But this is wild for her, Robin. Four is a lot.

[01:39:55]

Now, I'm being sincere. Is there something Something environmental in Duluth where half the population is at seizure? Wait.

[01:40:04]

No, because mine happened once I left.

[01:40:08]

But you grew up with that water.

[01:40:10]

Oh, you think it's the water?

[01:40:12]

Yeah, you have late onset.

[01:40:14]

Duluth Because I didn't drink enough water, and then it caught up.

[01:40:19]

I'm not about the logic of that. But what I'm saying is there's something in the soil where you grew up, where 70 % of all people have seizures. There's got to be. Monica's house was super close to mine. The other friend also lived right down the road, too.

[01:40:34]

And Gina.

[01:40:36]

Yeah. Honestly, if you draw a Polygon of the four points, it's a very small area, and so likely shared whatever water source.

[01:40:49]

It's pretty narrow there. You're right. Guys, did we just break an enormous case? Do we need to call the New York Times immediately? Fuck. You're going to have to do a A new podcast. You're going to start a new podcast where you investigate this issue. Wow. It's going to be called Poison Paradise. Under the veil of suburban beauty and tranquility- Oh, my God. Lies of burbling poison that results in the shutters. Okay.

[01:41:14]

That's a lot of words. That's a lot. That's too many words. You need it to be small.

[01:41:18]

Sure. No, first was the title, and then I was entering into the first episode. Oh. You got going already. Oh, my God. You're halfway there, it sounds like. This thing writes itself.

[01:41:28]

Okay, now we have moving on to point number Well, no, I have one follow up on that, Robbie.

[01:41:33]

Okay. In your free time, which I know you don't have much of, can you sniff around and see if any more folks have had seizures? Yeah. Okay. I will. I'll report back. I'll start casually throwing that into each conversation I have. So by the way, this is weird, but do you have a history of epilepsy? Just move on from there. Yeah.

[01:41:53]

That sounds like a good plan. That's going to work. Okay. Now, point number two is football, and you are my main source of information for football. I was texting you during the Texas-Georgia game, and we were secretly gloating while I was amongst a bunch of Texans. And then Dax just told Can you tell me that then Texas went on to win all the rest of the game? They're still in it.

[01:42:19]

They're still in it. Yeah. So they have a tough matchup against Ohio State because Ohio State looks really good right now. But yeah, they're still in it.

[01:42:29]

But does that be like a fluke?

[01:42:32]

Shut up. Well, it can't be a fluke because the only team to beat Texas this year is Georgia. Georgia beat Texas twice this year.

[01:42:40]

Oh, twice. But then it's a fluke that we aren't... It doesn't make sense that we beat them twice and we're now out.

[01:42:49]

Yeah. You know how it's like, how can Federer be the best ever if he can't ever beat at all? It's very similar. We all have our albatrosses.

[01:42:58]

Yeah, exactly. Yeah. All right. Well, that clears that up, I guess.

[01:43:03]

And I just want to end on this, Robbie. Your voice was built for radio. You must be involved in Poison Paradise. I'd love to help you.

[01:43:13]

Let me know.

[01:43:14]

I'm a hard worker, too. So just let me know what you need.

[01:43:18]

All right. Thanks, Robbie.

[01:43:20]

All right. Thanks, guys. Take it easy. Bye.

[01:43:24]

Stay tuned for more Armchair Expert, if you dare. That was a great use of time.

[01:43:38]

Yeah. I wasn't expecting his voice to be that velvody. He's a very handsome man.

[01:43:42]

You know, I only have handsome and beautiful friends. Right. This has started from day one.

[01:43:47]

Good for you.

[01:43:48]

I know. Anywho. Okay, good luck to...

[01:43:50]

Will you plug your ears? Good luck to UT. Hook them. I'm cutting that. This is the same as that story I told about the people flying to LA to watch the Red Sox play LA, hoping the Red Sox would lose because they had just beaten New York. But the other guy was like, no, they must win. That way, New York's number two. Wouldn't you want your team to have twice beat the champions?

[01:44:13]

I guess You're right.

[01:44:15]

I think it's time for you to transition into rooting for them for your own.

[01:44:20]

For my own gain. Yeah. Okay. I see that logic.

[01:44:24]

Speaking of which, and I know we're all over the map and I've taken up too much time, but I want to go on I have to say that I finished the Churchill documentary on the flight home yesterday, and I got very swept up in it. This has happened a few times, and I'm sure you've watched shows on this. When you were forced to watch what the Brits went through, 57 nights in a row of carpet bombing of London. Everyone's sleeping in the subway, no bathrooms, getting up, going straight to work, and carrying the fuck on. And they were so outgunned and outmanned and out everything. And they alone took on Nazi Germany at that point. Everyone was already defeated. The amount of will and resolve is so historic. I found myself like, this is so cheesy. I found myself being really proud that I know Jethro.

[01:45:27]

Oh, that's nice.

[01:45:28]

Yeah. I was like, By God, that little island, you motherfuckers refused. Yeah. And Churchill, he is a very flawed person. He was horrendous to India. I'll acknowledge that. But truly, one man got those people to that state of mind. If you watch this doc, you're like, who knows if that person doesn't exist, what happens? Because he had two burdens. One is to be fighting off these Nazis who are just mopping every single night, trying to keep morale high. And he has got to get America into the war or they're going to die. Everyone's going to die because they're not going to surrender. And so his skill at wooing FDR and developing this relationship and slowly getting us more and more involved is so impressive. And his own story is so unique in that he was a soldier in his youth, and he was an incredible soldier. Then he went into politics and he was a boy wonder because the whole time he was in the war, he was also a reporter. So he was reporting firsthand from all these wars, and he's one of the best writers to ever live. So he was in this crazy, unique situation where he leaves the service as a hugely popular figure in Britain, goes into politics, has this meteoric rise, and then plateaus, and then plummets, and he's completely on the outs, and he can't get anything done.

[01:46:56]

And then World War I comes along, and he decides in his in the '40s or '50s to rejoin the army. He becomes a commander. He wins all this glory, returns, and for four years is begging Britain to understand Hitler cannot be trusted and don't believe a thing he's saying, and we can't be signing these deals, and no one's listening, no one's listening. He never relents. And finally, the Brits realize he has been right the whole time. And overnight, he becomes Prime Minister. The story of the up and the down and the out and the the miscast in the... It's what a story. Horrible to the Indians. Let's be clear. A colonist, grew up in Elizabethan, England, definitely wanted the Empire to stay alive. Also, miraculous feat of will and resolve in the poetry with how he motivated people. He gave this speech to our Congress to help us embrace the fact that we were entering the war. And it's like the most incredible speech. I I cannot recommend the doc enough. Wow. I don't know why I went on that tangent, but it's been burning a hole in my brain. I know. I'm making you nervous.

[01:48:15]

My energy level is at 15.

[01:48:17]

I'm home. It's not making me nervous. It's like- Go ahead. No. It's just like, where's it going?

[01:48:24]

Oh, I'm just sharing all the things that I missed out on sharing in the last three weeks.

[01:48:28]

God. You're so much like my father.

[01:48:33]

I am.

[01:48:34]

He just loves to explain stuff.

[01:48:37]

Yeah, it's a male trait. But does that story... Is there a male/female thing going on? Is this the Roman Empire? Does that whole chapter just not interest you?

[01:48:49]

Parts do, but not that part.

[01:48:53]

Of an individual story where someone's completely discarded and publicly reviled, then finds their way back, then becomes so valued and important, then gets discarded again, and then it doesn't quit, has a calling that can't be ignored, and then match with this Shakespearean ability to write speeches.

[01:49:17]

Yeah.

[01:49:18]

No.

[01:49:18]

No. I'm more into the Anne Frank story of that era. I guess I'm really not drawn deeply to people in power. That's not a thing that- You're done to the disenfranchized.

[01:49:38]

Yeah. This makes total sense.

[01:49:39]

Well, I just find that way more as a human story way more compelling. I find that overcoming, like a true overcoming, much more compelling than someone who's just feeding off power.

[01:50:00]

I think the thing that interests me about it is as big as this world is and as complex and dynamic as it is, single individuals radically change the face of the world.

[01:50:14]

Oh, I agree. Yes. I find that fascinating. Those figures, they don't do it for me.

[01:50:18]

Yeah, they don't get you going.

[01:50:20]

I'm like towards them.

[01:50:24]

For the listener, she just... It was an interesting one. It wasn't an eye roll. It was a back and forth, side to side. Speaking of- Go ahead.

[01:50:33]

I roll.

[01:50:34]

You found the origin of your- I figured it out.

[01:50:37]

I figured out where my eye roll comes from. We thought it was an Indian thing.

[01:50:43]

Or just maybe a genetic innate thing.

[01:50:45]

I thought it was maybe just a full resentment I have of everything and everyone. We didn't know, but I knew that's not right. That's not it. It's a habit. But why? And now I know.

[01:50:57]

Well, you sent it to me, so I saw it.

[01:50:59]

I'm going to show the world. The world.

[01:51:01]

Show the world. I'm going to have to describe for the listener, because let's be clear, 98% of our audience is still just listening, not watching.

[01:51:08]

Check us out on YouTube and you can see this.

[01:51:10]

Yes, please do. For the listener, it is a two or three-year-old Mary Kate and/or Ashley Olson from the Full House program. It says, Duh, across the screen. She's shaking her head, and she gives the most expressive eye roll you've ever seen. She has, where they have enormous Disney eyeballs where it's very expressive and clear.

[01:51:35]

Yes.

[01:51:36]

Yes.

[01:51:37]

We got it? We got it. All right. Now, Full House was my original friend. I was obsessed with it. The only time I was ever punished for my parents, the punishment was I couldn't watch Full House that night. That's in my cells. Yeah. That's where I got it. I got it from original Mary Kate and Ashley Full House.

[01:51:56]

You started probably reenacting it. Always. Yeah. Aping it.

[01:52:02]

Yeah. Mimicking. They were my models then and now.

[01:52:05]

Yeah. It might be all the way to the end. I think it is. They might be your Aaron Weekly. I mean, you already have your Aaron Weekly. I think it would be sad if they're on my Aaron Weekly because they don't know me.

[01:52:17]

But they are my ride or die.

[01:52:21]

What I'll say is they're radically different people, which is so fascinating.

[01:52:25]

Yeah. I guess that makes sense, but also doesn't make sense. It doesn't make sense. Well, they're not identical twins. You know that, right?

[01:52:33]

They have to be. Baloney.

[01:52:36]

They're fraternal twins.

[01:52:37]

No. Well, sisters have never looked that much alike. I know.

[01:52:42]

It's crazy.

[01:52:44]

Do we know this for positive? God, don't make me.

[01:52:46]

Okay.

[01:52:47]

Ai Google says that.

[01:52:48]

Thank you. I know. I know.

[01:52:52]

They're not?

[01:52:53]

That's like me saying I know something about Valentino Rossi.

[01:52:57]

Tell me, what do you know? I know nothing. Yellow 46. Exactly.

[01:53:00]

It's the whole point.

[01:53:02]

I'm impressed you remembered his name.

[01:53:03]

Thank you. So, yeah, fraternal.

[01:53:07]

One's left-handed, one's right-handed. But that's super common in twins. And one's one inch taller than the other. Even when they're identical.

[01:53:15]

Well, they're fraternal.

[01:53:16]

But that could be a posture thing.

[01:53:17]

Okay, whatever. All right, let's stop. They're fraternal twins. I believe you. You'd never know it by looking at them. Don't judge a book by its cover. You would not. I agree with you. It's shocking. Yeah.

[01:53:29]

But I've met a lot of boy/girl twins, and they have all told me... They have all had the experience where someone asked if their twin was identical, even though they knew one was a boy, one was a girl. What? Yes. I'm telling you.

[01:53:43]

Okay. Well, some people don't understand twins. They don't understand what identical means versus fraternal. They must not, or...

[01:53:53]

It must be way lower percentage that you get a boy and a girl than same-gendered twins.

[01:53:59]

For For fraternal? For fraternal. I think the opposite. Oh, you do. I feel like if most fraternal twins I know are boy and girl. Oh, really? That's why they are very confusing.

[01:54:11]

We should have a twins expert on. Yes. Because what that means is that there were two ova in the uterus, and that one male sperm and one female sperm hit the two. And generally, you would think, well, either the males were making it because they swim slower and they're more robust or vice versa, and one swim fast. So it's weird that one It's not going to swim fast, but you know what I'm saying?

[01:54:32]

I don't know. The body is a wonderland. It is a wonderland. John Mayer. Yeah. All right, let's do a little bit of facts. This is for Ken Goldberg. He was wonderful. I really, really liked Ken.

[01:54:42]

Yeah, what a unicorn. A lot.

[01:54:44]

Okay, now this episode starts with your underwear on the floor.

[01:54:49]

Which was interesting.

[01:54:51]

That was shocking.

[01:54:52]

That's an experience to look down in your underwears outside your pants because your first thought is, my underwear came off my- Yeah.

[01:55:00]

It lift off. Yeah.

[01:55:01]

It doesn't seem to be torn in half. Yeah. That's a real, where am I at in time and space that my underwear has made itself off of my body and onto the floor? Yeah. I mean, it's so obvious later when you think it was clearly my pant leg.

[01:55:14]

I No, but in the moment, you can't think straight. Oh, my God.

[01:55:17]

My underwear is falling off. It's like, and I think you should leave when Robinson, they put a Whoopy cushion on his chair and he doesn't understand it. He goes, What happened? Like, he really is shook because he didn't feel himself fart, but he heard a fart? What happened?

[01:55:33]

Oh, my God. That's so funny. Okay, but also, so that happened, the underwear. But then when I was editing it, the inside out of my pant pocket Was exposed? Was exposed the whole time. That's a weird coincidence. It is weird, but no one caught that. So the whole episode, the inside of my pant pocket is out.

[01:55:55]

Which people could have thought might be her underwear. You know it's the lining of your pocket, but other people could be like, why are both of their underwears falling off?

[01:56:06]

Now, the vaccination mark. The smallpox vaccine scar is a small mark you might have on your upper arm if you receive the DRIVAX or ACAM-2000 smallpox vaccines. It's a sign that the vaccine successfully spurred an immune response in your body to protect you against smallpox. Not many people receive a smallpox vaccine today, so the scar is far less common than it used to be. The smallpox vaccine leaves a scar because it causes a minor infection in your skin. Your body fights off the infection, but this process leaves behind a small mark on your skin where the infection and related inflammation took place.

[01:56:44]

That makes a lot of sense. I assumed wrongly now that it had something to do with the mechanism of injecting it. Did they use some weird thing? Because again, my dad's was... I have such a good memory of my dad's. I don't know that my mom has one weirdly. Yeah. But my dad's is seared in my brain, and I was like, it looked like, I think I said, a cigar. Like they administered it with a burning cigar.

[01:57:06]

You can look at pictures online. They have them, and they do look like that. Okay, the scientific management book that was influential on Stalin is called The Principles of Scientific Management by Frederick Taylor.

[01:57:19]

See, this all paid off my diatribe on Churchill because Stalin was the trickiest figure in that triumvirate.

[01:57:25]

Okay. Now, so Kim Kardashian And posted some pictures with the Optimus robot, and it said that Elon gave it to her, and she denies that she was paid for those pictures. Okay.

[01:57:42]

Other than the free robot?

[01:57:44]

Right.

[01:57:44]

That she may or may not have. Right. Okay.

[01:57:47]

This is what it says the robot can do, the Tesla Optimus robot. Okay. It says it can do physical labor. It says it can move materials, assemble Humble parts and load items onto machinery.

[01:58:04]

Okay.

[01:58:05]

Yeah. I'm skeptical of that.

[01:58:06]

I'm skeptical. That's how we do it without getting sued. I'm highly skeptical.

[01:58:10]

This is also on the AI overview, so they're buddies. Okay. So he's gone. They're all in cajoots. Yeah. Inventory management. Optimist can use barcode or RFID scanning to track inventory in real-time. Home chores. Optimist can carry groceries, help the elderly. Helped elderly. And perform other home tasks. Look at Only Health's elderly. I mean, that would be good. Data collection and research. Optimist can be used in labs or remote monitoring environments to collect data. I mean, that's just like... That's a computer. The brain. Yeah. Smart home integration. Optimists can link up with Tesla cars and energy systems to become part of a smart home. Optimists can walk among people and serve drinks at a bar.

[01:58:54]

I doubt it, but I'm sorry, I'm skeptical.

[01:58:57]

But have you heard about that? Okay, apparently there's a place in like a Culver City or something that is run by... It's like a burger place that is run by robots, and the robots drop off your food.

[01:59:07]

Okay. I think I've heard that, but also my assumption of what that was was like very simple mechanized arms, not bi-petal robots walking it out. It can make it in the kitchen, then it goes on a conveyor belt, and then it's exactly lands in front of your thing. It doesn't necessarily mean that a bi-petal robot carried it as much as there might be automation that gets it all the way to your-I think it's saying it delivers it to your table, but it might not be bipedal. We should go.

[01:59:34]

We should go.

[01:59:35]

I'd love to go to a robot restaurant. What is it?

[01:59:37]

Cali Express in Pasadena.

[01:59:39]

Oh, it's in Pasadena. That's much closer. Yeah. That just up the odds of us actually doing that by a lot.

[01:59:45]

I do think there's a little guy that rides around and brings you your food.

[01:59:49]

A little Fala. Cali Express by Flippy, the world's first fully autonomous restaurant. Grill and fry stations are automated.

[01:59:58]

It looks like a little thing with Are there any other Vane trades and American flags that goes to your table.

[02:00:03]

We'll have to go. But okay, it says, Optimist can perform precise movements and heavy lifting. Optimist can adapt its behavior over time to reach the desired results. Optimist can play games like rock, paper, scissors.

[02:00:18]

Okay.

[02:00:19]

Anyway, that's what AI claims its buddy Optimist can do. Okay. They're best friends. Okay. Our robot feels a little left out.

[02:00:29]

No, No, he's more boylike, remember? I know, but- Big time glass half fall.

[02:00:33]

He's wondering what's going on because there's a lot of other robots now.

[02:00:36]

There are a lot of other robots, but he's becoming charming and flawed. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi.

[02:00:46]

Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. Waboo-sabi. There was Prada has these bag chains that I really want that are robots.

[02:00:57]

Bag trash? Is that what it's called?

[02:01:01]

No, it's a bag chain.

[02:01:03]

I'm learning this from Nicole. This is the movement now. It's like you have these very fancy handbags, and then you put all these little trinkets that car off the side, and I think she calls it bag trash or something.

[02:01:13]

She might, but they're called bag Charms. And look, Prada has this one. This one's in like... This one's in like snow gear.

[02:01:25]

Yeah, that's really cute. Isn't it? Yeah, it's about to be critical. I just think it's funny Fashion is very funny. Sure. So you get this perfect, outrageously expensive bag, and then you're supposed to drape some trash. Obviously, it's like, downplay it. It's like, what's happening?

[02:01:39]

I agree, but it's not trash. This is $1,100.

[02:01:43]

Well, I didn't say it was It's expensive. Oh.

[02:01:45]

Yeah. Well, okay. But I agree. I would not put... People love bag terms, and I think that's great. And it's a way to show your identity. But they're not for me on my bag. But I want this little robot to just sit in my house.

[02:01:59]

Yeah, that's great.

[02:02:01]

Yeah. He's pretty big. Look at him compared to the bag.

[02:02:04]

Oh, that's for posters. He's larger than the bag.

[02:02:06]

You said 39 % of US jobs are still manual labor. According to the Bureau of Labor Statistics, reported that 39.1 100% of civilian workforce in the US performs physically demanding jobs that require lifting, carrying, pushing, pulling, kneeling, stoping, crawling, and climbing activities in varied environmental conditions.

[02:02:25]

Sucking, fucking, don't leave out sex workers. That's manual labor. Don't we honor sex workers?

[02:02:33]

Yeah. But I'm just wondering, is it really manually? Yeah.

[02:02:35]

It's definitely manual.

[02:02:36]

It's laborious. All right. Well, that's it for Ken.

[02:02:43]

I'm glad we ended on that note for Ken. I think he would appreciate that.

[02:02:48]

All right. All right. Bye, Ken. Love you. I love you.

[02:03:05]

Follow Armchair Expert on the WNDRI app, Amazon Music, or wherever you get your podcasts. You can listen to every episode of Armchair Expert early and add free right now by joining WNDRI Plus in the WNDRI app or on Apple Podcasts. Before you go, tell us about yourself by completing a short survey at wundri. Com/survey.

[02:03:25]

Are you considering fertility treatment in 2025?

[02:03:28]

Waterstone Clinic is renowned for delivering advanced fertility science with personalized care. Now, in the heart of Dublin, you can have rapid access to fertility assessment and all IVF treatments.

[02:03:39]

With over 10,000 babies born, we achieve some of the best success rates in Ireland.

[02:03:44]

Take the first steps towards your future family at our PHI, Lea, and HSC-approved clinics.

[02:03:49]

Contact waterstoneclinic. Ie to book your free advice call.

[02:03:53]

Waterstone Clinic, science delivered with care.