Transcribe your podcast
[00:00:00]

You go three to one, Lex, handsome as ever. Thank you. Well dressed, I always feel like a slob at home around you. Do you dress like that in real life or only when you do podcasts?

[00:00:11]

Yeah. So I have to outfit this. And black shirt and jeans.

[00:00:15]

Slick outfit does nothing. Nothing more classic than a dark suit with a white shirt and a black tie. Is that a black tie or is it a dark blue? Black tie, black tie, black suit, black tights, armor. Yes. Makes me feel like focuses the mind like a pro.. Yeah. Yeah, I'm taking this seriously.

[00:00:34]

Yes, yes, yes. Like you're you're fucking for real, man. You get notes and shit. Yeah. I got notes and shit. But I.

[00:00:43]

But given this suit like I like to get like dirty like I like to walk in a car or whatever, like I don't want to like I love to get in a fight. And this this isn't like me trying to protect myself from the messiness of the real world. Oh, I understand. Is just.

[00:00:59]

Just looks good. It makes you feel it feels fashionable again.

[00:01:02]

Is it flexible? Like, you know, they make clothes that are flex.

[00:01:06]

Yeah. You can move in and move in. Oh, nice. And I can I mean, you showed me how you can choke me last time with the tie. Did you get a breakaway tie? No, I didn't.

[00:01:16]

But, you know, I can. I can let you have that one, because I think I can defend it pretty well. Well, you're probably very good at defending jokes. Yeah.

[00:01:24]

No, no. Good luck with the tie. I don't have a system yet. I'll have to talk to John her to develop a tie.

[00:01:29]

All you have to do, man, is just take the back of the tie. Cut it. Put a little piece of velcro on each. Then you at the same time.

[00:01:37]

Nope. Nope. But I think you going under the tie to try to start to choke. Mm hmm. Actually, I mean you're making yourself honorable. Like maybe ten armbar or something like that. I think there's. Don't be silly.

[00:01:50]

Don't be silly. Well, listen, if someone grabs a hold of your collar, that's the same thing. Zeke, yield chokes are deadly. Yeah.

[00:01:57]

It's not over. Yeah. If you sink it in.

[00:01:59]

But it's so it collars are real problem. Right in jujitsu. Yeah, they're a real problem.

[00:02:07]

If someone gets deep on your collar like no one with a suit starts doing this man your thought.

[00:02:14]

Not good. Not good. Collars are not good if you go deep and deep.

[00:02:18]

Yeah, well the problem with that is it's a handle. It's worse than a collar because I'll get underneath that knot and I'll grab a hold up pitch and then it's all just twisting. Yeah.

[00:02:29]

But you have to. You're right. Yeah. I would have to get it. I'd have to get it.

[00:02:34]

And you also don't have to hold on to this part because it can loosen naturally unless you're really good at like because it loosens doesn't it loosens naturally. This is a system to this. I think you haven't thought through this.

[00:02:47]

You don't think I have, you know, try to choke people with ties on as friends.

[00:02:52]

Yeah. Well, let me grab a hold that Hydro-Quebec quick and watch what happens if I do this. Yeah. No, no, not jujitsu. And also, it's probably a joke. Yeah.

[00:02:59]

I was fighting for my life. I think it'd be different. You sure you're a tough guy? You are actually trained martial artist. I mean, I'm not saying it'd be easy to grab your tie and choke to death. What I'm saying is it's one more area of vulnerability that doesn't need to exist.

[00:03:12]

But see, I'm disagreeing with you in saying like, if I was gonna fight to the death.

[00:03:17]

I would wear the suit. OK. Because I would look good. Let me tell you something about CIA agents and Secret Service guys. They were break away ties. That's because they're not good martial arts. That's not true. There's a lot of those guys are savages, are they? Fuck yeah. Them belts. No black belts, man. If you're a fuckin if you're a Secret Service guy, you're supposed to be protecting the president. I guarantee you a bunch of those guys are savages.

[00:03:40]

I think they're smart enough to use guns. That too. Yeah, but they don't. You know, if they have to wear a tie, a lot of people like to wear breakaway ties. That a fact? Mm hmm. I might be making this up, right.

[00:03:50]

I think that's it, bro.

[00:03:51]

I know, but it's it's a little bro fact, but only 10 percent I think.

[00:03:56]

Let's Google break away ties for self-defense because dude. Look, I'm definitely a dummy.

[00:04:06]

Right. Okay. Think about the stuff too much. But when I was driving limos, I always felt super vulnerable when I had to wear that tie. It looks good, though. My one of my actually my first album, my first real album that I ever did for Warner Brothers was in nineteen ninety nine and I wore exactly that outfit.

[00:04:24]

I wore a black suit with a white shirt and a white tie and a black tie. It looked dope. It's called I'm gonna be dead someday.

[00:04:31]

Breakaway ties stand up like that breakaway Tyson Lo pro breakaway tie. That's what I'm wearing.

[00:04:38]

Let me explain. Some come get some. Most people when they're vulnerable, like say, I'm afraid I'm gonna be picked on by bullies. I learn a martial art. How to defend myself. Yes. You when you felt vulnerable wearing a tie. Decided not to wear a tie. As opposed to learn how to defend yourself while wearing a tie. There must be a system.

[00:04:59]

I guarantee you that.

[00:05:00]

Or you could. You could say that you could defend yourself, your dog collar around your neck, too. But I wouldn't recommend anyone recommend Smite.

[00:05:07]

I took my dog out once I had a pit bull any bit. My cat grabbed hold my cat. It's terrible story. I had a crazy dog when one of my dogs was a dog that I'd gotten. It was I was young and irresponsible in my 20s. And I gotten this dog that was bred from a pig hunting dog in Hawaii. Wow. And those dogs are hyper animal aggressive. They're great with people. He was great with people. He loved people.

[00:05:35]

But everything that moved. He was like locked in on. He would spend his days in my yard chasing lizards. His thing was to jump up on the wall, the house and try to snatch lizards. It was like a video game for him. And my friend Eddie was terrified of this dog. Hey, bravo. Yeah. And so Eddie would come over the house. And Frank would just decide that he runs shit when Eddie's around because Eddie was so scared of him.

[00:05:57]

He blamed him. I think I killed kill this cat.

[00:05:59]

So he just tried to kill my cat. And I got a hold of him in time and I got my hand into his collar and I choked unconscious head like, yeah, I just behind. Yeah. From behind. I just dug my hand under his collar and our choices and I put him to sleep. He went right out. It's crazy. Yeah. Works. Works on dogs.

[00:06:18]

Yeah. Maybe from the ballsed thing for the backup's thinking from the front but from anywhere you can grab a dog collar if you get your hand in there.

[00:06:25]

If you're strong enough and you have good technique, you know how to go neon belly and then it. You can put a dog sleep well changing my mind.

[00:06:34]

Say yeah but it's I don't know. I mean look obviously you're gonna be aware of that and you're gonna defend and you get it. But it's a it's an area of vulnerability.

[00:06:44]

Right. Nineteen ninety and pull up. I'm gonna be dead some day.

[00:06:48]

Cover that cause I'm literally dressed exactly like you on the cover or when actually doing the show.

[00:06:55]

No I never wore doing a show. I think I just wore it for the cover almost ironically.

[00:07:01]

No I kinda like the way it looked. As you know, there it is. Bam.

[00:07:06]

It's hard to tell there because it's that one's orange and the other one's hot pink.

[00:07:09]

But I think the shirt collars a little more open. Like like you don't give a damn.

[00:07:15]

Well, that was a long day and there was a long photo shoot we were drinking.

[00:07:19]

Yeah, it was a lot of chaos involved. Yet there's a logit.

[00:07:24]

There was a lot of stuff going on there. But it was just it's just. I like that. Look, good look.

[00:07:31]

By the way, congratulations on the 10 years. Oh, thank you very much. I don't think you've celebrated. All I see is on Jamie's Instagram, like a naked picture of Bert.

[00:07:42]

Says it's a 10 year picture. Yeah, we probably should do something. It was December was officially 10 years. So was two months ago. Off shelves, some sort of a party or something.

[00:07:52]

I know you don't like to talk about it. Think about it. But you've inspired millions. So that's very nice.

[00:07:58]

It's very it's a very nice side effect. But it's a it's a weird gig, man. You know, it's a it's a gig that became what it is slowly without me understanding what was happening, what was happening, which makes it weirder and weirder.

[00:08:11]

And with it has come increasingly stronger levels of responsibility to where, you know, now I have to actually vet guests and think about what they're saying, whereas before I would have someone on their crazy like. Crazy motherfucker. Let's hear what he has to say. And people would say a lot of crazy shit. Then they would say, oh, you know, you didn't push back or you had this person on and they said something irresponsible and I had no idea what they were gonna say.

[00:08:37]

There's a lot of people that have said some pretty outrageous things that I had no idea they were going to say yes or the like.

[00:08:43]

I started one of things he inspired me to do start a podcast on artificial intelligence and have Jack Dorsey as a guest coming up. And that's a good example of somebody who got like an insane amount of pushback.

[00:08:58]

Yes. Because they were mad that I didn't talk to him about censorship.

[00:09:00]

My my take on it was it was certainly irresponsible on my part. The first podcast, because my take on it was I just wanna see what it's like to be a guy that starts this thing and it becomes probably one of the most important conversation tools the world's ever known. And also along the way becomes. You know, it becomes something we like right now. It's weird like Twitter now is. It's 50 percent hot dumpster fire. You know, it's amazing, inspiring stuff like you get.

[00:09:39]

You can always find the dumpster fire and all kinds of conversation. Yes. And the confusing thing to me about your conversation with Jack, which I didn't look at the Internet before I listened to it. I mean, I enjoyed it. It was interesting. I learned a lot from your first conversation, Jack, and like and then I looked at the Internet and that told me I'm supposed to hate that conversation.

[00:09:59]

And what I'm confused about is why. Why there's such like why is there such hatred thrown towards you? I also talked to the head of the YouTube, head of the each about whom social discovery. A lot of hate towards YouTube. You know how to hate towards Twitter a lot. Tell us Facebook and deservedly so. There's some challenges and so on. But they're doing like an incredible service. And the algorithm they're trying to develop and control is really hard to develop.

[00:10:27]

Control. Yes. So for sure. So the pushback that people get. It's almost like they're ignoring they're they're taking specific anecdotal pieces of evidence. Or look, this person said this. It's it's not that problematic in our eyes. But they somehow got censored from the platform, removed from the platform. And they don't look at the bigger picture of how challenging the entirety of it is and how incredible. First of all, it included incredible. A platform is to have a conversation like a global conversation like this.

[00:10:58]

And how hard it is to do to achieve the goal of having it sounds like cheesy, but having like a healthy conversation, a health discourse, because you want an algorithm and a platform that removes the assholes from the scene because it's a really difficult challenge because you want this is one person was really loud screaming. The room comes to the party, you have a cool party, but you call people some communists, some right wingers, whatever. It doesn't matter.

[00:11:30]

They can all disagree, but they're not assholes. Then they're to have like interesting debate, conversation, so on. And then there's somebody that comes with like and just starts screaming like one like slogan or something like that or is trying is trolling is completely sort of non-genuine in their way of communication. They're destroying the nature of the conversation. And then, of course, that person, if they get, you know, the bodyguards come in and say, can you please leave the party, sir?

[00:11:58]

Then they get extreme. That's exactly the kind of personality that's extremely upset. Mm hmm. And sometimes they almost look for that. So what you're supposed to do as a jackdaws is as a leader of that kind of platform.

[00:12:10]

It's a very good question. I really think that there's no real answer. It's one of the reasons why it's so frustrating. You know, if you just let people say whatever they want, whenever they want to. There's gonna be a lot of people that get turned off to that kind of a platform because you're going to have a lot of people yelling out racial slurs, ethnic slurs, gender slurs, homophobic slurs.

[00:12:36]

There's going to be a bunch of people at a trolling. There's gonna be a bunch of people that just say things to rile people up. And that's all they do is gonna be a bunch of people that just want to shit stir and they want to dox people. So they got it. Then you have to set parameters. What are the parameters? Don't you can't dox people. You can't. Don't say racial slurs. Don't say ethnic slurs. It's you're managing at scale and you're managing an insane amount of people.

[00:13:01]

But then there's legitimate criticism that they lean towards progressive people and liberal people. And they they have WOAK politics. Like, for instance, you can get banned from Twitter.

[00:13:15]

For life, if you dead name someone, so Lex, if you became a female and you change your name to Ali and I said, fuck you, man, you're Lex banned for life.

[00:13:27]

That's with a dead name.

[00:13:28]

That's dead naming.

[00:13:29]

Like if you wanted to call CAITLYN Jenner, if you want to call CAITLYN Jenner, Bruce on Twitter, you would get dead named or you would be dead naming her and you would get banned for life. A woman named Megan Murphy, who is a turfed, you know, to turf is what do you think? I don't know the term.

[00:13:45]

I'm sure you don't need to put your two balls deep in science. Turf is trends. Exclusionary, radical feminists. So trans exclusionary. Why I have such a hard time with that word. Exclusionary, right? Exclusion. Why does it sound wrong? Exclusionary sounds wrong.

[00:14:03]

What does it mean to be exclusionary to trans?

[00:14:07]

She did. Look, turfs do not want trans people to be to have a say in women's issues. I see. They think that they are a different thing, that there's women and women's issues. And these feminists that have been female their whole life dealing with women's issues do not want trans people coming in. And in many cases, what you find is that trans people come in and then the conversation changes and it becomes about trans issues. And they want these conversations to be about women's issues and in feminist movements.

[00:14:41]

It's complicated, right?

[00:14:43]

She got banned from Twitter for life, for saying a man is never a woman. They made her take the tweet down. So she took a screenshot of it, took it down and then put the screenshot back up. And then the banner for life. Should she get banned? No, no, she shouldn't, because biologically she's correct. If there's an argument there, if there's an argument, a scientific argument, if a man is never a woman, but can a man identifiable as a woman and should you respect him or her then and treat them as a woman?

[00:15:13]

Yes.

[00:15:14]

So, yes. So the question is where? I mean.

[00:15:18]

I'm not too deep into thinking about these specific issues, but the question is whether you should get banned for being an asshole. They should get banned for being for lying, because I think lying is OK. True. Whatever. A lot of people lie on Twitter. I. And sort of.

[00:15:33]

Or insult you can insult people on Twitter as long as you're not specific about their gender. See that the insult thing, that's where it gets. It's the party thing. When you have that asshole douchebag, whatever term you want to use, they show up to the party. And then if a person shows up to the party and a lot of people leave because they're annoying or whatever. Yes, that should be like we should do something to discourage that behavior.

[00:15:57]

That's a good point.

[00:15:57]

However, let's let's paint a different picture of a party. Let's have a party where everyone says my pronouns, are they them and Xur and job you and they. And then you come in. You. Come on, bro. You're a guy like. No, no, no. I'm a.

[00:16:14]

They fucking cisgendered heteronormative piece of shit. And then they want to kick you out of the party. Now all you're saying is you're a guy band. Both of them. No way. Ben.

[00:16:26]

Ben, the person who's not open minded or respectful for the, you know, don't ban people.

[00:16:33]

Here's the lawyers. No, no, no, no. Not that. Yeah. So, of course, it's been well-documented by but by people now. The reason probably have the current president is that people on the left a very also rude and disrespectful.

[00:16:47]

It's a small percentage of the people in the Larry Small. This is part of the really all on Twitter. They are all on Twitter. But it's also the small percentage when you.

[00:16:57]

It's so hard to have a group and call that group the left because the variables are so extreme.

[00:17:06]

There's so many different people that follow politics or that that espouse to certain belief systems that recognize themselves as left. Funny enough, you're probably on the left. Yes, I'm very much on the left.

[00:17:19]

Yeah, but I don't consider to be on the left because I'm a cage fighting commentator. The American flag behind. Yeah. I'm very bro ish. I hunt. I bow hunt, which is even more bro ish. Right. And I am unabashedly masculine. I am a man and a comedian. Yes, I am. And I'm a dirty comedian. And I and I make fun of everything, including sacred cows like gender, homosexuality, heterosexuality. My own kids, my wife, my mom, everybody.

[00:17:49]

I make fun of everybody. And if you take that stuff out of context in just publish a bunch of it, it makes you look like a moron or make sure you're an asshole.

[00:17:58]

That's you know what? What is the left? Right. What is the left? In my mind, the left. When I was a child, I always thought of the left because I grew up. My parents were hippies. All right. My my stepdad was an architect. And before that, he was a computer programmer.

[00:18:16]

He had long hair until I was I think I was 20 years old when he cut his hair. I mean, long like down those as like a Native American. Nice. And he you know, they always he smoked pot when I was little.

[00:18:29]

I mean, he was always around hippies. I lived in San Francisco from the time I was 7 till I was eleven. And I was always and and my family was very left wing. They were always pro-gay marriage. They were pretty pro gay rights, pro racial equality. Pro just name it, man, pro welfare pro. You know, just the idea was open mindedness, education. All these things are good. And this is. And war was bad.

[00:19:04]

And, you know, there's a lot of you know, there's a lot of things that maybe they had that they had they had like very strong beliefs on that. Maybe they weren't entirely nuanced on as well. When you find that about people on the left, as much as you find that about people on the right. But it's the radicals on both sides being there's nothing wrong with being conservative, right?

[00:19:28]

There's nothing wrong with valuing hard work.

[00:19:30]

There's nothing wrong with someone who values of fiscal frugality or someone who is. You know, you have a conservative view on economics or on social policies. You know, and you want less government. There's nothing wrong with those things either.

[00:19:47]

That's when you get extreme. Like, yeah, the guest was amazing guest who had recently converted a bunch of folks in the Dow Davis.

[00:19:55]

How does that sound right here? This is his seat. He's amazing. He's a incredible human man. But that kind of thinking. I wish you saw more of that. Yeah. It's sort of like not even if you're on the left to be to talk to people on the right.

[00:20:08]

And instead of just shut him out. That's the problem with this idea of kicking people out of a party. You kick people out of the party. Guys like Darrell Davis never get to convert them. There's been people from Twitter that have been converted. You know, Meghan Phelps is a famous one. She was a part of the Westboro Baptist Church.

[00:20:25]

Her grandfather was Fred Phelps, that fuckin famous crazy asshole who was like super rude, like who, you know, would make them take those signs that say God hates fags and literally go to soldiers funerals and say that soldiers died because God is angry that people are homosexual.

[00:20:45]

So Megan was completely entrenched in this toxic ideology.

[00:20:51]

What are a lot or to escape that? Yes.

[00:20:52]

Realogy, she met her husband on Twitter from arguing a thing back and forth. And now she's out. And now and now. And you if you talk to her, you would never believe it. And man, and not that long ago either. Not that long ago, she was in that church like six years ago.

[00:21:04]

It's kind of incredible that you can sometimes grow that mindset. So no matter. I mean, it's inspiring that you can hold the mindset of hatred. And I guess escape it while she was indoctrinated into it from the time she was a child. And, you know, for her, it was the only life she knew. Right. Her family is in that. And for her.

[00:21:23]

She just was I mean, by whatever for whatever grace of the Grand Universe plan, she had enough open mindedness to take into consideration some of these other things that people were saying. Now we have a problem today with cancel culture. It's a real problem. So you just want to write people off. Well, those people still exist. It's basically a cultural form of euthanasia.

[00:21:47]

It's want to go out and whack everyone who doesn't agree with you.

[00:21:50]

But if you if you do that, you will. It's eugenics or whatever, whatever you want to call it. You just eliminate everyone who's not the way that you like culturally eliminate them, take them out of the conversation.

[00:22:04]

They still exist and they still exist.

[00:22:06]

So what happens? And well, then they're angry. They're angry. They left out of the conversation and they don't grow. And then you've written them off as a human being. You said that they're they're 100 percent bad. Now, if you had a spectrum of people in this world. Hundred percent bad at 100 percent good. I mean, there are some beautiful people that really are 100 percent. My friend Justin Wren, who runs Fight for the Forgotten Charity, he's about as close to 100 percent good as you can get.

[00:22:31]

I mean, this beautiful person goes to the Congo and makes wells for the pygmies and gets malaria. So he isn't encouraged. He's got some crazy parasite now that they don't even know what it is. They can't recognize that he's been suffering for eight months now. I think that's about as good as you can get. Right. And then there's people, you know, you could well, it's a gray area when you start to drift away from the.

[00:22:56]

I guess I had the same thing in my life. That's the focus. I have an academic setting of science. There is. And that's that's the inspiration. Your pockets that you gave me is to talk outside the people that are sort of conventionally accepted by the scientific community. Like a little bit on the fringes. Yeah. And the quote unquote, fringes. So you have the same thing in machine learning and artificial intelligence. There's people that are working on specific.

[00:23:20]

It's called deep learning, these learning methodologies that are accepted. You know, there's conferences and we all kind of accept the problems we're working on.

[00:23:28]

And those people that have been on the fringes as people in neuroscience, actually anybody thinking about working on what's called artificial general intelligence is already on the fringes. Yeah. If you've even raise the question, OK. So how do we build human level intelligence? That's a little bit of a taboo subject. The consciousness is called the C-word for a while. Conscious. Really? Yeah. Well, it's scientists, so. I know. I understand.

[00:23:54]

Yeah, but explain it to me. Like what? What's the aversion? What is everyone worried about? What? No, it's. What are they worried about? It's this culture of rolling your eyes the same the same way you might roll your eyes if somebody tells you the earth is flat. Mm hmm. They they sort of put all other things in that category as well as like while that's OK. Whatever. That's we. So in the case of consciousness, we really do understand very much at all what consciousness is, what the what the you know, the subjective experience, the fact that it feels like something to take in the world, that it's not just raw sensory information being processed actually feels like to touch something, to taste something, to see something.

[00:24:39]

It's like incredible. David Thomas calls it the hard problem of consciousness. Why do we feel it? OK, but we don't have scientific physics, engineering methods of studying consciousness. So it immediately gets put into this bean. It's not a milk thing. You're a little bit crazy off the reservation. I think somebody with saying sounded that's a slur. I never even thought of that. I didn't think of what that meant. Yeah. So they already put in this bean.

[00:25:07]

You're not a legitimate researcher in the same kind of, you know. And I think we're now in a in a culture which is great. You know, Eric Weinstein is good at this. I'm hoping to be good at this. You're good at this. And allowing those people on the fringes in and saying, what are your ideas? Exploit those. Yeah, of course, you have a greater and greater platform to where there is a line. You don't want too far in the fringes.

[00:25:31]

Yeah, that's something I'm aware of. Now that I wasn't aware of, say like three or four years ago and I used to have a lot of those. I've had some people on that I would never have on again. You know, and and then I've had some people on that. I've been criticized for having a mom.

[00:25:46]

I'm like, okay, I see why you are upset. But I think there's value in having conversations with people that are on the fringes. There's people that are bad faith actors. Right. They act in bad faith. Those are the ones you have to be careful of. And sometimes you don't know who they are until you get to know them. Yeah.

[00:26:03]

And then you've already kind of opened the door like for some people for like the Democratic the legitimate seven year old plus Democratic Party. Tulsi Gabbard is on the fringe. Yeah, right. But I think you having her on as great is exploring. You know, she's one of the young minds exploring sort of the role of the United States, the foreign policy in the world militarily in terms of trade and so on. So she's a excellent mine who I don't think is on the fringe and saying I don't think she's on the fringe.

[00:26:34]

Bernie Sanders for many people, still is on the fringe. Yeah. And I think he gets misrepresented, though. Yeah, for sure. One of the things that was tremendously beneficial for me is a sit down with him for hours and have a conversation and you go, oh, you're a real person. You're not this wacky guy yelling about billionaires when you know, when you get these 90 second soundbites in these debates, you don't get a chance to know someone is.

[00:27:01]

Yeah. So I said, listen to this. I listen to a lot of radio on the left and the right to try to like taken what what people are thinking. But I still listen to this program and things called that Tom Hartman program. He's like a major lefty, but he had this segment called Brunch with Bernie. And he would invite Bernie Sanders like every Friday or something like that. And just sort of the intellectual honesty and curiosity that Bernie exhibited was just fascinating, sort of like as opposed to being a political thing to just repeat the same message over and over, which actually would it kind of sounds like when you listen to him now publicly, he's actually a thinking individual and somebody is open and changing his mind.

[00:27:47]

But within that is just completely been consistent.

[00:27:50]

What people are terrified of is that he's going to raise taxes on successful people and ruin business. Yeah, that's what people worried about that that in doing that it will crash the economy. Yes. I don't know if they're right. I don't even know if there.

[00:28:05]

So, first of all, the people are using the word socialist. So you're saying he's a socialist? Do you really want socialism? America is a great country because we're capitalists kind of thing. From my perspective, I think we already have a huge number of social. So he's a democratic, socialist, democratic.

[00:28:22]

So it's a different perspective. He just values workers the ideas. He wants people to earn a living wage. He wants people to not be indebted with the tremendous amount of student loan debt. When you're just 21 years old and getting out of college, he thinks it's insane. And I agree with him. He doesn't want people to be burdened in this insane way if you ever get sick. And I agree with them.

[00:28:47]

Let's improve the health care system. I think as a community, if we're we're looking at the United States as a community, one of the things that, you know, look, it's great to support business. It's great to have a strong economy. It's great to give business the confidence, to take chances. And a lot of people think Donald Trump does that. It's also great to take care of our own. And I don't think we do that enough.

[00:29:09]

I don't think we take care of our own enough in terms of we have the same problems in the same inner cities that we've had for decade after decade after decade. And there's no significant attempt to change that. But meanwhile, we do these nation building projects in other countries and we have the interventionist foreign policy where we go in and invade these countries and try to prop up new new governments and try to support them. And we spend insane amounts of money doing that.

[00:29:36]

And along the while, we don't do anything to our inner cities that are the exact same fucked up places that they were in the 70s and in the 60s. You know, Jill, Michael would junior as you know, he was on the podcast couple of times and he used to be a police officer and. Baltimore. Yes, I know. OK. So this is up, I guess I'm just horrible names like his. His experience was, first of all, just he found a piece of paper that showed like a crime.

[00:30:06]

Dockett from the 1970s, all the stuff like drugs, crime, robbery. It was all the same issues in the same neighborhoods that he was patrolling in today. And he was like, holy shit. And he realized like, oh, this is a quagmire. And then he found out about the laws that were in place from way back in the day where you literally, if you were an African-American, you couldn't buy a home in certain areas.

[00:30:35]

Right. They had. What does that terms? It redlined? Is that what the term is? Were they they designate certain areas where they literally won't sell homes to black people.

[00:30:47]

And he was becoming aware of this shit as he was a cop. And, you know, in the beginning, he was all gung ho. He's like, yo, I'm a cop. You know, I'm here to bust bad guys and do the right thing. And then along the way, he kind of recognized who you're dealing with, systemic racism.

[00:31:03]

You're right. Red line. Yeah. So. Yeah, so that that hasn't been addressed.

[00:31:07]

It's all about I mean there's a million other things at home. Education. Yeah. Everything. And all those things.

[00:31:12]

I think Bernie Sanders, when he talks about those things, he he seems like a guy who really cares about education, health care and people that live in poverty. Yeah. I don't I don't know if he's gonna be able to do anything. I don't know.

[00:31:27]

That's the main thing is like people say democratic, socialist and so on is going to he's going to make make it slight move into whatever direction he's trying to advocate, which in this case is more investment into the infrastructure and so on into our at home. But like, you know, he's just one human being, just, you know, has to be a Congress that represents the people. And if there's anything, I think Congress is probably the most hated entity in all of the universe.

[00:31:52]

Like if you look at all the polls of what people like and hate, like rats are above in terms of variability ratings.

[00:32:00]

So Congress is really the broken system. Byrne You won't be able to do much except take a little sort of the role of the President, as I see it is to one a terrifying one is to start wars. And so this it's a very serious responsibility you have to take. And the second is to inspire the population in terms of executive power, of enacting laws. There's not much power. All you can all you can do is what our current president is doing, sort of inspiring the UN that in that case, the Republicans in Congress to sort of work together, to work on certain legislation so you can inspire the Congress and inspire the people, but you don't have actual direct power.

[00:32:42]

So Bernie is not going to turn America into a socialist, you know, haven.

[00:32:51]

He's going to take a small step into maybe he'd probably fixing focusing on one aspect like health care or something like that, like President Obama did and tried to make a little change. And so so in that sense, people that are genuine and have ideas like Andrew Yang is another one. He has like a ridiculous number of ideas. And if you've seen, like he thinks all cops should be purple belts in jujitsu.

[00:33:16]

I like it. My go, Andrew as. Yeah. Is a million other ideas as he sees.

[00:33:21]

Well, he's a genius. He's a brilliant guy and he's an entrepreneur. So he comes at this stuff from a different angle. Yeah.

[00:33:28]

And he's open minded like I guess I. Well, I disagree with him on his evaluation of the state of artificial intelligence and automation in terms of its capabilities and having an impact on the economy.

[00:33:42]

You don't think it's gonna be as much of a deal as he thinks it is and the time scale that he thinks it is. But I also want to be careful sort of commenting on that, because I think for him, it's a tool to describe the concerns, the suffering that people go through in terms of in terms of losing their job, like the pain that people are feeling throughout the country. So it's like a mechanism he uses to talk to people about the future and that, you know, there are people that are well off, like the different tech companies that should also contribute to investing our community.

[00:34:16]

I mean, the specifics, I want to kind of sit back and relax a little bit. It's like when you watch a sci fi movie and the details are all really bad. I want it just suspension of disbelief or whatever and enjoy the movie in the same way the stuff he says about A.I.. He's not very knowledgeable body AI and automation. So it's a little it touches me a little bit the wrong way. We're not as far along that the transformative effects of artificial intelligence in terms of replacing humans in trucking, autonomous vehicles, something I know a couple things about is not going to be as you know, I can speak relatively confidently.

[00:34:54]

The revolution at Thomas vehicles will be more gradual than Andrew is describing, but that's OK. As a million other ideas and UPI, nevertheless, the universal basic income or some kind of support structure, that kind nevertheless could be a very good idea for people that lose their job. For people to be mobile in terms of going for one type of job. I know the type of jobs. Annually learning through life. It's just that artificial intelligence in this case I don't think will be the enemy.

[00:35:25]

There could be other things that are a little bit sort of neighbors of artificial challenges, which is sort of the software world eating up some of the mechanization of factories and so on. You know, maybe. Maybe the fact that, you know, the kind of way that Tesla and Elon Musk are approaching the design and engineering of vehicles, they're a little bit more software centric will change. We'll sort of move some of the job from Detroit, Michigan, in terms of cars to this Silicon Valley.

[00:35:59]

And they say location wise, but sort of a different type of person would need to be hired to work on cars a little bit more software engineer, a software centric versus the sort of hardcore mechanical engineers, more sort of, you know, tradition called car guys. Yeah. Gals, right.

[00:36:17]

Yeah. So that there would be some job replacements on it. But it's not this artificial intelligence. Trucks will completely replace a job. In the case of trucks, you know that it's not. There's a lot of complicated aspects about the impact of automation sort of trucking jobs. There's actually a lot of need for jobs.

[00:36:37]

Like there's not the truck that that job. There's already people leaving that job sector. It's a really difficult job. It doesn't pay as well as it should. It's really difficult to train people and so on. So the impact that he talks about in terms of A.I. is a little bit exaggerated.

[00:36:56]

Well, like I said, a million really good ideas. He's open minded. So in terms of I think the nice role of a president is to have ideas like the purple belt one, inspire people and inspire Congress to implement some of these ideas and be open minded and not take yourself seriously enough to think, you know, all the right answers. Andrew Yang, Bernie is like that, although Bernie is like 70 years old. So yeah, he's getting up there.

[00:37:29]

Yeah. Look at President Tulsi when he kicks the bucket. You know what? I didn't know him well. Yeah, that's so. I think Hillary Clinton endorsed Bernie and Tulsi Gabbard for president. Reverse endorsement accidentally.

[00:37:42]

Bye. Yeah. Bye.

[00:37:44]

It's just such a petty thing to say that no one likes Bernie. Come on, lady. This is you're in the twilight of your life.

[00:37:52]

I think she's really aware of the fact that if she says something like that, people are going to like Bernie more. I think it's an endorsement.

[00:38:00]

I don't think she has any idea that. I think she's super insulated. I don't I don't think so. I think she thinks that she can actually hamstring him by saying something like that. And she doesn't understand that. It just makes people realize the things that they say about her. Correct.

[00:38:13]

Yeah. The article nager Enough credit. Really? You gave her credit for killing EPSTEIN?

[00:38:18]

I was joking and I don't think she did.

[00:38:20]

I think Bill Jokin to somebody did it. I don't know. It was. Maybe some scientists, character, maybe still love. Could be. That's what Eddie Bravo thinks now. Eddie Bravo thinks he's in like Dominican Republic somewhere eating bananas and drinking mai tais.

[00:38:37]

It's a conspiracy and a conspiracy. Yeah, well, Eddie's always like that. He's ready levels deep. He plays for de chess when it comes to conspiracy.

[00:38:45]

Do you think that Andrew Yang is off but ultimately will be correct in terms of the automation timeline? Do you think that maybe he doesn't know?

[00:38:56]

Clearly, as much as you know about automation and artificial intelligence, but do you think that it's possible that, you know, I think he's looking at a timeline. I think he was thinking within the next 10 years. Millions and millions of jobs going to be replaced. Do you think that it's more like 20 years or 30 years, but still something that concerns.

[00:39:14]

The timeline? Of course, nobody knows. But I think the timeline is much the time scales is more stretched out at 20, 30 years and it'll continue. There'll be there'll be certain key revolutions and those revolutions.

[00:39:28]

So it's an incorrect word to use, but there'll be stretched out over time. I think that autonomous vehicle revolution is something to achieve a scale of millions of vehicles that they're fully autonomous in navigating our streets. I think it's 20, 30 years away. Mm hmm. And it won't be like all of a sudden it'll be gradual. There'll be people like the former Google self-driving car waymo Company, who's doing a lot of testing now. Incredible journey. I visit them for a day.

[00:39:57]

You'll be expanding their efforts slowly. They're doing also way more trucks, autonomous trucking. They're deploying them in Texas, I think. And then, of course, Tesla. Who's this? You're going to approach a million vehicles and they're trying to achieve full track self-driving capability. But that's going to be gradual.

[00:40:15]

I just got a new update for the Tesla. Some new self-driving update. It cost four grand.

[00:40:22]

And I was like, what is it? You know? And then but I think it was high.

[00:40:27]

And I was looking at my phone. I was like, OK, let's do it. And so I got this update. But I'm like, what did I just pay for? And I'm not even a I don't even know if I'm going to use it. I think it can change. I think it does everything. I think it changes lanes and. Yeah, well, OK. So I'm not exactly sure what the update is, but it's probably see if you can find out, Jamie.

[00:40:46]

So it's probably the quote unquote, full self-driving. Yeah. Very important. The safety person, I guess. And as bogus tests, like, can not drive itself fully autonomous, you have to keep your eyes on the road. Oh, I saw a guy sleeping on the Internet and he was fine. Yeah, well, in a car.

[00:41:05]

I'm cold.

[00:41:07]

I'll look into it. I was on CNN. It was someone filmed the guy.

[00:41:13]

He was in his car, passed out. Not just one. It's there's been a few examples of that. People commuting on the way to work.

[00:41:19]

Out cold.

[00:41:20]

So some are for fun and fake, but certain the real thing that you know and sleep. We do that with male driven cars too.

[00:41:28]

I enjoy driving home in my Tesla from the comic store like 1:00 in the morning hitting that autopilot. And I keep my hand on the wheel. But it just it's an a level of relaxation.

[00:41:38]

Keep your eyes on the road. Yes. On. Looking at my phone or anything stupid. But it's just like all to do. You press that double button. Yeah. I just ha. It changes lanes.

[00:41:48]

Oh, it doesn't change lanes. It stays in the lane does it. Can change lanes.

[00:41:52]

But I think you have to prompt it like if if you want to like it there's an option for navigate on autopilot like get a would take you everywhere you would need to go.

[00:42:01]

Yeah. But I think you need to step in at certain points.

[00:42:03]

Yeah. And you actually. And now you can it can change lanes without you pressing. That's what it is now. Yes. You can do it automatically. And they're doing hundreds of thousands. I think they're tracking the number of automated lanes. A fiscal incredible that this is possible. There's hundreds of thousands of automated lane changes without human initiation happening right now on I mean, to me as a sort of a robotics person, it's just incredibly.

[00:42:29]

Here it is from whole snack on Twitter. It says Tesla's new update lets the car recognize traffic cones, stop signs and stop line's trash. Kansas stoplights at their colors. If you try to run a stop sign under autopilot, the car emergency brakes and forces you to take over after a while, you can't run stop signs.

[00:42:49]

So this isn't the update you paid four thousand dollars for. That's already that's part of that. But I'm actually surprised that the $40.

[00:42:55]

But it says Tesla's new update. What's the time on this, Jamie? What date?

[00:42:59]

December 24th. I was okay for a more recent. So this this isn't the exact update. The the the you pay $4000 for. I think this is a general part of the full self driving, which is $4000.

[00:43:12]

And just to be clear, again, safety person, it's not like it detects traffic lights, but it doesn't stop at the traffic lights for you. And maybe in this case it does emergency braking on the south side. But it's not not good enough. It's not is not going on there. And there don't trust it. It's not there. It's in fact, as you know, there's a lot of people, including myself, think it's worth quite a few years away.

[00:43:34]

But also sort of also on the podcast, just like you got in just to talk to you, almost meet him, talk to him in person and realize that there's you know, there's people in this world that can make the impossible happen.

[00:43:48]

You interviewed him as well. Yeah, twice. Yeah. Yeah. Tell me, what does experience like for you?

[00:43:54]

So. You know, it was it's quite incredible in the sense that he is a legit engineer and designer, which is like a pleasure for me, I've talked to a few CEOs, talk to Eric Schmidt, just the CEOs, and they're a little bit more business oriented. Yuan is really, really focused on the fundamentals, the first principles to like the physics level of the problems that are being solved, whether that's Space-X with the fundamentals of rocket reusable rockets and and, you know, going into deep space and colonizing Mars, whether that's a neural link.

[00:44:28]

Sure. The getting to the core, the fundamentals of what it's like to have a computer communicate with the human brain and with Tesla on the battery side, sort of saying he he threw away a lot of the conventional thinking about what's required to build, first of all, an appealing car, electric car, but also one that's has a long range. That's something I don't know as much about. But that on the A.I. side, just I mean, he boldly said from scratch, we can build the system ourselves in a matter of months.

[00:44:59]

Now, a couple of years that's able to drive autonomously. And most people would laugh at that idea that most roboticists that know from the dorper challenges, most of them know hard. This problem is said, no, no, no. We're going to we're not only gonna throw a LIDER, which is this laser based sensor, we're going to say cameras only. And we're gonna use deep learning machine learning, which is learning based system, so it's a system that learns from scratch.

[00:45:26]

And we're going to teach it to drive from eight cameras and so on to just talking to somebody like that. Was that not the fact that he thinks like that? I think it's just fun to talk to people like that. I don't meet them often, let's say, and I'll stop this bullshit of thinking that this task is impossible. Let's say why is it impossible? Is it really impossible? You find out when you start to think about most problems from first principles is that it's not actually impossible.

[00:45:57]

And then you have to think, OK. So how do we make it happen? How do we create an infrastructure that allows you to learn from huge amounts of data? So one of the most revolutionary things that Tesla's doing and hopefully other car companies will be doing is the over the air software updates, just like the update that you got, the fact that just like on your phone you can get updates over time means you can have a learning system, a machine learning based system.

[00:46:23]

They can learn and then deploy the thing you learned over time and do that weekly. That sounds like maybe trivial, but it's nobody else is doing it. It's completely revolutionary. So cars, once you buy them, they don't learn. Most cars, Dussel learns that that's a huge thing. Forget about us. Autopilot, all this stuff. Just the fact that you could update the software, I think is a revolutionary idea. And then they're also doing everything else from scratch.

[00:46:50]

This is this first principles type of thinking. The hardware. So the the hardware in your car. I don't know when you got the Tesla, but it should be hardware version, too. But that hardware performs what's called inference. So it's already trained, it's already learned its thing and it's just taking in the raw sensory input and making decisions. OK. They built that hardware themselves from scratch. Again, ballsy move. Now they're building what they're calling again.

[00:47:21]

He's such a troll, but they're calling. Dojo is the name of the the specialized hardware for training the neural networks of training the models.

[00:47:32]

What training is is the learning side of it. So they're building their own like supercomputer. Google has a tipu to improve the training tape. You was a stand for tensor processing unit. It's the same thing as the more general and video has graphics processing your GP use that all the nerves all the people like me have been using on for machine learning to train, you know, networks. It's what most also gamers use them play video games, right. But they have they have this nice quality.

[00:48:00]

They can train huge new and that works on them. OK.

[00:48:03]

T-P you is is a specialized hardware for training your networks. GP use allow you to play video games and train your own networks. T-P Use clean some stuff up to make it more efficient, energy efficient, more efficient for the kinds of computation you'll no need. Google has them. A bunch other companies have them. You know, most komp most car companies would be like, okay, let me partner with somebody else to you from Google to use their T-P use.

[00:48:32]

I use in videos, GP use testers, building it from scratch. So that kind of from scratch thinking is is incredible. And. The other two things I read, listen. We're like about Musk is the hard work. We live in a culture like so many people. Like, I often don't sleep, I do crazy shit in terms of just focus, stay up nights and times and how often people recommend. To me, you know, balance is really important and taking a break is important.

[00:49:03]

You know it that you rejuvenate yourself, you return to with fresh ideas. All those things are true. Sleep is important. You had people on the podcast tell you how important sleepers but what most people don't. Don't advise me is hard, work is more important, passion is more important than all of those things like that should come first and then sleep empowers it, rest empowers it, rejuvenation powers, especially in engineering disciplines. Hard work is everything.

[00:49:32]

And he's sort of unapologetically about that.

[00:49:35]

It's not like a comb. Come to us. Come work with us. It'll be a friendly environment with free snacks.

[00:49:43]

It's like you're going to work the hardest you've ever worked on.

[00:49:48]

Whether you agree with him or not. On the most important problems of your life. OK. I like that kind of thinking because it emphasizes the hard work.

[00:49:57]

The the other part. It was terms of meeting him in person, I know. You got to interact with that off because when he was on Mike with you, he was very he was very kind of.

[00:50:10]

It was hard to bring it out of him. Yeah. In person before that, he was very jovial and friendly and huggy. He's great. Yeah. And then once he got on the microphone, I was like, oh, there's heavy lifting. Yeah. Bring this out of him. So then we started drinking, drinking and then. Oh yeah. And it helps a lot. And then wants to the drinking you know then. Then I got to see who he is.

[00:50:31]

Yeah. I should've I should've done that.

[00:50:33]

But I like drinking. Yeah. No. The thing that's really interesting is he's gone. If you look at his biography like the kind of stress he's been under in terms of he's been at the brink of losing his companies several times. Yes. And he you know, he lost a child and he just he's. Well, that's the other thing inspired me as. Is that he can be a good dad while running somebody companies, because like I'll often wonder about, you know, the kind of hours I put on what I'm doing.

[00:51:07]

Can I have a family? Can I really give because I'd love to be a father. And can I have a family? And I'd be a good sport. Like kids. Very, very, very, very difficult.

[00:51:15]

If you're working 18 hours a day. Yes. To give your kids the time that they need.

[00:51:19]

But it's possible. Not not eighty now. There's always I believe in in life, day, months, maybe years. You have to do the 18 hours a day, but not always this time. Right. Do the sprint sprints. Yeah. And then establish everything and then sit back. But the problem with a lot of guys like him is, first of all, it's very difficult to find a replacement for the way he thinks. Right. So if he's if he's a CEO, these companies and he's the one who's the mastermind behind all these things, and then he wants to step back.

[00:51:50]

Finding a commensurate replacement is insanely difficult because most people who would be a potential replacement are already off doing their own shit.

[00:52:01]

And there's not many people like him. That's interesting. It's actually the disappointing thing to me is that he's his kind of thinking is is a rarity. Yes. I'm not sure why that is. Exactly.

[00:52:13]

Well, he's a joke around about it, but it I think there's a there's a spectrum of evolution.

[00:52:22]

And his mind is clearly way more advanced than my mind. There's there's something going on in his mind in terms of his attraction to engineering issues, solutions to global problems, solutions to traffic problems, pollution problems, all that all the things that he's the Internet and he's trying to put he's trying to give the world Internet. I mean, he's got all these things going simultaneously. And one of things that I got out of on when I was talking to him was that he almost has a hard time containing these ideas that are just pouring out of his head like a raging river or he's trying to capture handfuls of water that is raging river of ideas is going through his head, you know, and when he described his childhood that he thought that everybody was like that.

[00:53:11]

And then as he got older, he thought he was insane.

[00:53:14]

Yeah. And I've I can relate to that. I'm trying to learn how to talk, but I have trouble talking because it's like a million ideas running in my head. Again, anything you say, I'll immediately start. There's these like weird tension. They go off and I want to start thinking about them.

[00:53:31]

Is that true with a lot of people in your line of work? I think so. I think that's kind of puzzle solving. That's one the comfort as I'm just surprised that a CEO is able to continue being that kind of puzzle solver.

[00:53:43]

Do you see that tweet that he made about his plans? Like you put a tweet up and I think it was two thousand and six.

[00:53:51]

And then it is essentially done, all those things is done on this thing. Now, the thing is, most people so a lot of people love you almost, but there's a quite a large community of people that don't love him so much.

[00:54:05]

Well, that's always the case. Not at all with anybody. Great. I don't know if that's always the case. When is it not the case? Who does? Who accomplishes as many things as that guy does? Well, everybody loves them.

[00:54:18]

It's a difficult. I mean, I I'm not a historian. I could say it's Steve Jobs, a terrible example.

[00:54:26]

So many people hated that guy. So many people hated that guy. I know I have personal friends that are involved in technology that wouldn't use Apple products because he's such a twat. Sure. They didn't have anything to do with them. They knew people that were engineers under him. They said it was horrible and mean and it just required so much, would scream at people and insult them. And, you know, you had these ideas in his head that he needed to get done.

[00:54:48]

And if you couldn't work the hours that you needed to do what he wanted to accomplish, you know, you treat you like shit, you know? You're right. And I just wish the world was better, I think, because I like with all people like that. Like with Steve Jobs and with the musk. When he dies, people will always. You'll remember the greatness. Right. Yeah. So that's that's how it seems to work. It's just sad that you can't celebrate that currently.

[00:55:13]

But I do think there's one particular aspect of his personality that I also share that pisses people off really bad, which is what, like you said, he had a plan, but he's laid down that plan. He is promising things and he gives being a year or two or three late.

[00:55:30]

Right. And that really I don't know if it actually angers people or if people that already don't like you use that as a thing to say why they don't like you. But it's certainly a thing that people say a lot. Yeah. And but I think that's an essential element of doing extremely difficult things is over promising and trying to overdeliver. That's the whole point. Right. Is to say to make all the engineers around you believe that it's doable in a year.

[00:55:57]

That's essential to do it in two years.

[00:56:00]

Like that.

[00:56:01]

That kind of and truly believing it as it seems to be essential. Well, didn't he have people pay full price for that roadster like you got less at a time?

[00:56:11]

Yes. So you paid a quarter of a million dollars for a car that's essentially vaporware.

[00:56:17]

Yeah, but the thing is, so I don't know.

[00:56:19]

There's a whole bunch of financial people they get like mad at. Oh yeah. Yeah, I agree. Like there's investors, you know, it's like it's I think it's the most shorted stock in history. Yeah. So they keeps kicking ass. I don't.

[00:56:32]

This confuses the fuck out of people. To me it's the stock market is the most boring thing ever. And people it's it's it's gambling. Yes. And so you trying to say you're an expert in investing in the stock market? I blocked I remove those people from my life because they don't say anything. Any interesting ideas? I said it. But that, you know, when you're doing legitimate investment. Yes. That's that's a really important services society.

[00:57:00]

But if you're commenting on the fundamentals of engineering problems that real engineers are trying to solve. That's not interesting to me. So it's that kind of stuff upsets, I think, financial folks. But the beautiful thing is when you have people buy vaporware and you bring that vaporware to reality. That's the amazing thing.

[00:57:22]

Yeah, that will definitely bring that roadster to reality. If he doesn't die, that roadster will happen. Yeah. If he dies. Bail out now. Same.

[00:57:32]

Same with that insane cyber trial. Yeah. And I wish I was fucking awesome. I so ridiculous. If you linked lives long enough, you better believe there's humans being put on Mars, whether it's him or he gets everybody else.

[00:57:45]

Yeah, that one I'm skeptical of. Just think we have our type of people that are gonna want to go see.

[00:57:53]

You're not talking about the engineering problem now getting it.

[00:57:55]

You think it's possible that it ultimately you know, I mean, it's. Look, can we put people in space for sure? We've definitely done it. Can we show things? Oh, some people think SpaceX is fake.

[00:58:09]

SpaceX is fake. Do you ever Google hashtag SpaceX is fake. I. It's wonderful. It's a testament to the education system in this country. Well, I and that's a tiny little tangent I've gotten. I joked about flat earth and space is fake a little bit, almost like saying that's an interesting way to being open minded. And then I realized that's not something to joke about, that there is a community of people that take it extremely seriously.

[00:58:35]

And then some of them thank me for acknowledging that the possibility. Oh, and then I had said, okay, plus a little heart.

[00:58:42]

Okay. This is not the or brains, but I appreciated their open mindedness.

[00:58:48]

But they should take. In physics, a mighty open courseware provides courses on physics, they should. Can a regular person just sign up for that? Yeah, yeah, it's open. Free. So how does it work? You would you have to do in order to take those courses.

[00:59:04]

It's all it's all made available online. Just go to m.a m.i.t. Dot org or is it you?

[00:59:09]

I might be OpenCourseWare as the website. I mean most people it's all on YouTube now. Oh that's beautiful. All lectures lecture's they're like millions of use introductory lectures to physics, mathematics, statistics. I have courses on the shadow ASICs.

[00:59:23]

But in order to understand that the work has been done to recognise fact that the earth is round. What would you recommend right away? Classical mechanics with exponential focus. Experimental focus. See, none of those things are going to find your own mechanics is good.

[00:59:40]

So but if you're a dingbat, you're not going to be able to absorb all that. Oh, look up the wikipedia page for gravity.

[00:59:46]

I think that's going to help either. They say gravity has never been proven. No one understands gravity. There's no one who actually understands with gravity. We just know the effects of it. It's actually magnetism. Yes, for sure.

[00:59:59]

So you have to undertake the effort of proving the Wikipedia article for gravity wrong.

[01:00:06]

So Wikipedia, bro. What a terrible example. Wikipedia is sketchy, it says, I'm Brian Cowen's brother says I got celiacs disease, says a bunch of shit's not real.

[01:00:15]

How do you know you're not related? I know. No. I'm pretty sure.

[01:00:22]

I mean, you might as well be my brother. I don't know if it says it anymore, but it could be someone. Put it in there again. Fuck it.

[01:00:29]

Wikipedia is actually another distributor's system. It's incredibly surprising to me that it works.

[01:00:34]

Yeah, it is. Right. Because even though there is a lot of misinformation in it and there's a lot of, you know, falsehoods, there's a lot of really good information as well.

[01:00:42]

You know, particularly about historical figures and interesting stuff. You know, if you want to find facts on things and on research, science and technical topics and not not like nutrition science or things where there's a lot of debates. I like physics and math and so on. It's really good. It's really, really good.

[01:00:59]

So it's community supported by other physicists. But moving back from a fighter can go back to why you think we're not going to be colonizing Mars.

[01:01:08]

Oh, no, I'm not saying ever. I'm just saying the problem to me is the type of people that would want to do it because they can't return.

[01:01:17]

You know, that's the real issue with going to Mars, is that you can't return this.

[01:01:21]

You don't think there's a huge number of non-crazy explorers in this world that want to die on Mars?

[01:01:27]

I had a whole bit about it. I really believe that it's the the fringe of the fringe.

[01:01:34]

They would be willing to die on Mars.

[01:01:36]

No. I would be willing to die on Mars. Really? Stay here. Come on. I like you.

[01:01:42]

Here's my take on it. Lies about me. It's all temporary. What's all temporary life? Yeah, it's temporary. Right. You're gonna die some day.

[01:01:50]

Sure. But if you decided die on fuckin Mars. Bro, you'll be sending me e-mails from Mars. Dude, I fucked up. I will be said success.

[01:01:59]

This is the thing. Your Arunta, the Native Americans, you've been following your work there.

[01:02:07]

I'm so obsessed, man. I've been obsessed by World War Two or one. But you're like you're converting me to think like 2 to the both the warrior cultures and the suffering in that world.

[01:02:17]

The suffering is insane. It's insane. This book on Black Elk, Man, it details his life from he was a young boy when during Custer's last stand he was there in Custer was killed. Black Elk. But the man. This guy. What do you call that? He's a Oglala Lakota medicine man.

[01:02:35]

Medicine man? Yeah. And he just lived through the transition. He lived through the transition of them battling with the U.S. soldiers to them being on the reservation and fucking insane poverty.

[01:02:49]

Insane.

[01:02:52]

Just did did in does the stories of people, the illnesses and the deaths. How many people's children died? Mao knew malnourishment, starvation, abuse and then how. Just how much they hated where they were living and how they were living on the reservation.

[01:03:10]

Yes.

[01:03:11]

It's horrific, man. It's horrific.

[01:03:13]

It's like it's hard to imagine. It's hard to imagine when you're reading that this just happened. You know, he's talking about the really horrible parts at the end were in the early 1920s, 1930s. It's hard to imagine it's hard to imagine that this this tribe from one hundred years prior in the 1820s were living wild and free. And where, you know, we're living the same way they'd lived for hundreds of years and had this incredible relationship with the land and these incredible religion that they practiced, where they worship the earth and the animals in the sky.

[01:03:58]

And they had all these concepts for the way you should live your life and how to guarantee prosperity and how to guarantee success. And, man, it's just they had a fascinating culture.

[01:04:14]

I mean, and it's gone. It was wiped off the face of the map. There was nothing like it anywhere else on earth. There's no there was no culture anywhere on earth that was like the Native American culture in the sixteen hundreds of seventeen hundreds, eighteen hundreds. And but in that period of time, they had this spectacular way of life. And there was often very cruel and very ruthless. And they they warred on each other like this. This idea that Native Americans were living in peace, in harmony with each others nonsense.

[01:04:46]

Yes. I started, I was listening while doing kills. Yes. You kick my ass the same to the empire as some fucking Korean man.

[01:04:54]

Look, I commented on your Instagram like saying something, you know, basically admiring the purity of that way of.

[01:05:04]

Life, yeah, I got so much shit by people saying, oh, you think rape and murders is pure and admirable. So there is a certainly an aspect to their way of life, which is sort of the warrior ethos, right.

[01:05:17]

The Comanches in particular imagined they were the most ruthless, the most warlike.

[01:05:22]

That's all they did. Basically like the ganga's Kahn, the same kind of the same horses, the innovators, actually. More innovators.

[01:05:29]

Yeah. And all day it was me as well. I mean, all day day it was Buffalo. I mean, they they essentially rode with the buffalo, killed buffalo, hunted buffalo, and then raided other tribes. And then till the white man came. And then they saw reading the white man killing the white man.

[01:05:43]

But they were at war with white people for hundreds of years. I mean, they were the reason why the West was hard to settle. I mean, this sneaky shit. I don't know if you've gotten to the point where they were giving people these big swaths of land in Oklahoma. And they this essentially set them up to be killed by the Comanche. They will say, hey, go on here, we'll give you a sixteen hundred acres. It's all yours.

[01:06:09]

And they're like, oh, terrific. Let's get our family and get in a wagon. And no one let them know that the wildest motherfuckers that ever, ever lived on this continent were run in that place and they would go there and just get slaughtered. And one after another, families were wiped out that way and people were kidnapped. And that lady that I have on the wall outside, Cynthia and Parker, who was adopted by the Comanches shoes.

[01:06:33]

Her family was murdered in front of her when she was nine years old. And she became the wife of a great Comanche chief and her son became the last Comanche chief, Quanah Parker.

[01:06:44]

It's crazy or it's the craziest story is all these tribes that some are probably more warlike, some are more p._c.s, that that had a way of life here. I don't I don't want to romanticize too much. Most people don't believe me, but I'd really like that way of life. That closeness to nature. You said texting and from Mars or whatever.

[01:07:06]

I like I like, you know, I wouldn't choose it, but I would be happier fellas forced into it. It seems like a counterintuitive notion, but because I'm so weak, I'm so soft, like even running hills yesterday, I realize how soft. Damn well you work too much. Yeah. Not behind a computer. Yeah. Little fingers typing. But you're also a black belt in jujitsu.

[01:07:28]

You also a martial artist, you know. You know me, I guess. A Comanche warrior. Good luck. I think, you know, like a Comanche up there.

[01:07:35]

Don't know how to fight for real. If they had a weapon, they'd kill you.

[01:07:38]

You think you're just. I know. I listen.

[01:07:43]

First of all, they're pretty small. They weren't very big people. Maybe. Second of all. They didn't know jujitsu. They had the average person that does know jujitsu. So you're going to choke the fuck out of them.

[01:07:53]

They'd be fun, actually, to sort of go into different warring cultures. I go to Genghis Khan times. Yeah. Without weapons to see what kind of combat styles had just send Francis in Ghana.

[01:08:03]

He'd clean out the entire. I mean just send Hoy's Gracies.

[01:08:08]

Yeah. I sure Vasant Ghana in all generations will be screwed.

[01:08:15]

But I think.

[01:08:16]

Interesting. Yeah. Right. It's just overwhelming.

[01:08:19]

But I think that if you had real jujitsu skills, you know what you know now today, particularly because jujitsu has evolved so much. I mean, even the jujitsu of 2020 is so radically different from the jujitsu of, you know, 1990. It's radically different. Like like almost unrecognizable in a lot of ways. But clearly, though, the basics are still the most important. And are some of the greats of all time who just operate with the basics, whether it's hard to Gracy or Hixon Gracy or, you know, there's a lot of great, great jujitsu players that just have those solid Salihee Barot, you know, solid basics that are just honed to a razor sharp edge, you know?

[01:09:07]

You know, Crohn, Crohn, Gracie. He's got an eye. When I say basic, it is a compliment. I mean, he, you know, armbar triangles, guillotines, renga jokes, those types of things, but perfected to a level that is they don't they don't participate in a lot of the more modern. There's a lot of like crafty, weird stuff that a lot of guys try today in some of the greats, even the greats that participate in jujitsu matches today and are effective at it.

[01:09:37]

Don't don't really have that kind of style.

[01:09:39]

Yeah, I mean, but kroners actually has some creativity. If you look at Roger Gracie, that's how basic that is. I don't I don't even know if he does foot locks like I. I think that my favorite thing to do is I need to just watch Hijau, Gracie Mansion, like he looks like he's half asleep. And he demolishes the greatest black Balts in the world slowly by just like in a half asleep way, taking them down, passing.

[01:10:04]

Guards go into moul and doing a joke. Yeah. And the against. I don't know what Shachar get against just the best smell.

[01:10:13]

My my instructor, John Jugg Machado. Same thing, man. Just his style is just solid basics of jujitsu.

[01:10:22]

And he has a saying that you more you know, the less you use, which is really interesting.

[01:10:27]

Well, you mentioned Comanche warriors in the meat. Yeah. GRATs and I saw the dad carnivore diet.

[01:10:34]

Yeah, man. Here's some crazy. I got off that diet for this weekend because it was I did the month. And then once Saturday came around as I were, eight Italian food. I had Girl Scout cookies and pasta. And then yesterday I went to Disneyland.

[01:10:49]

So yesterday I went way, way off the diet and I had ice cream and I ate all kinds of shitty food. And I was getting back pains and knee pains and all these kind of weird pains that went away when I was on the diet. Now, this is not a testament against plant based diet because I was eating shit, shitty food and part pasta, you know, which is a lot of, you know, bright pasta.

[01:11:17]

Yeah, spaghetti. That stuff causes inflammation. It just does. You know, it just does. Sugar causes inflammation. But it's interesting to have this great month where basically two weeks in after the diarrhea died off, I had two solid weeks of no aches and pains and feeling great. It's like this is wild. This is really why I feel amazing. And then two days of eating shit and like my back hurts.

[01:11:43]

Right now I'm sitting here, my back is hurt and my knee was hurting a lot yesterday. Like all those weird aches. Come right back. What's this?

[01:11:52]

The nice thing about the Joe Rogan effect is that you're trying to die and you're talking about Kito a lot that's become more socially acceptable to do. I've been eating Kito or low carb for many years and doing fasting like 24 or 48 hour fasting. And I was kind of keeping one low down. But even this time I'd like traveling, like what I like to do. And traveling is kind of I'm trying to be given my current situation, not spend much money.

[01:12:22]

And so I go one of the best ways to go either Carnivore or Kito is to go to McDonald's and just order beef patties. They'll sell you to speak passionately. dieties a dollar fifty real. Abadie For a quarter pound. Yeah. So you can, you know, like is is like usually what I use about two pounds of meat a day and that's.

[01:12:43]

What is it? I don't know. That's like fifteen bucks.

[01:12:47]

So you've been doing this carnivore thing, too? Mm hmm. How long you been doing it for? Often on site. Some have to for the carnivore I've done since the first time you either your podcast, Jordan Peterson or that kind of thing. I dived into. But before then I've been doing Kito like my favorite meal is just like meat. And I know some people hate cauliflower, but cauliflower and or green beans just lets you worry if people eat cauliflower.

[01:13:16]

Why do you have to make that distinction? Some people hate cauliflower.

[01:13:21]

I'm always out there hating cauliflower. Who the fuck are those people?

[01:13:24]

That's a weird thing they just said. A bunch of people say Colthurst sucks recently, so yeah. You're right, cook. You're right. It doesn't suck. You know, it's good.

[01:13:33]

Buffalo cauliflower like buffalo wings, buffalo sauce. Khalifah fucking delicious.

[01:13:38]

What's that? But that's his sauce. Yeah. No sauces is like you're giving into your weakness and spices and you've given away.

[01:13:46]

No. See, they call up a blander taste. A timmies better because you get to appreciate the fundamentals of the food.

[01:13:53]

So I got to say, I just enjoy it and do salt meat. Salt. How do you do that when you don't appreciate the fundamentals of the meat? Yeah.

[01:14:05]

Good point. Yeah. You don't check your you're playing chess. You like hot sauce.

[01:14:10]

I put hot sauce on everything. Yeah. I do. But I stay away from it. Like I tried to lessen food to me. Right now, my life is a source of energy, not a source of pleasure. But it can be both.

[01:14:24]

Unfortunately, I'm not addicted to drugs. I'm not addicted to many things. But with food, my mind. I don't know how to moderate, really. So like anything pleasurable is a problem for me.

[01:14:36]

Cookies you put to cookies in front of me too. Like I don't know how to eat. Just one of them. Like, it's just doesn't. My brain is terrible at it.

[01:14:44]

This is Girl Scout. Girl Scout cookies season, son. Yeah, they changed the name of Samoas. Those are my favorite. And now they have a new name though, like. I think they calling tagalongs or something like that. Not for a while. I think really whether they change or separate things, though, I think is that am I talking about the wrong thing? The ones that are like the chocolate on the bottom, it's to their Samoas have that coconut.

[01:15:10]

Yeah, yeah, yeah. That doesn't sound like the words of a man who's going to stick to the. Can't afford that mistake.

[01:15:16]

All right. Yeah. I mean I'll have cheat days or cheat meals, I should say.

[01:15:20]

Tagalongs that peanut butter. Oh that's right. Those are good. So those are fucking good.

[01:15:24]

What are the Samoas now? What are they call them now? That's Amala. But they just changed. They just changed to something new.

[01:15:31]

Sifters. I haven't got Cookie ever now. What are you, a robot? I'm Russian to basically a robot. Basically, I eat six of those and I was feeling like shit.

[01:15:44]

Oh.

[01:15:46]

What? Maybe this is just a very quick Kamal delight, as it's called now. I think so. Just someone might have on another company or whoever they were paying too.

[01:15:55]

I was wondering if it was a racial issue. That's what the question was with someone saying because it was racist. It's odd question. There's no the name of the cookies are owned by the two different companies who make them outsource it.

[01:16:06]

And they just you know, they change the name racial issue.

[01:16:09]

Well, because someone like someone might be like sensitive to having a cookie named after an island. Mm hmm. People like a fuckface. That's our ally, not your cookie, you know.

[01:16:20]

See those cookies, though? You sound good to me anymore. What about American cheese? Satoko I cheese. She American?

[01:16:26]

Not OK or American? Yeah. What? Like stay with the Russian warthog? It's a cottage cheese.

[01:16:34]

Well there's. There's Swiss cheese. There's American cheese. And that's it, right? Is there any other countries that are named specifically? After that, the cheese named after the country. I bet she's not even just like French fries. I bet you American cheese is not even American.

[01:16:50]

Do you remember when there was freedom fries? Were while people were trying to call fries freedom fries like post-9-11? Cause you were mad that France didn't want us going over to Iraq.

[01:16:58]

And then people hate freedom. Banned it. Yeah, freedom fries. Oh, that's so dumb. Well, the thing I really like, actually. I think that's the thing that people don't often talk about is the focus. Like my life, I think a lot of people do. This is being able to focus for long periods of time. And that's why I stuck with with ketone windows or fasting especially. Yes.

[01:17:24]

The focus is pretty trim and radical. Well, that's what I really got with the Carnivore Diet. The amount of my flatness of energy, the lack of dips and valleys, peaks and valleys. It's amazing.

[01:17:35]

It's great. It's in the fast. Fasting helps me to like Jack Dorsey doesn't only what's called o mad one meal a day. Yeah. Let me just say one meal a day. All mad stuff. Jesus Christ, Ray.

[01:17:48]

I'm on my hip it lingo, guy. I think I read it. It's all mad. OK. Now, I don't know, one meal a day, but 20, 24 hour fast. That that that's that's a careful weapon you have to play with, at least for me. I like it. For some, it's weird. It helps your mind really focus. I can sit sometimes for five, six hours a day like programming, really thinking and like lose track of time and really focus.

[01:18:15]

But when you do so, when you interact with other human beings, you're kind of a little bit of an asshole. I am sorry. I mean, when I say it in a in a way where it's funny, but if there's something about a person that's full of crap, you are more likely to point that out, like when you're on Kito or Carnivore.

[01:18:41]

No, it's irrespective of Diakite, Kanawa or whatever. Is the fasting. Oh, the fasting. Fasting. Really?

[01:18:48]

So it's just more irritable. Is it what it is? I think it's irritable, but you also see things more clearly.

[01:18:54]

It's like, I don't know. I'll talk to my parents. Right. Or something like that. When I'm more well-fed, I'll be like, just enjoy having fun with them. And if I'm like, fasted, I'll be like, why are you always judging me kind of thing?

[01:19:06]

Right. Like you you realize the thing the the aspects of the interaction which are problematic and you want to sort of highlight them are just sort of noticing it, which is problematic when you're in a working environment, especially sort of deliberating, discussing with other engineers how to solve a problem more likely, especially to lead a team to say that somebody is a little bit full of shit when I'm fasting as opposed that makes sense being a little bit more kind and eloquent about expressing why they're full of shit.

[01:19:38]

I found myself feeling more aggressive and more inclined to use recreational insults when fasting or kind of a carnivore.

[01:19:48]

What's a recreational?

[01:19:49]

And so like come on fuckface fuckface, you know, like saying some like that someone or fill in the blank with whatever the rate of other words you would like an academic paper.

[01:19:57]

The rate of fuckface goes up.

[01:19:59]

Well just, just in casual conversation I'd find myself using fun insults, but more like with the intent of kindness behind like no I mean haven't for even talking about people who aren't, they're just having fun.

[01:20:14]

But that that's also a function of being a comedian.

[01:20:19]

We do that to each other really bad. Like men like I had a birthday. My friends made me. Kate said, Happy birthday, faggot.

[01:20:28]

It's like that kind of shit is just so a part of the culture of comedians. Like everybody calls everybody, bitch, everybody. You know, it's just.

[01:20:38]

Yeah. Which is awesome because this comedian culture is now at full on war with the cancer culture.

[01:20:44]

And it's like it's two armies of people who don't give a damn the people who give way too much of a damn well on it.

[01:20:51]

I have mixed feelings about all that stuff, but I ultimately feel like the direction it's moving in. The reason why it's happening is for good. I think there's a lot of people that are complaining about things and they're trying to cancel people and all that stuff. And it's, you know, ultimately some of it's misguided.

[01:21:07]

But I think the ideas behind it, like the primary push, like the gravity behind it, is people want less racism, less discrimination, less less less of a lot of things. But then along the way, you have hypocritical human behavior that gets involved in this. And you have people that are deeply flawed themselves for pointing out minor flaws and other people. And then they get exposed and they feel horrible for every person who participates in this cancel culture.

[01:21:42]

It's like the wave is coming back at you. I mean, it comes in, it comes out.

[01:21:47]

And if you go too far out on that fuckin pier, it's gonna getcha. And this is part of it that we're learning. And I think. What what people are today like, if you look at the just the.

[01:22:04]

If you look at humanity from like the 1930s, those hard mand people lived in a hard way. It was ruthless.

[01:22:12]

If you watch films from the nineteen hundreds early nineteen hundreds.

[01:22:17]

The first of all, the domestic violence was so normal and like heroes in movies in the 50s and 60s just smacked women in the face.

[01:22:29]

Heroes smack their wives, you know, hit their kids as it was a different world.

[01:22:36]

And and people will look probably at our time today and say, you know, people openly a meet meaning not or like I can see it not engineered me. Not engineered meat.

[01:22:51]

Yeah, sort of eight meat from factory farms.

[01:22:54]

As a sort of as opposed to recreationally hunting it themselves and eating what they hunted or engineered meat. Allowed meat. Yeah. Or you can get ethically raised food.

[01:23:06]

I mean there are there are a lot of ranchers like it's one of things that butcher box does very well is they make sure that they have relationships with ranchers who have a commitment to ethically raised animals and ethically, ethically killed animals.

[01:23:20]

And what that means is, you know, they don't participate in anything that has anything to do with factory farming. No antibiotics, no added hormones ever. And that is possible. I mean, people have been eating animals from the beginning of time, literally 97 percent of the world eats animals. And this idea that the only way to do it is through factory farming. I don't think that's correct. And the only way, you know, I mean, this the idea is if you eat meat, you participate in factory farming.

[01:23:50]

And that's horrific. I don't think that's true, but I do think it is true when it comes to fast food for the most part. And that's unfortunate.

[01:23:58]

And I think if they could I mean, we need more transparency for sure when it comes to that stuff.

[01:24:05]

That's one of the reasons why those ag gag laws, agricultural gag laws, where people there's laws that prevent people from working in these factory farming situations to expose there's laws that prohibit them from exposing the horrors of these environments. That's a real problem. That's a real issue that's clearly designed to protect that industry and allow them to commit these crimes.

[01:24:28]

Yeah, it's one of the things I'm I'm conscious of my own hypocrisy in this. I've I think deeply unfortunately love meat, sort of like. And I'm aware of the how unethical factory farming is now. And so those two things have to sit with and be conscious of. I don't know. It's a question like when did that happen? When did the factory farming thing how do you go back to the 1930s? There was no factory farming. There was just farming.

[01:25:01]

Sure. I think there's probably incremental ACIAR wasn't in 1930, there wasn't already some mass. So what is driving for farming is scale, but also sort of the suffering. There's a certain line you start to cross where it just feels those. I mean, you know, it's unclear which point it really becomes torture versus agriculture. Agriculture. It's an interesting line. Yeah, it's probably a good answer for that. The real problem is the gas, food, the birth of fast food is really probably worth it.

[01:25:32]

Sure. Where's McDonald's? McDonald's probably started 11 years ago.

[01:25:36]

I don't know.

[01:25:37]

I'm not sure. One star at scale. You know, the feeding of massive amounts of people that aren't growing anything.

[01:25:45]

That's the real issue is the real issue is when you whether you're in New York City or Shanghai or Los Angeles, large, gigantic metropolitan areas that aren't growing anything, they're gonna get a lot of food to those people. If you have 20 million people, like in Los Angeles, 20 million people eat meat. That's all a lot of meat.

[01:26:06]

Yeah. Yeah. I got to feed them all those science steps up. I think lab engineer meat is kind. Kinda interesting.

[01:26:11]

Yeah, it is interesting. Have you how much have you paid attention to it? Not much. I'm waiting. Like I. This is the horrible thing. I'm very cognizant of it that I kind of don't allow my brain to think much about this whole space because I love meat and I'm trying to save money.

[01:26:28]

I get it right.

[01:26:29]

So you eat those McDonald's founder brains, you know, so. And the life of a scientist, right. The scientists and especially now I've taken a leap. That's a difficult leap. So I'm still affiliated with M.I.T., but I decided to leave my full time position. Why do a startup. So I want to try to build it. Try and trying to build the kind of thing I dream about. We talked about the movie, her work, and that's been 80, 90 percent of my day.

[01:26:59]

In fact, me doing the podcast is trying to is not trying, is already successful at giving me enough money for food and shelter to tell people the name of the box so they can.

[01:27:09]

Artificial intelligence podcasts. Lex Freedman, thanks, Ft.

[01:27:14]

Let's listen to what is it? Your mosque. Eric Weinstein is on there. I talk. Garry Kasparov. Chomsky. Sean Carroll. Shankar's brilliant. He is brilliant. What is it like talking to Chomsky? He loves it.

[01:27:29]

Lowe Yeah. Well, I talk most people say my voice is very boring. I talk slowly to those people. I say, go fuck yourself. I love you. I love you. You're right.

[01:27:42]

I'm trying to actually.

[01:27:45]

It's very difficult to be to express thoughts like Sam Harris struggles with us, to express thoughts with the kind of humor and eloquence that they are in your brain. Like to convert them as a comedian.

[01:28:00]

You're essentially a storyteller, so you're ready. You don't often you probably don't even acknowledge you don't even know how you did it. You're like Hodgett, Gracie, you've probably developed this art of storytelling of being able to laugh and make other people laugh off, like bouncing back and forth. To me, most of my life has been spent behind the Booker computer thinking interesting thoughts, but not connecting with other people and doing that dance of conversation. And so learning that dance while also thinking is really tough.

[01:28:30]

So what Chomsky was like a pleasure because we can both be robots. But but I think he's like ninety two years old.

[01:28:38]

Is he really? Yeah. And the thing I love the bottom. So, you know, there's all that political stuff that don't pay attention. I mean, he's he's a major sort of activist, but he's also a linguist that thinks that language is at the core of everything of cognition. So like it's at the bottom. It everything starts with language, cognition, reasoning, perception. All of that is things built on top of language. There's a brilliant sort of seminal research in that.

[01:29:02]

But ninety two years old, he's still looked in my eyes and really listened and really thought. And really sharp ideas came out like you do the same thing that was people asking like me, Jorgen, like you don't take yourself too seriously, even with your celebrity, with the popular podcast. That's a huge thing. And what Chomsky what was really surprising to me is while he's pretty stubborn on his ideas and so on, people criticize him. He's so stubborn in his ways.

[01:29:29]

He didn't take himself too seriously. Like I said there, I'm just some kid talk until he, like, really listened. Hmm. Like the stupid questions, the interesting questions. You really listen 92 years old to have that kind of curiosity. That was I was like, I'm so happy when I see that kind of thing.

[01:29:45]

Yeah, that's that's a wonderful example of a career academic who's still just concentrating on ideas, ideas, you know, still thinking always, you know, because academics can be like really any other. Endeavor any other discipline you can get lazy. All right. You see that in almost every walk of life, there's certain people that rest on their laurels.

[01:30:10]

And especially when you become popular, you get really good at explaining she like you do these talks, AEG's lectures, she starts saying the same thing over and over, you know, and you forget to listen.

[01:30:21]

Like out of because of because of this podcast, the artificial Toshi's Pocket, but also Joe Rogan, very two different groups of fans whom I both like. You know, people come up to me and start a conversation and I love it. Like the just like listening to them. And they're, you know, I hope I never lose that. I'm like younger than Chomsky.

[01:30:41]

I hope you stay that way.

[01:30:43]

It's nice if you have the time. It's a problem if you're in the wrong and someone wants to talk to you about something like very deep. Yes. I've had those moments where someone says, hey, man, I got to ask you. And then they might do this is a long conversation. I can't I can't do that. Right. That's the version. That's your burden, actually.

[01:31:00]

I'm in a beautiful place, which I don't think will last too long, which is I'm not sufficiently famous to to work like that. Those things don't happen often and right to work. And I can have that conversation. Right. You have the luxury. Although, let me say, I got to hang out. Brian Kallin, who was a huge fan of nine years eve, I got to watch the old man dance, some dance moves on. And this funny thing happened.

[01:31:25]

He's a celebrity. Yeah. So we're we're hanging out and two times somebody came up to me and Bryan and they said, wow, it's Lex Friedman is so good to come.

[01:31:39]

And then they completely ignore Bryan. I just a you and most of it.

[01:31:43]

My name is like so fucker. Mamy is so proud because it's theboss then. Yeah. I think it's like nerds or whatever. Sure. Yeah. That was that is funny though. That's hilarious.

[01:31:53]

He's he's one of my. I mean it was incredible. I didn't know you guys were friends until they all came together in the guess. And so that was a huge fan of his from like mad TV days.

[01:32:02]

He's one of my oldest friends. Yeah. And you guys, what's what's with Matt on Mad TV?

[01:32:07]

I was the host one week and he was the he was, you know, one of the stars of the show. He's an awesome guy, man. It could really under appreciated person. And he's a guy that because he acts so much and because he gets into that, you know, he's just always into that world. He didn't put the same amount of time into doing his podcast, his personal podcast, as I think he should have, because he's great at it.

[01:32:32]

You know, he was one of the first per- people that I knew that interviewed Jordan Peterson. And he's known that. Yeah. Said a bunch of brilliant people on his podcast. He's had a bunch of really interesting intellectuals and scientists.

[01:32:48]

And I think it's a mixed mental arts or something like that. Yeah. And he's doing it with with Hunter, his friend.

[01:32:54]

He stopped doing it with him. He's he's an unusual guy. Brian Cowen is because he's silly, but he's also brilliant.

[01:33:01]

Yeah. You can see that sort of Eric Weiss that has the same quality, obviously, from different worlds. The silliness you can see through the silliness that there's an intelligent, first of all, a good human being there. But also, yeah, you can be. But at the same time, he's like the butt of every joke. I appreciate that so much.

[01:33:16]

I love silly people. Silly people are so much more fun. The people that I'd like easily offended and easily upset like. He's so exhausting. Silly people are the best.

[01:33:27]

I actually. So I played your theme song on guitar and Brian Cowls of researching it like how do you play it? And Jerry theme song. And there's a Brian Kallin sing. Yeah. Video of like Joe Rogan shoulders for days.

[01:33:45]

Yes. Yeah. Some silly song. He made it.

[01:33:48]

I'm going to I'm working on an ad deal and I'm going to try to figure out because I can play guitar and play the theme song and put up online. He has got to work together and make an album. We're gonna make it out like a Joe Rogan theme. He's gonna come up with some words on that.

[01:33:59]

What's the notes you go? You got pages and pages of notes in front of you. These are stuff that you really wanted to discuss.

[01:34:06]

Yeah, well, we haven't talked about at all. But let me at least Boston dynamics being she talk.

[01:34:12]

I don't think it was a fake video that I sent Jamie today. These motherfuckers. They keep getting me. There's a new fake video.

[01:34:19]

Was the same one. I think if someone has taken on a clip from it was those guys have been making VFX videos on YouTube for 10+ years. They're really good at it.

[01:34:26]

So it's so good that people don't know that there's a YouTube channel where people think it's a single YouTube channel that does like, yeah, visual effects like fake humanoid or.

[01:34:38]

Yeah. Bad dog robots that kind of resemble something like Boston Dynamics. This was one crazy stuff with guns. Yeah. This one, they gave the robot a gun and have seen it. Pull it up, Jamie. What is the gentleman?

[01:34:51]

The corridor. Digital on YouTube is the guys that keep that make a quarter through the YouTube channel.

[01:34:56]

Fucking incredible. It's not real. It looks so real. And so the. The robot, they kick it. They hit it with a hockey helmet or a hockey stick.

[01:35:05]

Rather like a long video. They made a while ago. They might have made a new one which was won out and does it. But I think I've seen it before.

[01:35:12]

I see. They trick you with the Boston dynamics as boss town.

[01:35:16]

Boston dynamics. It looks so realistic. But here's the thing. We're not that far off from this thing.

[01:35:23]

No. Okay. Okay. Let's let's let's walk about. Let's walk it back. It. It's not realistic. In what way? So let's it looks human. Realistic. So you can tell it's a human like a robotics person could tell it's a human because it's really difficult to do that kind of motion, that kind of movement. Like when it's getting shot or not getting shot. So there's a lot of movement. It does for the purpose of comedy.

[01:35:47]

Right. Like, it actually is on purpose trying to look like a human for the comedic Internet effect, like getting a human that's getting pissed off and so on. Yeah. Those qualities are like another water like this here where it's like, yeah, I mean I give you guys.

[01:36:02]

Oh come on. So for. Oh yeah. Yeah.

[01:36:08]

Bruce any type of movements.

[01:36:11]

Some of those are just comedic. You know you don't need Terminator type robot.

[01:36:14]

Right. But they do have legitimate robots that can do backflips now and do so.

[01:36:21]

It's really a backflip is just more real. This is oriels manipulation, so all of these robots, depending on what we're talking about here. But those are remote controlled and these are single demonstrations that they've perfected. So they're it's really important to distinguish between the body of the robot and the brain of the robot. So these bodies, unlike anything else, like a Roomba, unlike a drone who can also be very threatening the these body, somehow we anthropomorphize them and they terrify us.

[01:36:55]

I don't know what it is. I met Spot many in person. That was one of the most transformative moments of my life, really, because I know how dumb it is. But the experience of it like it's not even a head is supposed to be a hand, but it looks like a head and it like looking up at me with that hand. I felt like I was like. It was magic. It was like a. It was like Frankenstein.

[01:37:18]

Coming to life is this moment of creation. And what I realize is my own brain sort of anthropomorphize in the same way. You're like looking at these robots and you're thinking these things are terrifying.

[01:37:30]

Yeah.

[01:37:30]

Like what's like, you know, in 10, 20 years we're we're we're gonna be. Yeah, that's our brain playing tricks on us. Because the key thing that's a threat to humanity or an exciting possibility for humanity is the intelligence that the robots, the brains, the mind. And these robots have very, very little intelligence. So one. So in terms of being able to perceive and understand the world, very importantly, very importantly, to be learnt to learn about the world from scratch.

[01:37:59]

So the terrifying thing is you talked often, like with this philosophical kind of notion of Sam Harris talks about sort of exponential improvement, be able to become human level intelligence, superhuman level intelligence in a matter of days become more intelligent than that. That's all learning process that's being able to learn. That's the key aspect. We're in a very early days of that. There's there's an idea of, you know, Big Bang is a funny word for one of the most fundamental ideas in nature of our universe.

[01:38:31]

Same way self play is a term for a, I think, one of the most important and powerful ideas and artificial intelligence that people are currently working on. So self play, I am not. You familiar with a company called DeepMind, an open A.I., Google DeepMind and a game. I know your first person shooter guy, but Starcraft and Dota 2. So last year, these are what you call them, real time strategy. I guess when people when millions of dollars in e-sport competitions and so open.

[01:39:05]

I separately had opened at 5 which took on Dota 2 Dota 2 as the computer game based on Warcraft 3. That's the most popular e-sport game. And then DeepMind took on Starcraft with their alpha star system. And the key amazing thing is they're similar to Alpha going Alpha 0, then learn to play. Go is the mechanism of self play. That's the exciting mechanism that I think if we can figure out how to have an impact on more serious problems than games will be transformative.

[01:39:37]

OK. What is it? It's learning from scratch in a competitive environment. So thinking of you have two white belts. So I go into jujitsu. You have to weigh both training against each other and trying to figure out how to beat each other without ever having black ball supervision instructor and so on and slowly getting better that way. Coming up, inventing new moves that way. And eventually they they get better and better by that competitive process. That's the machine playing itself without human supervision.

[01:40:08]

The interesting thing is there's a lot of cases in which if you set up the competitive environment well enough for those two white belts, they'll learn to be black bolts. They'll learn to be not only black balls. They'll learn to be better than like exactly the kind of evolution that's happening in Miami right now. If you put that in a digital space and speed it up, you know, a million fold. You'll continue to improve.

[01:40:31]

Let me pause you here, because this is one of the things that I think probably translates to A.I. as it does to Jiu-Jitsu. You need more than one opponent. You can't have one input, one person training with one person specifically and singularly, you're not going to develop the type of game that you need to become a real black belt in jujitsu.

[01:40:55]

One hundred percent. Exactly. So that's part of the brilliance of this mechanism. So imagine you didn't just have white belts yet, a opportunity to generate a new random white belt. I got like a fat big one. A little one. Right. And all kinds of different one. That an aggressive one. Eddie Bravo, a passive aggressive one.

[01:41:15]

And and let them play. And then that's what you find is like jujitsu might be simpler than the general problems, sort of different kinds of like Starcraft and so on. But there is sets of strategies in this giant space that these complex hierarchical strategies like high level strategies and the specifics of different moves that emerge, some of which you didn't even realize existed. And that requires that you start with the huge amounts of random initial states, like the fat person, the skinny person, the aggressive person and so on.

[01:41:51]

And then you also keep injecting randomness in the system so you discover new ideas. So even when you reach purple belt, you don't continue with those same people. You start your own school, you start like you start expanding to totally random new ideas and expanding this way. And when you find out is there's totally surprising to human beings, like in a game of chess or in the game of go in a game of Starcraft, these this self play mechanism can do what sort of A.I.

[01:42:19]

people have dreamed of, which is be creative, create totally new behaviors, totally new strategies. They're surprising to human experts.

[01:42:26]

That's why go was so astounding to them. Right? Because it's such a complex game, such a hard game.

[01:42:33]

And and it's able to well, the first astounding thing is able to beat the world champion. Yeah. The second astounding thing about both chess and go is it's able to create totally new ideas, sort of. I'm not good enough a gesture go to understand the newness of them. But grandmasters talk about the way Alpha Alpha 0 plays chess and they say there's a lot of brilliant, interesting ideas. They're very counterintuitive ideas. And that's such a and that's all the first breakthroughs.

[01:43:04]

Didn't have as much self play. There were trained on human experts, but Alpha Zero, an Alpha star and open AI Five. These systems are all fundamentally self play, meaning no human supervision starting from scratch. So no black ball instruct you just roll. And that means so learning from scratch. That's the that's exceptionally powerful you use. That's a process from zero. You can get to superhuman level intelligence in a particular task in a matter of days.

[01:43:37]

That's that's super powerful, super exciting, super terrifying. If that's kind of what you think about the the challenges, we don't know how to do that in the physical space, in the space of robots. There's something fundamentally different about being able to perceive, to understand this environment, to do common sense reasoning. The thing we really take for granted is our ability to reason about the physics of the world, about the fact that things weigh things, that you can stack things on top of each other.

[01:44:08]

The fact that some things are hard, some things are source and things are painful when you touch them all that like there seems to be a giant Wikipedia inside our brain of like common sense dumb logic. That's very tough to build up that that's. Yeah, that's that's it seems to be an exceptionally difficult learning problem that Boston dynamics will have to solve in order to achieve even the same kind of. Physical movement behavior that we saw in those videos, and then on top of that, to have the ethical behavior that not the ethical sort of the objective, the complex strategies involved in first following orders and then getting frustrated and then shooting everybody.

[01:44:55]

That's an exceptionally difficult thing to arrive at because ultimately these systems operating on such a set of objectives and what a lot of people that think about artificial general intelligence say the objectives we need to inject in these systems that they're trained on need to have one uncertainty. So they should always doubt themselves, just like if you want to be a good blackball, you should always be sort of always open minded, sort of relax, always need to learn techniques. It's OK to get submitted.

[01:45:25]

So always, always have a degree of uncertainty about your world view, the kind of thing we've criticized Twitter outrage, mobs for not having. So having uncertainty. And the other thing is always have a place where there's be human supervision. And I think I just there's I think we have good mechanisms for that in place that I think. I'm very optimistic about where these kinds of learning systems can take us. The exciting thing is Boston, Dunia, or terrifying, depending on whether you think I'm a trustworthy human being.

[01:46:03]

But the Boston dynamics is not opening up their platform. So they are working with a few people. I'm trying to make them quite busy these days. I'll try to make time to make it happen, to work with them, to build stuff on top of the platforms that I'm referring to. Spot many as a platform. So this robot is this dumb? It's like a Roomba. It's a dumb, mechanistic thing that can move for you. But you can build.

[01:46:28]

You can add a brain on top of it. So you can make it learn. You can make it see the world and so on. That's all extra. That's not what Boston Dynamics offers. So they want to work with people like me to to add that kind of capability. And that's exciting because now you can have hundreds of people start to add interesting learning capabilities.

[01:46:48]

So I may I may have to retract my words about how far away we are with the capabilities of these robots once you now open up to the Internet. So I was speaking to Boston dynamics. I think they're solving the really hard robotics problem. But once you open it up to the huge world of researchers that are doing machine learning and doing computer vision and doing AI research, the kind of capabilities that might add to these robots might surprise us. That's where people are concerned, right?

[01:47:15]

The big leaps, the big leap, and then sort of just not being aware of the consequences of these big leaps. And once you let the genie out of the bottle, you can never put it back.

[01:47:26]

All right. The genie and the self play mechanism where you grow from zero to becoming world class chess player. That's that's the genie being out of the bottle.

[01:47:35]

And it's you see Black Mirror. Yeah, Black Mirror. You know that episode. Heavy metal with heavy metal. Very difficult to pull that off. Very.

[01:47:43]

So for now, for now and and yet yet had a conversation with Nick Bostrom. I'm also talking with on the podcast. Yeah. One of the things he mentioned is so I don't think he thinks about the stuff a lot I do about military applications. I talk to folks. That's one of the things people don't just like with me. They kind of put to the side. They don't want to think about military applications. Bryan, I'd be more worried about drones than I would be about robot dogs.

[01:48:12]

Because the kind of stuff is on the Black Mirror episode is really difficult to pull off to make a robot learn.

[01:48:21]

Drones are kind of more impressive, right, because they they hover, they can move through 3-D space. They have Hellfire missiles attached them. And there's a lot of crazy shit that they can absolutely do right now with drones. And you're talking about large scale drones, but you can think of small scale drones.

[01:48:37]

And I think there's a I think there's also a black mirror episode with drones where they take over. I haven't seen that one.

[01:48:44]

So there's a I think there's drones everywhere. And they're kind of doing, you know, your basic friendly government surveillance, mass surveillance kind of thing. I think it's for.

[01:48:55]

I think they said in the episode this for a good cause was a spoiler alert. But I think they, like start killing everybody or. Of course, I wasn't there.

[01:49:05]

There has been research done on making artificial insects that have like little cameras inside of them that look like like dragonfly or some some sort of bug. And they fly around and they could film things.

[01:49:17]

And the thing that terrifies a lot of people is going more microscopic than that.

[01:49:21]

More like robots inside the body that help you cure diseases. There are certain things even at the nanoscale.

[01:49:31]

So basically creating viruses. Yeah, new viruses.

[01:49:34]

Little tiny ones. Yeah. And they if they learn, they can be pretty dumb, but on a mass scale dumb, you know, to be intelligent, to destroy all of human civilization.

[01:49:45]

So the real question about this artificial intelligence stuff that everybody seems to the ultimate end of the line, what Sam Harris is terrified of, is it becoming sentient and it making its own decisions and deciding that we don't need people? That's what everybody's really scared of.

[01:50:04]

Right? I am not sure if everybody's scared of it. Yeah, they might be. I think that's a story that's the most compelling, the sexiest story that the philosopher side of a Sam Harris is is very attracted to. Yeah, I am also interested in that story. But I think achieving sentience. I think that requires also creating consciousness. I think that that requires creating the kind of intelligence and cognition and reasoning abilities that's really, really difficult. I think will create dangerous software based systems before then.

[01:50:40]

There'll be a huge threat. I think we already have them. The YouTube algorithm, the Twitter, the recommender systems of Twitter and Facebook and YouTube, from everything I know, having talked to those folks, having worked on it.

[01:50:54]

The challenging aspect there is they have the power to control minds. The mass, sort of what the mass population thinks. And YouTube itself and Twitter itself don't have direct ability to control the algorithm exactly like that one. They don't have a way to understand the algorithm and to not have a way to control it. Because what I mean by control is control it in a way that leads to in aggregate, a better civilization, meaning like sort of the Steven Pinker or the better angels of our nature.

[01:51:33]

So encourage the better sides of ourselves. It's very difficult to control a single algorithm that recommends the journey of millions of people through the space of the Internet. It's very difficult. And I think that intelligence instilled in those algorithms will have a much more potentially either positive or detrimental effect than sentient killer robots. I hope we get to Sencion killer robots because that problem. I think we can work with. I'm I'm very optimistic about the positive aspects of approaching sentience, of approaching general intelligence.

[01:52:14]

There's going to be a huge amount of benefit, and I think there will be. There's a lot of mechanism that can protect against that going wrong just from knowing the we know how to control intelligence systems. When they are sort of in a box, when they're singular systems, when they're distributed across millions of people and there's not a single control point that becomes really difficult. Mm hmm. And that's that's the worry for me is the distributed nature of dumb algorithms on every single phone, sort of it's sort of controlling the behavior or adjusting the behavior, adjusting the learning journey of different individuals.

[01:52:56]

So to me, the biggest worry and the more exciting things, recommender systems, what they're called at Twitter, at Facebook, YouTube, YouTube, especially that one that one has just like I think you mentioned, there's something special about videos in terms of educating and sometimes indoctrinating. You know, and YouTube. Has the hardest time. I mean, they have it's such a difficult problem on their hands in terms of in terms of that recommendation, because they don't.

[01:53:31]

This is a this is a machine learning problem. But knowing the contents of tweets is much easier than knowing the contents of videos like we are. Albums are really dumb in terms of being able to watch a video, understand what's being talked about. So all it's all you do is looking at is the title and the description and that's it. Mostly the title. It's it's a basically keyword searching, you know, and it's looking at the at the clicking viewing behavior of the different people.

[01:54:00]

So like it figures out that the flat earth supporters enjoy these kinds of videos. It forms a different kind of cluster and makes decisions based on that. By the way, it seems to make definitive decisions about, you know, it doesn't like flat earth. Do I think.

[01:54:17]

Well, YouTube in particular are they're trying to do something about the influx of conspiracy theory videos. Yeah. And the indoctrination aspect of them that, you know, one of the things about videos is like, say, if someone makes a video and they're make a very particular subject and they speak eloquently and articulately, but they're wrong about everything they're saying. They don't understand the science. Say if they're talking about artificial intelligence, they're saying something about things that you are an expert in there.

[01:54:48]

They could without being checked, without someone like you in the room that says that's not possible because of X, Y and Z. Without that, they can just keep talking. So one of the things that they do, whether it's about flat earth or whether it's about dinosaurs being fake or nuclear bombs being fake, they can just say these things and they do it with a an excellent grasp of the English language.

[01:55:12]

Right. So they said they're very compelling in the way they speak. They'll show you pictures and images. And if you are not very educated and you understand this is nonsense, and if you're especially if you're not skeptical, you can get roped in. You can get roped in real easy. And that's a problem. And it's a problem with some of the people that work in these platforms. Their children get indoctrinated and they get angry. Their children get indoctrinated.

[01:55:41]

Now, what's interesting is they get indoctrinated also with right wing ideology and then people get mad that they're indoctrinated like Ben SHAPIRO videos. So they'll they'll they'll get pissed off with that. Well, but you're okay with left wing, right? Why? Because you're left wing. So then it becomes like, okay, what is a problem? What's really a problem? And what is just something that's opposed to your personal ideology and how who gets to make that distinction?

[01:56:10]

And that is where the arguments for the First Amendment come into play. Like should these social media companies that have massive amounts of power and influence, should they be held to the same standards as the First Amendment? And should these platforms be treated as essentially a town hall like where anyone can speak and there's a, you know, a platform and there's a real problem in that there's not that many. This is a real problem. And the real problem is like Twitter is the place where people go to argue and talk about it in a Twitter, maybe as a competitor of Facebook.

[01:56:48]

But YouTube certainly doesn't have a competitor. YouTube doesn't have any competitor. I mean, there's Vimeo. There's a few other platforms, but realistically, it's YouTube.

[01:56:57]

You know, YouTube is a giant, giant platform. What does this alphabet reports? YouTube ad revenue for the first time, video service generated fifteen point one billion in 2019. Holy shit.

[01:57:15]

A comparison I just looked up twitch. AD revenue was supposedly around 500 to 600 million. Wow.

[01:57:23]

That's a big difference. And what about Facebook? Facebook is so stupendously valuable, probably way higher than that.

[01:57:30]

But this isn't the first Facebook I don't think pays like YouTube paid for my McDonald's burgers yesterday.

[01:57:39]

Yeah. Facebooks not right.

[01:57:40]

Facebook is not on Twitter and Instagram. Are paying you like directly. But there's a lot of calls to breakup Facebook. I'm not I mean, I'm on Facebook, but I'm not on it. I don't use it. I just it's just connected to my Instagram. When I post something on Instagram, it goes to Facebook as well. I don't I never go to Facebook. Is it Joe Rogan, Facebook group?

[01:57:57]

That's that's a dumpster fire. Brilliant folks. Just put it that way. Look at this.

[01:58:03]

Facebook revenues amounted to twenty one point eight billion dollars is the fourth quarter. Jesus Christ just the fourth quarter, the majority of which were generated through advertising. The company announced over seven million active advertisers on Facebook during the third quarter. 2099 probably also has an Instagram, that thing with YouTube is just YouTube, not Google. Brought you YouTube premium. Anything else you know?

[01:58:26]

And to be fair, so the cash they have, they spend like Facebook. I research groups some of the most brilliant. It's a huge group that's doing general open ended research. Google Research, Google Brain, Google, DeepMind doing open and research. Like they're not doing the ad stuff. They're really trying to build it. That's the cool thing about these companies. Having a lot of cash is they can bring some of the smartest people and let them work on whatever in case it comes up with a cool idea like autonomous vehicles with waymo.

[01:58:57]

Yeah, it's like let's see if we can make this work. Let's throw some money at it even if it doesn't make any money in the next 5, 10, 20 years. Let's make it work. That's the positive sort of side of having that kind of money.

[01:59:07]

Yeah, that makes sense. There is. As long as they keep doing those kind of things. The real concern, though, is that they're actually severely influencing the democratic process.

[01:59:20]

So is it is difficult. I mean, it's certainly in Jack Dorsey. Jack Dorsey enters the CEO's for interactive. I think was one of the good guys. Yes, I agree. Yeah.

[01:59:31]

I mean, he's he wants a Wild West Twitter. Well, he doesn't know what he wants.

[01:59:35]

He wants a good Twitter. Hey, he's kind of thinking about Wild West, but he wants his ideas have to go to Twitter.

[01:59:42]

One that's filtered to one that's like Krypton. Anything goes.

[01:59:48]

Haha.

[01:59:49]

But I think he. The point is, nobody knows what's what's the best kind of Twitter even having to. Twitter's like, do you really want the Wild West? You want the First Amendment to say free speech is free speech for everyone. It's a difficult. Like the grey area there. You're just talking about YouTube with certain people like saying that I'm an expert in A.I. or autonomous vehicles, but I disagree with a lot of people. But those people make videos and maybe they don't have a p_h_d_, God forbid, like.

[02:00:18]

Are they not an expert either? Am I right? Yeah. I'm actually personally sick of the academic sort of cathedral thinking that just gives you a p_h_d_ and just you can be an expert. Like I'm not an expert. I'm an idiot.

[02:00:30]

Do you feel like that that line is getting more blurred with the access to like all those IMRT courses that are online and the extreme amount of data that's available to people that there are going to be a lot of people that even though they might not be classically trained, they have a massive amount of information and have an open mind.

[02:00:49]

Yeah, I like the best. You know, I love like I like I record a podcast. Right? I do like what you first of all, shout out to Jamie for being incredible mastermind of audio production.

[02:01:01]

Right. Watch, sir.

[02:01:03]

Yeah.

[02:01:04]

But he's the goat. The reason I'm getting giving a shout out because I suck so badly and have to do it. I do it all myself. And but I learn that I do it, you know, pretty good when you learn it yourself from scratch, just like what you just saw with music and so on and learn guitar from scratch. You can learn with the online material they have now. Yeah, you could become really good. And the journey you take is not that the traditional conformist journey through that education process.

[02:01:33]

You take your own journey and we have millions of people taking their own journey through that process. There's gonna be brilliant people that IPD without ever having gone to college. Right. And they I mean, it's difficult to know what to do with that, especially about political questions. Economists, there's these, you know, Paul Krugman, Nobel Prize winner. Economist, Harvard economists. You know, there is supposed to be the holders of the truth that the fundamentals of our economy and when is there going to be a crash?

[02:02:05]

What's good for the economy is the left, the right, what taxation system is good for the economy, but nobody really knows. Say we'd like nutrition, science, psychology, economics, anything that involves humans. It's a giant mess. That expertise can come from anywhere, right?

[02:02:22]

Like. Like. Patrick, I think she's pretty criticized for she's kind of young. Yeah. And she's I would say, you know, she's incredibly knowledgeable as one of the world sort of experts. But I think academia probably doesn't acknowledge her as an expert. She's young. She recently got a p._h._d. I don't know. Even sure. You know, like there's that kind of hierarchy that people do.

[02:02:45]

She's been on justly criticized by people who don't even know her actual credentials. There was one guy who was criticizing her and saying, well, she's on a clinical researcher. That's one of things. He was backing his criticism. Like, no. That's exactly what she is. And she's been doing that for years. Like, you don't know what the fuck you're talking about. People get very touchy with her because she's young and also because she's incredibly brilliant like that lady.

[02:03:09]

She brings stacks and stacks and notes when she comes here. She doesn't even look at them. She's rattles off all those studies off the top of her head. She is on a massive amount of data available and she's very unbiased in her perceptions of things like her. She's all about what do the results say? What have the studies proven? What can we learn from those studies? And what do we have to take into consideration when we're we're assessing this data.

[02:03:34]

She's brilliant. She's she's off the charts. Brilliant. And people get fucking jealous. And I've seen it.

[02:03:40]

I've seen it with weaker, lazy minds in academia that criticize her, that had at least at one point in time had a larger platform. And I think her platform is bigger now. And I'm happy that I've played a part in that.

[02:03:53]

But I don't want to be a social justice warrior, but I'd have seen women being criticized more harshly. Yeah, a lot of domains of science. I think you're right.

[02:04:01]

Yeah, well, you know, she's pretty, too. There's a lot of things wrong there. You know, I'm criticized for that, too.

[02:04:07]

Like good looking, beautiful guys get dressed. Well, you're funny, handsome. No, actually, it criticizes this guy's an idiot. Boring. Why can't he be more like Joe Rogan?

[02:04:18]

What else you got there? What? The notes was. Yeah. I got to. See about martial arts. Okay.

[02:04:24]

I got to talk to you about. Well, so I'm a huge fan of wrestling and a huge man. A huge fan of the Dagestan region. Yes. And I've gotten a lot of shit for it. Posted that. Khan is gonna be carboy before that happened.

[02:04:41]

I'm also a huge fan of the different styles of fighters in May and I'm surprised how much shit actually Connor gets even though he brought. Besides besides sort of all the.

[02:04:57]

All the mess that came with him, he also brought an interesting style, interesting way of approaching fights. And yes, I was thinking and also philosophizing about fighting, I think is amazing. It clashes with the ideas of Kebede. To me, that could beat them. Of IBI. And so I posted that Karner would be carboy. And why didn't know Dan was going to say I was going to say but taste Musbah dull. And I thought he could beat Mojado.

[02:05:27]

And then the biggest fight ever. Thirty thousand people in Moscow were against could be for re-launch. For me, a rematch would be the greatest Jenny past monstered. Not easy and could be getting past twenty. Furgeson not easy.

[02:05:40]

Yeah. Both of those fights. And first of all, Marzotto is now going to fight. Losman, which is very interesting, is everything. Yeah, that's Jul.. Very, very interesting fight.

[02:05:52]

Losman is such a tank. He's fucking terrifying.

[02:05:56]

They hate each other. There's a lot of. Yeah. Certainly city and certainly a lot of animosity, a lot of shit talking. But it's also the more that happens, the better is for both of them in terms of revenue generated.

[02:06:09]

It's a really interesting fight now. Let me tell you something. When Mazo dol- was at the corner Cowboy fight when they put the camera on him.

[02:06:16]

Biggest pop from the crowd, the biggest and all the people was like screaming and people went nuts.

[02:06:22]

They saw him alive.

[02:06:25]

And he was like in a worrying that Roble. He's hilarious, dude.

[02:06:29]

I mean, he's right. Look, he's a slow starter in terms of his career, like it being recognized for the kind of fighter that he is now and also being recognized publicly as like a superstar. But his time has come. He is here. He is a fucking star. When that camera went on him and the audience saw him, that crowd went bananas. The entire T mobile arena just they went crazy.

[02:06:54]

Yeah, they'll be I mean. Yeah, well this will be an epic fight in terms of the grade, which I just think maybe it's me, the romanticized notion like Rocky 4. But of vs. could be in Moscow. I can just see it with like Putin and Feydeau are sitting there. Also next to him.

[02:07:10]

Do you think they would do it in Moscow. Yeah. Thirty thousand people. If Connor went to Moscow. Man. Good luck getting out of there if you went.

[02:07:21]

Good luck getting out of there if you lose a few, though.

[02:07:24]

But I so loved their CBeebies, so loved in Russia. But they I think Russian people also love them and they generally like the the fan. The number of people that love fighting in Russia is huge. And I I know it seems like on the Internet. CBeebies, like they love Kobe bean counters, hated. But I think ultimately they'll love a good what is a what's kind of call it a good he'll.

[02:07:55]

No, a good scrap. I think he calls it a good fight. Yeah. Yeah. I think I think I'll be probably the biggest fan of all time. And I think actually Connor has a shot like this. The I love CBeebies from my favorite fighter. I love that solid fighting. I like the safety of brothers that Frankie Eyed Girl said mentioned to you about probably a boob vice are safety of the greatest freestyle wrestler of all time. Just epic.

[02:08:24]

His brother Adam has a match against. I was a soldier of God. What's his name? Yo, Romario, y'all are marro at the 2000 Olympics in the finals. Y'all Romeril looks like the like if you were to imagine a terror of the most terrifying opponent ever. He's just like shredded, ripped. And then, Adam, safety of looks like, I don't know, dad bod very skinning like nerd. And he just effortlessly destroys him. Really?

[02:08:55]

Yeah. With a trip like.

[02:08:56]

Let me see that video. I've seen it online. 2000 Olympics. Sydney Finals. Adam safety of cells. Name Adam. Say essay I. I e.v.. Vs.. Yo, man, you all were Marines fighting Israel at a SONIYA in March. I got this giant.

[02:09:21]

I can't. Yeah. Not too much of a nerd, but. Well, he definitely doesn't look as built as you. Yeah, well as a freak, yo is probably the freakiest athlete. I think I've ever seen personally in terms of his build like his small waist. He hugged that guy and picked him up. I was at the end of it. I was looking for.

[02:09:41]

There's a couple moments where he take away scores points. It was get I'm down here. I guess he's up by four to one and I think once again, he takes him down or else. This started off in the beginning so we could watch it. There's a certain moment. I mean, there goes. Just target right there. They're basically technicians. Yes, for sure. When you look at the Dagestani people, I mean, it's such emphasis on technique.

[02:10:08]

Yeah. Everything else, but also toughness.

[02:10:11]

It's like they don't they have both things like in this. One of the things that George Peter told me about training in Russia excuse me, training in much of what I take down right there. Fargas spectacular. Oh, my God. It's amazing. Look at that. Yes, yes. I think those are Innotribe would you. Gary Tablature covers his mouth. What? George St. told me about training with Russian nationals in Montreal. He said they're so technical in that you get a lot of Americans that that are definitely technical, but they emphasize being hard and tough and grueling, training teens and grinding butting heads in practice.

[02:10:54]

And he said, whereas the Russian nationals are far more committed to drilling, far more committed to the technical aspects of exchanges and and going through, you know, one one technique after the other, changing these techniques together, understand the paths.

[02:11:11]

Also, the one of the at least to me, one of the differences could be similar to Yul Romero's actually philosophy. But the philosophy of the Dagestani, the Russian people, the Soviet Union is that recognition, fame, money, all of that stuff doesn't matter. Even winning doesn't matter.

[02:11:28]

The purity of the art is what matters, at least with its IETF brothers is what they stood for.

[02:11:34]

Well, that's that's mirrors what KBB says about Connor, that he doesn't want a rematch with them. Yeah, he's like, fuck that dude that's there.

[02:11:41]

I mean, yeah, Caveh was a little bit more of the modern age and he hasn't Instagram and Twitter and so on right now.

[02:11:47]

Is he in Carby despite what he says also does a little bit of trash talking. And, you know, he's he still plays the game a little bit to change his face. Yes, that's my favorite. Send me location whenever rangers' face. Most people say it's for sure. I'm unhear is basically if. If could be ab did science.

[02:12:06]

That's I take that as a cop.

[02:12:10]

That's my one of my favorite quotes he's ever said though. I want to change his face. To change his face is terrifying. It's terrifying because he can do it. The cool thing is with Connor that doesn't affect him.

[02:12:20]

The confidence he has, the confidence Econo has is just incredible.

[02:12:24]

Well, that he wants to do it again. But I know for a fact that Connor was going through a whole lot of shit before that fight and did not have the best training camp.

[02:12:32]

He did an amazing training camp for this, like he really prepared had like he did for Connor or excuse me, cowboy, the cowboy fight.

[02:12:38]

His coaches were saying he's never looked better, but he just was on fire and so focused and so, so accurate and precise in training and that he was just on fire. And that just seemed to be that all of the bullshit and the distractions and all the things that sort of come with being the kind of global superstar that Connor is. He managed to figure out a way to get away from those and to just really concentrate on his craft and and pull everything to a championship level again.

[02:13:11]

And God damn it, he looked like it against cowboy.

[02:13:14]

And to see the contrast of those two cultures. I mean, it is a rocky for type of situation. Yeah, the track. Because you better believe Connor McGregor will resume trashtalk. Who knows? He might not. He might not. He might not.

[02:13:25]

I mean, he didn't in this fight with cowboy at all. He didn't do any trash talking. I wonder if maybe he has learned.

[02:13:32]

And I wonder if, you know, his desire to beat Beeb eclipses his desire to get inside of his head and play all the games that he usually plays and the promotional games that ultimately probably won't be necessary. But I think, you know, the UFC is trying to push for it right now. They're pushing for right now a rematch with Kirby, but they're ignoring Tony Ferguson and a lot of ways in my eyes and I'm like, that is the bogey man.

[02:13:58]

It's going to be exciting. It can go it can go anywhere. He's the lucky man. Dude, he doesn't get tired. He doesn't get tired. He slices everybody up. He isn't lost. He's lost one fight in, you know, X amount of years. And that was because he had a broken arm and Michael Johnson broke his arm. So when you think about what Tony has been able to do to world class fighters, what he did to Donald, I mean, he just smashed Donald's face.

[02:14:23]

He smashed Anthony Pettis. He smashed he smashes everybody. Tony Ferguson's the goddamn boogie man. He really is. He doesn't get tired, man.

[02:14:32]

And he's get taken down. And for sure. You're taking me. I'll do the his thing on him.

[02:14:37]

He's not scared to be taken down. That's the difference between Tony and everyone else. If he gets taken down, he might let him take him down and just attack off of his back and elbow the shit out of him, off of his back. He's fuckin dangerous off his back. He's hard to control. He scrambles very, very well. He also has fantastic submissions. He catches them from everywhere. I mean, he catches triangle chokes. Dazs chokes is dazs chokes are spectacular.

[02:15:05]

He's got one of the best dazs chokes in the sport. And he's he's not scared.

[02:15:10]

Kobe gets admitted. Mark gotta be crazy when he puts could be sleep. Look. Yeah. When does the way. Do you remember when Dustin Poy a call could be even a guillotine? Yeah. Yeah, he did. Court caught. Could be, but a guillotine.

[02:15:23]

Listen. That is not what you want to be with Tony Ferguson. You do not want to be in that position when Tony Ferguson that's a different kind of guillotine did dust employees, primarily a striker. Clearly, he has submission skills. He submitted guys before he submitted Max Holloway and Dustin Poor is a bad motherfucker, no doubt about it. But when it comes to pure submission skills, Tony Ferguson has an edge. And, you know, he's a black belt and a 10th planet black bow.

[02:15:51]

He's master of submissions and a great wrestler and a great scrambler. And the thing about him that's so fucking terrifying is his cardio is all the things right. It's the striking, it's the the grappling, it's the submission abilities.

[02:16:05]

But it's not going to get tired. He doesn't get tired. And his mind is impenetrable. His mind's impenetrable. People are looking past that fire fight. No, not me, man. I don't understand it. When I when the IOC is talking about, you know, look at everybody he's fought. Beat the fuck out of everybody that Edson Barbosa, half Eldo, San Jose torgersen.

[02:16:25]

Yeah, yeah. I mean, he smashes people. He smashes people.

[02:16:31]

I mean, it's crazy that I would I would say if I would say it probably could be its toughest fight. I think it is as tough fight. I do. I think that's puts a lot of people put could be like in that close to the top 10 of all time.

[02:16:45]

Oh, he's in the top 10 of all time. And my my my eyes, he's 28 now.

[02:16:50]

So there's a lightweight cares about the record. You look at the people you've beat. Sometimes we idolize people for the perfection of the record.

[02:16:56]

Too much to the way he rag doll harfield or sangiovese the way steam rolled like you mean he's he's beaten top flight competition and made them look like they have no business being in there with them. But I think if you beat Tony first, I mean that. Yeah, that's some that's racism. That's immense. Yeah. And people put him above. Like I don't know. I think Connor deserves to be in that story in that in that top like fifteen top ten perhaps.

[02:17:24]

Perhaps Jose Aldo. Yes. Like beating. Oh yeah.

[02:17:28]

I don't know why people look past like Jose Aldo or Eddie Alvarez.

[02:17:33]

Oh yeah. Though the Eddie Alvarez fight was unbelievable.

[02:17:36]

These. Maybe I'm just biased in the sense that I thought there's no way that Connor beat Jose Aldo. And then there's no way Connor beats Eddie hours moving up a weight class. Like I always thought he's going to lose and I'm being surprised. Makes me like up Connor's ability, my head.

[02:17:55]

Well, he's phenomenal with Connor. It seems to be a man, a matter of how focused he is. And who is you fighting? And, you know, where's he at in his life? He's just his life is so chaotic. He's always filled with so many distractions. I mean, think about all the crazy shit that he's done, throwing the dirt, throwing the dolly at the bus and just all the nutty shit he's done.

[02:18:18]

But it's nice that he seems to be still hungry to fight, even though he probably has a lot of money in the bank.

[02:18:23]

Well, he was certainly was hungry to fight cowboy. I mean, he looked fantastic in that fight. And again, you know, he's worth a couple hundred million dollars. It's so it's just the pure love of the game that your love of the game.

[02:18:35]

And that's kind of the ethic, that's the warrior ethos that could be concern and that it's cool to see that modify. Nobody's ever said anything in Russian, actually, probably in the Joe Rogan podcast. No, I don't think so. If you ever need a translator. Okay, I'm your man. Now, can I read? So, Sachi, I've just a few lines in Russian. Okay. I'm sure so. Boy, sarasate, you have read Boice Pasternack, which is famous Russian poet, won the Nobel Prize before every match and he kind of captures that ethic.

[02:19:06]

So this is the poem. I'll say it in Russian. Okay. And then in English, please.

[02:19:11]

Okay. Draghi push of all-Muslim to write. Dutoit puts a pad to pad. Now put Azania at Bubka. This some indulgently jets. It doesn't need Deeney Doigaki yet boosted so it's no big of him. zaloom with Boyka Gelu Madoka the cancer. I know there's a bunch of Russian people that appreciate that the translation a bit crappy is very difficult to translate the Russian language, but it's the others. Step by step will follow the living imprint of your feet.

[02:19:46]

But you yourself must not distinguish your victory from your defeat and never for a single moment betray your credo or pretend to be alive. Only this matters. Alive and burning to the end. So this is the end of a poem that represents the fact that fame the most of the poem says that fame, recognition, money, none of that matters. The winning and losing, none of that matters. What matters is the purity of the art. Just giving yourself completely over to the art.

[02:20:18]

So like others will write your story. Others will tell whether you did good or bad. Others will inspire using your story. But as the artist in the case of Pasternack, he's a poet writer did wrote Doctor Zhivago. The is the art. You should only think about the art and the purity of it and the love of it. And so when you look at both of us, Air 687, the brothers in that whole Dagestan region, they shunned fame.

[02:20:48]

So with the thing that CBeebies thrust into this Amami world, which is fundamentally I mean, there really is a popular sport. It's an interesting it's an interesting thing I mentioned, I think last time I was unhear the most terrifying human being. You know, investors, when they like buy a penny stock seeing it's going to blow up. To me, the most terrifying human being in the heavyweight division, the Russian tank I mentioned last time, the sad ally of who now just continues destroying everybody.

[02:21:18]

And it looks like he's already won the gold medal, one bunch of world championships. He's a heavyweight. The heavyweights in that and UFC should be scared. They're going to find him a mate.

[02:21:28]

So the hard thing, spell his name. Let's get a video so we can look at it. It's serious. James got it. Yeah.

[02:21:35]

Bam. It's think of I will never join them. I may not. Classic.

[02:21:40]

That's not yet. That's the enemy. That's a that's part of the quote.

[02:21:46]

That's and that's not that's that's not so. Yeah, I love that live that's closer to where he was chasing, he's still chasing the lives, just the 2020 Olympics.

[02:21:57]

How do you say the name? How do you say his first name? Well, I just called the Russian tank, but it's Abdou of Deuel Abdul Rashid Satellite. OK. 23, 24 years old. And I think his tension is. And he says he has a lot of close friends who are many fighters. He loves watching it. He feels a lot for them. But it's not that what the very thing that this poem gets at. He doesn't want.

[02:22:27]

He thinks that wrestling the pure sport of wrestling is all about courage, skill, like he describes in this way. He thinks MDMA also has to have this component of sexual trashtalk as showmanship. And he is the doesn't he doesn't like it. But I think that MDMA needs that guy to write like a heavyweight could be heavyweight could be.

[02:22:52]

Every corner needs it could be like every every showman needs a person who says showmanship sucks.

[02:22:58]

Every Ali needs a Frazier. Frazier right hand, I think.

[02:23:02]

But this guy is terrifying. I think he would he would do the same thing to that could be a division again. Humble technique is everything, but just a strength wise is also mine.

[02:23:14]

Is he really thinking about fighting or no?

[02:23:17]

That's hard to say. It's hard to say because again, one of the greatest wrestlers of all time really focused on 2020. Olympics is thrown punches here. I think what's going to happen is once likley wins gold at this Olympics, he's going to say he's going to, you know, this this Titanic ship, a 23, 24 year old ship, is going to start thinking and turning him. Maybe there is artistry, maybe there is skill and courage in and mixed martial arts.

[02:23:45]

Well, there definitely is. I mean, he doesn't have to do the trash-talking thing. There's a lot of people that are very stoic that fight. And they they don't participate in any of that stuff. You know, and then there's people that thrive on that stuff. I mean, it's it's really up to you. The UFC doesn't tell you.

[02:24:02]

You have to talk trash. You know, I mean, results are what matters, right?

[02:24:07]

And it's they even trash. That's interesting. I think stories are interesting. Yeah. That's why that's why people like like team sports, like NFL. There's a soup. You watch the Super Bowl? Yes.

[02:24:18]

No, I didn't want to Disneyland. I wanted to talk to you about something that was at Disneyland. What's that? There's a new Star Wars, right? Yeah. This crazy Star Wars. Right. And there's the year. It's a 20 minute ride. I mean, it's just crazy. Long ride and a lot of it. You're in like a vehicle. Yes. And the vehicle is all programmed by computers. The direction, the vehicle, the way the vehicle moves.

[02:24:42]

It's very complex. There's no tracks. So you're riding around in this vehicle and the vehicle like they're shooting at you. The vehicle has to back up. You go into this new door, the vehicle knows how to go around a corner. And what's that guy's name? Darth Maul is trying to cut through the wall. Spoiler fuckin.

[02:24:59]

This this new ride is amazing. It's crazy how intricate and complicated it is and how far off the deep end. Disneyland went to create this thing. I mean, it looks so crazy. I mean, you were like, how much money this cost? This is it right here. So you're riding around in these things and storm troopers are shooting at you.

[02:25:19]

And are there rails or. No, no, no. There's no rails, man.

[02:25:22]

Everything is done by computer. The computer tracks out the environment and knows where each one of these does go. And by the way, there's several cars moving at the same time. So there's people in front of you there in cars. They get shot at. And look at the fucking scale of this place. So that's one of them. Giant for four legged robot things. That's in Star Wars. So you're moving underneath them. There's giant cannons that you have to move through.

[02:25:49]

It's in rise of the resistance. It's trackless. Yeah. So this represents eyes on the ground.

[02:25:55]

There's some. Yeah.

[02:25:57]

I think those lines in the ground, just the wheels going the same way over and over and over again. So I just wanted to sort of commentate. But that day they're probably now using the computer recognizer. So, yeah, I think it's probably light our base.

[02:26:08]

It's I don't know what it's based on, but it's some the computer is coordinating all of these different things. At the same time, you go through this room and you're seeing battles outside and you feel it. You see the walls get hit like that. Yeah, like it's fucking crazy, man. It's it's amazing. I mean, the ride the line is bonkers.

[02:26:29]

So the robotics aspect of this, like the A.I. aspect to this is probably minimal.

[02:26:34]

You look at that. Look at that. You're in this thing. You move through this room and in the background, you're watching these starships shooting at each other and the following time perfectly.

[02:26:44]

Yeah, it's crazy, man. Yeah. So really cool. Make this happen. I mean, these are people that are willing to. Probably invest hundreds of millions like in this. Guaranteed. So I think there is some like there's very minimal A.I. in this because A.I. creates uncertainty and uncertainty is very undesirable in situations like this.

[02:27:03]

Yes. Yeah. I don't think there's any A.I. in it, but there's some for sure. There's some sort of automation. Some computer automation. Yeah. But it's basic software. It's like it's software and all this basic.

[02:27:15]

Don't you dare. Don't you dare. Lex, it's Star Wars. It's not even real.

[02:27:21]

How are you?

[02:27:22]

There's reusable rockets being launched on a monthly basis and we're gonna colonize Mars for reals.

[02:27:28]

That's a real right. That's more interesting, for sure. Definitely. But this is dope.

[02:27:32]

Martin, while you know, I'll be here playing fucking Disneyland rides. It isn't. And then I'll go home and sleep in a bed and breathe air. Fuck. You'll be out there on Mars and the history books will nowhere remember you.

[02:27:43]

Okay, that agrees. Blizzard. The history books. I don't know. I just read books. Don't matter once you're dead. Yeah. I mean, it's nice that we have access to the history books and I praise the historians for sure. But it's not I'm not interested in making history.

[02:27:57]

Yeah, I don't know actually why I said that because I don't care about the history books where I duke the just exciting. It's one of the only X frontiers that we can actually be explorers and like we've explored well the depths of the ocean. WP7. Exactly right. Yeah, but the outer space, that's like that and that's that's like the most exciting one for engineering and science that we can explore in the mind. Like. Like we don't know shit about the mind and exploit that with neuroscience with AI.

[02:28:26]

Mm hmm. Just all of that. The court, the cautious, the all the other thing you talked about with boss German simulation. Yeah, it's gotten some.

[02:28:33]

I wanted to talk to you about that too, because you brought up ostrum. What are you Bostrom relies on? I mean, he was relying on theories in terms of like mathematical theories of probability to say that he thinks it's more likely that we're in a simulation. Yeah, he has. He has a thing. Think he's articulate. I don't think he's come up with the idea of the simulation. He's just kind of really thought about it deeply. He came up with this simulation argument, which these three categories, the describe two possible outcomes.

[02:29:06]

I think the first one is we destroy ourselves before we ever create a simulation. The second one is that we would lose interest in creating a simulation at some point in the third one. Is we living in this simulation? Yeah.

[02:29:21]

What do you where do you lean? I think, too. I think there's going to be the three passi highlighted. It makes it sound like it's so clear there's just three. But I think there's going to be a huge amount of possibilities of the kinds of simulation like to me. I keep asking, you know, that's your mosque. He's about the simulation. He said, what's on the other side? What's outside the simulation? Yeah. I think I asked, what would you ask an AJAI system?

[02:29:52]

He said, what's outside the simulation is the question.

[02:29:54]

He believes in it. I rarely see in occasions it as a troll the Elon Musk embodies like the best of the Twitter Internet troll meme and a brilliant engineer and designer in one.

[02:30:10]

It's like, yeah, it's like a quantum state that you can't quite figure out with. What's the coupling? Because I don't know if he's trolling, but I'm the same way. I love asking people the simulation, even though I get a little bit of hate from the scientific community.

[02:30:22]

But it it push I do get hate from the scientific community about dissimulation because it's it's ridiculous notion.

[02:30:29]

If you think of it's like literally that because it's not a testable thing. We don't or not it does. Like, why are you talking about this? Why do you sit down with Elon Musk and talk about the simulation? When you're sitting with a world expert in particular aspects of rockets or robotics. Like I'm an expert. I can't I just said that I'm not expert. But I know a few things about Thom's vehicles. Why don't you talk to him about that?

[02:30:54]

Right. Why don't we start with a simulation? Well, the thing is, the simulation pushes you outside of the muck. The messiness of everyday details of science and makes you ask big questions about like the nature of our reality.

[02:31:07]

And I I like to think of it as like, what how do we build a simulation? What would be a compelling enough virtual reality game that you want to stay there for all your your whole life? That's a first step there. That's useful to think about. Like what is our reality? What what aspects of the most interesting for us humans to be able to perceive with our limited perception, cognitive abilities, interpret and interact with. And then a bigger question then is like, how do you build a larger scale simulation that would be able to create that virtual reality game I think is a possible future.

[02:31:45]

We're already creating virtual worlds for ourselves on Twitter and social networks and so on. I really believe that virtual reality will we will enter, we'll spend more and more of our lives in the next 50 to 100 years in virtual worlds. And the simulation hypothesis is the nation's discussion as part of that. I think. I think there's the question of what's outside summation is really interesting. That's the other way of because like what created us, what started the whole thing.

[02:32:16]

It's the modern version of asking what the what is God, what does God look like? What, you know, it's it's asking what is the programmer look like? I think that's a fascinating question. But arguing that we're already living this emotion. I think you've got stuck on that little point. Mm hmm. I think it's not that it's a bit of a language barrier to this.

[02:32:41]

There's a technical. Yeah. I think next. Wichita. It's funny. It's funny. Nick is a logit philosopher. So he's been fighting battles in the philosophy game, if you ask them. Does somebody disagree with him on these bars? And there's a bunch of philosophers disagree with him, but including Sean Carroll on the philosophical level. And a lot of the arguments are in philosophy and any sort of technical and they're about language in about terms and so on.

[02:33:07]

But I think, yeah, it's very possible that we we live in a we live in a simulation. I think one of the constructs of physics, theoretical physics with many worlds interpretation of quantum mechanics is Sean's to you about reveals some interesting sort of fundamental building blocks of our reality. There's something I don't think people have talked to you about, which is like the coolest thing to me, the most amazing thing that nobody can explain yet. Things called cellular automata.

[02:33:40]

And there is a guy, mathematician John Conway, who came up in the 70s with a thing called Game of Life and sell your automata. Are these two dimensional, one dimensional, but gamer left two dimensional grid where every single little cell is really dumb and behaves based on the cells next to it. And it's born when there's a fuse, when there's like a certain number, like three cells alive next to it and it dies otherwise. So it's a good, simple rule for birth and death and all it knows it's it's nearby, surrounding in its own life.

[02:34:14]

And if you take that system with a really dumb rule and expanded in size, arbitrary complexity emerges. You can have Turing machines so you can simulate perfect computers with that system and it can grow. And all these behaviors grow like if you watch if people Google like game of life. And you can wash this extremely dumb, simple system. Just grow arbitrary complexities. And what you start to realize that from such incredibly simple building blocks that don't know anything about the bigger world around them, you can build our entire universe.

[02:34:57]

You can build the kind of complexes we see and less so like we think that God is like designing every little aspect of whatever of our world or assimilation hypothesis or the simulation is designed by hand, like I'm going to craft these things. Well, you realize is all you can. All you need to do is just set some initial conditions, set some really basic rules and allow the system to grow as long as it can grow arbitrarily.

[02:35:23]

Just crazy stuff. Amazing stuff can happen from life, from simplicity. Complexity can emerge. And that for like if you study this a little bit closer, just like watch it. People can watch a game of life on YouTube and think about what it's showing like for ten minutes. It'll blow your mind. The fact that from simplicity, arbitrary complexity, beauty can emerge is like incredible. So for the simulation, the creator of the simulation is probably some 13 year old nerd living like in his mom's basement is probably just set some rules in this video game and press play.

[02:36:03]

An arbitrary complexity can emerge. It can have a Joe Rogan. It can have an Elon Musk. All the technologies that we've developed and probably millions of other alien species that are living throughout our universe. Choosers. So the. Yeah. That to me, the suley Automata reveals that the simulation is much easier to create than we might think. And but there's a lot of variability in the kinds of simulations will create. I think the simulation hypothesis thinks like, you know, there's like one.

[02:36:36]

But I think those are going to be a lot of varieties. There's that, there's that. There's a lot of possible different rules. Says there's about a lot of different physical mediums in which these symbols could be created. It can be completely virtual world. The role of consciousness, whether you make most people conscious or not, whether most of them are philosophical zombies, they're just like now non-player characters. And it's just you or you have. Or is your mind simulated?

[02:37:03]

Like the role of suffering, so consciousness brings with it this idea of of basically, you know, subjective experience, of subjective experience comes the idea of pain and fear and so on. The thing again, my Russian romanticization of it, but I think fear of death is essential. Scarcity is essential for muIti for life. Yeah. And that that's a nice feature of this little simulation we've got going on that there could be a lot of different alternatives.

[02:37:33]

I think it could be less individualistic, less less. Consciousness can be present in different kinds of forms. So I see there's a lot more options than those three that he highlights and we can destroy ourselves in a lot of interesting ways. Entire civilizations from age, nuclear weapons to biological to all kinds of weapons.

[02:37:53]

So it's almost like. Whether it's a simulation or not is almost irrelevant. The complexity of the existence and all of the various pushes and pulls that keep everything together. They're they're almost operating like some grand plan, whether they like it or not, whether whether or not a grand plan exists. All these different things are happening and everything is moving in a very specific direction, right? It's moving towards further complexity. Like I was having a conversation with a friend of mine last night where we were talking about phones and we were like, you know, like, when are there ever going to look at a phone and say, I think we're good.

[02:38:40]

We don't have to. The camera works great for cell signals. Great. And call people can text people. Let's just stop innovating right here. And we're we're both laughing like it's never going to happen. But even though we admit, like, if you have an iPhone eleven or a pixel four, that we have it.

[02:38:56]

Yeah, it works for him.

[02:38:57]

They work great. But you don't really need anything better. Like in terms of the way our culture works, you get so much done on these things, you can bank on them. It's okay if I'm drinking all your water, whether you have a lot of water.

[02:39:10]

Yeah, please. So I shouldn't. It's very polite of you.

[02:39:18]

Just the existence itself. Whether or not there's a design to it, it seems to operate in a matter that would indicate there's a design. The design doesn't have to be real, doesn't have to be a simulation. It doesn't have to be a grand plan. But it moves in this the same way as if it's a grand plan.

[02:39:40]

It's weird. It's hard to put into words, but there is a different force and a momentum like the evolutionary process. The fact that life was created, the fact that there is a kind of progress and also like just like with the Native Americans. The fact that suffering seems to be a constant story that we've been in, like we constantly progress. Belief seems to be creating the other and torturing and there seems to be causes, suffering and war and so on through this growth process that seems to be death is a huge part of that.

[02:40:14]

And conflict, conflict, even social conflict like we're talking about social justice warriors and and that type of thing.

[02:40:21]

I think they almost have to exist. It's almost like the world creates a space for them and people find a way to fill that space.

[02:40:30]

The conflict, by the way, also atrophy of an aware. You're kind of even though you were thrust into politics, politician i of politics. But there is the Iowa caucus going on today. The first vote for the Democrats. Yeah. And Bernie is leading in the polls, which is interesting. But that's the that's the that's the fun little guy. Americans have their own little conflict going on here.

[02:40:52]

Oh, there's always going to be conflict with all groups of people with everything. I mean, there's conflict in the comedy community. I'm sure this conflict in the a-I and autonomous vehicle community, there's this conflict. I mean, those things are critical. You know, you learn from conflict.

[02:41:10]

If everything was just simple and easy and there was no resistance whatsoever, nothing would get done. And also, your own personal systems would never get tested. I feel like every adversity, the experience is really a gift because on the other end of that adversity, there's an opportunity for massive growth. What was that thinking, grow rich quote that Lovato Lovato said the other day. Every adversity carries the seed of an equivalent advantage. I mean, just that.

[02:41:44]

Yes. Just the beautiful way to see it. That beautiful quote. How to write it down. I bought that book, too. I'm gonna get to that once I'm worn out on Native Americans. I got about seven other Native American books.

[02:41:57]

I've been so like I mentioned, doing the startup rides since August and. It's been it's been a bit of a torture like the self-doubt. It's pretty hardcore because I know because I've been failing nonstop like some trying to build spending most my day programming and trying to build a her for a she, whatever it is.

[02:42:15]

Let's hear her. Was her her.

[02:42:17]

But now they're on that path. This particular thing, because you want to create a business you want to make. You know, you have to create tools that people would enjoy using on the path. That's a long journey to create a companion that can form a deep friendship.

[02:42:31]

It seems so weird. It's it seems everything seems weird until your life becomes better because of it. Mm hmm. Like flying cars seem weird. Oh yeah. To me still in fact Uber and heightened. I just partner. They're still pushing this idea of flying cars electric. Vito, I just feel like people are gonna slam into each other unless they are autonomous and they have like magnets so they repel down like they can't hit each other. They get close to shore.

[02:43:03]

And what happens when they hit like to me the like, what does an accident look like? Fall on your head?

[02:43:08]

Yeah. You're hanging out in your house trying to watch, you know, black mayor or also long curly.

[02:43:13]

Most accidents people can walk away from like cars today are incredible. Right. And I don't know how you can walk away from it. Grier's crash.

[02:43:21]

Good question. Very good question. It probably won't. Yeah. Foxx, that's scary.

[02:43:26]

Yeah, but there any technology kind of seems awkward or weird. You can you can be terrified of it or you can you know, you can think it's weird. Until it until it takes over. I mean, it would none of us know what that would look like to have a closer connection with a system. I don't know.

[02:43:44]

One of the things in this book that I'm in the middle of with what I'm actually towards the end of this Black Elk book, is it details the invention of the automobile and the implementation of it so real the world changed.

[02:43:57]

That was the other surprising thing about this book is it's so recent. It's crazy. Really, really recent. Yeah. Yeah.

[02:44:03]

So during this time where Black Elk was, a young boy sees Cat Custer get killed, takes his first scalp, remembers the sound of the man gritting his teeth as he's cutting his hair off, like cutting his scalp off.

[02:44:19]

And then later on in his life, as he's an older man, the world goes from very few automobiles to most people have an automobile during his lifetime. Most travel was by automobile. What does he what does he say about this world? This this new world? That's no, let's read the book. All right.

[02:44:40]

Because the mean. He doesn't even know about this world. He knows about the world. The 1930s, I believe he died in the late thirties.

[02:44:46]

It's just it's scary to be born, not scary. But I don't know what it would feel like to be born in this natural world, to see the kind of suffering and the U.S. military and then see the technology of the industrial revolution kind of propagate and be faced with that. I don't know what that would feel like.

[02:45:07]

I know which world is better. Well, the world represents progress by. Right?

[02:45:11]

What is progress? What is proper? I mean, it's it progress seems to be inevitable. Complexity inevitable, never ending complexity. And then there's this push towards that.

[02:45:24]

And I've always wondered if I mean, Ilan has a saying that human beings are the biological bootloader for A.I.. And I've always thought that if you paid attention to the human being's desire for materialism, like materialism seems to be constant throughout cultures. People want things. And when they have things, they want better things. They want newer things. Well, that generates a consistent level of innovation inside that that civilization, that culture. People are going to make better stuff because people are gonna want better stuff so that they're gonna improve upon things.

[02:45:59]

Well, if you just scale that and you keep going, improving, improving, what do you get to you? Will you get to something like artificial intelligence? You get to something like, you know, some sort of. Some some sort of an event, some sort of a thing where the world changes. And I think technology will help us ride that wave. I am an optimist in that sense. All we've talked about much, but I'm an optimist on your link.

[02:46:25]

I think there'll be a few exciting developments. There's a new rolling brand appearing on faces. I think, you know, the exciting possibility there that our new boss wants to also skeptical about a more positive about increasing the bandwidth of our brain, being able to communicate with the Internet with information. It doesn't necessarily need to be through being computer interfaces, but increasing that bandwidth to expand our ability of our mind to reason, not not to expand the ability of reason site, to take the mechanism of our minds, ability to reason and expanded with access to a lot of information and increase that bandwidth to be able to reason with facts.

[02:47:04]

Just like we can look up stuff on Wikipedia now increasing the speed at which we can do that. And I think fundamentally transform our ability to think.

[02:47:13]

Do you think that that's ever going to be a wireless option? Because right now they have to drill holes in your head, right? Right.

[02:47:19]

I think I think that it could be out there could be other interfaces. I think yeah, I think so. But also, like I said, weird technology holds in your head. Sounds terrifying right now, but it could be normal ear piercing it. My ear piercings. Yeah.

[02:47:37]

But this is something standard like, hey, did you get suited for neuro link yet? Billy's only 13. He's not ready for neuro link. We're gonna wait until he's 16 is like Dad.

[02:47:47]

All my friends have it. Yeah. Come on, dad. I want to get fitted.

[02:47:51]

And just like surgery. You take me surgery. All subjects have brain surgery. And you take that for granted. Yeah. For you, you're okay with it. But on the brain, it's just scary.

[02:48:01]

Sketchy. Can I. Because I know you probably got to go. Yeah. Well you got last. Can I. Can I close it out with the poem. Let's do it. I'm that guy. OK. Because I've been doing the start up, I've been suffering. I'm reading a lot of Bukovsky. Oh, Bukovsky poems. You get drunk when you read them. Of course. Some whiskey. Roll the dice. Vodka.

[02:48:23]

Vodka is for friends and family. When you buy yourself, it's whiskey. No, I don't need a man does not drink. So. Well, this man doesn't quite. It's more like relaxed thinking. Drink is whiskey. Vodka is. We're going crazy. Yeah, we're going. We're going dark. We're going to raid and pillage. Roll the dice or go all the way. By Charles Bukowski. If you're going to try, go all the way.

[02:48:54]

Otherwise, don't even start. If you're going to try, go all the way. This could mean losing girlfriends was relatives jobs and maybe your mind go all the way. It can mean that eating for three or four days, it could mean freezing in a park bench. It could mean jail. It could mean derision, mockery, isolation. Isolation is the gift. All the others are a test of your endurance. How much you really want to do it and you'll do it despite rejection and the worst odds.

[02:49:26]

And it will be better than anything you can imagine. If you're going to try, go all the way. There's no other feeling like that. You'll be alone with the gods and the nights will flame with fire. Do it. Do it all the way. All the way. You will ride life straight to perfect laughter. It's the only good fight there is. I want to take a picture, you, Irene, pick up that piece of paper real quick.

[02:49:57]

What's this thing? Take it. Fake it for Instagram. Fake fence around people on Instagram that watch it. Well, no fake fake leg up. That was awesome. Appreciate you, brother. Thank you very much for coming in here. It's always a pleasure. We got to do it more often. 10 more years. Yes, 10 more years. Bye, everybody.

[02:50:19]

All.