Transcribe your podcast
[00:00:00]

What's happening, people? Welcome back to the show. My guest today is Gwinda Bogle. He's a programmer and a writer. He also happens to be one of my favorite Twitter follows. He's written yet another mega thread, Exploring Human Nature, Cognitive Biases, Mental Model, Status Games, Crowd Behavior, and Social Media. And it's fantastic. So today we get to go through a ton of my favorites. Expect to learn whether cynical people are actually smarter, why people tend to find certain outcomes so intolerable, whether Why you would rather lie than say what you really think, why people would rather be hated than unknown, why appearing to do good has become more important than actually doing good, and much more. This guy is so great. This must be his sixth or seventh episode, I think he's had on the show now, and he's just so incisive and interesting and unique with the way that he goes about things. You should check out his Substack. His Substack is great. Phenomenal writer, great speaker. I can't get enough of these ones. I hope that you take tons away from this because I had an awful lot of fun recording it. Also this Monday, Dr.

[00:01:05]

Mike Isoretel, one of the best, if not the best evidence-based training coaches on the planet, doctor of exercise science. He is a professor at Lehman College in the Bronx, and he's going to teach us over the space of 2 hours how to build muscle using science and research. And none of that is bro science. So yeah, huge few weeks coming up, including some massive, massive guests next month as well. So Get ready for those ones. This episode is brought to you by Shopify. Shopify is the global commerce platform that helps you sell at every stage of your business, from the launch your online shop stage to the first real life store stage, all the way to the did we just hit a million orders stage, Shopify is there to help you grow. Whether you're selling scented soap or offering outdoor outfits, Shopify helps you sell everywhere, from that all-in-one e-commerce platform to their in-person POS system, wherever and whatever you're selling, Shopify has got covered. Shopify helps you turn browsers into buyers with the internet's best converting checkout, 36 % better on average compared to other leading commerce platforms. You would be amazed at how many massive brands you love use Shopify.

[00:02:12]

Gimshark, perhaps one of the biggest independent sportswear companies in the world, uses Shopify, and if it is good enough for them, it is good enough for you. So if you are looking to get started at selling something online, Shopify is the easiest, quickest, and most convenient way to do it. Plus, you can sign up for a $1 per month trial period at Shopify. Com/modernwisdom, all lowercase, that's Shopify. Com/modernwisdom, to grow your business no matter what stage you're in. Honestly, the difference in the quality of your life when you have a world-class backpack is pretty hard to describe. Nomatic make the most functional, durable, and innovative backpacks, bags, luggage, and accessories that I've ever used. Their 20-litre Travel Pack and Carry On Classic are absolute game-changes. The amount of thought that they've put into every pouch and is incredible. They're beautifully designed and not over-engineered and will literally last you a lifetime because they've got a lifetime guarantee. So if it breaks at any point, they'll give you a new one. They also offer free shipping on orders over $49 in the contiguous United States. And if you don't love your purchase, you can return or exchange your item 30 days after you've received it for any reason.

[00:03:18]

Nomatic is offering Modern Wisdom listeners 20% off their first purchase when they go to nomatic. Com/modernwisdom and use the code Modern Wisdom at checkout. That's nomatic. Com/modernwisdom. Wisdom and Modern Wisdom at checkout. This episode is brought to you by Element. I have started my morning every single day for the last three years the same way, which is with Element in water. It tastes fantastic. It reduces muscle cramps and fatigue. It optimizes your brain health. It regulates appetite, and it helps to curb cravings. It's got a science-backed electrolyte ratio of sodium, potassium, and magnesium. Super simple. There is nothing fancy going on here, but it really does work. Also, they have a no BS, no no questions asked, refund policy, so you can buy it 100% risk-free. If you do not like it, they will give you your money back and you don't even need to return the box. That's how confident they are that you love it. They're the exclusive hydration partner to Team USA weight lifting and relied on by tons of Olympic athletes and high performers in the NFL, NBA, NHL, and FBI sniper teams, plus tech leaders and everyday athletes around the world.

[00:04:23]

Head to drinklmnt. Com/modernwisdom to get a free sample pack of all eight flavors with any purchase. That's drinklmnt. Com/modernwisdom. But now, ladies and gentlemen, please welcome Gwinda Bogle. Every single time, Dude, you keep releasing these mega threads with cool ideas. I keep loving going through them. So today we're going to go through as many of your ideas and some of mine that I've already made from home, and we'll see what we can get to. First one, cynical genius illusion. Cynical people are seen as smarter, but sizable research suggests they actually tend to be dumber. Cynicism is not a sign of intelligence, but a substitute for it, a way to shield oneself from betrayal and disappointment without having to actually think.

[00:05:30]

Yeah, so this is actually based on a pretty large study, which was conducted in 2018 by Stavrova et al. And it's basically what they did was they did a series of surveys to test the hypothesis that cynical people are more intelligent. Because a lot of TV popular culture portrays cynical people as intelligent. You see characters like Dr. House, played by Hugh Lorry in that show, Sheldon from Big Bang Theory. A lot of these characters tend to be very cynical, very pessimistic, but also geniuses. It's become a bit of a stereotype. These researchers decided to test this by actually doing a massive study which involved 200,000 people in 30 different countries. It was a series of surveys, firstly, to test their cynicism, and secondly, to test their competence, their essentially their IQ. It was interesting because they actually found the opposite of what a lot of people believe, which is that cynical people actually tend to be lower IQ, or at least lower in their performance of cognitive tests. It's actually very interesting because they posit as an explanation for this, the idea that cynicism is basically a evolutionary heuristic to basically save people from having to think.

[00:07:02]

It's basically a way to protect yourself against betrayal, to protect yourself against any form of treachery, including treachery of your own expectations. I can see how this would have probably been a useful heuristic, say, about 100,000 years ago. In the study, they describe it as the better safe than sorry heuristic. It's this idea that For instance, if you're out there and you're in a low information environment, so let's go 100,000 years back into the past. We don't have the Internet, we don't have TV, we don't have books, we don't have real knowledge. We're in a low information environment. We're in the middle of a forest, and we see this alien-looking fruit on a tree, and we have the choice whether we can eat it or not eat it. And we don't know what this fruit is. We've got no books, we've got no understanding of it. We've never seen it before. So in that situation, religion, the best thing to do is to default to believing that it's dangerous. Because obviously, one fruit, if you eat it and it turns out to be harmless, is not going to benefit you that much. But if you eat that fruit and it turns out to be poisonous, that's the end.

[00:08:16]

So obviously, from that point of view, it makes sense to have this pessimistic, risk-averse approach to life. Now, the thing is, is obviously the world now is very, very different from the world that we had. And yet We retain the same basic psychology, the same biology. We are averse to risk, and that involves being distrusting of other human beings because we don't know these are the- That's a question now.

[00:08:43]

One thing that I'm trying to bifurcate here, what's the difference between cynicism and conservatism or risk aversion or something like that?

[00:08:52]

So cynicism is a pessimism, but it's a pessimism with respect to other people's intentions. So it's believing that people are always doing things for the worst possible reasons. It's usually, you can summarize it as saying that people are only in it for themselves. So basically, you can't trust people, basically. Obviously, some conservatism could be a function of cynicism, but I think that obviously, conservatims are much more broader than that, and it takes into account many other different heuristics. The thing with cynicism is It's very low cognitive effort. It doesn't require you to really expend much mental effort to do anything. All you've got to do is not trust something and to basically just say to yourself, Oh, wow, I shouldn't do this because something bad might happen. Our brains are very good at finding reasons not to do something. There's this idea where if you have a hole in your roof, you could reason to yourself, On a sunny day, you don't need to repair that hole in your roof, so you just not do it. It'd be like, Oh, what's the point? I don't need to do it. It's sunny outside. It's just letting sunshine into my house.

[00:10:09]

It's actually a good thing. On the other hand, if it's raining, you could also say, Oh, wow, it's raining. So I don't want to get wet, so I won't go out. I might slip from the ladder and fall. So your brain is very good at inventing reasons not to do things. We have this natural cynicism, and it actually takes mental effort to overcome that. It actually takes mental In the study, they actually found that people with higher IQs actually tend to be more trusting, which is quite an unusual thing. You would expect it to be the other way around. You'd expect high intelligent people to be less trusting, but they're actually more trusting. This is because they tend to be... They're not necessarily better at determining whether they should trust someone or not, but they're better at determining whether cynicism is warranted or not, which is slightly different.

[00:10:54]

Why does this presumption that hoping for the best or that believing in people is naive, and smart people would never be naive. One of the worst things that you could do is have the wall pulled over your eyes. It's seen as juvenile or innocent or unsophisticated, and the Converse of that is cynicism or skepticism is more mature intellectually in some way.

[00:11:22]

Yeah. I mean, this is a very popular misconception, I think. And that's why cynicism is very popular because it has the illusion. Because obviously, if you take no risks in life, then you're not going to fail ever at anything because you didn't put yourself out there. You have this idea that I've heard you speak about called the cynicism safety blanket, which I think really jives with this very well, because obviously cynicism is a form of protection. It's like this front that you put up which protects you from any risk-taking. If you don't take any risks. If you don't go out there and if you don't try to succeed at anything, then you won't fail at anything. It's basically like a way to guard yourself against any form of failure. That's why I think people who maybe don't want to expend mental effort or emotional effort, because there's an emotional aspect to this as well, they will instead just choose not to take the risk. It's much easier to just say, Oh, I'm not going to take a risk because everything's gone to share. Everybody's out for themselves. I'm not going to trust this person. I'm not going to love this person because they might betray me.

[00:12:33]

They might not return the affection. I'm not going to go out and try this new thing because I might fail. It's much easier just to not do any of that stuff. Then you can just say to yourself, Oh, well, I've never failed. It's like a ego trip that you put yourself on. But the thing is, the truly intelligent people will say to themselves, Well, look, yeah, I might fail. But at the end of the day, it's worth trying Because at the end of the day, if you don't try, you'll never achieve anything. You're not actually going to better yourself. You're just going to remain in the same situation whatsoever. And even failure can be good. If you're intelligent, failure can be good because you learn from failure. In fact, failure is pretty much the only thing we learn from. It's the only lesson that we learn from. We don't learn when we succeed. We don't learn when we're happy. So intelligent people will tend to put themselves out there. They will risk engaging in ambitious endeavors because they know that at the end of the day, even if they fail at that endeavor, they're actually still improving their station because they're improving their knowledge.

[00:13:37]

They're learning from it. I think that's ultimately what it comes down to is, if you don't have a high IQ, you can feign a high IQ by criticizing other people their efforts and saying, Oh, look at this fool, he failed. Whereas you'll never fail. So you say, Oh, you always have that. I've never failed. But then you've never actually succeeded. It either. I think it's a guard. It's an emotional guard, and it's an intellectual guard.

[00:14:07]

Siegel's law. A man with a watch knows what time it is. A man with two watches is never sure. Ancient societies followed a single narrative. Modern societies are cacophonies of competing narratives. Without trust, more data doesn't make us more informed but more confused.

[00:14:27]

If you talk to a lot of these disinformation academics, people who study disinformation and stuff, they'll often say that there's a problem of people not getting enough information. There's this whole idea of low information voters and stuff. That's what people tend to call, euphemistically call people that they regard as stupid, it's low information. But the thing is, the problem in society at the moment is not actually a lack of information. It's a lack of trust. That's the bottleneck that is stopping progress. Because look, we have more information than we've ever had in a whole of human history. I think I read somewhere that every year more information is produced than in all of the preceding years of human history. That's how much information is exploding.

[00:15:21]

It's the most exponential of exponentials.

[00:15:23]

Yeah. So information is not the problem. We have more than enough information. The thing that's holding people back is a lack of trust. I think this got particularly bad since the pandemic, because obviously, our mainstream institutions, which we rely on to navigate the world for us, they showed that they were flawed during the pandemic. For instance, at the beginning of the pandemic, the World Health Organization said that COVID is not airborne. If you go on Twitter and you look at their page, the tweet's still up, which says that COVID is not airborne. But we very quickly found out that COVID was airborne, and it was actually designed because people, obviously, were lulled into a false sense of security. That was obviously a big problem. Then we also had the problem with the masks. How efficacious are they? Then there was a problem of vaccines, how efficacious are vaccines? What are the side effects? Then, of course, there was the lab leak hypothesis, and that was instantly dismissed as a conspiracy theory, despite the fact that there is at least as good an argument that COVID escape from a lab as that it was It was a result of a natural spillover.

[00:16:33]

So these events, I think, really destroyed trust in institutions. But I mean, obviously, this problem began before COVID. It just COVID exacerbated it a lot. And obviously, things have not gotten any better since then. We've seen, for instance, the whole Harvard scandal, the plagiarism scandal. This year, we've seen many big academic studies which have been shown to be completely punk. There's a A famous professor, his name escapes me, but he did a series of studies about systemic racism, in which he basically showed that systemic racism is a thing. This was picked up by the New York Times, the Washington Post, to basically say, Hey, look, systemic racism is a real thing. Look at these disparities in treatment of white people and black people. That was all shown to be complete nonsense. It was all fabricated. All the data was fabricated. Dan Arely, who's a famous psychologist, his work was also found to be fabricated. Ironically, there was a Harvard professor who was studying faking of information who ended up her own work was faked. This year has been really bad for academia. There's been a massive drop in trust. If you look at any deep pole regarding trust in the media, you see a gradual slope.

[00:17:49]

You see people, you see decline on both sides of the aisle, but particularly amongst people on the right. Because obviously, there's this idea that most of the mainstream institutions in West lean left. But even the left have less trust over time in institutions. Obviously, this has gotten a lot worse over the past few years. The problem with trust is it's like a tree where it takes a long period of time of nourishment and light, seeing what's going on, to actually grow it. But it can be chopped down in a day. It takes years for a tree to grow, but it could be chopped down in a single day. Institutions, over many years, they tried to build trust with the public. But a few real bad instances of betrayal of that trust have now caused the trust to a nosedive. What's interesting here is this dovetails with what we were talking about previously about the cynical genius illusion, because a lack of trust leads to more cynicism. The cynicism stops people from doing things. People become more risk averse. They become less likely to form partnerships with people, even to form relationships with people. There's a lot less innovation in a sense, because people distrust a lot of things.

[00:19:22]

You see in our daily lives with the ways... Again, I'm not saying that this distrust is unwarranted. A lot of it warranted. If you look at what's going on in America, in San Francisco, in places like that, where you see the government in San Francisco had an opportunity to clean up the streets, to take the fentanyl users off the streets, to house them in a decent place, and to try and give them help and to clean up the streets, generally. And they didn't do it. They only did it when the Premier of China, Xi Jinping, came out. They thought, Okay, now we've definitely got to do something about it. So that just showed that they just didn't care. Obviously, when there's a foreign leader coming to visit, then they suddenly clean up the streets. So this is obviously... This trust isn't necessarily unwarranted. But what's happened is the result of this is that people tend to... No matter how much information you give them, no matter how much information the World Health Organization or governments or corporations even try to give people, the fact is that there's this porosity of trust. And To be honest, I don't think that this trust is ever going to be fully restored.

[00:20:34]

I personally don't trust institutions anymore. I find that it's easier to trust individuals now. That's what I do. I don't really That's what they call it, the lowest integrity individuals. The reason for this is, although there are a lot of low-integrity individuals, there are also a lot of extremely high-integrity individuals. It's much easier to gage whether an individual is high-integrity than whether an institution is high-integrity. In fact, Most institutions tend to fall to the level of their lowest integrity members. This is because corrupt people obviously tend to rise high institutions because they tend to be more ruthless, they tend to be more dishonest, they tend to play the game. The dishonest people rise to the top in institutions.

[00:21:18]

People who are trustworthy on their own in solitude also become untrustworthy due to negligence or fear or compliance or the Abilene paradox. All That all of those things happen. So you get honest individuals and untrustworthy, highly falsified groups, even if they're made up, even if the constituent parts are trustworthy.

[00:21:42]

Yeah, that's it. It all comes down to the perverse incentive structures that institutions have. They tend to be these closed systems of status games. They also tend to be chasing money. And a lot of the time, these people are playing against each other for status. So you It leads to prurity spirals, for instance. A lot of these perverse incentives ensure that institutions can never really rise above their worst members. Whereas individuals, they are a lot more variable. Not every individual is more trustworthy than every institution. But of the high-integrity individuals, there are a lot more trustworthy than high-integrity institutions. I tend to trust individuals a lot more. The ways that I learn whether I can trust someone or not. I have a few heuristics, but for instance, one of them would be, are they willing to publicly admit when they get things wrong? Because it takes integrity to admit when you're wrong, but it takes a huge amount of integrity to do it publicly. If you can do that, and that's a very rare skill, it takes a huge amount of strength to be able to go out there and say, Okay, I was wrong.

[00:22:53]

And so that, for me, is a very good indicator that somebody is high integrity. It shows that they value the truth more than their own ego.

[00:22:59]

Do you know what? One One of my favorite heuristics for this is when was the last time that the person you're thinking about surprised you with one of their takes? If they are predictable with the things that they do, if you know one of their views and from it, you can accurately predict everything else that they believe, they're probably not a serious thinker. They've just absorbed some ideology wholesale. What you want is someone who you don't always necessarily agree with, but definitely you can't predict. Obviously, most people do fall in some grouping of ideologies. That's why we tend to have birds of a feather. But yeah, when was the last time that this person surprised you with something that they commented about?

[00:23:39]

Yeah, that's definitely one of mine as well, because it shows that somebody's willing to think for themselves rather than subscribe to a total package ideology, which just gives you everything. Tells you what to think about abortion, tells you what to think about gun control, tells you what to think about freeze in speech. All of these things are generally unrelated. But if somebody's got all of these predictable opinions, it shows you that they're getting it all wholesale from someone else.

[00:24:03]

There's something that I think is associated with this, another one of yours, ambiguity aversion. People tend to find uncertain outcomes less tolerable than bad outcomes. De Berger et al, 2016, found that test participants who were told they had a small chance of receiving an electric shock, exhibited much higher stress levels than those who knew they'd certainly receive an electric shock.

[00:24:26]

Yeah, I mean, this explains so much. I mean, everything from the world of investing. It explains market volatility, but it also explains things at a personal level, where one thing I've found in my personal life is that things are never as bad as I think that they're going to be, pretty much. It's a very simple thing, but I find that the anxiety of trying to expect what's going to happen is often worse than the actual any, even the worst eventuality. So For instance, if I were one of my old selves from, say, 10 years ago, I might be nervous having this conversation with you right now, knowing that a lot of people are listening. I would probably be playing in my head a lot of times where it could go wrong. I might say the wrong word. I might say something really bad. I might say the N-word or something accidentally. That's it. I think about the worst possible scenario, and that would really give me nightmares. But then I would find that even if the worst It did happen, it probably wouldn't actually be that bad. Not that I'm actually going to say the end of it, but it was just...

[00:25:36]

Things are always worse in your mind because your mind is more terrifying than reality. Your imagination is more terrifying than reality. It's a more skilled scaremonger than reality because it knows your worst fears. I think when you're uncertain, you can often imagine extremely bad outcomes because Because in that uncertainty, that's where your imagination runs riot. That's one aspect of it. With regards to the ambiguity version that you're talking about with the electric shocks, again, it's managing the anxiety of uncertainty that takes a bigger toll on somebody than actually just resigning themselves to the worst outcome. I found that this is, again, if I know that something is going to happen, something bad is going to happen. It gives me a sense of peace of mind because I know what to predict, I know what to expect. I don't need to expend stress and mental effort in trying to find a way out of it, trying to predict what's going to happen. Because trying to predict what's going to happen is a very stressful thing to do. It basically requires you to consider an extremely wide swath of possibilities, and our minds are just not very good at doing that.

[00:27:06]

If we have just one path ahead of us, even if that's a bad path, even if it's got a ditch at the end of it, it's much easier to just continue along that path and say, Okay, so when it happens, I'll deal with it, than it is to say, Okay, which of these paths has gone ditch at the end? How many steps away is it? Every step you take, you have to be worried that you might fall down that ditch. So it's the stress of having to navigate possibility, which ends up causing more mental discomfort than the actual bad outcome itself.

[00:27:41]

Do you think that ambiguity aversion explains some of the conspiratorial thinking, doomsday cultish fads that we've seen, that it actually closes down the potential optionality of the world to one thing, one bad thing, but it gives you a sense of certainty as opposed to leaving you open to ambiguity.

[00:28:03]

Yeah, absolutely, 100%, because I think there is one thing that's scarier than a conspiracy of people plotting everything, and that is no conspiracy of people plotting everything. That everything is just rudderless. Society is rudderless, basically. Nobody knows what they're doing. Everybody is just trying to navigate the world as best as they can. There is no overarching plan. That's scary. Also. It leads to uncertainty. When you don't know what to expect, when you can't blame your problems on a single thing, that leaves, again, it leaves so many paths ahead of you that you just become overwhelmed and you just The stress of trying to work out which path is the true one, that is an underrated form of stress. Whereas the stress of knowing that there's a group of bad people out there who are plotting everything, that actually isn't really stressful at all. In fact, it's actually quite interesting because then you want to go online and you want to learn more about it.

[00:29:03]

There's a degree of certainty about it. Yeah, I came up with this idea called anxiety cost. So in the same way as you have opportunity cost, the amount of time that you spend thinking about the thing that you could have gotten rid of had you have just done the thing When you wake up on a morning, you need to meditate, walk the dog, go to work. The longer that it takes to meditate, the more times you have to have the thought, I still need to meditate today. That is a very effortful thing to do. And this is like a protracted version of that. There was this from Matthew Syed in the This is back in 2020. Psychologists have conducted experiments to shed light on why people lose or at least suspend rationality. One experiment asked people to imagine going to a doctor to hear an uncertain medical diagnosis. Such people were significantly more likely to express the belief that God was in control of their lives. Another asked participants to imagine a time of deep uncertainty when they feared for their jobs or the health of their children. They were far more likely to see a pattern in meaningless static or to infer that two random events were connected.

[00:30:00]

This is such a common finding that psychologists have given it a name, compensatory control. When we feel uncertain, when randomness intrudes upon our lives, we respond by reintroducing order in some other way. Superstitions and conspiracy theories speak to this need. It is not easy to accept that important events are shaped by random forces. This is why, for some, it makes more sense to believe that we are threatened by the grand plans of maligned scientists than the chance mutation of a silly little microbe.

[00:30:30]

Yeah, absolutely. I think it explains so much about why we dramatize reality. We tend to turn events into stories because it's much more orderly. If you try to comprehend the world as it actually is, your mind will be overwhelmed. There's so many variables going on all over the world that we have to reduce things down to simple patterns, which we call stories. In in which we basically we collapse the web of causality down to a single thread. That makes life a lot easier to comprehend, even if it's not completely true what we believe, but it's true enough that we can get on with our lives and just not have to worry about it. So much of our brains- What you're looking for with any sense-making, truth-making system is, I want to be able to move through the world with reliable predictive accuracy of what's going to happen.

[00:31:35]

But really what's deeper than that is, I just don't want to expend that much mental effort trying to work out what's going to happen. And the difference between those two allows this to slip in, which is what Mono thinking is. If every single problem in the world is because of capitalism or the climate change or the libtards or the whatever, if every single problem is due to the same solution, that's because The demand for answers outstrips your ability to supply them. So you just retrofit one answer to all questions.

[00:32:07]

Yeah, absolutely. Again, it's a cognitive... It's an energy-saving mechanism that would people engage in. I think it explains so much of the current landscape, the current online landscape, in particular. It explains tribalism. It's much easier just to... For instance, I saw this It's a really good tweet by Michael Mallis. I think he's been on your show.

[00:32:35]

Many times, unfortunately. Yeah.

[00:32:38]

I haven't got it in front of me, but he wrote something in terms... He said, People don't see the world Most people don't navigate the world by a true and false filter, but by an us and them filter. It's like this true and false is too much of a cognitive demand. They're trying to work what's true and false. It's just way too much effort for most people. It requires statistical analysis. It requires looking at hard data. It requires suppressing your own emotions. There's so much that you need to do in order to actually work out what's true. Whereas if you just adopt a very simple us versus them heuristic, it's so much easier. You can still get on in your life because if you have an us versus them strategy, then you're to be in the same boat with a group of other people who will help you navigate the world and they'll become your allies. It's just so less cognitively demanding to do that. Pretty much everything about our mental architecture is configured to this system because that's how we evolved. When we were hunter-gatherers, we lived in tribes and we engaged in tribal warfare.

[00:33:56]

Everything that we've just been talking about, this pattern matching and everything, is all in the service of tribalism, ultimately. We will see the best in what our allies say, and we will see the worst in what our enemies say. We'll interpret it in the worst possible way. We will see signs in the clouds that pretend that God is on our side or whatever. He's on our side and he hates the enemies. Whatever it is, we'll find patterns that justify an us versus them attitude naturally. That's what comes naturally to us. It also explains why we see things in terms of drama rather than data. I think this was one of my other concepts that was talking about Compassion Fade. It's this idea that there were experiments that were conducted in which people, they basically engaged in these appeals for charity. What What they did is it was like a campaign for funding for charity. They had two different ways of doing it. One way was based on presenting famine statistics and hard data, and the other was based on presenting the story of a single starving girl. The people tended to donate a lot more to the girl.

[00:35:20]

And the reason for this is that the hard data is alien to the human brain. And statistics is just something that we're not... Our brains are not formatted for that data analysis. We're just not... It's too much effort. It requires too many calories and too much time. So what our brains do is, again, we collapse the web of causality. We collapse all the variables into a single thread, a single line, a single linear vector, which just has a beginning, a middle, and an end. The girl is starving. She needs your help. You give her your help. She is no longer starving, and therefore you've saved. You saved a girl, and then that's it, and then you're a good person. That's how we collapse the whole world down to these single narrative threads. It just makes... Because obviously, we think in the language of story, if you want to convince people, that's how you've got to appeal to people. You've got to... Statistics aren't going to help. You can rattle off all the numbers you want. The bigger they are, the more alien they are, and the less they'll be believed, the less they'll be really comprehended.

[00:36:26]

You get the story of a single girl, and you present her story in a narrative sequence in the way that people tell stories. You could use a three-act structure, you could use the hero's journey, whatever system you want. But as long as it's a narrative thread, a single narrative thread, you'll reach a lot more people.

[00:36:45]

Yeah, we're not donating a million times more money or feeling a million times worse when we hear the story of a million kids compared with the one of the single kid. In fact, it's probably the opposite, that pulls on our heartstrings. The personnification of of data and stories. And you can see this, the charity example is perfectly right. They are split testing into oblivion what the most effective way to pull on people's heartstrings is. They know. So if you want to find out how to motivate people's behavior, just watch a charity advert, because they're not doing the thing that doesn't motivate behavior. They're doing precisely the thing that motivates behavior. And they'll have had behavioral scientists, behavioral economics guys. They'll have had Rory Sutherland will be in there, and the copyright is all the rest of it, split testing everything. And that's what they've arrived at. Right, next one. Preference falsification. If people are afraid to say what they really think, they will instead lie. Therefore, punishing speech, whether by taking offense or by threatening censorship, is ultimately a request to be deceived.

[00:37:49]

Yeah. I think this is another reason why there's actually a distrust in institutions, because they've tended to react to criticism by essentially censoring people. But it's censorship is based on a very outdated way of operating. It's based on a very outdated information architecture censorship. So censorship would have worked very well 100 years ago when there was a centralized authority which passed information down to everybody, whether it was via printed leaflets or television screens. Information was very central. It was very centralized. But that system no longer works because the reason it worked in the past was because since the authorities provided a single system of information. For instance, think about the TV. In the UK, the TV tended to only have four channels originally when I was very young. Those four channels all tended to have the same narrative. If you wanted to censor certain information, You could just basically pass a law because this was broadcast media, so they were beholden to government interventions. You could pass a law saying that the four channels are not allowed to talk about this. Therefore, now, none of that information is going to get beamed into people's homes.

[00:39:16]

Now, nobody can ever know what that information was. But that centralized information structure no longer exists. All information in the West, at least, is decentralized. Or it's decentralisable in the sense that somebody can pick up on anything now and make it go viral. Now, censorship doesn't work. Now, what happens is people are well aware of what's being censored. You have this thing, obviously, the Streisand effect, where when people learn what's being censored, then they want to know what that thing is even more. In the past, the further back into the past we go, the less likely the Streisand effect was because people wouldn't even know what was being censored since information was centralized. But now, because information is everywhere, that information is going to leak out. People are going to know what's being censored. People are going to know. Even if they don't know the precise thing that's being censored, they're going to know what information is being censored from them because somebody's going to spill the beans somewhere because of how interconnected everything is. All it takes is just one person to spill the beans, and then that's going go viral. Everybody's going to find out about it.

[00:40:31]

We see this repeatedly. For instance, with the lab leak, going back to the lab leak hypothesis. Immediately, as soon as Facebook and Twitter and everybody else tried to stifle that story, it went viral and everybody was talking about it because it just isn't possible. Hunter Biden is a laptop. Hunter Biden, that's another perfect example. There's many other examples. As soon as one organization tries to censor something, other individuals will immediately raise the alarm. As soon as that happens, everybody now wants to know what that thing was censored. They want to know why it was withheld from them. This is this thing called reactance, sometimes called the backfire effect, where when you withhold, when you say people can't have something, they become even more adamant that they want it. They want it even more. This leads to essentially a backfire. That's what it's called, the backfire effect. What happens is that people will then decide that, hang on a second, if this is being withheld from me, then it's going to obviously... I'm going a little bit, I'm veering off a little bit from the original thing. That's one aspect of it. But another aspect of this whole censorship thing is that when people realize that they can't say certain things, they instead will lie, and they're not going to change their beliefs.

[00:41:58]

Like I said, the backfire effect means that people don't become... If you censor people, they're not going to become less likely to believe that thing. They're going to become more likely to believe that thing. The only thing that's going to change is that if they know that they're going to get banned for saying something, they'll just lie. But it's not going to change their thoughts. In fact, the opposite is happening. And so it's a counterproductive thing to do in the digital age. That's why censorship just doesn't work in the digital age. Because although you can control what people say online, you can't control what they think. In fact, what you do is you make people more adamant to think what they want to think.

[00:42:33]

They become more entrenched in their beliefs. Well, you taught me a couple of episodes ago about the chilling effect. When punishment for what people say becomes widespread, people stop saying what they really think and instead say whatever is needed to thrive in the social environment. Thus, limits on speech become limits on sincerity. It seems very similar to preference falsification. Is there a distinction between the two? Where is the difference?

[00:42:55]

Yeah, so they are essentially the same thing. Maybe the difference would be something of scale. Where preference falsification really refers more to the individual actions. Then you have things like the spiral of Silence, which is another way of saying the same thing. Spiral of Silence is the cumulative effect of preference falsification. So what happens is that certain ideas become more and more verboten over time. When they become verboten, then people don't want to be the first person.

[00:43:23]

What's that word verboten?

[00:43:25]

It's just a fancy way of saying forbidden.

[00:43:27]

Okay, that's cool. Verboten.

[00:43:29]

Yeah. German For some reason, I don't know why I said verbotim, but I could have just said forbidden. That's nice. I like it.

[00:43:33]

I prefer it.

[00:43:35]

Yeah. But what happens is that it leads to a spiral of silence. The more that an idea becomes unsayable, the less likely people are to say it, and so the more it becomes unsayable.

[00:43:51]

It becomes a cycle.

[00:43:54]

It's self-reinforcing. I just don't know what people are thinking, like these organizations, when they think that they can censor information in the digital age, it just very, very rarely works. It might work in a place like China, but even in China, where the government has absolute control, they've got the great firewall, what they call the great firewall. But even that is not enough now. There have been cases now where information has gone viral that the CCP didn't want to go viral because they were trying to stifle it. Even though They do all they can. It just isn't possible because of the number, because of how it fast information travels in the digital age and because of the number of connections between nodes. It's just not possible to use censorship anymore. So any organization that's trying to use censorship. They're using 20th century tactics against 21st century information systems. It just doesn't work. Again, it leads to more distrust of institutions. This goes back to this whole thing that we're talking about with the problem of trust in society, and it leads to more cynicism. All of it, between the backfire effect and the whole cynical thing, it just makes things worse.

[00:45:11]

I don't know when institutions are going to learn this, but eventually they will, hopefully.

[00:45:16]

You end up with this game where they chase their own tail in a little bit. So for instance, you see this with YouTube channels. So a YouTube channel will begin to struggle with plays, and they won't be too sure why. And everybody has on YouTube, when it comes to the way that they frame their episodes and what they do, both content and framing, they have an Overton window that they exist within, and they're not prepared, usually, to go beyond a particular level of boring because people aren't going to click. There's usually an upper bound of click bait-ness that they're also not prepared to go past because that seems hacky. And what will happen is they will begin to skew more and more toward the click bait side. They will use more limbic hijack words, war, battle, imagery, the whole MrBeastification faces. They'll lean more down that. But the problem that you have as you begin to pull that lever more and more to chase ever declining plays, your audience becomes increasingly desensitized to the subtlety that you want them to come back to. So it's a one-way street. When you start to pull that rip code, like Russell Brand's channel, regardless of what you think about Russell Brand, what he says, his content is, I would challenge anybody to say that the framing around his YouTube channel is fair and gentle and reassuring.

[00:46:35]

As someone that talks a lot about love and you awakening wonders and all this stuff, it's like, They are coming for your kids. You won't believe they did What? It's like the most limbic hijack. I'm pretty sure it was his channel that did that image of the Hawaiian laser beam hitting the roof of a thing. I'm pretty sure that he either created or his team created or used like this. Anyway, my point being that you chase that limbic hijack game, and it makes people become increasingly desensitized to the things that you can say in the same way as institutions that feel like they're losing control, increasingly apply more rigorous high levels of scrutiny, high levels of control. And what happens? It drives the trust down ever more. You can't dictate trust top down. It has to be emergent. It has to come out bottom up. But they're chasing more and more. Oh, my God, we need to do more because the trust is declining. And that means that we need to use ever more totalitarian techniques to do this. And it doesn't work.

[00:47:38]

And the fact that they think that it's going to work actually makes it even harder to trust them because they're just so wrong about that. So you ask yourself, what else are they wrong about? They got to be wrong about so many of the things. If they don't understand this basic facet of human psychology, then they're pretty much hard to trust on anything else.

[00:47:57]

Yeah. Hero-stratic Fame. Many people would rather be hated than unknown. In ancient Greece, Herastratus burned down the temple of Artemis, purely so he'd be remembered. Now we have nuisance influencers who stream themselves committing crimes and harassing people purely for clout.

[00:48:20]

Yeah, so this has become a serious problem now, I think. I don't know if you know who Jack Doherty is.

[00:48:27]

I do. This world of IRL streamers, and Jack Doherty, tell me when I get this wrong.

[00:48:34]

There are a few of them.

[00:48:35]

He appears to start fights in person and has massive bouncer/ security guys with him, most of whom seem to be black. And then they will sort out whatever the issue is by punching or choking out the person that Jack just started some beef with. And then the Internet goes completely crazy by saying this dude started on somebody, then got his 6'7 behemoth of a security guy to step in and smash some kid in the face. And now he's getting paid millions of dollars and has a Lambo and lives in LA or something.

[00:49:12]

Exactly. Yeah. And he's not the That's really one. I mean, this is a whole trend. So there's people like Mizi, for instance. You probably know about Mizzie as well, who is the guy who was going into libraries and ripping up the books whilst filming the librarians to see what they would do. And then you have like Johnny Somali, who would go out and start harassing people in the streets and recording their reactions. He actually went to Japan, and it's quite interesting because first he got knocked out. He got hit in the face and knocked out because in Japan, they don't screw around. Then he got arrested, and now he's in jail. At least last I heard, he was in jail. He's in jail in Japan, right? So occasionally there is comeuppance. But most of the time, there is no comeuppance for these influencers. They just go out there and they harass people in the streets and they record it because they know, again, this limbic hijacking, right? They know that by appealing to the worst, most basest impulses of the human brain, they can get a lot of eyeballs. And so they just basically... There's a lot of pressure on young people to have a lot of followers on social media, for instance.

[00:50:17]

They want to be popular. Everybody wants to be the cool kids. One way to get a large following online, if you don't have other talents, is to just be an asshole. Just be an asshole and film people around you. People will get hate followers. They'll get hate audiences who watch them simply to hate on them. I think people like Misie and Jack Doherty have fallen into this strategy. I think Jack Doherty, originally, he was just some He just did some other lifestyle stuff, but he obviously found this niche and he thought, Wow, I'm making way more money doing this. Now he's a millionaire. I mean, he's got a lot of money. He's got a very glamorous lifestyle. At least it appears glamorous. If you look at his Instagram account, he's surrounded by fancy cars and beautiful women and all this stuff. He portrays this lifestyle of I'm success. But really, when you look at what he does to earn that success now, he just goes out there and he just makes life miserable for everybody. This is bad because this is creating, again, this is creating a very perverse incentive structure, fueled by TikTok, again.

[00:51:23]

The Chinese government probably knows that they're doing this, and that they're allowing these nuisance influencers to get a a lot of views on TikTok because they know that it's bad for America, and it's bad for the UK, and it's bad for West in general. But yeah, it's a race to the bottom now where you've got a lot of people competing to be the most nuisance, to be the biggest nuisance, to be the worst possible human being. People who formerly were pranksters, people like Fuzetube. So you'd probably know about Fuzetube.

[00:51:55]

He basically had a full on psychological break on camera, got arrested by Miami police, called the cops on himself, pretended that he was someone had a knife or a gun or something. Yeah, wild.

[00:52:10]

Exactly. And the crazy thing is that we don't even know if this was genuine or not. This could have all been part of, again, just being a nuisance. It might be real, it might not. We don't know because the line between real and fiction is blurring now. For instance, Mizzie said that all of his pranks were planned and stuff. But it's hard to believe that he would go into, say, Asda, he'd go into a superstore, and he would start riding on the disabled trolley things that they have and just smashing shelves and stuff, and that the supermarkets would actually allow for them to do that. I don't believe that. But a lot of them will say stuff like that to defend themselves if they get into a lot of hot water. Ultimately, what this does is that this creates really bad incentives for kids. Because if you think about in the In order to be... At the dawn of YouTube, for instance, in order to get a big following on YouTube, you tended to have to do something that was extraordinary in some way, and extraordinary in a positive sense. You tend to have to be talented at something.

[00:53:13]

The first big YouTubers tended to be musicians or athletes of some people who had some skill. But very soon people realized that you could actually develop just as big of a following by having zero talent and just being a nuisance, just being an asshole to people. Once that happened, this nuisance influencing went viral. It's essentially a race to the bottom now, where people are competing now to be the worst possible human being, which really sets a bad precedent. It sets bad incentives for other young kids watching this, because when the kids watch it, they say, Oh, I want to be like Misie. I want to be like Jack Doherty. I want to have all these fancy girls, all these fancy cars. I want to be like that. I'm going to learn how how to be an insufferable human being.

[00:54:03]

That person is bringing no value to life, and they're getting rewarded for it.

[00:54:08]

People respond to incentives.

[00:54:11]

Yeah, they respond to incentives. And if you say rather than working really hard at a thing consistently for a long period of time and accumulating skills and making yourself worthwhile, the bottom of the brain... It's the reason, I think, in part, that there is some distaste against Silver Spoon, Dynasty Children, and OnlyFans influences that there's something unfair. It feels like, well, you got that, but you didn't work for it. And in a meritocratic system, which is what we've got, that's always going to get people's backs up. I have to work harder than this person to get less. How can that be fair? Oh, well, it's because they were given a privilege that I didn't get. That seems unfair. It's because they're prepared to compromise their morals in some way that I see as I wouldn't do. Therefore, I am somehow superior to them. There's this Puritan nobility that gets associated with it. But when we're talking about nuisance influences, which I think is a phenomenal term I've never heard of before. And dude, that first sentence that you put, many people would rather be hated than unknown. Just brilliant. And I know that you've got two books in the works, one of which you may have submitted, but I can't wait for both of them, man.

[00:55:22]

All of the time, I watch very similar stuff to what you watch, and yet what you're able to pull out of it is significantly more in-depth than me. So I'm very, very excited for what you've got coming up.

[00:55:35]

Thank you.

[00:55:36]

Yeah. So I've got one. Here's one that I made earlier. So toxic compassion. In a world where our opinions have been separated from our deeds, appearing to do good has become more important than actually doing good. The prioritization of short term emotional comfort over actual long term flourishing motivates people to say the things which make them appear caring and empathetic, even if they result in negative outcomes over time. And this is seen most obviously, in support for the body positivity movement. Rather than make someone feel uncomfortable about their weight, you would say that weight has no bearing on health, even if that encourages people or discourages them from losing weight, which results in worse outcomes over the long term. Same thing could have been seen for defund the police, that rather than talk about some of the challenges that are faced by different groups when it comes policing, you say that all police are mistreating minorities. Therefore, the police should be withdrawn, even if the actual outcome over the long term is more poor policing and more negative outcomes for those precise minorities that you were looking to protect in the first place.

[00:56:49]

Yeah, absolutely. So this brings together quite a few very interesting and informative ideas, one of which would be luxury beliefs, which I think you alluded to at the end there. Also my idea of the opinion pageant, where the whole thing about the social media has caused us to overvalue opinions as a gage of character. We're judged more by what we say them by what we do. This goes to what you were saying initially about how it's all about looking good rather than doing good, which, again, echoes what Elon Musk said, I think, in a talk, I think, with the New York Times a couple of weeks ago, where he just expressed a bit of outrage at how corporations are trying to look good but not actually doing good. I think this is one of the key concepts to understand the digital age, where because we now have an image oriented economy where everything, your success in life is based on how you appear to others now more than ever, because we're all... Because social media is where people come to promote their stuff, whether you're a corporation, whether a politician, whether you're an influencer. Everybody's on social media trying to promote themselves, trying to show why their brand is the brand that you should buy into.

[00:58:13]

And part of this is this whole social game, this new social game. I mean, obviously, there's always been a social game as long as there's been a society, but it's been pushed to the forefront by the fact that the vast majority of our lives now are spent trying to appear a certain way to people in terms of just on social media. It really explains so much of everything from cancelation to the kinds of politics that we have now, polarization and even disinformation. All of these things really ultimately come down to people trying to look as good as they can rather than trying to do as good as they can. People are peddling theories theories that, again, the peddling theories that they're going to hijack people's brains and scare among them, or they're trying to convince people that they're morally superior, so they'll post their luxury beliefs online. I think that it's hard to really work out how we go from here, where everything is image-oriented and things are becoming more so. I think that ultimately, I think there may be some... I mean, we're seeing it already where we've seen it with there's a backlash to people just going against looking good, trying to people counter-signaling.

[00:59:47]

There's been a rise of counter signaling. I think that Trump's election in 2016 was a form of counter signaling, where people elected the most obnoxious outwardly, somebody who just made no effort to even appear good, or at least they did it in a really obnoxious and overbearing cartoonish way, almost as a parody of the society that we're living in. I think that was a counter signaling. But I think that, yeah, there's been the rise of vice signaling as a response to this prevalence of virtue signaling. But even vice signaling is where people will outwardly just say things that they know are going to upset people. You could even say that this nuisance influencing is a vice signaling where people are like, I don't care. I'm over and above the morality game. I don't have to appear good. I can just be the worst person possible. People like Andrew Tate, for instance, who have developed massive followings by saying the opposite of what is considered good by the majority of society. You see even Elon Musk. Elon Musk is counter signaling very, very strongly on Twitter a lot of the time, where he will say things that are the complete opposite of what we've been taught, we should say, by the New York Times, by the Washington Post, by the World Health Organization, all these other mainstream organizations.

[01:01:08]

They tell us that we should be saying these kinds of beliefs. We should be portraying this person. This is how we should be to be a good person. Then you've got these rogues like Elon, like Donald Trump, like Andrew Tate, who are basically saying, No, screw that. Let's do the opposite of what they say. That's a backlash. But in a strange sense, this vice signaling is itself a virtue signaling, because it is signaling to others that you are way above all of this silly bickering that people are engaging in.

[01:01:40]

It's the same reason why Yeezy's have got progressively more ugly over time. And if you actually look at what counts for a lot of super fashionable streetwear at the moment, it's almost like hobo chic. Well, that's because you're saying, look, I have so much surplus cool in me that I can basically dress what is so orthogonal to what other people think is cool and still be cool. That's how cool I am, which, broadly, because of how cool is, it's just so subjective. If you call something cool and if enough people agree, it is, and no one can falsify whether it is or not. But yeah, this toxic compassion thing I've been playing around with for ages. And the interesting bit is that second part, the prioritization of short term emotional comfort over long term flourishing. Saying things, you're totally correct. Living life online has caused us to flatten down how we are judged to be about proclamations rather than actions. And it's the reason that people were bullied about whether they did or didn't post a black square. It's about whether you do or don't have Ukraine in your bio. It's about whether you do or don't have pronouns in your email signature, all of those things.

[01:02:49]

And yeah, the addition- Again, it's perverse incentives. I think that's probably the running theme of today's discussion is we're creating all these perverse incentives for people to follow. And that's essentially what's driving these behaviors is that we're rewarding, like you said, we're rewarding the short term gains over the long term, the actual proper gains, which are the long term gains. We're trapping ourselves in these compulsion loops. Compulsion loops are this idea from gaming and gamification, where you trap people in these short-term cycles of effort and reward that can often lead them away from what they should really be doing. We're all getting trapped in these compulsion loops, whether it's being a nuisance, being an asshole online, or whether it's being a virtue single or online. We're all chasing these short term rewards at the expense. Well, not all of us, but many of us are. I like to think that you and I are a little bit better. But we're not completely immune to it.

[01:03:54]

I mean, dude, think about how many times anyone that's ever been on a plane knowing that they don't have connection, gets their phone out, swipes up, cycles through a bunch of apps, even knowing that nothing can have happened. It's a compulsion. It's ingrained in there. Right, next one. Tarswell's Razor. Emotion causes bias, but it also causes motivation. As such, we're most likely to act when our judgment can be trusted least. Solution: don't trust thoughts you have while emotional. Instead, pause and wait for the feeling to pass before acting.

[01:04:30]

Yeah. I think everybody is not a single person, but is a collection of selves. Some of these selves are much more representative of who we are at our core than others. I think emotion can bring out outside of us that is not really us, and it can cause us to act in ways that we would later regret. I found this myself. I don't really do it anymore, but back in the early days, 10 years ago, sometimes I'd get angry online if somebody said something nasty to me, and I would be spiteful and I would say something nasty back. I would later read back what I'd written, and I'd just be like, Wow, I can't believe I actually said that. What an asshole I was. I basically was just as bad as them. I should be better. I just realized that that person that is saying those things was not actually me because If I'm later regretting when I'm calm, I'm later regretting what I actually said when I was angry, then it's not really me. One of the things I say is that when you act, when you're emotional, you are an ambassador for your most primitive self.

[01:05:45]

You're basically acting for your most animal self because you're engaging your reptilian brain. Any decision that I've made when I've been emotional has pretty much turned out to be a bad decision, or at least it's been suboptimal. I always make make better decisions when I'm mentally balanced. I think that's true of pretty much anybody. If you send that email in the spirit of the moment, more often than not, you're going to think, I could have waited that better. I could have waited that a lot better. What I do now is, it's not like I'm a robot. I do feel emotions. If somebody says something nasty to me online, I get an urge to just be nasty back. I get it. We all do. We're all humans. But I never do it now. I never I'm never spiteful. If I reply to somebody, and sometimes I'm snarky, I am snarky, but I tend to do it in a way that I think is more productive. But what I always do is if I'm feelings particularly emotional, I'll always wait for that emotion to pass because it will pass. It's amazing how often when you let that emotion pass, and then you consider what you would have done when you were emotional, and you realize how idiotic it would have been.

[01:06:56]

That's happened to me so many times that I actually am afraid of acting when I'm emotional because I just realized how demented I am when I'm emotional. I think this is true of everybody. Yeah, it is deranging. Because I mean, emotions ultimately are the opposite of rationality. They are a shortcut. There's this thing called the effect heuristic, which is this idea that emotions evolve. I mean, I would say emotions evolve for two purposes. One of them is they evolve for motivation, and the other is that they evolved for decision in low information environments. Your gut feeling, for instance, your gut feeling is how you make decisions when you don't have enough information. The thing with gut feeling is it's actually often wrong. People will say, Oh, I swear by. I've got a really good gut. I've got a really good gut feeling. I always trust my gut. But what they're doing is they're engaging in confirmation bias. They will usually remember when their gut feeling was right, but they won't remember when their gut feeling was wrong. They're obviously going to naturally be skewed towards believing that their gut feeling is more accurate than it actually is.

[01:08:02]

That's why I don't really trust it so much. There's a certain thing called intuition, which is a little bit more than gut feeling, which is more something that you've learned to trust over time. It's something that you set and cues that you just see, and then from that, you can build a full picture. But just relying on emotion alone is usually not a good strategy for decision making, because, again, emotion favors short term impulses. It favors short term compulsion loops over long term compulsion loops. This is why I think you should always, if you're going to make an important decision, just wait for the emotion to pass. It will pass. Most emotions don't last very long. Most emotions last a few minutes, and then they usually weaken and they fade. That's all you've got to do. Just wait a couple of minutes and then see, compare your actions when you're not emotional to how you were going to act when you're emotional, and you will realize there's a massive difference. And that way, you'll prevent yourself from many regrets, I think.

[01:09:10]

Semantic stop sign. One way people end discussions is by disguising descriptions as explanations. For instance, the word evil is used to explain behavior, but really only describes it. It resolves the question by not creating understanding, but by killing curiosity.

[01:09:30]

Yeah. We see this online a lot, again, with people calling other people names in order to dismiss anything that they've said. An example of this might be calling somebody a bigger, saying, Oh, you're a bigger, and stuff, and basically saying, Oh, why does he feel this? Why does he think that? Oh, because he's a bigger. For many people, that's enough. Oh, okay, he's a bigot, so I don't need to listen to what he has to say anymore. But really, what is bigotry? Bigotry is not an explanation for behavior. It's a description of behavior. It's a description. Basically, it's a statement that somebody is prejudiced towards somebody. It's not really... I mean, you could use it as a very shallow explanation, but it doesn't really explain much. If you really want to know, if you really want the explanation, then you've got to delve a little bit deeper. You've got to a bit further back and you've got to say, Okay, so this person is a bigot. That's a description. Now we need an explanation for why is that person a bigger? Why would they say that thing? Then you would say, Oh, okay, it could be many things.

[01:10:43]

For instance, let's use an example of classical bigotry. Somebody might, for instance, hate immigrants. They might say, Oh, I hate immigrants. I don't want these boats to just keep coming to our shores or whatever. The standard response from many people in positions of power is to say, Oh, that's just bigoted. Move on. Next question. But if you really want to understand, you've got to ask yourself, Why is this person bigoted? It may be a pretty enlightening answer. It might be that they had their jobs taken away. They might have their job taken away by immigrants, and now they're out of work, and they're on the doll or whatever. They're on welfare or whatever. Their life is... All their plans have been destroyed by this fact that they've been superseded by somebody from another country. Or it might be that their family member was a victim of a crime by an immigrant. If you can actually go past the instinct to dismiss somebody by disguising a description as an explanation, then you can actually get to the real explanation, and then you can start to actually resolve the question. You can actually say, Okay, well, so if this is the case, then I can go out there and I can convince this person that, hang on a second, immigration, it might have taken your job, but some immigrants also create jobs or whatever.

[01:12:10]

I'm not going to go into the hole, whether immigration is good or not or bad or not. But this is just an example of what somebody could do. You could maybe, if you were interested in getting people to accept immigrants, if you were one of these people, you could basically... That's what you could do. You could actually, instead of dismissing them and making them hate you even more and hate immigrants even more, which is going to happen. If you dismiss somebody's concerns, they're only going to react. Again, what we talked about earlier, reactance, a backfire effect. If you tell people that their opinions are bigoted, it's not going to stop them from being bigoted. It's going to make them more bigoted. They're going to start thinking, Oh, there's a conspiracy now to stop. There's a conspiracy by the Jews to flood the West with immigrants and all this. These people are calling me a bigger because they're trying to destroy my life because they don't want the truth to come out. It's going to create, or it's going to basically just have a negative effect for everybody. It's just going to make things worse for everybody.

[01:13:05]

That's why these semantic stop signs are bad, because they don't resolve the question. They don't solve anything. They just make the problem worse. That's why I don't call people racist. I don't call people bigoted. I don't call people transphobic. What I do is I might call something that they've said bigoted. I don't really even do that. But if I were, if I were going to use the word bigoted, because I don't like the word bigoted I feel it's overused. I don't like the word racist. I feel it's overused. I don't think that these words really mean anything anymore. But if I were going to use those words, I wouldn't call people racist. I wouldn't call people bigoted. I would call their actions bigoted. I would call their actions racist because I think that's much more helpful. Because if you call somebody bigoted or call them racist, or you call them transphobic, or sexist, or misogynistic, or fascist, or any of these other words that are thrown around so casually these days, if you use those terms to describe a person, you're essentially implying that that person is irredeemable, that you can't help that person because they're a lost cause, because they're just a bigot.

[01:14:08]

Whereas if you call their actions bigoted, if you call their actions racist or transphobic, and I'm not advocating this, but I'm just saying it's better than calling them a bigger. Because if you call their actions bigger, that actually allows you to still see them as a human. Because I feel that calling somebody a racist is actually dehumanizing in a sense. Especially when you consider that term terms like fascist, Nazi, a lot of these terms are used to paint people as the worst possible human beings. Because when you think of the term fascist, when you think of the term Nazi, racist, when you think of these terms, you think of pretty much the worst human beings. You think of the Nazis, the Nazis of Germany in the 1930s. You think of the Ku Klux Klan. You think of really bad human beings. You think of people who lynched black people. You think of the worst human beings. It's dehumanizing in a sense because you're portraying people as villains. You're saying this person is a villain, so I can just discount everything that they say. Whereas when you call their actions bigoted or whatever, then you can say, Okay, well, we can actually convince this person to behave differently.

[01:15:16]

I think these semantic stop signs are a very harmful aspect of our society. That's just one example that I just gave you. We have many other examples in which these kinds of questions that people have are just dismissed by disguising descriptions as descriptions. Sorry, disguising descriptions as explanations.

[01:15:42]

Yes. Max Contentraiser. This This is from mutual friend George Mac. Would you consume your own content? If not, don't post it. And it's just the easiest way to work out whether or not what you're producing is actually something that you should continue producing. And I had a similar idea, a tangential idea, postcontent clarity. If we presume that your body is made up of what you put in your mouth, your mind is made up of what you put in your eyes and ears, your content diet should be spirulina for your soul, not fast food for your amygdala.

[01:16:18]

Yeah, 100 %. I agree. I'm very selective now about the content that I consume. I used to be very careful. So I used to just mindlessly If I could actually browse my Twitter feed and just whatever got my attention, I would follow it. But the thing is, I found that that just leads to a lot of wasted time and very low information. Social media is not very information dense. I mean, your feed is probably a lot better than mine because You only follow about 100 people, whereas I follow 600. That's why I hardly ever browse my feed. I usually just use lists. But I do absolutely go by that razor because I find that it's a good heuristic to use. One of the reasons why I originally wrote those mega threads, started writing those mega threads on Twitter, was because they were the kinds of things I wanted to read. I wanted to learn about the world, and I thought, Well, this would be a good exercise for me. I thought, If I can get 40 concepts that are very useful that I think can help people understand the world better, that's exactly the content I would love.

[01:17:21]

But nobody was doing it at the time that I was aware of. So I thought, Okay, I'll do it then. I'll be the person to do it. And it was interesting It's interesting because it's 2020. I think it was right at the beginning of 2020 that I posted the first mega thread, and it went viral. I just realized there was so many people that actually wanted to see that thing, but nobody had thought of it before. Even though Twitter had been around for quite a while, as far as I could tell, anyway, nobody had thought of it before. But what was quite interesting was in the aftermath of that, there were a huge number of people who did exactly the same thing that I was doing in order to replicate the success I had with that first mega thread. I just saw them all over the place, people doing their own threads. I've got nothing against people to do that. I don't think I've got the sole rights to do it or anything. But it was interesting because I think it just made something click in people's minds where they thought, Wow, this is a great idea.

[01:18:17]

Why didn't I think this? And then they did it themselves. It showed that if you do the kinds of things that you want to see, if you create the kinds of content that you want to see, then because you're human being and you share 99% of your DNA with every other human being, that there's going to be a large number of other people that will have similar enough interests that they will actually want to do what you want to do. I suppose this actually fits in quite nicely with one of the other concepts in one of my recent mega threads, which is hoteling's law, which is basically this idea that people will tend to copy whatever's success successful, whether we're talking about business, in politics, in art, or whatever. As a result of that, content tends to converge. It tends to become more similar over time. You see it with TikTok. There were a very small number of people like Bella Poarch and Charlie D'Amelio who became extremely popular on TikTok. They're basically the most viewed people on TikTok. All they did was lip syncing and dancing. Now, I have no interest in watching that stuff, but evidently, they thought it was fun.

[01:19:34]

Maybe that's the content they wanted to see. But somehow that stuff blew up. As a result of that, it started a whole new genre of TikTok video, where you just had people lip syncing and dancing, and everybody was doing it now. It decreased the value of doing that. It's the same with politics. If you look at, for instance, in the UK, you had the political parties, Labor and Conservatives. If you look at, say, the postwar period, you had Clement Attlee versus Winston Churchill. Clement Attlee, he was a socialist. The Labor Party was a full-on socialist. But I think Winston Churchill's Conservatives were proper Conservatives. They were like Berkian Conservatives. Over time, the two parties have moved towards the center. So labor has become more right wing and Conservatives have become more left wing. And it's interesting because the right wing party of the UK, Conservatives, are now to the left of the left wing party in the US. And the reason this has happened is because of hoteling's law, because what happened is that when certain politicians in both these parties appealed to the center, they had a huge amount of success. And the other people saw this and thought, Wow, we better capture the center, get some of these people's audiences from them.

[01:20:50]

So these two parties gradually began to try to eat the center, eat as much as the center before the other parties got the center. They moved closer and together and they converged. It's the same with content creators. They tend to converge over time. The great thing about the max raiser that you just spoke about when you create content that you yourself would want to see, is that you can avoid hoteling's law because you're creating content that you want to see. You're not chasing what everybody else is doing. You're doing the opposite. Because the interesting thing about hoteling's law is that the more it happens, the more these content creators or these politicians or whatever we're talking about, the more their content converges, the more value there is in being different and in actually trying to do something that you want to see. For instance, going back to my mega threads, I saw a lot of stuff about mentor models, but it was not It was not portrayed in the way that I decided to do it. It was more about getting a single mental model than doing a thread about it. Loads of people were doing that.

[01:21:52]

I initially was going to do that, but then I thought, I'm just doing the same thing that everybody else is doing if I do that. Because that form was originally really popularized. I think people like Tim Ferrace, they popularized that stuff and they became very successful with it. It was such a good formula that a lot of other people tried to do that. I thought, Well, why don't I do something different instead? Because I decided to just go against that. I thought, I don't want to see this. I don't want to actually consume this content because I've already consumed it, because so many other people are doing it. I thought, Let me do something a little bit different and let me just create a thread of various different concepts. And so that was different enough that it actually allowed me to go viral when I did it. So it's a very good strategy to chase not what other people are doing, but what you want to see, I think.

[01:22:37]

I agree. I understand some people would say that if you copy successful content, you avoid making stuff which is absolutely atrocious. Like your instinct could just be completely off kilter. Like you're aiming at the target at the north and you shoot south, basically. So there's a a base layer. There's a foundation of understanding writing. For instance, if you were going to do the thing, if you couldn't write, it doesn't matter how good your idea is, it's not going to work. If you don't understand how Twitter works, if you don't understand the concept, if you can't portray them in an interesting way, there's a lot of things that you need to get in place. But once you've got basically the rules of the game, you can then start to maybe step outside and completely break them. So for instance, with these listicle style episodes that I do, and they're some of my favorite, and I think that they keep the episode moving really quickly Lee, I know that me and you, when we finish these episodes, feel like we've been in a fucking fever dream for two hours. I'm like, How's it been two hours already?

[01:23:36]

And I did them with Hormozy. I've done them with Sean Poury. I've done them with George Mc. I've done them with yourself. Going through a list of things because that would be fun to me. If I left this pressure hose of insights about human behavior, I would have left an episode going, wow, that's cool. And yeah, it was something that was my instinct. Now, that being said, it's framed in a way that we know works for the algorithm them. It's presented from a tech perspective in a way that we think is engaging. Dean edits these things in a way that keeps stuff engaging. So again, we're playing within the physics of the system in some regard, but we're also trying to give our own spin on something with something new. And Douglas Murray has said this as well, like, follow your instincts. Your instincts are a pretty good guide. It allows you to be unbelievably unique. If you are interested in something, there is a pretty good likelihood that some non-insignificant minority of other people are also interested in it. And given how broad the access that you have on the Internet is now, you only need some non-insignificant minority of other people to have a massive audience, like millions of people.

[01:24:41]

Absolutely.

[01:24:42]

Yeah, that's one aspect of it. And another aspect is that if you are genuinely passionate about something, if you're genuinely interested in something, you will make it interesting to other people because you'll be passionate about it. If you're just chasing metrics, if you're just looking at what other people are doing, then you just copy them, your passion is not going to be in You're not going to be interested in it. You're just going to be interested in getting as many views or whatever. You'll be chasing the wrong metrics. The right metric is interestingness, interestingness to you. Because if it's interesting to you, you'll make it interesting to other people because your passion is contagious. I think that's the best advice I'd give to somebody who wants to make a start in just being an influencer or whatever. It's just to just find what interests you. Don't try to find what you think other people are going to find interesting because no matter what it is, even if it's something like stamp collecting or whatever, right? If you are passionate about it enough, you will make it interesting to other people.

[01:25:39]

Dude, so me and my housemate, Zack, love these videos of guys that watch Rallycross. So it's like Colin McRay, four-wheel drive cars going through a dirt road. And these blokes will have gone up to fucking air in Scotland or the Quebec or something, and they're stood in a poncho So under an umbrella in the pissing rain, basically in the middle of a forest to see to see that, the 0.3 of a second. And then when these cars go past, they all turn to each other and go, and we love watching it because watching anyone get fired up about anything makes you feel fired up as well. I love people that love things. If you follow your passions in that regard, you're always going to remain on the right side.

[01:26:28]

And you'll also be motivated as well. Yeah, Of course. Another thing is you'll be more motivated. Yeah, absolutely.

[01:26:33]

All right, next one. Epistemic luck. You know that if you'd lived in a different place or time, read different books, had different friends, you'd have different beliefs, and yet you're convinced that your current beliefs are correct. So are you wrong or the luckiest person ever?

[01:26:52]

Yeah. And this is one that gets me a lot because I find that a lot of my opinions are in sync with the society in which I live. I have broadly... I'm quite liberal, in a sense. I wouldn't say that I'm actually a liberal, but I have very liberal views. And we live in a liberal society. I find that it's hard to extricate my beliefs from the time and place in which I'm living. I always wonder what would I believe if I'd lived, if I'd say, being born in of India, for instance. If I'd been born in India, what would I believe there? If I'd been born in the 19th century, what would I believe? If I was born into a rich family rather than a poor family, what would I believe? All of these things make me question my beliefs because I think to myself, my beliefs seem to be quite local to where I'm living in time and space. I think this is very true of religious people in particular. If you What about, say, a Muslim person? A Muslim obviously believes things that were originally a belief system that was invented in the seventh century Arabia.

[01:28:12]

But what would happen if that person was born before the creation of Islam? If they had been born in the second century, would they still be a Muslim? Obviously not. Would they still have Muslim principles? Obviously not. This is interesting because Islam is supposed to be a religion for all times and all places. That's its main claim to Fame. Although there's this concept in Islam called Jahilia, which is about basically this idea that before the coming of Islam, there was ignorance. Still, you've got to ask yourself, Surely that means then that being born before the creation of Islam means that you're not going to have the advantage in God's eyes of somebody who's born after the creation of Islam, because the person who's born after the creation of Islam is going to be more likely to follow Islam than the person before. There's this weird disparity there. I think you could extend this to any belief system. Communism, for instance, as well, even. If you were born before the creation of communism, you're not going to be a Communist. Would you be different? Would a Communist be different if they were born before the creation of communism?

[01:29:26]

Of course they would. How can they be sure that they're A belief is right? Do they just happen to be born at the right time in history to have the right beliefs. That's why my solution to this problem is to try to find beliefs that are as universal as possible. One way that I can gage whether a belief is a good one is whether I can view myself as having believed that no matter what time or place I was living in. It's not a perfect system because obviously, knowledge is constantly growing. Obviously, I wouldn't know the germ theory of disease a thousand years ago, but I do believe it now. I think I'm pretty justified in believing in the germ theory of disease, given the evidence for it. But as a general rule, I think it's a pretty good one where you think about, is this belief a product of the society in which I'm living, or is it one that can be applied to any time in any place? The thing with the German theory of disease is even though it didn't exist a thousand years ago, it would still have helped me a thousand years ago.

[01:30:26]

It would still have been beneficial to believe in it a thousand years ago. I think that's a good heuristic to use in order to determine whether your beliefs are real. It doesn't matter if they're a product of your time. What it matters is, will they be useful in any time, in any place? That's the universality of a belief. If your beliefs wouldn't work very well a thousand years ago, then that's a good sign that you're probably just imbibing what you're learning from the present day. You're myopically trapped in the present moment and in the present place. So yeah, I think universality of applicability is what you want to look at. So can you apply it universally? And if you can, then that's a sign that it's a good belief.

[01:31:11]

So Rob Henderson put something in his newsletter a couple of weeks ago, and I gave it a name. So I've come in at the end and thrown a pretty bow on top of something which I really like as an idea. So I called this the Intellectual's Treadmill. Some thinkers, as they rise in prominence as a result of their interesting ideas, gradually devote less time to reading and more time to lucrative opportunities. This is a mistake. They are neglecting one of the core habits that made them so interesting in the first place.

[01:31:41]

I think I'm guilty of this. I tend to read less than I used to, but I definitely agree with it in general. I think one of the problems with a lot of thinkers is that they tend to just resort to the same set of tools that got them famous. A classic example of this would be somebody like Nassim Taleb. He became famous through a handful of concepts like antifragility, the Lindy effect, skin in the game. These, obviously, they're great ideas. They're really good ideas, and that's why they became popular. But since then, what I've noticed in him is that he tends to try to apply these concepts to pretty much anything that happens.

[01:32:27]

This is the Golden Hammer, isn't it?

[01:32:28]

The Golden Hammer, yeah. We've spoken about this before, the golden hammer. It also links in with another thing called the toothbrush problem, where the toothbrush problem is basically where intellectuals treat theories like toothbrushes. They don't want to They just want to use anybody else's. They just want to use their own.

[01:32:49]

That's the opposite of me who just shamelessly repurposes everybody else's.

[01:32:55]

Well, I think that's the healthiest way to be. I think oftentimes, When you just rely on your own theories, you're just closing yourself off from so much learning and so much knowledge. It's why I try not to do these things. But it's hard because when you do become famous for a certain idea, you develop a certain brand and you want to overstate the importance of your ideas. So obviously, Taleb got very famous from his three major ideas, and tail risk and all the other ideas that he's come out with. So he's incentivised to, instead of learning new ideas by reading books, to just double down on his own ideas by just constantly writing about them. And so that's obviously going to get him more clout because the more important his ideas seem, the more important he seems, and the more opportunities he's going to get to expound upon various social issues and apply them to his golden hammers to those.

[01:33:56]

I remember hearing Peterson a while ago, it was probably five years ago, he was on Rogan, and he was really at the crest of this huge growth curve that he was on, maybe just after the Cathy Newman interview, something like that. And he said something along the lines of, I need to take some time to go away because if you are outputting more than you are inputting, all that you're doing is just saying the same things over and over again, and you end up becoming a caricature of yourself, which is dangerous. There's this, I learned from Critical Drincer. Do you follow that guy?

[01:34:28]

Yeah, I watch some of his Yeah, he's funny. Brilliant.

[01:34:31]

So I learned from him that there's four stages to most media movements. So let's say the superhero genre that we've seen since the mid-naughts. There is the introduction phase, the growth phase, the maturity phase, and then the parody phase. What's interesting about that is you can track it perfectly with Thor. So you have this ground-breaking or maybe less so Ironman because he died, I guess, before he could get into parody. But certainly with Thor, you get this ground-breaking one and everyone's like, oh, my God, Chris Hamptons is so ripped. And then you get into growth and it's still developing. Then you get into maturity where it's a little bit more predictable and you've got an idea. Then you get into Love and Thunder, which was the most recent one. And you even saw bits of parody earlier on in it.

[01:35:13]

But where he's the butt of the joke. Jumping the shark, I suppose. Yeah. Yeah. Yeah.

[01:35:16]

Yeah. Yeah. He's the butt of all of the jokes. He's doing a set of splits on the top of a pair of dragons like Jean-Claude Van Damme. Even Doctor Strange, I guess he featured as ancillary character in lots of other things. But he only got two. So he had the first Doctor Strange with Benedict Cumberbatch. He's a phenomenal actor. First one, super sincere in the way that they did it, and it was very meaningful about him. The second one, a zombie version of Benedict Cumberbatch goes back in time to a different universe to tell the Central American daughter of a lesbian couple called America Chávez that she just needs to believe in herself. It's just the most parody of the most parody that you can think. So, yeah, and I think that one of the problems that you get is what Peterson identified. If you are outputting more than you're inputting, you end up just regurgitating ideas, you bastidise them, you don't have anything fresh, you become You become a caricature of yourself. You become easy to be parodied. And that's dangerous. And he was saying, I need to take some time away. It's someone that we can say absolutely has adhered to that.

[01:36:24]

And there's all, dude, it's how are you going to say no to another speaking gig? How are you going to say no to another Joe Rogan experience episode? How are you going to say no to all these things? I get it, right? But someone that definitely has done this was Na'Val, who just said, I did my Rogan episode and I'm now away on sabbatical because I never want to say the same thing twice, and I won't be doing any more podcasts until I have three hours' worth of new things to talk about. Fair play.

[01:36:50]

Yeah. I think Navel is very wise in that he's done this, I think, to avoid audience capture. I think what That's ultimately what we're talking about, because when you have the same set of ideas, there's a pressure on you to continue to talk about those ideas, again, to emphasize their importance. I think Taleb is a very good example of this, going back to him, because I feel he has been audience-caption in the sense where it's now expected that he's going to try to explain things in terms of tail risk or whatever. It's because it's what he knows. I understand why he does it, because it's It's wise to a certain extent to just stick to what you know. But he's clearly a very intelligent man, and he's a man who could learn a lot more about many other things. But he instead just chooses to pretty much talk about the same things again and again. He's doing what Jordan Peterson essentially warned about, where instead of learning... Because Taleb, he's a smart guy, but he's arrogant as hell. He thinks that he has the final answer. He understands things even when he doesn't really have a grounding in it.

[01:38:01]

He thinks he understands IQ, but he makes very elementary mistakes about IQ. But yeah, he tends to just focus on a very narrow field of maths, statistical tail risk analysis, risk analysis, that stuff. He uses a very narrow set of tools. They're very useful tools, but they're very narrow. He uses that very narrow system of tools to explain everything from COVID to polarization to Israel and Palestine. He talks about a lot of these things, often just using a very narrow set of tools. It's weird because otherwise, he's quite an erudite guy, but he just chooses not to progress beyond what made him successful. I see this with a lot of other influencers, a lot of other intellectuals, where they just stick to the thing that made them successful over and over again, as if they're just scared of venturing into new territory. You see it with a lot of anti-woke accounts online now as well, where the same thing is always the case. It's always about wokeness. Everything's wokeness. Everything can be explained in terms of wokeness. You see it on the opposite side where everything's racist. Racism is the explanation for everything. Oh, it's because of systemic racism.

[01:39:26]

It's because of whiteness. It's because of white fragility, all of this stuff. Then you just see the same sets of explanations being used over and over again because these people are not reading new things. They're just regurgitating what was already in their head again and again and again. They're basically being spoon-fed their own intellectual vomit and just recycling it and vomiting out again. And it just degrades. It's like ChatGPT being trained on its own outputs. It's a very dangerous thing. And that's why I think I tried to go broad rather than narrow in on one thing. I do occasionally narrow in on one thing where I write a long read or whatever. But what I try to do is to just keep learning, learning new concepts and new things. I've set up a pretty good thing now where I've got an audience that expect me to write about a wide range of different things, but very, very shallow things. I do write pretty shallow stuff in general, just because I've got so many ideas to cover that I can't go into too much detail. I'm not always shallow. I do sometimes I sometimes go on deep dives into articles and essays where I write 4,000, 5,000 words about a single concept.

[01:40:36]

But usually, I write a wide range of things, but quite shallow, in order to give people ideas for them to springboard their own ideas. That's generally what I like to do. I find that that's a healthy way to approach because it means I'm constantly learning new ideas instead of just focusing on one idea and using that one tool to explain everything, which is a temptation.

[01:40:58]

It seems like this is related to another The other one I got from you, beginner's bubble effect. You cannot learn that which you already know from Epiketus. The most ignorant are not those who know nothing, but those who know a little, because a little knowledge grants the illusion of understanding, which kills curiosity and closes the mind.

[01:41:17]

Yeah. So this would appear to go against what I've just said. It would seem like, Oh, okay, you shouldn't learn just a little thing. You should really go deep into that. But in practice, that's not actually possible. You can't just learn one one thing in loads and loads of detail and not learn anything else. You're always going to be in a situation where you have to learn a little bit. The key to overcoming the big beginner's bubble effect is not to learn more because you can't learn more about everything. The key is to recognize your limits, is to recognize how much you actually know, basically. Once you learn how much you actually know, and that comes from humility and from curiosity, then you're no longer subject to the beginner's bubble effect. The beginner's bubble effect is a product of thinking you know more than you actually do. It usually comes from having a very shallow explanation for something. Because once you have a shallow explanation, you think you have a full explanation. It's just the way our brain works. It kills your curiosity. When you have a shallow explanation for something, it falls your brain into thinking that you understand it.

[01:42:21]

That's where the danger lies. I'm not saying you shouldn't just learn little things. In fact, I actually advocate the opposite. I think you should learn a little about a lot rather than learn a lot about a little, I think you should learn a little about a lot. The reason for this is, well, this goes to Philip Tetlock's work. Philip Tetlock is one of the founding fathers of decision theory, along with people like Robert Cialdini and Daniel Kahneman, they founded the field of rationalism. Tetlock's all about predicting the future. He's basically because the true measure of how rational you are and how much truth you have is whether you can predict the the future consistently. Because only truth allows you to do that. You can't bullshit your way to predicting the future. That's one thing you cannot bullshit. So you have to know the truth in order to consistently predict the future. And that's why he's into the whole thing about super forecasting. He basically found that the people who were most accurate at predicting the future, because he did a series of trials which actually involved the CIA, involved like there was a massive funding from the CIA.

[01:43:29]

He did some pretty crazy stuff in the 1980s, where he basically did these competitions to see who could predict the future the best. People adopted various strategies of various kinds. This phenomenon became known as super forecasting. What Tetlock found was that the people who tended to be the best at predicting the future were not the people who knew a lot about a little, but actually the people who knew a little about a lot. This was because I think there are probably several explanations for it, but I think one of the key explanations is that the people who who know a lot about a little tend to try to solve all problems by recourse to that little, narrow sliver of information that they know really, really well because they feel they're safe on that territory and they don't want to venture outside of it. They tend to try to... They view everything through the lens of what they know really, really well. Whereas the people who know a lot about it, know a lot about a little, they... Sorry, a little about a lot, sorry. They tend to be a lot more generalist, and they are more flexible in their thinking.

[01:44:32]

This is why I would advocate, if you have a choice between specializing in just a small number of topics or learning a little about a lot, I would advocate the latter because that puts you in a good territory to be flexible in your thinking and learn. You can then learn. If you want to know more about a certain thing, you can learn about it. There's a concept called the curiosity zone, which is when you learn a lot, sorry, when you learn a little about a lot, What happens is that your curiosity gets stoked and you want to learn more because curiosity is not stoked by an absence of knowledge. It's stoked by having a little knowledge. Because when you have a little knowledge, curiosity is the desire to fill gaps in knowledge. In order to have gaps in knowledge, you need to have things. You need to actually learn things because a complete absence of knowledge is not a gap in knowledge.

[01:45:29]

You A gap is- Learning something teaches you what you don't know.

[01:45:33]

Yeah. A gap can only exist between two objects. You can't have a gap without... The empty space is not a gap. It's got to be in the middle of two things. If you learn those two things, then you have a gap now. You have a gap in that knowledge. That gap is where your curiosity blooms, basically. If you want to stoke your curiosity, if you want to evoke curiosity in yourself, then the best way to do that is to learn a little about a lot, because that way you'll want to know more. It'll motivate you to want to know more. Yeah, that's what I would definitely advocate is doing. That's why I like to be more of a generalist rather than specializing in a single concept. I think it's much better to do that.

[01:46:21]

Agenda setting theory. Most of the time, what's happening in the news isn't actually important. It only appears important because it's in the news. The public conversation is based on whatever's reported by the press, giving the impression that this news matters most when really it's just what was chosen by a few editors and thoughtlessly amplified by the masses.

[01:46:43]

Yeah. So This is why I don't really read the news very much. I browse it very, very casually, often just once in a while. I don't really read it much because what I found is that 99% of the time, The news doesn't make me any wiser. It doesn't make me any more informed. It doesn't really help me in my day-to-day life. It doesn't help me understand the world any better. It's just something I do for entertainment. I think most news is just that. It's just entertainment. I think it's entertainment that is presented in such a way that you don't feel guilty for consuming it because you think you're learning about the world. A lot of the time, the reason for this is that news is hijacking what we call shiny object syndrome. And shiny object syndrome is a concept, is another concept, I think, from one of my recent threads, where evolutionarily, in our evolutionary history, We evolved. I'll keep saying the word evolve. But basically, we evolved to basically favor new information over old information because new information tended to be more useful. In a low information environment, new information can often be the difference between life or death.

[01:48:04]

So new information, for instance, in a thousand years ago or 100,000 years ago, would be seeing a lion coming out from the undergrowth. That's new information. And that's crucial information. Do you know what I mean? If a lion is coming out of the undergrowth and it's charging towards you, you need to know. So obviously, we became biased towards new information because new information could be the difference between life or death in a way that old information wasn't. We have this bias towards novelty. We're attracted to the new. Anything that's new, we're just attracted to it by virtue of its novelty. News hijacks this evolutionary impulse by providing us with new content. People are always searching for what's new. They're constantly looking for the breaking news, the big bar in red, which say, breaking news, or they're looking for see new tweets or whatever. Click the button, see new tweets, see the latest posts, all this stuff. People want to see what's the latest. They want to know what's the latest. This is a maladaptive desire because in a world where information is must-produced, it's no longer actually valuable to have new information most of the time, because the majority of the new information has been created for one reason and one reason only, and that is to hijack your impulse for novelty, your It's a desire for novelty.

[01:49:31]

It's there to just... Basically, it's rushed out. The information is rushed out. If you look at a lot of the latest breaking news, it's usually wrong because the journalist wanted to be the first person to break the story, so they just rushed it out as fast as they could. And they didn't do their due diligence, and they didn't give you all the facts. And likewise, people want to be the first to retweet this news story and talk about it. And so they'll just hastily retweet the headline without reading the or whatever. So a lot of this news stuff is rushed out. And that's why news is generally not that valuable, because it's often reported impulsively by editors and by journalists. They just say, Oh, okay, this sounds like it might do well online. So let's just post this. Let's just write about this. And then what happens is that people think that because it was reported by the news, therefore it must be important. But it's not. A lot of the time, it's not. A lot of the time, it's there simply to hijack your attention, hijack your desire for novelty, and you're not going to remember it, you're not going to benefit from it.

[01:50:37]

Just think about it. Go to any news page and just look at the top stories. And a lot of the time, it's just not really stuff that... It might be interesting. It might be interesting. It might interest you for a couple of minutes. You might think, Oh, okay, that's okay. But most of the time, it's not really going to be that interesting. The exception to this would be news that's directly relevant to your chosen field. For instance, if you are a biologist and you are interested in curing, let's say you're a medical professional and you're interested in curing cancer or whatever. Then if there's a new vaccine for cancer, which there is, which is an amazing story, then that's obviously going to be interesting news, and you want to know about that. But that's rare, that's very rare. And you usually get that not from looking at the mainstream media. You usually get that from specialized news outlets. So you want to go to science news outlets, which will tell you about the latest breakthroughs in technology. The mainstream media is usually just generalized, just stuff that is just not really going to be of value to many people.

[01:51:48]

It's just going to be there to tickle your desire for novelty. So mainstream media news is generally not that useful. That's why I don't really read it much. I do read it, but only because a lot of people expect direct me to comment on it. If I wasn't a writer, I wouldn't check the news. I would only just check information that's relevant to me. So maybe if I was an investor, I would check stock prices and stuff like that. But I wouldn't check the general news, because the general news is usually just it's worthless. People fall into believing that it's important because it's reported, but it's not. It's just what was chosen by a few editors.

[01:52:26]

Yeah, it's strange. What we click on and what editors know will drive interest and engagement often has absolutely no correlation with something that's important. How many times have we seen left wing woman says that she can't get a man to hold the door open for her. And it goes like super viral online and everyone's got the same take of, That's a conservative. And it's like whatever. It's a slow, medium pitch.

[01:52:57]

It's there to fuel engagement and It's engagement farming. It's basically a lot of it's rage bait. They want to try and make you as angry as possible because they want to start a fight online. Because if they start a fight online, then the two factions that are fighting are going to be inadvertently promoting the story by fighting over it. Then also just stuff that's reported just generally. For instance, if you're an average person, you'll hear, Oh, 30 people died in a bombing in Gaza. It's bad. It's It's tragic. It's horrible news. But most people are not going to ever do anything about it. They're just going to read it and that's it. They're going to forget about it. It's like they may as well have not even learned about it because it's not going to change their life in any way. They're not going to go out there and stop the bombing.

[01:53:48]

Apart from maybe they're a bit more ambiently anxious about the world and the impending general sense of doing.

[01:53:55]

Exactly. It's just going to make them feel bad a lot of the time. There's a negativity bias in the news reporting as well. It was interesting because I think Steven Pinker recently posted a list of 66 news reports that were actually positive. They were positive developments, but they didn't get any traction because they were positive rather than negative. The negative stories always get way more engagement. If you constantly are consuming news, you're going to develop this more cynicism. You're going to develop great cynicism, more pessimism. You're going to become depressed in a sense. You're going to feel bad because you're just going to feel like the world's falling apart. Whereas if you actually go to, again, you go to these specialized news outlets, so you go to a science reporting, then you'll find a lot of stuff about medical breakthroughs, which is actually a lot more interesting because that will allow you to predict the future a little bit better. If there's been a breakthrough, then you can maybe do something about that. You can maybe invest in it. If you learn that there's a vaccine for cancer, you can invest in it, and you can help the people that are actually trying to make that happen.

[01:55:02]

So that's a lot more useful stuff. Positive news tends to be more useful overall than the negative, engagement-driven stuff that you see.

[01:55:12]

Two of my favorite websites that I go to, Scypost and Psychology Today, both just phenomenal insights about human nature. If you're interested in that, a lot of the studies that I cite on this show come from PSY Post or Psychology Today. They're great. Do you know what the browser is? Are you familiar with that?

[01:55:34]

The browser?

[01:55:35]

No, I've never heard of that. So the browser is... It's been going for, I think, over a decade now. It is a daily email of five articles, and there is nothing. These articles have nothing in common at all other than the fact that the editor has found them to be interesting. And it's my favorite place to just get exposed to always new, new, new, new, new stuff. Here is the life story in 3,000 words of a boot polisher from 1800s Birmingham. And here is some new drone technology that's coming out of China. And here is a story about Genghis Khan and whatever. It's just so varied. And literally the only single thread between them all is the guy, Robert Cotrill, I think the dude that's in charge just found it interesting. And on the whole, not everyone's for me, but at least one to two per day. It's amazing. I think it's maybe like 40 bucks a year. And your Substack, something else that people should subscribe to, which they can go to gwinder. Substack. Com. I'm definitely some premium member, which I enjoy. What can people expect from you over the next few months?

[01:56:45]

What's coming up?

[01:56:47]

Yeah. So I'm working on my most ambitious article yet, which is going to be a long read. It's going to be about 5,000 words. I'm working on it for Unheard, but I'm also going to be posting the longer version on my Substack. And it's about gamification and how it can be used to control us, but how we can take advantage of it. That's going to be, I think, a very useful one for a lot of people. I also got my book. I don't want to talk too much about my book yet because it's coming, it's coming, but there's something big in the works.

[01:57:16]

It's going to be-Is this the first one or the second one? Because there's two, right?

[01:57:19]

Yeah, there's two. The first one's coming out next year, so not long. Then the one after is probably going to come the year after, so that'll be in 2025. But yeah, there's going to be a book, hopefully next year. And I'm also going to be trying to actually start doing videos as well because I've had a bit of demand from that. So I think by the time this comes out, I might actually have a YouTube channel. I don't know. But if you're watching this and you're interested in hearing me ramble more, then you might want to search my name on YouTube.

[01:57:51]

I'm going to guess. I can give you some more ramble. If people go to your sub stack, everything will be on the sub stack, I'm going to guess.

[01:57:56]

Everything's on the sub stack, yeah. And also Twitter, I'm going to be more active on Twitter. I've I got another mega thread coming up, actually, because I'm going to do one for the winter 2024. Megathread is going to be out in about a month or two. So that's going to be the next big thing on Twitter, but I'm going to be posting a lot more now because the bulk of my work on my book's done. So, yeah, I'm hoping that 2024 is going to be a very productive year for me.

[01:58:21]

I look forward to it, man. I want to try me as productive as you. You might want a bit more sleep than I get. But, yeah, dude, look, I really cherish these episodes. That's two hours that's gone by in literally no time at all. Once the next mega thread's up, you will come back on. We will talk about it again and we will have more fun. But for now, ladies and gentlemen, Gwinda Bogle, thanks so much for your day, mate.

[01:58:43]

Thank you. Always a pleasure, Chris.