Transcribe your podcast
[00:00:16]

Yeah, good sweat and I just had a hot bath I was already in, it was too late. I could I couldn't back out, but yeah, good.

[00:00:26]

From judging from Twitter, you're a bath man. More than a shower, man. I love it.

[00:00:30]

I have to ah the two pounds a day in the winter or bath and a shower or two showers in the summer. Sometimes I do it because I'm bored. There's something I think it's from my upbringing where, you know, we could only have one bath a week when I was little. Sometimes second hand water a while that was there. It was it's like a Dickens novel, but it is. That is hard.

[00:00:54]

You could you joke. But that is. Yeah. Deprivation one bath a week. I mean that's that's seventeenth century stuff.

[00:01:00]

What I remember in the winter in our house we had this is actually true. This sounds like a joke. It sounds like a Monty Python sketch, but we had ice on the inside of the windows when I got sober. Yeah, I used to dream. I got up and got dressed and then I'd wake up and go, oh fuck, I haven't got dressed. I there anyway. I've been them. I've got a minute. I got a question for another question.

[00:01:28]

Yeah. Yeah. I'm just, I'm just not in the bath. I, I've been thinking a lot about the brain or rather my brain has sort of a point now. It's quite a long question. I would stop me at any point if I've made some sort of fallacious leap like the brain I get. I totally understand evolution by natural selection. It's a no brainer. And the brain is just an organ. Like anything else. It came from three billion years, from a blob of reproductive protein to this most complex computer.

[00:02:10]

But it is just physical. You know, it goes by all the laws, the contingently laws, the universe, chemistry, physics, you know, energy, electricity, all that. But obviously this has the epiphenomenon of of consciousness. We feel like we've got a self. We feel like we've got free will, even though that's an illusion. And this leads to imagination. Invention of philosophy. GODSE So. Two part question and one, a chimp's going through that, do you think they've got all the rudimentary tools to invent the gods of spirituality or, you know, what you need is imagination and a decent brain or even a sense of self.

[00:03:00]

And two, if that is true, if the brain is purely physical, it can be reproduced. So in the future with a computer, will we have paranoid computers where we have computers that are nice and nasty and don't want to die and want to murder someone? Shoot. Yes, that's a great question, and that is there's so many questions contained in it. Here's what's not controversial. So there are many places where one can try to find a foothold or a handhold to debate some materialist assumptions and then, you know, try to open the door to something that many people in science and philosophy at the moment would consider spooky or theological or.

[00:03:51]

Yeah, just unwarranted. So the central drift of your question is fairly uncontroversial in science, which is to say it's safe to assume that everything we know and notice about the mind from the first person aside, as a matter of experience, you know what it's like to be us. All of that is a product of what our brains as physical information processing systems are doing. Right. So, yes, our brains are essentially computers made of meat, although they're not computers that are all that similar to the computers we currently call computers.

[00:04:29]

I mean, they're different in important ways. Many people will point out that science has been repeatedly confounded by a bad analogy is that we you know, we used to make analogies to water pumps and steam engines, and now we no longer do that because now we have a much better analogy, a computer. But many people would be tempted to argue that it's still not a perfect analogy or not even a good one.

[00:04:54]

But no, but but but the the important thing is that intelligence is basically the ability to problem solve, negotiate the world. And obviously those things, if they were that favored and that passed on and positively gets better and better or it doesn't work or a dead end or whatever. So, yeah, I get that and, you know, it starts it starts worrying me. I, I came from a science background and I went to do philosophy. So all the things like determinism and materialism, all those things, I sucked into anything that felt a little bit new agey nonsense mumbo jumbo magic I sort of rejected, but I kept in mind as it will prove it to me, you know.

[00:05:42]

And so I am this sort of this hardwired contingent. I need proof. I need physical proof. And so even consciousness freaks me out because. Yeah, well, I should.

[00:05:58]

It should because it's really we don't understand it physically yet. And there are impressive impediments to doing that, I think. I mean, the so-called hard problem of consciousness is genuinely hard because it's not clear why anything we do as minds, you know, all of our behavior, all of our mental behavior, everything, including our intelligence, needs to be associated with experience. Right. We could build robots and we undoubtedly will build robots eventually that pass the Turing test that are indistinguishable from humans and in fact, only become distinguishable from humans by their superhuman capacities.

[00:06:42]

They'll they will be as intelligent as we are in every respect. They'll be conversant with our emotions and display emotions of their own because we will program that them that way very likely, or at least some of them that way. Yeah. And I think it's true to say they're there already is good. They might even be better at facial recognition than humans are now. And that will eventually include detecting and responding to our emotions and images and so much of what makes us effective social beings, you know, millions of years of evolution as social primates and, you know, 300000 years or so of finishing school as Homo sapiens.

[00:07:24]

We're very good at this. And there's no question we're going to build machines that are better than we are. And then then literally everything we do cognitively will be like chess, where it will be true to say that the best mind at that is now a computer mind, not a human one. Yeah, yeah. We will never be the best at chess ever again, right. Yeah. And that's going to be true of looking at a person's face and deciding how they feel.

[00:07:53]

Will there be a robot. Right. That's Beethoven. Taller and stronger than me, made of steel. Look and see in the dark. And he's a bystander. Yeah, that's that's the robot.

[00:08:06]

They're coming for your job.

[00:08:09]

I'll always have. I'll go. I'll fall over and the crowd will go wild. Like I look like I look at that fat bloke. He's dying and the robot will go come a. Can't compete with the Ricky and the steam engine. Yeah, but no, I think it's true if ultimately that something like that has to be true if intelligence and even comedic intelligence and comedic timing and everything that gets built into that empathy, I learned that was it was still my brain.

[00:08:48]

Yeah, exactly.

[00:08:48]

If that's just information processing, there's just there's no reason why a human has to be the best at that forever. And in fact, there's no way one will be if we just keep making progress, building intelligent machines. So I think that I even buy it.

[00:09:04]

So I totally accept that. I suppose my question is then where it comes down to is why in this, you know, this illusion of free will? Is it the same as if it wasn't an illusion? What's the difference? That's my question. I totally accept it. But so what we are what we are, what does it matter? What does it matter that there isn't free will?

[00:09:28]

I mean, the reason why it's important is that so much of our psychological suffering personally and so much of our social suffering in terms of what we the ethical and legal decisions we make is anchored to this illusion. Yes. The feeling that you are you and really responsible for you, it's not that it's never useful. It's useful in certain cases. But the fact that we put people in prison for the rest of their lives or even, you know, give them the death penalty in certain states and my country and feel totally justified in doing it as a matter of punishment, not as a matter of social necessity, that we have to keep certain dangerous people off the streets, which is a difference.

[00:10:17]

And I think that's quite different. Yeah, yeah, it is different. And I'd say what I'd say with them. I think to and I know you're not saying this, but to say no one has to be well, so no one's to be punished is a nonsense. Rather like if a machine breaks down in a factory, you don't go. What he didn't mean to bite down. We keep it on. You get rid of it. A new one is not a punishment.

[00:10:39]

It's what we got to still protect the innocent. And I get that.

[00:10:43]

And and I think, yeah, definitely something else that there's there's there's the punishment certainly makes sense still in many cases. But retribution doesn't or, you know, the vengeance part of it doesn't. Yeah. Morally at once. You know, the swallow this pill of free will being an illusion.

[00:11:04]

Well, the three reasons for retribution, rehabilitation and what's the restitution. Yeah, we met Ted Hendricks, but no punishment. I think it's called think it might be called eye for an eye. No, I think it's just for punishment. It's got a picture of an eye on a tooth. And it is my it was my professor. Oh, yeah. He told me about four years ago. I was I was sold on, as he said about it.

[00:11:33]

And yeah, he breaks down. Why that sort of punishment for retribution doesn't work. And, you know, we'll be totally agree with and you know, with a death penalty, you can't go back and say we were wrong. We know the we know the worries about that. My point is, even if everyone understood free will is an illusion, we're hardcore. I don't think it should make any difference because we're not saying, oh, we came from a tough background or it was a crime of passion.

[00:12:05]

We're just saying we're all robots. Let's do what we like, which we know isn't acceptable. That's why I mean, it doesn't make a difference. All the other caveats would still be in place, you know, a sympathetic judicial system and and act utilitarian as opposed to rule utilitarianism. All those things would still be in place. But what what I can never accept is that the people at the site, if hard determinism is true, no one is responsible for their actions on a societal level.

[00:12:38]

But the difference on making once you view people in this vein as akin to malfunctioning robots, so evil people, if we built an evil robot, it would reliably produce evil. You know, nature has built evil robots for us, as you know, psychopaths and other people who are just reliably create a lot of harm for everyone else. The question is, how should we feel about that and whether hatred is the right emotional response now? I mean, it's a totally natural response, certainly, if you've been victimized by such a person.

[00:13:16]

But I think we should treat it like any other force. There is no fault. You don't you don't you don't go into morality of an angry bear. Exactly. Who attacked you in the woods. Right. You might shoot, though.

[00:13:32]

He came back background. I love him. But if a bear is attacking me, I don't I don't care about his home.

[00:13:39]

But he did come from a tough background. He came from the background of being a bear. Right.

[00:13:44]

What else would you do? And I don't care whether it's whether whether I should I should I take this back? I get out. It's like if I can't get out of that, I try and stop it. It's not a moral issue is the fact that I don't deserve to die by a bear yet. That's what it comes down. Yeah, it's. Yeah, yeah. I love bears. I love there's I've never you love them and good luck to them.

[00:14:13]

They've got to do what they've got to do. But as I say, if he's in my apartment I've got other words. I don't don't care.

[00:14:23]

Yeah. Yeah. I don't know what that where that analogy goes. What I'm saying is the psychopath is part of nature like the bear. I know it's not. It's always a psychopath, just like it's not. It's only it's a hungry bear. But that's no reason for me not to try and stop things.

[00:14:43]

You've got to stop. Oh, yeah. But you don't have to hate it and you wouldn't hate it in the same way you'd hate a person. And that is the crucial piece for me. That's a very good point. Ethically, it's like. Right.

[00:14:54]

It's like even even if it harmed you, I mean, this is I don't know if you got to that part in my in my. Yeah, I know you heard some of the audio from from waking up where I talk about free will. But just imagine the two cases, you know, one case you're attacked by a bear and, you know, let's say you lose a hand, right. You really are. You've had you've had a terrifying encounter with near death, but you're saved and the bear gets tranquilized.

[00:15:22]

And let's say it gets put in the zoo, right? Yeah, that's one case. The other cases, you're attacked by an evil person and suffer the same injury. Right. So that. Yeah. And so but then the question is, what is your subsequent mental state. No, you're right. Or the rest of your life there is right now. I mean, you could be hating the person and fantasizing over killing that person with your bare hands or hand.

[00:15:46]

Yeah. And but with the bear you might actually laugh, especially if you laughed in cold.

[00:15:51]

Yeah. And he said he could just play upon your hominid emotions so that you would really hate him, you know, and want to kill him.

[00:15:59]

Infanticide too. Yeah. Yeah. Because we've got a sense of self and morality and we feel what's right and wrong. Yeah. We impose that on another human. Well we wouldn't do it in the back rather rather in the way. If I walk. If I walk. A tree and I break my nose, I do not hate that tree. You hate yourself. I hate myself. And I, I try and widen the council, but I would want someone to blame.

[00:16:26]

I want someone to blame with the weather if it rains. What who did who didn't tell me to bring out whose job was.

[00:16:35]

Yeah, yeah, yeah, yeah. That's true. That's a very good point. And we can't we it's hard to forgive another human who hurts you for fun, but I suppose it's fun even though it's not. Yeah. In a in a naturalistic framework they can't help it as I'm doing put in quote marks around how. But we mean that literally as well if we're determinists.

[00:17:02]

And honestly, that does help me now a fair amount psychologically. I mean, there's so many people out there on, you know, on social media in particular, who this is where I tend to see it. I don't I don't see it in my life. Who just maliciously attack me and attack people who are associated with me in any way and it's why am I talking to that?

[00:17:27]

Good luck. Good luck on social media after this.

[00:17:29]

I don't know anything about it was way popular. OK, now anyone has to have you. I don't like sound. I'm asking them. I'm using it, if anything. Well, so, guys, just if you listen. Yeah. Yeah. That's a very, very good point.

[00:17:49]

It's much easier to process when you actually recognize that certain people are doing what they do because that's what they do. They're like bears.

[00:17:59]

Yeah. Yeah, exactly. Yeah. And there's lots of other factors on social media getting getting noticed, wanting to be a part of someone else's polls where he claimed they're not like that in real life. They'd ask you for an autograph. All these things that there's a you know, if someone if someone sends a nasty tweet, I think I'll tell you this before that. And I thought, well, they said that. And I look back and they've said twenty nice ones, but I didn't notice them.

[00:18:29]

And I think got me wondering why would I put this line in our flag as well. Why, why would people rather want to be famous for being an asshole and not famous. What, what is the attraction of, of being famous saying I was here, this caveman just used to put their hand on the wall and blow over and you know that I was here. And now it's obviously got out of hand. But there seems to be I think it's some sort of cash for eternal life.

[00:19:02]

I think that's a very human worry and quest. What's the point? Well, you know what will happen after I die? Will people remember me when I will myself carry on? When I come back as a spirit is there I have them about let a good life. Was it worth it? Will I come back as a coward? I think all those things as irrational as they all are. Oh, I'm very human and I don't know why.

[00:19:25]

I don't know. I don't know. Again, they could be upshot's, but. Yeah.

[00:19:30]

All right. Well, we can work that out after you've had your third bath of the day. And up until now. So in conclusion, yes, robots, computers will soon be indistinguishable in humans. Final question, is there a chimp somewhere that sat down and looked up and thought, where do we come from? Who did this? Where are we going? Has that happened yet? As a chimp thought, what the fuck is going on here?

[00:20:01]

I would highly doubt that. But the interesting thing is that there are certain things we do that are really crucial to our being smart, like, you know, working memory, which chimps are better at, which is pretty. And you can you can see this display. We could find this video on YouTube where given a memory task where there's a keyboard like a, you know, a keyboard on a screen and many numbers and letters, you know, suddenly get illuminated and then you have to recapitulate.

[00:20:31]

Sure. You if to press all the right keys. Yeah. Chimps are so fast and so much better at it than humans that it really is. It's kind of terrifying.

[00:20:42]

Have you seen that? Have you seen that experiment? The that shows it's not just the arbitrary test, it's it's the reward that has a sense of it. So they did a thing with a chimp with beads. So if it shows the small pile of beads, it got a jelly bean, got it right every time to the smallest pile of jelly bean when they gave it the choice to choose the smallest pile of jelly beans in the big pile of jelly beans because it was be good.

[00:21:17]

This experiment was out the window. It just went up.

[00:21:20]

That's the big Pollock.

[00:21:22]

So there is that great. That's fantastic. That was a genius. Now, I don't have a sense of self and I want to be a brilliant James monitors say.