Happy Scribe Logo

Transcript

Proofread by 0 readers
[00:00:00]

The part Kenny show on news talk with Marter private network during current restrictions. Don't ignore your health concerns. Our expert team is ready to help. It's novels, the remains of the day and never let me go, so my next guest achieved both commercial and critical success going on to win the Nobel Prise for literature in 2017. Well, now Kazuo Ishiguro is back with a new book, Kallara and the Sun, and he joins us on the line.

[00:00:30]

Good morning and welcome. Hi. Good morning. Yeah, it's an extraordinary book. I think it must be a first in fiction that you have your first person.

[00:00:41]

NARRATOR is a robot powered by artificial intelligence.

[00:00:46]

I don't think I'm the first. I mean, I think a lot of sci fi writers and people have been doing this. I don't claim to be the first, but it's certainly the first for me. All right, now, Clara is one of these AIFS, an artificial friend, and many people have been commenting on your book and saying, look, this is a, you know, maybe looking at a dystopian future, but it strikes me that it's not too far away from the present.

[00:01:16]

No, I wasn't really aware of, you know, delving into kind of futuristic stuff or anything like that. I had for some years been really fascinated by developments in A.I. and also in gene editing. And there have been remarkable breakthroughs just in the last few years. And I don't think, you know, people are perhaps waking up. We haven't quite woken up to to actually what's been happening. And so I was just writing out of either things that we can already do today or we will be able to do very, very soon.

[00:01:49]

But it's not all dystopian. You know, I have great hopes. I think this is going to make our lives better in many, many ways. But we have to be ready and we have to reorganise our societies, I think, to meet the challenges. At the beginning of the book, Clara Hughes and don't want to give too much away, but I don't think I'm spoiling it to say she is residing in a store where these artificial friends are sold and we become well, I became, you know, quite fond of Clara treating her as a fellow human being rather than some machine.

[00:02:28]

And maybe that is the way we are going, that our robots will become more humanoid.

[00:02:35]

That's possible. But, of course, I mean, it's not surprising if you feel quite sympathetic to Clara because, I mean, she's she's artificial in more than one sense of the word. You know, she's artificial, that she's a fictional character. And I've made her up. And, of course, we're well used to, you know, weeping and crying or, you know, getting furious at fictional characters, whether they're on the screen or, you know, in our books, you know.

[00:03:00]

So I always felt confident that her being a machine isn't going to be an obstruction. And of course, we grew up reading books about teddy bears and animals, you know, hoo hoo hoo hoo. Our hearts went out to. So are the question that you pose. There is an interesting moment once once these machines are not just machines and not just in books or stories, but they're actually in our families and living in our homes. What kind of emotions are we going to have towards them?

[00:03:33]

But my book is more concerned with what it will do to our feelings towards each other as human beings when we live in a world that that has algorithms and big data on almost claiming to be able to kind of map us out as individuals, are we going to actually look at each other differently? Will relations between parent and child or between spouses, would that change or will we actually change of our very assumptions about what an individual human being is and how unique that human being actually is and how irreplaceable somebody you love might be.

[00:04:18]

So these are the questions that actually occur to Kallara. Ironically, it occurs to this machine who is trying to understand and learn about the human world.

[00:04:28]

Now, at the beginning, Clara manages to do pretty much everything, navigating the store and so on. When she does end up in in the home of the young girl, we become more aware that she's a machine because she's got to learn a whole pile of new tricks, you know, even to navigate the the island unit in the kitchen or the high stools that people sit on. And that that kind of brings us back a step almost in understanding that this is not a person.

[00:04:56]

It's a machine. Yeah, but that's one of the advantages of having a narrator central character that is an A.I. machine, she can be a mixture of very naive, very childlike in her knowledge and experience about some things and super sophisticated about other things, because she can learn at an exponential rate. And my understanding of it is like this, you know, I think a very, very powerful machines that can that can defeat, you know, all the chess and go grandmasters in a few minutes, find it very difficult to make a cup of tea because the kitchen is different or the fridge is a different distance away from the table in the kitchen or something.

[00:05:42]

And that that completely stops them making a cup of tea. So I find that kind of thing quite fascinating. And Clara actually does retain, as part of her personality, if you like, something very, very childlike. And she holds onto a kind of a childlike faith in in the goodness of maybe not the goodness of people, but there is some kind of something good, something very powerful that is good watching over her. And she she thinks this is a son because she's solar powered.

[00:06:12]

And and she she holds on to a whole set of kind of quite childlike beliefs, if you like, that that don't become more complex, almost like an idea.

[00:06:25]

You know, when she's in the shop window, she is bathed in sunlight and and so she is energetic. But if she ends up for too long in the back of the shop without natural sunlight as she can become lethargic. And I suppose for any of us worried about our AFEs, our artificial friends taking over our lives, you know, lock them in a cupboard and deprive them of light. Well, there is no doubt about that. I think it's perfectly logical for Clara, you know, because she's solar powered.

[00:06:55]

She thinks the sun is a source of nourishment, not just for her and her arse, but for all the human beings you can see out in the street, because that's what it looks like. And so so at the beginning, she sees the human world through the lens of, on the one hand, the sun being this positive thing and loneliness being this this thing that human beings try to avoid because she has been manufactured with the commercial purpose of preventing teenagers from becoming lonely.

[00:07:23]

So that means she goes out into the human world seeing things through the lens of loneliness. She she asks herself, you know, do people behave like that on the street because they're lonely? Are they trying to avoid loneliness? Is that why they were like that? And and that's that's the perspective she takes out into the human world. And and this widens out into questions for her, like, you know, what do human beings mean when they say they love each other, stuff like that.

[00:07:53]

Now, a child needs an AW and artificial friend because that child has no brother or no sister, perhaps. And you're looking maybe forward to a world of the single child policy, which we saw in China for a while, although it seems to have been abandoned somewhat. The shortage of spouses, for example, where and when female children were being, shall we say, disposed of in some way.

[00:08:20]

But the world as it's going at the moment, I mean, I know you have spoken about these matters. If, for example, the the old Soviet Union had the tools available to China, what kind of world would we have today? Well, that's something that's started to worry me recently. You know, I expressed a lot of optimism about artificial intelligence and gene editing and these new tools we have now, I suppose that would be the downside. Those would be the concerns that I would have more immediate concerns as well about employment, the fact that perhaps a majority of us will be out of work and that certain prejudices of the age will be will be kind of hard baked inside black boxes of A.I. So decisions are being made and we can't really figure out why.

[00:09:14]

And they may seem unfair, but we don't quite we can't unpack those decisions in the way that we could with past institutions. But I think the one that you name there are also worries me. I think the the advantage that liberal democracies had, which enabled liberal democracies to effectively win the Cold War, because we could we could provide better supermarkets and give people a much more comfortable, open, free life and be more successful economically than than than the communist states.

[00:09:51]

That advantage might be taken away with artificial intelligence, you know, both because of the surveillance powers of artificial intelligence and also because of the central planning efficiencies that can be achieved with artificial intelligence. So liberal democracies like ours, we might have to work harder at setting ourselves up because we might lose that vital economic advantage. We can't say liberal democracy is good because your countries will be richer. You know, that might not be the case anymore. And I think already China provides an alternative model of a very successful economic kind of miracle country that that is actually a single party, pretty authoritarian state.

[00:10:40]

And it looks like that people are prepared to trade certain freedoms and and be compliant in exchange for comfort, you know, with the exception, maybe the Falun Gong and the weaker people and so on, who will articulate their differences with the Chinese.

[00:10:59]

But many people just happy they've got the latest technology, they've got a job, they've got comfort, they can buy luxury goods. So let's leave the state to do what it wants to do.

[00:11:11]

Well, absolutely. Yes. I mean, and that is understandable to up to a certain point. But I think over many decades, perhaps over many centuries in the West, we have tried to educate ourselves about these things. And we have come to understand the concept of human rights that know you cannot just have a more comfortable life at the expense of a minority who is not sharing that life. Now, that is something we are still struggling with in the West.

[00:11:42]

But, you know, we've got it we've got it a lot better than than it was a century ago, even fifty years ago, I think. But I think we I think there's a danger now that, you know, we could be complacent. And I think the events of the last few years and particularly the events of this past year has shown that the foundations of of the liberal democratic system are much, much shakier than certainly I ever thought in my life.

[00:12:13]

You know, I'm of the generation that grew up thinking that, you know, we had these very, very firm foundations and everything was getting better. Just in the last few years, I've been wondering, oh, you know, things are much shakier. There aren't grown ups upstairs making sure that everything's going to be fine. Yeah. And we got to work harder at this.

[00:12:36]

It's interesting that there's nostalgia in Iraq for Saddam and the order that he brought. There's nostalgia, you know, for Colonel Gaddafi and Libya.

[00:12:50]

You know, the people are prepared to trade a lot of stuff in exchange for stability.

[00:12:55]

Well, this this this is this is the basic Hobbesian vision of of political philosophy. I mean, the first thing that people demand is, is freedom from fear. You know, people first of all, they they want to be able to eat and go to bed and look after their families without the fear that the relatives will come and kill them and and rape them, you know, so that's the first thing that people are asking the government. And and you trade some things in order to allow that government to to to give you give you order.

[00:13:33]

But that's a very basic thing. And it seems to me that we. Certainly in what you call the Western countries we managed to build since the Second World War, many things beyond that, you know, and I think we have at the heart of our way of building our societies, the idea that the individual human being is very important. And that's the unit around which you have to build things. Now, you don't start with a big idea and sacrifice individuals to it.

[00:13:59]

You start with the idea of the sacrosanct importance of the individual you like to call it. That's the humanist, the liberal humanist view. And that's just bringing things back to my novel. I mean, I guess it's asking, is that itself under threat emotionally as well as intellectually in a world that actually questions whether there is anything peculiar, peculiar to unique about each individual? Or do we do we actually have have some sort of mysterious thing inside, like a soul that makes each of us so important that that the individual has to be the basic unit around which we built all our societies or our systems or the ethical systems, or are we more replaceable than we think?

[00:14:51]

And so that is one of the central questions in Kallara, in the sun, and and it's one of the things that I feel we have to not be complacent about living in. I think these very privileged, comfortable countries such as yours or mine, which would presume that it's always going to be like this, the idea of Clara and the Sun.

[00:15:19]

I mean, I read somewhere that it might have been a children's book, but you took advice from your daughter that perhaps this was not the way to go.

[00:15:29]

Well, my daughter wasn't a child. And my daughter, Naomi Sugar, is a novelist, but she was then working in a bookstore. I knew a lot about small children's books. But yes, there's something about Clara that that is like a character from one of those picture books for children, you know, five, six years old. And and I and I love those. I love that the world that's created in those books, you know, I like the relationship between the drawings, the illustrations and the little tender story that's usually offered to small children.

[00:16:00]

I love the way that the adult world is trying to protect a small child from the harsh truths in those stories and present a kinder world. But at the same time, you can sense the reluctance of of the adult world to actually mislead the child. So you can see the hints of sadness and and the dangers and darkness, often in the drawings as much as in the stories, the hints of them, you know, behind the clouds, in the shadows of the trees, in the eyes of little animal characters, you can you can see traces of this darker world that we're telling our children.

[00:16:39]

Well, we want you to think it's lovely now, but you have to be a little bit ready as well. So I've always loved that. And I try to put a lot of that into this novel, the visual images of kind of big skies and big suns. And and as I say, that the vision of this robot kind of go a lot of it remains like that of a small child. And and, yeah, that that kind of hope and trust that exists in that kind of very fragile world of small children's books.

[00:17:15]

I wanted to have that existing in Clara's vision. Well, it's a wonderful book, Clara and The Sun is its title, it's available now from good bookshops. You can order online, of course, order locally, help out your local bookshop. It costs around 20 euro. And if you want to hear more from Kazuo Ishiguro, he'll be speaking on the 12th of March as part of the International Literature Festival of Dublin. And you can find out more about all of that on their website.

[00:17:43]

But Kazuo Ishiguro, thank you very much for joining us on our programme today.