Transcribe your podcast
[00:00:02]

From ABC, this is the 10 percent happier podcast. I'm Dan Harris. Hey, hey, a question for you, given that social media has been blamed for rising levels of anxiety, depression, loneliness and political polarization, is it possible to use this technology wisely? This has been an incredibly urgent question for a long time, but perhaps never more so than right now as we head into an election. And it's the question we're going to dive into today with Randi Fernando, who's featured in a new Netflix documentary called The Social Dilemma, which is all about the many alleged pernicious impacts of Facebook, Twitter, Instagram at all.

[00:00:47]

Randy is the co-founder and executive director of the Center for Humane Technology, and he's also a time meditator.

[00:00:54]

We start out by talking about what he sees as the dangers of social media, but then we get into a fascinating discussion where he ticks off a ton of techniques informed by his knowledge of Buddhism for using social media in a way that won't cause you to lose your mind. Here we go, Randi.

[00:01:12]

Fernando. All right, Randi, nice to see you, thanks for doing this. Great to be here. Thank you so much.

[00:01:19]

And just by way of background, I'd love to hear a little bit about you.

[00:01:24]

How did you arrive at this confluence of meditation, mindfulness and the perils of social media and technology generally? It's been an interesting journey. I was born in Sri Lanka to a very Buddhist family, so my parents, both in theory and practice, are just very serious about Buddhism and it's been very useful for all of us in terms of navigating our lives in sort of the framework that I use and we use at home, my wife and I when we discuss life and its problems and challenges.

[00:02:01]

It's a great framework. It's been really helpful. I'm very grateful to my parents. They taught me the precepts when I was five. And so I've really not had to look for anything else to give me that guidance. So I actually learned basic meditation about age eight from my mother and programming from my father. So those are the two threads that kind of diverged and then came back together in quite unexpected ways, I would say, in my career.

[00:02:29]

I loved programming. I loved making pictures appear on the screen. And so I just followed that. I just happened to pick something that turned out to be really relevant as I grew up. And I ended up at Cornell to study computer graphics. And that same passion took me to Invidia in Silicon Valley. I got to manage a bunch of different software projects and to author three best selling computer graphics books. So that experience, you know, which is one of those weird things you get to do in Silicon Valley at a young age out of my masters, I had just come out and got these great opportunities.

[00:03:05]

And around that same time, I got back into meditation and more about the study. I was reading a lot of suiters and meditating much more seriously in my late 20s. And at the same time, I started volunteering and I started looking for things I could do and just started learning, and that process led me to end up helping to build a nonprofit called Mindful Schools, where I served as the executive director for seven years. And we ended up bringing mindfulness to nearly a million children globally.

[00:03:38]

And now I think that number is several million kids. And at the same time, I started doing retreats regularly because of that work, and so I was doing one to two retreats every year, and I think that was all really helpful. And then ended up cofounding the Center for Humane Technology with Tristan Harris and is a Reskin, I haven't actually been Tristan right after he was on 60 Minutes and he was getting this flood of interest. It was crazy.

[00:04:07]

And he asked me to help to kind of corral that and organize it. And that's what led here. And so throughout my career, I've really been exploring all of these different intersections. I'm very interested in helping the deeper Buddhist teachings to survive in a world of mindfulness that often is very watered down. And I think there is a lot of wisdom there. And so I serve on the board of Spirit Rock Meditation Center and also a small group called the Buddhist Insight Network, both of which are really dedicated to preserving these teachings.

[00:04:42]

I'm really pleased and I'd say pleasantly surprised that all of these things have come together with the work at Center for Human Technology in a way that I had never anticipated.

[00:04:54]

Can you say more about that? How how what is the Center for Humane Technology? Who is Tristan Harris? How do the Buddhist concepts get woven in there and in practice, et cetera, et cetera? Sure.

[00:05:05]

So the Center for Human Technology is a nonprofit organization of deeply concerned technology and social impact leaders, and we are focused on addressing the harms of the social media platforms. So from outraged polarization, addiction, depression, political manipulation and ultimately the breakdown of shared truth, which is the one that keeps us from solving everything else. When you try to think about what is humane technology like, what is it that you want? It's easy often to define what we don't want.

[00:05:38]

It's easy to define what's going wrong and to say, oh, here's the problem and then you get to the solution. And it's often much trickier and more subtle. And I would say where the Buddhist concepts really come in for me personally is human technology first and foremost, I think needs to reduce suffering. And from the Buddhist point of view, that largely is going to be related to addressing greed and hatred instead of perpetuating them and reducing ignorance as well.

[00:06:09]

But a lot of technology right now actually does the opposite. Instead of reducing greed and hatred, it promotes that because that's a lot of times that's the way to sell more product. You want people to be dissatisfied with their situation, with what they have. And that's a lot of I would say bad marketing is all about that. And good marketing is looking at what the values are that someone wants and where there's a real problem. And often that's a much narrower spectrum and results in a lower revenue.

[00:06:43]

So humane technology reduces suffering. It has to be value centric. It has to look at what people are needing in their lives, what their values are. It has to recognize that technology is not neutral, that anything that we build is an expression of our value system. And also when we place it into the world, when it lands in this water. Right. That everyone swimming in the technology itself ends up conditioned by that water. Human technology has to be sensitive to human nature.

[00:07:16]

It has to recognize we have certain physiological vulnerabilities, and that's just how we're built. And most of that is about survival. It's just about helping the human race survive is kind of our evolved. We I would say we're not really evolved towards happiness necessarily. We have to work to that. We're evolved to reproduce and perpetuate the species. But when we think about happiness and wellbeing and reducing suffering, often that requires some work. Another characteristic of human technology is it builds shared truths instead of dividing us.

[00:07:51]

This is a huge problem right now where people are more and more divided. A lot of the technologies out there, especially social media platforms, are dividing us, often in unintended ways. But it's a big deal. And the way they're designed actually perpetuates that division and is driven by the underlying business model as well. And the last thing is human technology has to account for the unintended consequences that it generates and tries to minimize them. And again, this is related to the Buddhist view because we just see everything through conditions.

[00:08:28]

One thing, conditions, the other thing. And it's just like this endless stream of conditions that are perpetuating. So when you have the privilege of designing software that is going to be used by millions or billions of people, you have to be aware that there are some serious conditions that you are perpetuating out in the world. And so it makes all the difference when we're trying to design more humane technology. So you're right that we will dive more deeply into solutions once we've walked through the problems associated with social media in a more granular way, just to not let it hang, though, you referenced this individual, Tristan Harris.

[00:09:13]

He showed up on 60 Minutes. I remember that piece. It was an Anderson Cooper piece about how social media is designed to hook us. Tristan, if I recall, worked at Google and then left and started become a critic from the outside. Can you say more about him and your work together?

[00:09:30]

Tristan Yes. As you said, he was at Google and what he saw when he was at Google was the immense responsibility that these engineers were having, that every time, each keystroke, every decision, every product decision was shaping the way millions and actually billions of people were using products. And he started to see this attention economy game that was being played and that actually there was this race to the bottom of the brainstem, as he likes to call it, where the companies are competing for our attention in order to monetize.

[00:10:06]

That became very clear. And so then he started to speak out more and more and others were speaking out as well. He was very articulate in how he expressed these things and he was able to translate the experience that people are having on their phones, that they can relate. You know, that's very resonant for them. And to translate that experience into words that said, oh, that thing you're feeling on your phone is actually part of a bigger problem.

[00:10:34]

And that's what happened. So different versions of that. And that bigger problem first looked like, oh, this attention thing, stealing our time, stealing our attention. And then the bigger things started to look like, oh, wait, it's impacting our relationships. It's impacting democracy. It's impacting polarization. It's impacting shared truth. It's impacting our ability to actually solve other problems. And now at this level, it starts to become existential because if you can't build common ground to solve problems, you're going to be in big trouble as a human race.

[00:11:09]

So you and Tristan are featured in a new documentary on Netflix called The Social Dilemma I've Watched. It's very interesting and unsettling. What would you describe as the basic thesis?

[00:11:23]

I think what's special about this film is it describes how social media works and the harms it creates, as told by the insiders who helped to build the products, the people who were there and saw from the inside what's going on. There's a lot of credibility that comes with hearing what people who saw people who are on the inside and saw what was going on and the decisions and the thinking and now can see, you know, 10 years later, roughly the consequences, what has played out as a result.

[00:11:58]

And this is particularly true in a time when all of us, especially children, are on these platforms more than ever. We need to understand how they work, what are the implications, what are the mechanisms, how does the actual attention grabbing work, what are the different types of notifications or infinite scroll or the kinds of the way choices are presented to us in menus? All of these determine our behavior and then to understand this process of how actually we are the product.

[00:12:30]

Right. This is something that Douglas Rushkoff said back in 2011. We are the product. It's actually our attention and our behavior that is being sold, its access to our brain, its access to our next thought. That is what is being sold. That is what is being auctioned. And it's not only being auctioned in a general way, it's being auctioned in a very specific, micro targeted, highly optimized way where a third party can say, hey, I want to buy this and I want to target this specific group of people.

[00:13:05]

They live here. Here are the demographics. Here's the gender. Here's the race. Here's the edge here. Their interests, all the fun stuff everyone shares on Facebook. That information is then used to target you. Right. And all the other platforms and. It is not only the information we shared directly, it is also the information that is inferred. So the algorithms also infer information about it. So, for example, it can infer your wealth level.

[00:13:35]

So when you're an advertiser, you can choose. It's so detailed, but you can choose things like you can guess someone's net worth and say, hey, if they're high net worth, send this ad to them. And so like that, there are all these things about what their interests might be, and all of this gives access for a third party to get access to your brain at specific intervals while you are scrolling through a feed or interacting with the product, something jumps out, right.

[00:14:03]

It's decided this is the moment that you might be susceptible to a specific kind of ad because the platform's goal is to deliver this ad in a way that makes you click on it, because that's what the transaction takes place. And so I think this is a very dangerous model and it's very, very easy for malicious third parties to hijack that and to use it for different motives. So no question we've seen malicious third parties gamed the system, but just to play devil's advocate, isn't this creating an efficient market for advertisers to reach people with products or services that might be useful to them?

[00:14:46]

For example, some of the targeted ads and not on social media that much for reasons that we'll get into, but sometimes like a pair of shoes pops up and I'm thinking I actually need a pair of shoes and that's a good pair of shoes. I'm going to get that. And by the way, just full disclosure, 10 percent happier advertises on some of these platforms. And isn't it a good thing that we can figure out who our target customer is and reach them with something that's going to make them happier?

[00:15:11]

Yes, it's not always that conflict. I think a lot of it comes down to, again, the root principles of is it increasing greed and hatred and delusion or reducing it? And in some cases, if you're offering a meditation app to people and you're sending it to the right people and it's a sincere app and the whole model behind it is sincere, that part of it is not a big problem. One of the things that would be really helpful is if the platforms let someone explicitly signal I'm looking for a meditation app right now.

[00:15:45]

It's all in effort and this is where it gets very muddy and very dangerous. If instead I was able to say, OK, I'm actually looking for a meditation app right now, can you help me find that based on what you as a platform know about me and about the world? Now, that kind of relationship, as sweet as it sounds, can only work effectively if there's a very high degree of trust, essentially some kind of a fiduciary relationship between you and the platform, just like your lawyer has to protect your doctor, has to protect you, has to do what is in your best interest and in return for that to work.

[00:16:25]

You give them all kinds of information, they have access to everything, and that's the point. But in this case, we know time and time again that these platforms have failed to protect people. In fact, they sell that information. It's easy to game. Some of it is is definitely their fault. Some of it is other people using the system as designed to manipulate in creative ways that were unanticipated. So then each time the platforms go and patch up that part.

[00:16:53]

But one of the key premises that we make and that the film makes is that the whole stage, the whole platform, the whole ground is already tilted and it's tilted because of this underlying business model. If you follow the money, you can see that the incentives are not aligned between the people who are on the platform, just happily sharing their information and interacting and the advertisers who are buying access to their thoughts and their behavior changes. That's the problem.

[00:17:26]

So when those incentives aren't aligned, naturally you're going to have this tilt that is not in favor of the actual person using the platform. And this becomes true when you look specifically, for example, at each of the different areas where harms come from. So I'm going to mention a few interesting factoids, which I think are very relevant just so everyone understands what's at stake here. So in mental health, from twenty sixteen to twenty nineteen, there has been a quadrupling in the number of cosmetic surgeries for the sake of looking good on social media.

[00:18:03]

For social relationships and values. Every time someone treats and I like it has human qualities, so like interacting with Siri, for example, the more they later dehumanize actual humans and treat them poorly. Children under age 14 spend nearly twice as long with tech devices as they do in conversation with their families, so the time is about three hours, 18 minutes per day. With truth and facts, fake news spread six times faster. And that's because it is so many more degrees of freedom and is often so much more appealing.

[00:18:42]

And when you compare that with another fact, which is really troubling, people tend not to change their minds back when they've been given a factoid and it's kind of planted in their head. It's pretty hard to change their minds back. So even if you track down everyone and say, all right, all right, wait, hang on. That thing you say is not true. You can try to do it, but it's a lot harder. People are loyal to what they've heard.

[00:19:04]

This is this idea of first impressions. With children, children who have been cyber bullied are three times more likely to contemplate suicide than their peers. There was a study that tracked 200 children from the ages of two to five and children with higher levels of screen time showed greater delays in development across a range of important measures, including language, problem solving and social interaction, polarization and extremism. Anger is the emotion that travels fastest and furthest on social media.

[00:19:41]

Every word of moral outrage added to a tweet increases the retreat rate by 17 percent. So these are some of the examples, right? And so I'm just trying to explain why we say that the platform is tilted by default, because if the game is attention and anger travels furthest and fastest and fake news spread six times faster, what is going to dominate on these platforms? It becomes very, very obvious. And it's exactly what we end up seeing.

[00:20:12]

And I think that's extremely dangerous. So all of these you can read more about all of these at Ledgard Humane Tech dot com and actually our podcast. So our podcast called Your Undivided Attention. So it said Humane Tech dot com slash podcast. So, yeah, it's a problematic situation, I would say, in. Just on the on the anger and the fake news, I mean, in the film, the cavalcade of experts, all very credible and in my opinion, prognosticated, quite horrifying ways about where we're heading as a global society, given the pernicious impact of social media on society, that we can't agree on a basic set of facts upon which to to have a debate, anger wins out.

[00:21:02]

Look what's happened in Burma with the Rohingya Muslims and the Buddhist majority. They're carrying out what appears to be a pretty clearly a genocide fueled by Facebook, allegedly.

[00:21:13]

So that prognostications run toward civil war, like all over the place. So there's there's a lot to unpack here. So what is it that people have always had their perceptions, their hatreds, their desire for fame? All of these are there. The film is not saying social media is causing all of these things to happen out of the blue. What it is, is an accelerator because it is now the infrastructure. It is a big part of our communications infrastructure and it's also a big part of how we make sense of the world.

[00:21:50]

It's taken over a lot of journalism because credibility is based on a different currency. It's not based on how well did you research your article? It's based on how many likes and comments and shares. Is it having how much influence is it having? And unfortunately, we just explained how bad that second metric is, much more related to sensationalism, to anger, to fake news. And so you end up in this world where everyone says things like, oh, my God, I saw the greatest thing, the most amazing thing I have ever seen yesterday.

[00:22:26]

That kind of language people use all the time, and that's the kind of language that now has become necessary to get attention because everyone's competing. So you're saying, oh, my God, this is the cutest cat video I've ever seen. You've got to see this because everyone's a go. I've seen cat videos that I have seen cat videos. Yours is probably great, but I've seen them all. So to get someone to rise up and see the next one, you have to use language like that.

[00:22:52]

And with cat videos, the consequences are not that great. But when it comes to domestic civil war or disagreements, it becomes a major issue. And it turns out that protecting our physical borders, we've already got that right. We've got the United States has probably a trillion dollar military. Right. Many trillions of dollars are invested in that. But in contrast, the digital borders are not secure and it's a lot easier to penetrate those. And actually, it's pretty cheap on the order of hundreds of thousands or millions of dollars.

[00:23:26]

You can start dividing people, you can plant narratives, you can make fake groups and invite people to fake events. And all of these things have happened. And so you end up with people physically showing up two sides, both of whom were manipulated. And I think all of us, I'm sure all of us have fallen for this stuff where because we care so much about the topic, we end up forwarding something without checking. We were just thinking, like I said, that has to be true because we want it to be true.

[00:23:56]

We can all be easily manipulated. We can all be part of the problem all too easily with very good intention.

[00:24:04]

More of my conversation with Randy Fernando right after this.

[00:24:08]

Thanksgiving and Black Friday may look a little different this year, but there's still a lot to be thankful for, like being able to find the right people for your team when the holiday rush has you ramping up your small business needs. So when you're ready to make that next hire, LinkedIn jobs can help by matching your role with qualified candidates so you can find the right person for your business fast. Getting started is easier than ever. Post a job with targeted screening questions and they'll quickly get your role in front of more qualified candidates, manage job posts and contact candidates from a single view on the familiar LinkedIn dotcom as functions are streamlined onto one simple screen.

[00:24:42]

And now you can do all this from your mobile device to when your business is ready to make that next hire. Find the right person with LinkedIn jobs. You can pay what you want and get the first fifty dollars off. Just visit LinkedIn dotcom slash happier again. That's LinkedIn dotcom slash. Happier to get fifty dollars off your first job. Post terms and conditions apply 10 percent happier is supported by better help online counseling. We're in extraordinary times and if you're struggling with stress, anxiety or depression, you're not alone.

[00:25:12]

Better Help offers online licensed professional counselors who are trained to listen and help simply fill out a questionnaire and get matched with a counselor in under 48 hours. Join more than a million people taking charge of their mental health with better help. Better help is an affordable option. And our listeners get 10 percent off your first month with a discount code. Happier get started today at better help. Dotcom slash happier. That's better. H e l.p dotcom shapir. It's interesting because I was watching the film and you're talking about Kuhnen and Pizza Gayed, and I'm thinking to myself, well, anybody who believes any of that stuff is, you know.

[00:25:56]

Gullible, right, to put it gently, but your last statement before I started talking and the film also seemed to be arguing that even those of us who consider ourselves to be smart or whatever can be hijacked by fake information, even if it doesn't mean believing that they were running a pedophile ring out of the basement of a pizza parlor.

[00:26:17]

That's exactly right. I think it's just our own good intention, our own well meaning of wanting to share. Oh, my gosh, this has to be true. This matches with my world view. Therefore, I want to share it with my friends because everyone needs to know this. At least we should do the diligence of looking at the actual story that we're sharing, reading, at least skimming it and saying, OK, that seems credible. It's got reliable sources behind it.

[00:26:45]

When we share an interesting infographics, share the source so other people can take a look and say, oh, actually, that's been debunked. Or, you know, they can have an intelligent discussion about it. So it's increasing the friction a little bit. So this then is parallel to mindfulness of saying increase the space a little bit, increase the space before we react, increase the space, take a look at what we're doing and respond with a little more wisdom.

[00:27:10]

I think that's that's one way to do. Now we're starting to get into sort of practical solutions for wise use of social media. But before we really, really dive in on that, let me just ask about one other area of sort of the pernicious impact of social media that you talk about in the film, which is mental health. And we've done this a little bit.

[00:27:28]

But how strong is the correlation between social media use and adverse mental health outcomes such as depression or anxiety?

[00:27:39]

The opposite of addiction is really it's well-being. It's love. You have to have the stability inside to be able to overcome a lot of these feelings. And when we are feeling vulnerable, when we are depressed or angry or anxious, that's exactly when we are most vulnerable to these algorithms. It's specifically those kinds of mental states that make us more vulnerable to doing all of the things we just discussed, the sharing of something or going back to post. Because we want that.

[00:28:09]

We want some support. We want some acknowledgement. We're feeling down. We're low on agency. We post something where we want people to say, I like it, I like you, you're good because we don't feel good about ourselves in that moment. Our own well-being, that the reservoir is low. It's not everyone that's affected in a large way. Everyone is affected in a small way. But the people who end up in more vulnerable situations then get affected in a larger way.

[00:28:36]

It's easy for someone to get sucked into a rabbit hole and end up sort of in an extremist rabbit hole, for example, when they're looking, when it's the middle of the night, their cognitive ability is sort of lower. Right. You're tired, you're not as discerning and you're more suggestible. You know, I see some of the some negative impacts in my own mind when I use social media, as I mentioned, I try to be pretty sparing.

[00:29:01]

I will occasionally post on Instagram, but I will delete the app afterwards. I'm not really on Facebook. I do use Twitter, which I don't find messes me up too badly, but I use it pretty judiciously. But Instagram in particular, I notice. If I'm using it regularly, there are two things that I see that are. Deleterious to my mental health. One is I get obsessive about how many likes I've gotten on whatever I've posted, and two is that I start comparing myself to these carefully curated images from the lives of my friends and colleagues or at parties that I wasn't invited to or whatever.

[00:29:40]

And so I can see how both can fuel depression and anxiety.

[00:29:44]

I'm just curious, do we know for sure that are there studies that really show a correlation in terms of depression and anxiety with social media use?

[00:29:54]

Yes, they are there. I think one way to understand this, you know, beyond the studies is to look at it from a training lens and see. Again, from a Buddhist point of view, and I look at it very much as conditions and saying, OK, how are the conditions affecting us from the moment we're born are actually before that? Right. Our existence is highly conditioned. So our genes. Right. Which is what we come in with, our genes, our environment, our relationships, our experiences.

[00:30:21]

All of these things are highly conditioned. So we have to be very thoughtful about what are the ways that technology is training us individually. And as a society, we have this intuitive understanding the laws of social physics. The thing that matters in terms of your voice and your ability to communicate effectively is likes, comments and shares. Then you start to put this together in your head and kids are wonderful at doing this. They learn very, very quickly these kinds of patterns and they start adapting them, but so do adults.

[00:30:55]

And so it's certain types of language, as you said, certain types of curated pose, curated pictures. That's what you learn. That's what you observe and you sort of integrated into your brain and say, OK, that's how this works. And I think that has very dangerous downstream consequences because it ends up shifting our value system. So there have been studies recently on how basically being an influencer or seeking fame has gotten much higher on the list of things that kids care about from, let's say, 10 years ago.

[00:31:31]

And that's natural. This is just practical. It's not a subjective judgment. The fact is that in this economy, if you want to if you have something useful to say and you want to say, you need to have more attention in order to achieve that. And it's pretty hard because of all the things we said about the default tilt to achieve that attention. If you follow a path of being modest, being humble, being simple, being accurate, being really accurate is really hard.

[00:32:04]

As a journalist, I'm sure you've seen this trend. And so then you end up in this era of click made headlines, because that's the pinnacle of the fight, is you even leave the headline out of the headline. You just put the dot, dot, dot. The one thing that you need to know about this election is. And you just wait for the click, it's crazy. So given all of the harm, macro and micro that you've just described from social media, is there a way to use this technology wisely, given that it's so pervasive?

[00:32:40]

People need it professionally, they need it to keep up on their friends. If you drop off the Facebook, you may not know what your family is up to. You may not hear about the family gatherings or the parties that you actually do want to go to that would be healthy to attend. Is there a way to use this stuff in a healthy way that's conducive to human flourishing or is the only answer abstinence? Delete these apps? Because certainly some of the people in the movie are suggesting don't use this and we certainly don't let our kids use it.

[00:33:07]

Yeah, I think there's a lot of facets to this. One is age for sure, because I think for kids to use this at a young age is exceptionally dangerous. So I think that's one thing, that one line to draw in the sand. One of the beautiful things about this film is that it allows everyone to have some shared understanding about this problem. There's a lot of safety in bringing up this topic. You're not going to be the weird one at your parent group or anywhere because millions of people have watched it, including teens.

[00:33:37]

Right. So that safety that the film brings to allowing these conversations to perpetuate is huge. That's the first piece. I think one of the fundamental problems, as we've talked about, is that the platforms are tilted to the tilt. Is there if you're on it, you are going to be subject to that tilt in one form or another. And you may think, oh, it's OK. Like I can overcome these things. And in many cases that's true.

[00:34:06]

We can we can figure out what's true or not, that we you know, we won't be too manipulated. But remember, the game is always progressing and already we've probably made mistakes for sure. I know I have shared things that weren't true because I was excited or behaved in ways that I think are not the way I would want to behave because I was online because of these specific conditions. And all of this is only going to get worse with defects right now.

[00:34:33]

We can fake video, we can fake audio, we can fake text very easily. There's all kinds of stuff that we are inevitably going to be vulnerable to. And I would ask the question more about why are you going there? Why are you using it? What are you seeking from it? So one example of where Facebook actually I think is useful is if I'm about to see someone I haven't seen in a while or talk to them, you can go and Facebook, type their name and look at their scroll, look at their feet, see what they've been up to.

[00:35:06]

That works great. But the problem is most people go back to the platforms and they let the default feed be their experience. So when that's your experience, what happens is there's sort of this replacement of the timeline and the sequence of actions that you were intending to do, and that gets replaced with a different sequence of actions that you were not intending to do. And sometimes you can drift really far out. I think YouTube is the best example of that.

[00:35:35]

If you go to YouTube, dotcom, like just open an incognito window and go there and you'll see just on the default homepage, so many interesting random things that you want to click on. And I saw one which is really amusing. It was like little coffee cups made with Lego, tiny micro things. And I was like, Oh, that's really cute. I love to click on it. And you can see every single one of those is interesting in one way or another.

[00:36:02]

And the more it knows about you, the better it'll do at finding that. So that's not the experience you want if you go to YouTube and you're looking for how to learn something, a new skill, for example, YouTube is great at that. You can go you can get that information, watch the video and then you should close it. This is where it gets tricky, this auto play. There's recommendations and our eyes are sort of we can't help ourselves seeing those things and saying, oh, well, that could be interesting to let me just click one more video and then it's a bunch of time.

[00:36:34]

Right. That experience everyone is familiar with. And I think the challenge is the platforms are not well incentivized to solve that problem, to make it such that when you come, you get exactly what you want and then you leave. So if I'm hearing you correctly, one way to use these platforms wisely would be to be pretty specific and intentional about it. So if you want to go to YouTube to learn how to tie a tie or how to hang up a chin up bar or whatever it is, I just I just hung up.

[00:37:12]

But then you can go there and look for that, but then be aware that you are facing off against supercomputers that are really good at getting you to stick to the site.

[00:37:23]

So, yes, if you can go and get what you're looking for and then turn it off and then also with Facebook, same thing I heard you say that, you know, if you want to go find out what a specific friend is up to before you see them again, go look. But if you're just going to get sucked into the random timeline, then you're likely to get manipulated.

[00:37:42]

Yes, that's true. The trick is and I think one of the real challenges is, for example, Twitter has lots of good news from people who are experts in the field. For example, many experts in many fields use Twitter to share their insights, their sort of latest insights to stuff that hasn't made it into articles or into Wikipedia or any other reliable source yet. And so if you want to know the latest, you end up going there to Twitter.

[00:38:08]

But you can make a pretty good argument that this sort of very short, limited character tweets and sequences of tweets is not the best way to communicate. Real insights or to have deep conversations, that's the problem, you're using it outside of the fundamental way it's being driven to operate. There are many well intended technologies at these companies who are also frantically trying to patch all of these things. Right. They're trying to put out all the different fires. They're trying to patch all the different problems, but they are unable to change the actual default tilt of the platform because they don't have access to the business model to change the way it actually works, to change those actual incentive structures.

[00:38:54]

So they're forced in this sort of in some ways, an impossible battle to just put out fires constantly and find new ways of putting out fires. One way to think of this is, is it a good idea for all of us to just have a megaphone and go in the public square and everyone start talking? It's just not a natural way of interacting. And so you end up in a set of assumptions that just don't match what's actually healthy for people.

[00:39:20]

And that works on a lot of different axes. So let me just say a few words about Twitter, and just to be clear, I have no investment in Twitter and I'm certainly open to the many, many critiques of Twitter, but I don't find it personally problematic.

[00:39:37]

Yes, I find it deeply unattractive that people that I know who are otherwise sane are on. They're just basically spewing a lot of venom. I think it brings out the vituperativeness and if that's even a word in otherwise calm, reasonable people. So I see that critique and I do see it in my own timeline, but I don't feel that sucked into it. When I look at Twitter, I find it's very interesting for me on a couple of levels.

[00:40:03]

One is I'm very interested in what's happening with a pandemic and lots of epidemiologists on there and that can go and read their threads. And it's very interesting. I'm also very interested in the election and I like to read perspectives from people on both sides. Sometimes the hot topics are too hot and there it's too hot. But I like seeing what articles people are posting. And because, by the way, then that takes me into a deep dive into something that's well researched and not, you know, just a few characters, et cetera, et cetera.

[00:40:33]

And by the way, one other thing I like about Twitter is that random people can reach out to me. Often they're saying nasty things, but usually they're saying really nice things. And I can and they're asking me I'm at a technical meditation, a question. I can take a few minutes to answer it. So I don't know. What do you think about if I sold out of my diluted. What do you think of my take on Twitter?

[00:40:51]

No, that's great. I think your take on Twitter is accurate for you. Again, I think the point is about the way the platforms are tilted by default. If you're careful, if you curate correctly and you know, the way you're using it is very different from the way a lot of people use it. And so I think that is helpful. There's a great website, by the way, called Alcides Dotcom that will show you the left, the center and the right for many different articles.

[00:41:19]

It's just a it's a way to see all of the different views. I do this, too. It's actually one of our recommendations is to go to the most sensational opposite sides, but go to the ones who represent different viewpoints and try and understand. I actually go into the comments and I try very much to understand what is going on, because I think ultimately this is about understanding other people's viewpoints. That's what we're missing in the overall political landscape.

[00:41:46]

We don't spend enough time understanding the viewpoints of others.

[00:41:51]

Is there a way to affirmatively decide to use social media and to be a vector of positivity?

[00:41:57]

Just on a personal note, yesterday, the day before we were recording, this was my son's first day of in-person school. We're living in a little town where you have the option of doing remote or in-person. And we had let him do remote for the first couple of weeks and then he decided to go in person. And so I went to drop him off and he was freaking out. He was so scared. Everybody's in masks and it was really hard.

[00:42:24]

And his cousins in first grade, she was with me at the time and being very supportive and ultimately coaxed him to go into the school. And I got this picture of her carrying his jacket all the way down this long hallway that I was not allowed to go down walking Alexander into the classroom and so on. His part, it was remarkably brave and on her part it was extraordinarily kind. And I had the thought, I haven't done it yet just because I'm lazy.

[00:42:53]

Maybe I should post this like I hereby interrupt your political death spiral that you're in or whatever you're in right now to show you something really cool that the six year old did for a five year old yesterday.

[00:43:05]

And so all long way of just saying, is there a way to use social media with the intention to be positive? Yes.

[00:43:14]

So let's talk about it. I think there's a lot of interesting discussion there, especially as usual, kind of using the Buddhist lens to analyze it. And I think this is exactly what we should be talking about. At the same time, there are a lot of tips that are really helpful, humaine to come to control that are actually somewhat unconventional and different from the typical tips that you hear. But for now, let's talk about the Buddhist analysis of this, or at least my version of it.

[00:43:40]

So let's start with what it feels like as we scroll. So on one hand, right there is the cat video supercute. There's a political thing superheavy. There's an inspiring picture of the two kids that Dan mentioned. Then there's an ad somewhere. Right? So your brain is really getting sliced around. And I definitely feel that when I'm scrolling through, I don't find that to be a pleasant experience. I find that to be a very sliced experience.

[00:44:08]

And paired with that, we have to look at what is the intention that we came in with? What is our intention when we post one really simple exercise we can all do right. Your intention when you post. Right, it along with the post at the bottom. Here's why I'm posting when you do that a little bit, you stop because a lot of the time the posts are related to. Well, I mean, the point is, I'm feeling bad about something.

[00:44:38]

I'm feeling anxious or depressed or angry. And I need some kind of I'm trying to make up for that. I'm not looking at that. I'm making up for it in a different way. So then it takes you to this point where you say, all right, well, there are two options. One option is to create distance between us and the devices. I think that is wise. I don't think of these things as a binary thing. It's kind of a spectrum.

[00:45:04]

And, you know, knowing yourself, you have to create the right amount of distance. Probably it's more than you think. And then the second part is to address the underlying instability, like what's the problem underneath? How do we become more stable underneath? And actually, I think it's very closely related to the realm of the horrors, which I think I heard you've been talking about that on the podcast as well. So this idea of loving kindness, compassion, sympathetic joy and equanimity, when we are practicing those, we end up weakening our sense of the fixed self.

[00:45:43]

That's the thing that causes all of our problems, really. So not understanding that, OK, ourself, we were just testing. That's highly conditioned. It's very much in flux. It's always changing and it's shaped by the conditions outside of us. It's shaped very much by our relationships. And this is why it's so important when we look and we say if we have a kid who are their friends and it's important for us to look who are our friends, and the reason is that they have the largest footprint in terms of shaping us.

[00:46:18]

But now there's this invisible friend, right, or unaccounted for, which is your phone, and that's actually around you more than a lot of these other friends, a lot of these other companions. And it's training you all the time. So this is also about more than just what you went to the platform for. It's about all of the different interactions. Every time you click a button and you're instantly gratified, it reduces our tolerance for when things don't go our way.

[00:46:49]

So back to this connection idea. Deep connection with others is a very good solution. Strong way to develop protection for ourselves and a deep connection to others is best achieved in person in video, on the phone, not online chat or texting, because online chatting and texting are highly interactive. See, I'm chatting with you on Facebook. I'll send you a message, the sibling, and then you will look and then you'll start typing something. Meanwhile, I am looking at something else and then it says bullying.

[00:47:24]

And then I'm brought back and then I say, OK, so now I'm typing. And while you're seeing my dads, you're looking at something else again. And so this process keeps repeating and in effect, you're doing a great disservice to your friend because you're doing this repeated interruption thing, which from a mindfulness point of view, it's the exact opposite of what we're trying to cultivate in terms of just a healthy, sustained attention, sustained mindfulness. We are slicing it up constantly.

[00:47:54]

And also, practically speaking, what you also find is that when you do this kind of chatting, you end up spending a lot more time for a very small number of words. It just takes a long time to do well.

[00:48:07]

Really appreciate you coming on and talking about wise, healthy ways to use social media going in. I wasn't sure that there was going to be any affirmative answers to that, but you gave many with the appropriate caveats. We will be putting links everybody to the various Web pages that Randy cited in the show notes. If you want to go check out the Center for Human Technology will make that easy for you. And we'll put a link to the documentary on Netflix.

[00:48:33]

Randy, anything I failed to ask you before we close here? No, it's fantastic.

[00:48:38]

I think we covered a lot of things. Let me just take a quick scan and see if there's any other big thing we should. We should touch home. Oh, there is one thing, it's the topic of what do we do after watching this film? I get it. I'm really concerned what's next and and how do we fix the system? One thing we can do to be really helpful is watch the film with others and have conversations with them.

[00:49:09]

So then after that, there's this movement. There's a movement for humane technology that's building up. You can sign up at humane tech dot com. That's good to take us to this kind of the bigger impact, the bigger change that we all want to see.

[00:49:25]

Randi, thank you to you and your team for doing all this work. Really appreciate it. Thank you for coming on today. You're very welcome.

[00:49:33]

It's been a real pleasure. Thank you so much for the opportunity. Big thanks to Randi, really appreciate him coming on. One last thing before we go, we, as I hope you know, care deeply about supporting you in your meditation practice and feel that providing you with high quality teachers is one of the best ways to do that. Customers of the 10 percent happier apps say they stick around specifically for the range of teachers and the deep wisdom they impart to help them deepen their own practice for anyone new to this app.

[00:50:04]

We've got a special discount just for you. And if you're an existing customer, we thank you seriously for your support to claim that discount go to 10 percent Dotcom's reward. That's 10 percent. One word all spelled out Dotcom's reward. Finally, big thanks to the team who work incredibly hard to make the show a reality on the regular. Samuel Johns is our senior producer. Marissa Shneiderman is our producer. Our sound designer is Matt Boynton of Ultraviolet Audio.

[00:50:32]

Maria Wartell is our production coordinator. We get a ton of massively helpful input from our colleagues, including Nate, Toby, Jen Point, Ben Rubin and Liz Levin. And finally, a big thank you as always, to my ABC News long time comrade's Ryan Kessler and Josh Cohen. We'll see you all on Wednesday for a special post-election day episode. We're going to record this one late at night once we have hopefully some sense of what's happening with Lamar.

[00:51:01]

Rod Owens, who you may remember from back in June, one of the most popular podcasts we've ever done. So very excited to have my MARAD back on at a time when we're going to need them. See them. Thanksgiving and Black Friday may look a little different this year, but there's still a lot to be thankful for, like being able to find the right people for your team when the holiday rush has you ramping up your small business needs.

[00:51:27]

So when you're ready to make that next hire, LinkedIn jobs can help by matching your role with qualified candidates so you can find the right person for your business fast. Getting started is easier than ever. Post a job with targeted screening questions and they'll quickly get your role in front of more qualified candidates, manage job posts and contact candidates from a single view on the familiar LinkedIn dotcom as functions are streamlined onto one simple screen. And now you can do all this from your mobile device to when your business is ready to make that next hire.

[00:51:57]

Find the right person with LinkedIn jobs. You can pay what you want and get the first fifty dollars off. Just visit LinkedIn dotcom slash happier again. That's LinkedIn dotcom slash. Happier to get fifty dollars off your first job. Post terms and conditions apply.