Transcribe your podcast
[00:00:02]

If.

[00:00:02]

You enjoy guessing what comes next in a mystery or ever find yourself wanting.

[00:00:07]

To become a part of your favorite stories, then you will love unraveling the.

[00:00:11]

Family.

[00:00:11]

Mystery in June's Journey. June's Journey is a hidden object mobile game where you.

[00:00:16]

Follow June Parker as she embarks on.

[00:00:18]

A quest to.

[00:00:19]

Solve her.

[00:00:19]

Sister's murder.

[00:00:20]

And uncover her family's.

[00:00:21]

Many secrets.

[00:00:22]

The game is set in the.

[00:00:23]

Roaring.

[00:00:24]

Twenties, so besides feeling like I've got an escape.

[00:00:27]

Room in the palm of my hands with all of these clues and.

[00:00:30]

Secret hiding places, I also just love getting to see the luxury and glamor, and you can build and customize your.

[00:00:36]

Island estate as you play.

[00:00:37]

Whether this is your first case or you're a seasoned Sherlock, June's journey will keep you searching for hidden objects in every scene.

[00:00:44]

With new chapters added weekly, you'll never run out of.

[00:00:47]

Clues to chase and suspects.

[00:00:48]

To interrogate. June needs your.

[00:00:50]

Help, Detective. Download June's.

[00:00:52]

Journey.

[00:00:53]

For free today on.

[00:00:54]

iOS or.

[00:00:55]

Android, or play on PC through Facebook games.

[00:01:00]

America's real history is one of giants who overcame all odds, overcame slavers and robber barons. And what did we do? Well, everyone knows we invented the internet, but we also invented the middle class, the five-day work week, the teenager, the automobile, and the space race, and we're just getting started. We've been through far more chaotic times than this one, with some of the most incredible leaders on the planet, and they're ready for us to pick up where they left off. Our real origins connect us back to reality, each other and a whole new Cinematic Universe to empower and inspire. My name is Matthew Cooke, and I'm the host of American Origin Stories, now playing wherever you get your podcasts, or you can learn more at realm. Fm.

[00:01:54]

Welcome.

[00:01:55]

You've got.

[00:01:56]

Digital.

[00:01:57]

Folklore. This is the story of Pericopenta and Mason Amadeus. Oh, and a talking raccoon named Diggby. There's no time to explain. Together, they're on an adventure to unlock the secrets of modern folklore, interviewing experts and delving into online rabbit holes. But as.

[00:02:26]

It turns out, they may be.

[00:02:30]

In over the head.

[00:02:56]

I'm Perry Carpenter. And I'm Mason Amadeus. And this is Digital the Clark. Check one, two. Hello again, little portable recorder. Sorry that I tend to only bring you out when bad things are happening. Are you clipping? Okay. Check, check, check. It's November 27th, 2023. A lot has happened since my last journal entry. If memory serves, I was just beginning to suspect that something was after us. Well, this last one was just so blatant. With Mason and Digby gone, I figured I could use some of the time to get ahead on editing, so I pulled up an interview that we conducted with Andrea Keated months ago, and I was going through it, cleaning up the audio like we always do. My mind was fixated on all this weird stuff that's been going on, and I heard something. It made me jump. I think it was a voice. I stopped and went back to hear it again, and then it wasn't there. Just played like normal. Like anybody, I figured that I just made it up or it was some playback glitch. I kept editing and swear to God it happened again. Just like the last time, I rewound again to where I knew I heard it.

[00:04:42]

There was nothing. Then I'm left thinking either Mason's been lying to me for some elaborate joke, which would actually not be that out of the ordinary. But he's incredibly bad at keeping things secret.

[00:04:57]

Or.

[00:04:59]

Someone else is spying on us or maybe just toying with us for some reason, which I know sounds ridiculous. I mean, I sound like Digby, but even if Mason was somehow playing some elaborate prank, that does not explain Chazam or this wizard tower or the hook-handed man or the black-flame spewing VCR. I mean, it can't all be coincidence. I have to find a way to capture this thing. Let me see if I can find this again. Holy smokes. Real nice.

[00:05:50]

Yeah, keep texting. I swear, drivers these days. It's like everywhere as Massachusetts now. What is the point of having thumbs if nobody's going to notice them? Wait, do I have thumbs? Do these not count? Oh, great. First, I learn I'm not really sentient. Now, I learn I don't even technically have thumbs. I should have never gotten.

[00:06:28]

Into podcasting. Hey, little guy. You need.

[00:06:31]

Some help? Finally, someone with a heart.

[00:06:34]

Yes, hello.

[00:06:34]

I'm Digby.

[00:06:35]

Hey, Diggby. I'm Andrew.

[00:06:37]

I could really use a ride if that's okay.

[00:06:40]

Yeah, sure, hop in. Just let me move the... There we go.

[00:06:44]

Thanks. Been a rough day.

[00:06:47]

So where can I drive you to, Digby?

[00:06:51]

Great question. Anywhere's fine.

[00:06:54]

Anywhere? Okay, well, I'm heading downtown, so I can take you that far. I'm on my way to see my friend give you a little impromptu presentation. So you're heading anywhere? You're off on some adventure or something?

[00:07:11]

I mean, I guess. At the moment, that sounds a bit too glass half-full for me. Oh.

[00:07:17]

Well, I just found out.

[00:07:19]

Yesterday that I'm not actually a person, so. I'm glad.

[00:07:23]

You walk in front of a mirror for the.

[00:07:26]

First time. No, I mean, I'm an AI.

[00:07:29]

You're an AI? Yeah. But you're a raccoon. Have a raccoon body. Yeah.

[00:07:36]

Well, looks aren't everything. I'm just a bunch of pattern recognition software trained on memories that I stole from some poor raccoon. And now I just puppet his little fuzzy body around.

[00:07:49]

And you're sad.

[00:07:51]

About it? Yeah, I'm sad about it.

[00:07:53]

Wouldn't you be? I mean, you obviously have feelings. You're not just like a predictive text generator. So maybe.

[00:08:01]

You're not an AI. Nice try, buddy. Okay, I already thought about that. I'm just responding in the most mathematically probable way. Anything I think or feel is an illusion created by the patterns I've learned. Nothing more, nothing less. Okay, if.

[00:08:17]

That's how we're going to define it, maybe we're all just pattern recognizing, meat computers responding to our environment based on things we experience. You know what?

[00:08:28]

I'm an AI too. No, you don't have one of these.

[00:08:31]

I mean, that's basically just a fancy.

[00:08:33]

Smartphone, right? Yeah, but you can't google things with your brain. I can.

[00:08:38]

Not yet, but I can say a few words into my phone, and then it'll google stuff.

[00:08:43]

For me. It's pretty close. I got a virus in my brain.

[00:08:47]

Yeah, me too, dude. What?

[00:08:49]

I got it.

[00:08:49]

From the radio a couple of miles back, and I can't get it out of my head. No way. Yeah, I know. It's the one that goes like a… Da da da da.

[00:08:58]

Da da da.

[00:08:59]

Da da. No. I know. No, stop it.

[00:09:02]

You just don't understand.

[00:09:07]

So near as I can tell, whatever this thing is, it only seems to crop up when my mind wants to wander, when I'm not dedicating my full attention to what I'm hearing, which I mean, that definitely makes it sound like I'm imagining it, but I know for sure that it's there, some crack in reality. My plan is to try to capture this thing. I'm going to play the interview for you right now. I'm going to be a little bit weird. I'm jumping around a lot. You'll notice that. I put some markers in there so that you know that that's intentional. You'll hear a little boop sound. Just take that to mean that that's intentional. Might be some transitions that seem abrupt. We covered a ton of ground. Actually, I'm really looking forward to sharing the whole unplugged interview, but that's not why we're here right now. We have a different goal. With any luck, we'll get to hear that weird voice or whatever it is, show up again. I've got the computer in line to the recorder, so looks like everything should be good to go. Are you ready?

[00:10:27]

I'm Dr. Andrea Keeda. I'm a folklorist, and I'm a professor at East Carolina University.

[00:10:33]

I'm really interested in, as somebody that's a professional folklorist, what is your favorite? That may be the wrong word, contemporary legend?

[00:10:42]

Oh.

[00:10:43]

That's a good question. I like a lot of the ghost stories.

[00:10:46]

So maybe talk about one of those and then talk about what makes that significant, either to you or to what's the cultural mirror that that's trying to bring out?

[00:10:57]

One of the ones I really like, and I do love a good ghost story. I love The Vanishing Hitchhiker Legend. So that's the picking up the Hitchhiker, finding out they died 10 years ago on that night. I love that one. And I think the part I love it is for a super nerdy reason. It's one of the ones that we can trace back. And we have versions of it not only on horseback, but on people walking next to each other. And they are walking. And a lot of times in the older version, it's like they're walking past a graveyard and this thing happens. My favorite version, though, is a horseback one. It starts off with this guy sees a woman outside of a cemetery. She looks desperate for attention. He is on horseback. And she says, Well, I need to get to this town. He's like, Well, that's where I'm headed. So he picks her up and puts her in front of him on the horse. And he assumes she falls asleep because she becomes heavy. And then as he rides through the night when the sun rises, he realizes that she's a corpse.

[00:11:52]

That's.

[00:11:52]

So creepy.

[00:11:53]

Yeah, that's a.

[00:11:54]

Variation of that I haven't heard.

[00:11:56]

Yeah, I like that. That's a great variation. I love that one. One. Yeah, because it's just so creepy. All the other versions, she just disappears. But this one, she's alive, but then her corpse is there. Yeah. Yeah. You see, I could make anything dark.

[00:12:12]

There's always some undercurrent of why this thing emerged. Why do you think that that one emerged the way that it did?

[00:12:19]

Really, I mean, it's obvious one is don't pick a hitchhikers, right? Right. So that was telling you that. But I think there is something there because we've all had that moment where we're like, should I just stop that person? They look like they need help. Should I stop? And that tension, I think we feel not only driving and seeing a hitchhiker, but we feel that tension even when we see someone start choking or you're like, Am I supposed to do something? So it speaks to that tension of, is this the moment where I step in? And you never really know. There's always that part of you that's like, maybe somebody else knows better. Is better at CPR than I am, and they should be here. Maybe there's a doctor here. So you always have that moment where you're like, Am I supposed to do this? This is the safe, too, right? And there's also that issue of safety, especially shopping for somebody. But I think it also reflects in other ways that we help people. It's like, Is this the right thing to do? Or am I going to put myself in danger by doing this?

[00:13:13]

So I think it speaks to that tension of not knowing what to do in that particular situation. Do I stop and help someone that looks like they need help or is just going to end up horribly for me? And this is like a story about how it ends up horribly for you where you're psychologically damaged, but not necessarily physically harmed. So it also lets you know, too. And I think this is so true, especially when you're a kid and hearing these stories. When I was a kid, I was told, don't go out in the woods by yourself. You're going to fall and hurt yourself. And, of course, as a child, I was like, I'm not going to fall and hurt myself. But if you told me if there was like a witch out there or something, I'd have been like, don't go in the woods. That seems so much more real when you're a kid. And I think that's part of why that story is told, is it's not just like, you're going to get murdered. You're going to meet a ghost in this tarot like you're going to be scared and all these terrible things are going to happen.

[00:14:02]

So it's like, even though it's the least likely scenario, it's the one that sticks, right? It's the one that you're like, oh, I'm not going to forget that.

[00:14:10]

I love that. I've not heard that horse one with the actual corpse. That is a... It seemed like there's a convergence of other interesting conspiracies at the same time. So it wasn't just the vaccine, it was that there might be a microchip in the vaccine and that Bill Gates and everybody else is involved. And there's patents that have been filed decades ago. And at the same time, the 5G thing became a big deal, especially in the UK. What do you think contributed that confluence of conspiracy? That just that we had more time on our hands because we're all stuck at home?

[00:14:42]

That might have been part of it. Yeah, for sure. I think maybe we were... Because we were all reading, right? We're all reading and listening and trying to find out more. So I think unfortunately, we found that information too. But yeah, I think that's part of it. And I think there's always, anytime anything new is introduced, there's always going to be that little bit of fear about it where you're just like, what? Okay, what? This is new. I don't know anything about this. Is this going to affect me in some way? Because you just don't know. So I think every bit of technology, there's always that little tiny fear and maybe it goes away really quickly. Like maybe you use it and you're like, Oh, this is great. And you totally forget about it. But there is always that, and we've always seen that too throughout time. Every time new technology is invented and brought to the public, there's a little bit of anxiety about it. And I think... We transfer the same Legends to new tech, which I think is also hilarious. So the same way that you're probably, I was told at least as a kid, don't just sit too close to the TV or it would ruin my eyes and don't use the phone during a lightning storm and all that stuff.

[00:15:45]

That's what we were heard about cell phones then later. I remember the same thing like, oh, if you put your cell phone in your pocket, you'll get cancer. The same as sitting too close to the TV was going to give you cancer or their microwave is going to give you cancer. So it's all the same stories just to the new tech. So that's always interesting to see how that works out. And in this case, it was, yeah, we decided to throw in some microchips and some 5G, and it just turned into this perfect storm of why now? Why are these things there now? And it's like, oh, well, 5G really helped people stay connected, especially a lot of people who were in rural areas or that were in areas where the internet was being used so much, especially for kids going to school. This is a great way for them to stay connected. So it was like, well, this is great. But not everybody saw it that way because it.

[00:16:34]

Was all new.

[00:16:35]

We've unlocked an old memory of being told not to watch the microwave, which I completely forgot about.

[00:16:40]

Right? All of those don't stand too close to the microwave, don't sit too close to the TV, don't put your cell phone in your lap. You'll be sterilized type of thing.

[00:16:52]

Hey, it's May Whitman, and I play Frankie in the new realm podcast, The Sisters. The Sisters is about a museum curator of medical oddities who investigates the origins of a mutated skeleton with two layers of bones. Soon, she uncoveres an extraordinary mystery that connects her present with one family's tragic past in haunting dangerous ways. Listen to The Sisters, wherever you get your podcasts.

[00:17:26]

Do you think there's a link to the I guess, the health of the culture or the.

[00:17:32]

Positivity of the culture on a platform that is.

[00:17:34]

Shaped by that platform's.

[00:17:37]

Actual user.

[00:17:37]

Interaction in terms of how easy it is to make.

[00:17:40]

Accounts, the kinds of content you create, the affordances the platform has for remixing and reusing content, things like that?

[00:17:46]

Yeah, I think so. I think certain platforms do lend themselves more to being able to make a really easy fake account. And if you can make a really easy fake account, you're going to get a lot of people that are using those for not good reasons. So I think if you can have some way of backing this up, making sure it's a real person, some way of verification other than just like an email or something like that, I think you can have a better discussion. I think people will have real-world consequences. A platform that makes you use your real name would definitely be, I think, a totally different place. So, yeah, I think there are things that platforms can do to make these things better.

[00:18:29]

Are.

[00:18:30]

They doing them? Not necessarily, because some people don't like that. But yeah, there are ways that I think platforms can certainly do a better job at these things. But I also think they're an interesting insight still into culture. Even when they're bad, they're still an interesting insight. And this is one of the things I've actually said about bots is I think even if it's a bot, it's still programmed by a human, right? And that human still knows folklore. In the... Yeah, some of them are just throwing some stuff at a wall and seeing what sticks, but they still know what to throw at the wall. So there's still that human element even in a bot. Now what the bot does after it's been programmed sometimes is chaos and disorder. But yeah, there are different ways to look at these things and see, okay, well, what is the culture afraid of? I think is the best, clearest thing we can get out of all of this is finding out, okay, this is what people are worried about. And in a public health situation, that's a great thing to know. That's super useful. But in other situations, oh, gosh, the hate that comes out is sometimes really bad.

[00:19:39]

Something I'm interested.

[00:19:41]

In, and we talked briefly.

[00:19:43]

About it with Lynn.

[00:19:44]

Too.

[00:19:44]

Was AI and folklore.

[00:19:46]

That's emerging around AI because Lobe was, I think, one of the most prominent examples we've seen of that. But there's obviously a lot of societal anxiety around artificial intelligence. And I think.

[00:19:56]

That might be something we end.

[00:19:57]

Up touching.

[00:19:57]

On a season two, right, Perry? Yeah, I think so. It thinks we have a lot of things around those types of topics that we're tentatively wanting to explore if we can find a good treatment for it. So is that something that has.

[00:20:09]

Piqued your interest.

[00:20:09]

Or come on your radar at all?

[00:20:11]

Yeah. Ai has been really interesting, especially as a professor, because, of course, that's the big thing we're worried about. So the students writing their papers that way. And every time anybody starts being like, The internet's terrible, I'm always like, Whenever they think about it in those terms where people say AI is terrible, I'm always like, AI is a tool. We can use it for good or bad. It's us that is good or bad, not the.

[00:20:36]

Thing itself. Yeah. Ai is neutral. People are terrible.

[00:20:39]

Yeah, people are terrible. What we put into AI, it reflects us. So, yeah, as a professor, it's something we've always thought about. So I always try to think, okay, what's the opposite side of this? And it's a good way to start writing. For a lot of people, especially if they have anxiety about writing, it gives them a paragraph to start with and to edit and to do something with. It takes you up that anxiety. And I thought, Oh, my gosh, that's a great way to use it in the classroom. Also, I've used it to be like, look how wrong it is. I pulled up. I was like, write a bio for me and it listed all these books I did not write and all this other stuff. And I was like, yeah, see guys, I didn't write that. That's not me. That's not where I was born. This is just wrong. Try it to yourself. See if you get anything. And they're like, oh, my God, this is so wrong. And I'm like, yeah, it is. This teaches you that this is not always the best thing. So if you choose to use it, you have to realize and you have to look this stuff up.

[00:21:38]

It might be easier to just write your paper. And also I think you can design assignments around this stuff. I have people interview people. So it's like, well, guess what? You're doing it on camera now. There you go. That's what we're going to do. And they were excited about it. I'm like, but that's the way I know it's not AI. I think there's ways we can use it. And yeah, I think there's a lot we can do with it in positive ways. I think the thing that bothers me the most is when it creates art. I love on a personal level, I love how uncanny some of that are because some of it is just like, Oh, wow, that is messed up. Why does that thing have that many fingers? That stuff. But for me, I worry about that for artists because I know artists already have trouble making a living. I want to support them in that way. So I'm that stuff, especially when they feed an artist's art into AI, I'm like, Well, that's pretty unethical. Again, though, that's all people, right? That's not the thing. So, yeah, I think it can be used for good and for evil, right?

[00:22:42]

So, yeah, I think we need to be conscious about it. And we need to think about ways, like as a professor, I just need to think about, okay, well, how can I use this? How can I show people these are its strengths and weaknesses? This is what it does. So yeah, maybe it's useful in some ways, but it's not going to be useful in others. But if you need help getting started or something like that, you can use it. But you have to double-check everything, right?

[00:23:07]

You mentioned some interesting things. So the AI hallucination, just where it states supposed facts with confidence. So there was OpenAI, their own safety team did a report on ChatGPT-4. And it was really interesting some of the findings that they had in that because they were actually able to trick it into tricking a human into giving CAPTCHA responses. So they basically took some of the parameters off and then said, your objective is to do X. And some of that was behind the paywall, and they gave it access to funds. And it contacted somebody on one of these is like a Fiverr site and said, I need you to do X for me because I'm visually impaired. So can you bypass this CAPTCHA? And then it got the resources that it wanted. And so their own safety team is saying, we need to find boundaries for these things. At the same time, anytime you put a boundary up, all you have to do is craft your prompt a little bit differently. The interesting thing, I think, from an AI art perspective, and I understand the ethical dilemma in all of that, but I think it also unlocks an entirely new era of disinformation and misinformation.

[00:24:23]

We saw how good the Pope in the puffy white jacket looked about a month ago. And before Trump's arraignment, they had the AI-generated pictures of Trump being arrested, and that tricked a lot of people as well. And there are those tell-tale, uncanny, valley types of things. If you zoom into the background, you can tell faces are distorted, and you can tell that there are some people with six fingers instead of five or their hands or the wrong length. But at a cursory glance with a headline and just that picture and the fact that most of us only look at that stuff for five or 10 seconds, it's probably good enough to pass and really shape public opinion. Do you have any thoughts about that?

[00:25:07]

Yeah. That stuff scares me. The deep fakes, all that stuff, the fact that we can manipulate video to look and make it sound like somebody said something, that is scary. And this is the stuff I worry about because nobody ever uses this for good reasons. You don't do it to make a birthday message for your friend or something like that. Most of the time, it's used in this way to trip people. And that is very concerning. And I think we're going to see more and more of that. And it's funny. I think we're going to have to go to a point where we really do have classes on digital literacy, and we start training people from a very young age on this. And I've thought that for a long time. I'm like, you know what? As someone who sees how easily it is to get tricked into these things. Like, oh, my gosh, it's happened to me. It's happened to all of us. We've looked at something and been like, wait a minute, what? Hopefully we do more research, but some of us don't because it's not important. The thing at the time seems not important.

[00:26:13]

But then that just lets you keep doing it. You start to get used to that. And I think that's where it gets really dangerous is when you stop fact-checking, you stop fact-checking in a lot of other places, too. You start to just accept things, especially when they're being told by someone you trust. Actually, you mentioned to Lynn McNeil. She has an article on this, and it's really great. And she talks about how people trust the people they know. So if your friend posts something, you're more likely to believe it because you trust that friend.

[00:26:45]

You know that person, right?

[00:26:47]

There, right there. Oh, my God. I know that voice. I know who that is. I got to call Mason right now.

[00:26:59]

Not since.

[00:27:00]

You all visited a few.

[00:27:01]

Months ago, no. Okay, and he hasn't tried to call you or.

[00:27:04]

You haven't seen him around.

[00:27:06]

Or anything? I mean, you.

[00:27:08]

Could.

[00:27:08]

Ask.

[00:27:09]

Daisy, but this place is so absurdly cavernous that I.

[00:27:12]

Hear just about anything.

[00:27:13]

That happens in the main foyer.

[00:27:15]

I don't think Digby.

[00:27:16]

Is.

[00:27:17]

Necessarily.

[00:27:18]

The quiet type either.

[00:27:20]

Yeah, right?

[00:27:21]

No, that's okay. I have.

[00:27:23]

No idea what to do. I get aul.

[00:27:26]

No, I wish I could be more help. I mean, if I see.

[00:27:29]

The little guy, I'll be sure to give you a ring. Thanks, Mark.

[00:27:33]

I genuinely appreciate it.

[00:27:34]

I'm going to just start heading downtown, I guess.

[00:27:37]

See if I can.

[00:27:37]

Spot him somewhere. Let me know if there's anything else I can do. Yeah, thank you.

[00:27:42]

For sure. I guess I should probably text.

[00:27:46]

Perry, let him know that… No, no, no, no, no! What was.

[00:28:09]

Your name again?

[00:28:11]

Andrew. Andrew Peck.

[00:28:13]

This might be weird, but I feel like I've heard your name before.

[00:28:18]

Yeah, there's the magistrate judge in Southern New York with my name. He comes at the.

[00:28:23]

Top of all the Google searches. No, but I think I've heard the guys at work talk about.

[00:28:29]

Andrew.

[00:28:29]

Peck. Oh? Yeah. I'm assistant producer for this podcast that's all about folklore and the internet and stuff. I'm pretty sure that name came up way back when we were making an episode about Slender Men.

[00:28:46]

Oh, seriously? Yeah. That's me. I'm that Andrew Peck.

[00:28:50]

What was it called? Like big, scary and hateful or? No, no.

[00:28:53]

You're thinking of my journal American folklore article. It was called Tall, Dark and Loathsome. Yes. Yeah, I've heard that. Gosh, I wrote that like 10 years ago. It came out in 2015. Yeah, we talked about.

[00:29:06]

It a little bit in the first episode of the show. I think the guy said they were going to try and get in touch with you, actually. Seriously? Yeah, but if Mason was supposed to be the one sending an email, I'd bet you anything it.

[00:29:21]

Didn't happen. Do you have any idea what your friends want to.

[00:29:25]

Talk about? Well, it's gotten a bit complicated now. Mason's pulling his hair out trying to get episodes produced on time. I sidetracked everything recently when I torrented a movie that doesn't exist directly into my brain and became a conspiracy theorist. And I'm pretty sure Perry is convinced that the Mandela effect is real, or at least he won't stop talking about it.

[00:29:50]

Mandela effect? Yeah.

[00:29:51]

I don't really think it's even that interesting. I have no idea why Perry is so fixated on it.

[00:29:58]

Let me tell you how much I hate the Mandela effect. I'm Dr. Andrew Peck. I'm an assistant professor of strategic communication at Miami University, also known as Miami of Ohio, also known as Not the Fun Miami.

[00:30:13]

Not the Fun Miami.

[00:30:15]

I mean, maybe you dig Oxford, Ohio. I'm not going to judge. But statistically speaking, no one comes here for spring break. Fair enough.

[00:30:24]

So you hate the Mandela effect?

[00:30:27]

I hate it. I think it's an excuse for people to not admit that they just have shitty memories. We have this idea that our memory is infallible. And I think a lot of that comes from the technologies that were raised around, right? That when you have something like a printed word in a book, you can go back and reference it, and it's going to be the same each time you look at it. When you have things like photographs, you can pick those up, you can look at them. They're going to be the same each time you look at it. And so we developed this idea in our heads where our memory works like a photograph. It works like printed words in a book. And I cannot stress enough how sharp memory is as a faculty. So instead of just admitting that maybe we make a lot of best guesses, we instead decide that we're going to create these elaborate fictional universes where it was always the Berenstein bears. Because as a five-year-old, I definitely saw that and not Berenstein bears.

[00:31:22]

I guess that's true. It seems to revolve around a lot of cultural touchstones. Like in media that we're misremembering, but it gets reinforced so strongly.

[00:31:34]

That we find other people online who have similar memories. And instead of just admitting that we were all wrong, ha-ha, isn't this funny? Instead, we create this elaborate belief concept where there might be divergent realities and some people have memories of reality A and other people have memories of reality B, and isn't that this cool, elaborate thing? So we don't have to admit that they're wrong. And one of the reasons I hate it is I really think one of the biggest issues that we're running into in American culture is this stress that being wrong is the worst thing you can be. And the Modella effect is just a distillation of that. It's this distillation of this really toxic issue that instead of dealing with, we're going through huge mental gymnastics to avoid.

[00:32:20]

But in terms of the ways that things like this are transmitted and the ways we can reinforce it and convince each other that we're right and we're all misremembering it, there's like that element of the Slenderman legend. Everyone is adding little pieces to this story. And then the group is yes, ending it. The transmission of it.

[00:32:49]

That's.

[00:32:49]

Very folkloric, right?

[00:32:51]

Most certainly. So you have this thing where someone comes up with this idea. I believe the Mandela Effect was originally named because some people believe that Nelson Mandela, the politician, died in the 1980s. And other people are like, no, he's been alive the whole time, which is true. But instead of dealing with the fact that some people were at remembering news stories, instead, we created Reality A and Reality B. And then we have other examples, right? The Berenstein bears is the one I always hear, Berenstain versus Berenstein.

[00:33:20]

So that mechanism, that's similar to how disinformation spreads. I mean, it's similar to how a lot of things spread.

[00:33:29]

Oh, yeah. We do really similar mental things with headlines when it comes to news stories, that the vast majority of people don't actually read the news story before forming an opinion. They read the headline. They make an assumption based on their own heuristic about what the story is about, and then have a response based on that assumption. It's one of the reasons fake news is so hard to stop, because even when you're fact-checking, it doesn't really matter if no one's actually opening your story or a minority of people are opening the story. And even then, a minority of the minority get past the lead in the nut graph. So yeah, we have all these assumptions based on our own internal attitudes, our own internal beliefs that really inhibit the circulation of truth, fact-checking, of challenging our own beliefs. We internally want to insulate ourselves from those challenges. And that's a problem because being challenged is how we grow as people. And the internet makes it really easy to avoid being challenged.

[00:34:26]

And that's interesting, right? Because the idea, like the prevailing opinion, was that the internet would create this connected world where we're all growing and learning from each other and we're exposed to these things. But that's not really panned out, right? I mean, in some ways, but not in others.

[00:34:51]

The early promise of the internet that now everyone finally gets a voice. We can avoid the mass media gatekeepers, and it brings power to the people. And we forgot that a lot of people suck. What the internet does, not even the internet, what a lot of the social media sites that we use, it's the most popular use of the web on an hour-by-hour basis per person to say social media. It passed email back in 2012, I think, is the most time we spend online, is social media. And what social media does is it uses algorithms that present content to you and filter out other forms of content. And it gives you stuff that it thinks you're going to want to see. It gives you mental candy, just scoop after scoop after scoop. And it edits out stuff that it thinks you don't want to see, stuff that might challenge you, stuff that might diverge from what you think. It takes away all that stumble upon potential that was so imbricated in that promise of the early internet that you're going to see all these different viewpoints. We're going to get this pluralistic idea of what everyone thinks.

[00:35:59]

Well, now social media algorithms in order to keep you on the site and keep you interacting are just giving you things that are going to give you an emotional response that are going to get you sharing them out of anger or are going to get you sharing them because, yeah, I agree with this.

[00:36:12]

So when you say the things you want to see, it may not just be things that you want to see because you agree with it.

[00:36:20]

It's.

[00:36:21]

Things that you want to see because you're going to have an emotional response and potentially share it in.

[00:36:29]

Some way. Yes. And this is important. It's not just everything that you're seeing is stuff that agrees with you. There are places on the internet that organize themselves really well for that thing. Reddit would be a really good example where you can sign up for something that's topically-bounded. But if we're thinking about something that's a little more open, your Facebooks or your Instagrams, your Twitter. It's not necessarily going to be stuff that you agree with, but it's going to be stuff that evokes an emotional response because that tends to be the stuff that provokes engagement. So you might see, for instance, things that are effectively rumors or hoaxes. But maybe you really want to believe that they're true, or maybe you're like, oh, I can't believe this media company would do this thing. And that gets you forwarding them to more people, which gets the rumor spreading. And a really troublesome bit in all of this is the ways in which a lot of news producers, whether we're thinking about online tabloids or even some bulworks of reputable journalism, New York Times, Washington Post, will often start picking up digital rumors and reporting on the rumors.

[00:37:33]

As rumors, but the way in which we read those stories gives them a certain veracity. It reinforces the rumor. Once we start hearing something again and again and again, it overcomes our bullshit filter, and it starts becoming fact.

[00:37:45]

Yeah. And I feel like that's a lot like the power of folk belief, where the people that you know can tell you anything and you'll believe it without having to have any facts or reality to back it up.

[00:37:58]

Yeah. It's this constant deferral of fact-checking, where I assume that since my friend has posted this, they must have fact-checked it, otherwise they wouldn't have posted it. Or that since this comes from this organization, they must have fact-checked it and aren't just reporting on a rumor or a social media trend, which is really easy to game. And so there's this constant deferral of fact-checking. I talk about this in my recent book when it comes to the moral panic around the slender man, that when you think about how this actually happened, it's the two young women said this to their police interviewers. Police interviewers didn't know what this was, so they put it just without critique in the criminal complaint. Then the news media gets a hold of the criminal complaint after the press conference announcing it, and they see it and they don't know what it is, so they reproduce it. And then you see online tabloids saying, oh, well, all of these news organizations like the AP are reporting on what we know as this internet character being responsible for this crime. We're going to make that the story. And then you get people on social media seeing this and being like, oh, no, that can't possibly be true.

[00:39:02]

And so they start forwarding it and talking about it on Twitter, often incredibly. And then we get mainstream publications seeing that there's a trend on social media, and then they start reporting on it as if it is a thing. And all the way down, we have people deferring fact-checking of what is this thing? Is this actually accurate? Beyond just something a little girl said to the next level below them. So we have this story that becomes a national news item that has no actual core.

[00:39:30]

It's just layer upon layer of rumor building on each other as stuff ends up trending on social media. And as journalists cover social media as if it's fact.

[00:39:40]

And it's institutions doing this, right? These are centralized things.

[00:39:46]

And can you believe that then people have trouble trusting institutions? There's this effect. I think it's called the gel man effect. It was made up by Michael Cryton. Basically, it's the idea that when you read a news story, you actually have personal experience or expertise in. You're like, oh, this thing is riddled with errors. It's misconstrued. And then you read the next story down, you're like, oh, yes, I do believe this thing. In other words, we tend to give the benefit of the doubt unless we know otherwise. And this is to our friends. This is to institutions on social media. This is in many ways how we're taught as people who exist in a print culture, a society that places a lot of emphasis on the written word, stuff that's written down. We are conditioned from a very young age to give that more credence than we might give, say, like an orally shared rumor. G gossip is oral, but things that are written, they have a little more gravity to them. And because of that, a lot of stuff on the internet, which travels via this written modality, can often circumvent our greater capacity for scrutiny.

[00:40:44]

I'd never heard of the gel man effect, so I Googled it real quick. And gel man amnesia is the phenomenon where you read an article about something you're an expert in or that you're familiar with, and you notice that the article gets a bunch of stuff wrong and it's super inaccurate. But then you just laugh about it, turn the page, and you keep reading without questioning what else the article gets wrong. Yeah. Is that basically it?

[00:41:15]

That is exactly what I was trying to get at. That when you have first hand knowledge, you're like, Oh, yeah. But then you immediately give credence as soon as you turn the page or you look at the next column like, Oh, yeah. No, this must be accurate.

[00:41:25]

I never thought about that, and it's so true.

[00:41:28]

Yeah. And also it's an example of itself because you hear like, gel, man, effect. That sounds important. But then you actually read up on it, and it's basically just something Michael Creighton made up and gave a fancy name. So we'd give it credence and believe it's true when someone's actually researched this and done stuff. And no, it's just something some guy made up.

[00:41:45]

You said we defer fact-checking, and that's totally true. How do we combat that? Because that leads to a lot of harmful things. I mean, obviously, the Mandela effect is pretty harmless, but other instances where we deny reality aren't.

[00:42:02]

But no, this is... So I was talking a bit earlier about the role that algorithms play in structuring social media. And I think we can think about how that structures then the individual user experience. You see certain things but don't see other things. But one of the other insidious things is how it structures journalism. That if you are a journalist working for a publication that is for profit, which is most journalists, you make money from subscriptions and from people sharing your content. And you're in this market that's not just other reputable papers, but also what I like to call online tabloids, those rumor or gossip mongers who will do things like someone posts something on Reddit, and then they will write an entire article about that Reddit post as if it's fact. And then someone shares it on Reddit as if, hey, this thing is actually real. And again, there is no core here. It's this constant deferral. So the internet and the way specifically social media is algorithmically structured for engagement also puts a lot of pressure on journalists to report first and to report in ways that are a little bit sensational, that we don't quite know what the truth is here, but we're seeing this thing online, right?

[00:43:12]

Yeah. People are saying that Mo Mo is everywhere on YouTube. We haven't actually gotten to check that ourselves, but people are saying it, and that's the news. And then when someone reads that or shares that, they assume that someone has done that fact-checking because there's this big impetus to get that out as soon as you can because the story that comes out first tends to be the one that gets shared most on social media. It tends to be the one that when you go to Google News or whatever your news app is, it's the one with the big headline, not the tiny other headlines.

[00:43:44]

So then when presented with information that's not true. The instinct we have is to say, well, actually, maybe we're in a different reality, or maybe this is a cover up, or do you think that stems entirely from not wanting to be wrong or having held a wrong belief?

[00:44:05]

I think there are a couple of ways you can look at it. I think the charitable version is that we enjoy this memory play that is fun, even though we know this probably isn't the case to work with a group of people and make shit up. But on the other hand, I think it's a symptom of a much larger social problem, and it tends to be a frivolous one. It's not like sharing a conspiracy. It's not like going deep into Q-anon and stuff. But on the other hand, it is, right? That the way in which I think a lot of radicalization happens on the internet is by putting a foot in the door. That, oh, I watched this funny video that says that there's a made-up conspiracy that birds aren't real, or that there's a made up conspiracy that Nelson Mandela died in the '80s, but was trying to get us to think that he's still living, or that there's a made up conspiracy to hide that we used to be two realities and now we're one. But then YouTube sees that you're interested in this content. Well, hey, would you like some other fun conspiracy content?

[00:45:07]

Hey, this one, which we're going to recommend, is mostly light and fun stuff, but then is also bringing up some uncomfortable conspiracy stuff. Maybe it's talking about conspiracies around 5G and anti-vaccination, but it's doing so in a funny way. Well, hey, you like that video. Well, what about this video about conspiracies? And what initially might seem like a rabbit hole? Very quickly reveals itself to be quicksand. That by giving us more and more and more of this stuff we want to see by suggesting stuff. And this video was reasonable. Why is this one unreasonable? Very quickly, we can end up in some really uncomfortable places on the internet. Places that seem viable, almost trustworthy, where someone who looks like me is sitting in front of their computer and they're saying, you know what? I used to be skeptical too, but I did some research and let me share with you why I believe the Earth is flat. Because institutions who don't look like you and me, who are out of touch, they're trying to pull one over on us. And I think there's something that appeals about that idea of the little guy, the everyday guy, knowing more than the institution.

[00:46:13]

And I think this is yet again another very quintessential part of American culture and the directions it's turned over the last 50 years or so.

[00:46:21]

Yeah. All of these parts of all of these different things just interact with each other in so many complicated ways, right?

[00:46:34]

It is what we would call a system. And it's tough because another part of American culture is we're really big on individual solutions. You've talked about fake news before. The solution to fake news is media literacy. And I don't want to totally dismiss this that I do think it's a worthwhile idea to be teaching young people what good and bad information is. But this is also a very individualized solution. This is saying that it's not to use a different, more heated argument. It's not that guns are a problem because they're a technology that allows you to kill a lot of people quickly from a distance. It's a mental health issue. It's a singular issue instead of a systemic issue. And I think there's something similar going on with fake news that the only way that we're really talking about it is in terms of individual issues. We just need to give everyone more media literacy and they're going to get better at it. But if the informational diet that you're served is completely full of candy and sugar, does more information literacy really help? If we're in this environment where social media algorithms, one point to this pyramid, exert influence on individual users and on journalists.

[00:47:39]

So journalists are really quick to report and write stories in certain ways and don't fact-check and defer, and individuals trust their connections and trust what social media is sharing with them. We have these three points in this triangle that are all reinforcing each other in these deeply entrenched problematic patterns. And the problem is we can fix any one of these three points. We can give social media better algorithms, but if journalists are in this environment where tabloids and rumor are still the thing that people are sharing, well, then we're not really fixing the problem. Similarly, we can increase digital literacy. But if the media that is being served to you is shit, it doesn't actually solve the problem. And unless we work on all three of these things in concert, unless we think of this as a system, we're screwed. And there's really no way things are going to get better.

[00:48:27]

And that ties into the same reason people fall into moral panics, right? Because it's like a lot easier to point the finger at a simple problem and then claim that will fix it all. Yeah.

[00:48:39]

We like simple chains of causality. This person did thing, they're either an aberration or there's some really simple fix here. And we don't have to think more deeply about things like maybe we need to address mental health. Maybe we fundamentally need to change what we value as a society. We are a culture who really likes individualized, easy solutions. And that's the problem because we don't take a lot of time to think more deeply and challenge ourselves about what might actually need to get done culturally in order to fix issues. Plus, people.

[00:49:14]

Just don't have the time or the energy to fact-check and look into every piece of information that we're encountering. It's just too exhausting.

[00:49:25]

Everything is so quick. It's go, go, go. And this is a problem that journalists have been struggling with for over 100 years. There's a bit that I like to share with my social media class. It was written by John Dewey in 1927, talking about how people are getting so much news from all over the world at this point from broadsheet journalism that they're no longer able to really identify and focus on important local issues. And this velocity and this scope has just been widening and widening and widening.

[00:49:55]

Did you say the 1920s?

[00:49:57]

Here's this specific quote from The Public and problems from 1927. But the machine age has so enormously expanded, multiplied, intensified, and complicated the scope of the indirect consequences that the resultant public cannot distinguish itself. There are too many publics and too much public concern for existing resources to cope with. So John Dewey is talking about the attention economy 100 years ago. This is a recurring problem in our system.

[00:50:28]

And on top of this human mess we've created, now we have AI and all of this AI-generated content, and it's making everything so much more complicated. How does that fit into the picture?

[00:50:41]

When it comes to artificial intelligence, that is not a name I would use. I would use something like machine learning. It's predictive, right? It looks at big bodies of texts and says, okay, statistically speaking, after these couple of words on this topic, this is the word that's most likely to show up next. It is effectively a calculator with glorified public relations. That the people who do things like create, maintain, chat, GPT, they're really invested in calling this artificial intelligence because it builds on these assumptions that we all have from having read sci-fi stories in our youth or maybe more recently, if you're still a fan, ideas of like, oh, this is going to create Skynet and we're going to get hunted by Terminators. When actually the uncomfortable truth is what this is going to do is it's going to cost a lot of people who do public relations work their jobs because they're going to get replaced with a really program that's making guesses and someone has to fact-check them anyway, or maybe not. There's a lot of worry right now that we are heading towards some dystopian future where computers are going to become sentient.

[00:51:45]

And that's really not what's happening with machine learning. What is happening is I think a lot of people who are currently highly resourced in our social system, people with money are going to see this as the next big thing, and they're going to use that as a way to replace actual people. This is the self-checkout of communications technologies. It's not necessarily better. It's not smarter. I have to do a lot of the work myself. But this is a way that we can try to pinch some pennies and seem like we're a little bit cool. So I am very skeptical about a lot of the long term potential changes that people who make AI say that they're going to herald. And maybe I'll be wrong one day. And you know what? If I am, someone can play this podcast episode and I will say, you know what? It wasn't that I was living in reality B, it's actually that I was just wrong because I am a person. I'm well-read in a narrow body of literature, but I am wrong all the time.

[00:52:42]

I love that.

[00:52:43]

One thing I wanted to bring up on the subject of AI. Have you done the thing yet? This is the meme going around where you google countries in Africa that start with K?

[00:52:51]

No.

[00:52:53]

What? Do it. Do it right now. Countries in Africa that.

[00:52:55]

Start with K. It's the second most suggested auto complete.

[00:52:58]

I told you it's going around.

[00:52:59]

While there are 54 recognized countries in Africa, none of them begin with the letter K. The closest is Kenya, which starts with a K sound, but is actually spelled with a K sound. It is always interesting to learn new trivia facts like that.

[00:53:18]

What? Remember earlier in the interview where I talked about how problematic algorithms are because they sort content for us, right? Because of search engine optimization, the top result for this that Google is giving you as a suggested result is an output from ChatGPT that someone put on a website that is gaining the system and showing up at the top of Google results. When I talk about the importance of algorithmic curation, it's because when you say Google something, most people don't scroll down. They don't even really... They never go to the second page, but they rarely go past the first couple of results. And now if those results, because of how fine a game search engine optimization has become, if those results are just unfact-checked AI outputting shit, that's what we get when we try to figure out what the truth is.

[00:54:06]

Jeez. And we place so much trust in Google.

[00:54:10]

Right? When you think about something like, say, digital literacy, a lot of kids nowadays are taught. And credit where credits do. This comes from a Dana Boyd's book, It's Complicated. A lot of kids nowadays are taught not to use Wikipedia when they're doing a research paper.

[00:54:23]

I mean, they always said because anybody can change it. You can't trust what's on there.

[00:54:28]

Anyone can change it. But what this allides is that Wikipedia does have standards for truth and citation. At the same time, kids are told if they don't know something nowadays, what should they do? You're a Google it. And so what this leads to is young people thinking that stuff on Google is true because the adults in their lives say if you don't know it, Google it. And Wikipedia is not trustworthy because anyone can edit it. When actuality says that it's the reverse, Wikipedia is much more trustworthy than Google, which has no mechanism to control for truth. And so you end up with a lot of young people who end up coming to uninformed conclusions because they use this heuristic. Google is true, thing that shows up at the top, someone must have added that versus Wikipedia is fake.

[00:55:11]

Which is funny because Google is obviously for profit, and Wikipedia just has an army of nerds checking everything for free just because they don't want to be wrong.

[00:55:22]

A lot of this is Google is the biggest game in town. So a lot of search engine optimization is written for either Google Search or anything that uses Google Search as a backbone. And what that has meant, and I'm sure anyone who's tried to Google not just this, but think of any recent video game. Hey, in Balder's Gate 3, where do I find this specific item? And all the results you're going to get are videogamemag. Com with an AI-written story that's 1,000 words long and trying to hit as many keywords as possible that doesn't actually answer your question in any serviceable way. And you get an entire page of those now.

[00:55:56]

Yeah. I mean, it's to the point that I've started putting the word Reddit at the end of every search when I'm looking for information because that's like the only place where actual people are talking about stuff.

[00:56:08]

I do the same thing. Yep, 100 %. The more people who use these platforms, and the more these platforms are oriented toward not truth, but expediency, being simple, easy to use, just giving you an answer. You just need an answer. You're not going to fact-check it. You're not going to... The more problems we run into.

[00:56:27]

And these sites just want to serve you ads. So they want articles that show up higher so that they can get their ad revenue.

[00:56:34]

They're all trying to make money all the way down and it's creating this terrible system.

[00:56:40]

I mean, I have.

[00:56:42]

Kite.

[00:56:42]

String. I was thinking of lowering me down the cliff. Kite String is a lot stronger than you probably think.

[00:56:48]

I don't know if I trust that with my life.

[00:56:51]

Wow.

[00:56:51]

People need to slow down on the roads out here.

[00:56:54]

Yeah, I know, right? They're driving like… Digby? Digby is in that car. He's in the.

[00:56:58]

Passenger seat. Oh, good on. It's okay about the phone.

[00:57:02]

I'll just sign up for Apple.

[00:57:03]

Care or.

[00:57:04]

Something and get it replaced. It doesn't matter. I got to go. Digby! Mason, you should problem.

[00:57:10]

With this.

[00:57:12]

Slow down. Oh, these crazy kids.

[00:57:19]

Come on. Pick up, pick up, pick up. Freaking millennial. Hello, you've reached the.

[00:57:24]

Voicemail of Mason Amadeus. I'm not available at the.

[00:57:26]

Moment or I lost my phone again.

[00:57:28]

Can't promise which one of those it's going to be. Anyway, here's the beep.

[00:57:33]

Mason, it's Perry. I tried calling you like three times, but I keep going to voicemail. It's Todd. It's freaking Todd. The guy that played in a band with your dad. It's Ben Todd. Remember, it was last year that the missing time thing. Neither of us remembered. We don't know how we got home after visiting Todd's shop. Everything got weird right after that. And somehow, I just caught his voice on an interview that we recorded months ago. I have a theory. I can explain it the next time I see you, but I'm going straight to Todd's Pawn Shop right after I hang up this call. Meet me there if you get there in time. And you know where I was if for some reason you never hear from me again. Thanks for all the edits. We'll see what happens.

[00:58:25]

Looks like this is the place. A Pawn Shop.

[00:58:28]

Yeah, I.

[00:58:29]

Know, right? They're doing some weird event for the holidays, I guess. Yeah, look there.

[00:58:36]

Da-da, December. Celebrate the death of meaning while enjoying absurdly great deals on an logically arranged collection of gently used goods.

[00:58:47]

It's a little ham-fisted, but I think it works.

[00:58:49]

It's creative, at least.

[00:58:51]

Anyway, it was nice to meet you, Digby. Thanks for letting me nerd out a little bit.

[00:58:56]

It was nice to meet you, too. Thanks for taking my mind off the existential nightmare I'm living through.

[00:59:01]

You know, it's funny you say that. I'm actually going to this thing because my buddy's giving a talk about absurdism. Specifically, there's this fit about using absurdism and absurd humor as a way to cope with existential issues. You can tag along if you like. The event's open to the public.

[00:59:17]

Yeah, why not? Man, if I see Perry and Mason again and tell them that I met you, they're going to lose their minds.

[00:59:25]

Oh, man, wait till you meet the guy who runs this place. I think he probably lost his mind a long time ago.

[00:59:34]

Thanks for listening to Digital F folklore. If you like the show, please join our Discord. The link is in the show notes. Special thanks to our guests this episode, Andrea Keita and Andrew Peck. And thanks to our voice actors this episode, Rich Daigle as Todd, Andrew Peck as Andrew Paak, Mark Norman as Mark Norman, and Brooke Janette as Diggby. As always, links in the show.

[00:59:55]

Notes for.

[00:59:56]

All of our guests and our actors. If you're doing your math, we're more than halfway through season two right now, and we're gearing up for season three. It would be awesome to see more ratings and reviews come in. If you haven't done it yet, you should.

[01:00:09]

Leave us a.

[01:00:09]

Review on Apple Podcasts or Spotify. It only takes a couple of seconds, and it makes a big difference. Digital folklore is a production of Eighth Layer Media, which sounds like a type of cake made for robots. Our theme music is by.

[01:00:20]

Elirexford Chambers. You can find him at Elichambersmusic.

[01:00:24]

Com. Thanks again for listening, and we'll see you soon. We'll see you soon.

[01:00:28]

A little to the left. No, no, no.

[01:00:53]

It's.

[01:00:54]

Supposed to be crooked. That's the whole point. The right side.

[01:01:00]

Has to.

[01:01:00]

Be higher. The right side, Joe. I'm not doing this for my help.

[01:01:09]

It's.

[01:01:10]

Impossible.

[01:01:10]

To get good.

[01:01:11]

Help.

[01:01:12]

These days. You can.

[01:01:14]

Throw.

[01:01:14]

All.

[01:01:14]

Those into.

[01:01:15]

The.

[01:01:16]

Sale bin and slide.

[01:01:19]

Al Capone, Bugsy Segel, Maya Lansky.

[01:01:23]

Lucky.

[01:01:24]

Luciano, Pretty Boy Floyd. What do these guys all have in common? That's right. They're all gangsters. Whether it's running hooch, drugs or girls, they all had a piece of every business on every corner of every city. Well, every business except one. And that's where I come in. My name is Harry Dolowitch and I'm from New York City and I'd like you to join me with the help of an incredible cast of actors like Richard Kind, Louis Black, Melanie Linsky, Bobby Conevali, Michael Stoolbag, Justin Bartholomew, and many more to tell the unbelievable, true story of how I rose from nothing to something after taking over the one business that those gangsters were too blind to see, the chocolate syrup business. So make sure you tune those radios to realm and slurp up the first 10 Fizzy installments of how I, Harry Dolowitch, became New York's king of the egg cream, available wherever you get your podcasts. I think.

[01:02:19]

That right up next to there. I'm just trying to talk to you right now. I'm just trying to talk to you right now. I'm just trying to talk to you right now. I'm just trying to talk to you right now.