Transcribe your podcast
[00:00:00]

Start the weekend with Freebie Friday on just eat with freebies from McDonald's, cafe, Nero Sombrero, and more. Your faves subject to availability and store servant times. Participating stores only. Minimum spend deploys promotion runs on Fridays only. Participating brands and free items may vary weekly. See just eat ie for details.

[00:00:21]

Hello. This is Susie Esman and Jeff Garland. I'm here, and we are the hosts of the history of curb your enthusiasm podcast. Now we're going to be rewatching and talking about every single episode, and we're going to break it down and give behind the scenes knowledge that a lot of people don't know. And we're going to be joined by special guests including Larry David and Cheryl Hines, Richard Lewis, Bob Odenkirk, and so many more. And we're going to have clips, and it's just going to be a lot of fun. So listen to the history of curb your enthusiasm on iHeartRadio app, Apple podcasts, or wherever you happen to get your podcasts. Welcome to stuff you should know. A production of iHeartradio.

[00:01:05]

Hey, and welcome to the podcast. I'm Josh. And there's Chuck. And Jerry's lingering, too. She's a lurker. And this is stuff you should know. The education edition.

[00:01:15]

Yeah, I'm pretty excited about this after learning more about it.

[00:01:18]

Yeah, this is your pick. Where'd you come up with this?

[00:01:21]

I don't know.

[00:01:22]

Well, that's good. I was hoping that it wasn't like, well, I had a really bad experience with a teacher when I was a kid.

[00:01:29]

No, I had always good experiences generally, but now I'm worried that it was a listener because I've gotten a few of those lately. Like, hey, when you said you didn't know it was me, I usually make a note, but sure, I don't know.

[00:01:46]

Well, if you suggested the Pygmalion effect, you're probably the only one, and you can feel free to email and be like, hey, sorry. And you said you didn't know it was me. Yeah, well, we're talking about the Pygmalion effect, and it does have to do with education, but it has to do with more than that, too. And for those of you who don't know, the Pygmalion effect is a kind of self fulfilling prophecy. It's called an expectancy bias, I believe, and it basically says, in effect, that if you have high expectations for, say, a student or an employee or something, they're likely to perform better than other people. And it has something to do in all sorts of different ways. It turns out from that relationship, that high expectation. And it's pretty neat if you think about it. And Pygmalion, it's named after, I guess, an Ovid metamorphosis story, right?

[00:02:42]

Yeah, I believe it was a statue. Isn't that right?

[00:02:47]

I think Pygmalion was the sculptor and the statue is Galatea.

[00:02:52]

Okay, I knew it. Know, because I'm not the art major.

[00:02:58]

Well, I'm not either.

[00:02:59]

I was the english major. So I read George Bernard Shawl's play Pygmalion in college in a class. And then, of course, my fair lady was based on Pygmalion, in which I think her name was Liza Doolittle, sort of, hey, let's take this rough around the edges young woman and make her into a fair lady.

[00:03:20]

I knew it as trading places.

[00:03:23]

Exactly. But sort of a classic story. The original play is great, and it all has to do, like you said, with this idea of the self fulfilling prophecy, which had been around for a long, long time. But in the 1960s, of course, when psychology and doing studies on all kinds of things was really blossoming and just sort of exploding in all directions.

[00:03:47]

Super hip.

[00:03:49]

Well, I don't know about that, but maybe in those communities. But there was a psychologist named Robert Rosenthal who got pretty interested in this idea of how bias can affect something like performance or assumptions or thinking. Like, it moved out of the classroom. But initially, like, hey, this kid has promise, or this kid does it, and then they end up being like that.

[00:04:13]

Yeah, for sure. And there's a lot of implications, obviously, of, okay, well, then does that mean that there's kids who are not performing as well as they could because they're not being treated well by their teachers? Sure, there's a lot, and I think one of the things that I like about this is that just how much debate and research and argument has gone into just this one segment of approaching education really just goes to show how seriously we take education or have at least in the past.

[00:04:45]

Yeah, I think so. I mean, that certainly doesn't mean we figured it out, but. No, I think people have long studied and tried and argued and debated on the best way to help kids reach their potential. And that's a good thing.

[00:04:59]

Yeah. So there was a sociologist named Robert Merton, and he turns out to have been the person who coined the term self fulfilling prophecy. I hope he copyrighted it because I owe him some money. Just me. Anyway, that was back in 1948, and even by then, that was a good almost 50 years after experimental data started coming in that showed self fulfilling prophecies existed. So I guess our kind of hero, or at least protagonist antagonist, I guess it depends on how you look at him. Robert Rosenthal in the 60s, he hit upon a pretty great study idea, along with a colleague of his name, kermit Fode, which is a great name in writing out loud, blinked out in Morse code. It's a great name all across the board. But working together back in 1963, they took on running rats through mazes, which was already, like, just so cliched back then that it was like, a perfect thing to experiment on, because it was like the people that they were actually experimenting on. The students who were running the research were the ones who were being experimented on. But the rats and mazes was just so ubiquitous, they didn't question that at all.

[00:06:13]

It didn't even occur to them that they would be being experimented on.

[00:06:17]

Yeah. And they ended up coming up what I think should be just these words on a t shirt and just don't even explain it, because what they told experimenters that were working with these rats, they said, all right, you got some really great rats in this group, and they were bred to be maze bright, but those other ones, they're maze dull.

[00:06:38]

Right.

[00:06:38]

And I just think that would be fun on a t shirt. But actually, these rats were assigned randomly. But what they found out was that the dull rats, the maze dull ones, hit their peak performance three days in and then started to go downhill where these really bred to be maze bright rats just kept on improving. And so the conclusion was, I think these students are getting these rats that are maze bright and are Just, hey, little buddy, you can do it. I know you got it in you. Sure you're a smart rat. Like, they're handling them better. They're talking them up, they're encouraging them, and it's working.

[00:07:14]

Yeah, because, again, you said that they were assigned randomly, and there was no such thing as maze bright or maze dull rats. They were all just the same. So they had to have something to do with the researchers, because there was no difference between any of the rats that were assigned. I think in the worst interpretation is you could also suggest that the maze bright rat student experimenters could even have been fudging the numbers a little bit to meet their expectations. But it's a possibility. And actually, that kind of led to one kind of branch of study that came out of that mazebrite, maze dole rad experiment. How much expectancy bias affects researchers in scientific studies. That was the first leap that it went to. But shortly after that it ended up in the classroom because a principal of spruce school in elementary in San FrAncIsco read about this rat experiment. I think it was an american scientist in 1963 ish. And the principal, Lenore Jacobson wrote to robert rosenthal and said, hey, if you ever want to replace rats and experimenters with students and teachers, I'm your person. And very quickly Rosenthal took Lenore Jacobson up on that.

[00:08:38]

Yeah, by very quickly, I guess in science terms, a couple of years later. And they said, all know, we don't know it now, but this is going to end up being a very famous experiment called the Pygmalion experiment and again named for the art and the play. And what else was it?

[00:08:59]

Trading places.

[00:09:00]

Trading places. That's right.

[00:09:02]

I keep wanting to say 48 hours, but that's not it at all.

[00:09:04]

Yeah, but actually that came afterward, so you know what I mean.

[00:09:08]

Sure.

[00:09:09]

Great movie, though.

[00:09:10]

Which one?

[00:09:11]

Trading places. I love it.

[00:09:13]

Okay. I've never seen 48 hours.

[00:09:15]

Oh, really? Yeah, it's a good one.

[00:09:18]

Okay, so which one's better, trading places or 48 hours?

[00:09:22]

Well, I mean, they're both kind of great. One is just more of a straight up comedy, which is trading places.

[00:09:27]

Sure.

[00:09:27]

48 hours was sort of in that cop buddy movie action thing. Lethal also has laughs. Yeah, for sure. But it's prime Eddie Murphy.

[00:09:36]

Okay, that sound like more of a trading spaces person to me.

[00:09:40]

Trading places.

[00:09:41]

Trading places.

[00:09:43]

That's a HTTV show.

[00:09:45]

Yeah, it totally.

[00:09:47]

So. Boy, that was a good sidetrack. Eddie Murphy's got a new Beverly Hills cop coming out, by the way.

[00:09:52]

Oh, yeah, that's right. I wonder how that's going to be.

[00:09:55]

I wonder, too.

[00:09:56]

I don't feel like he's aging poorly. He doesn't seem to be getting less funny over time, although I haven't seen any of his stuff very recently.

[00:10:04]

We'll see. I haven't either.

[00:10:06]

Okay.

[00:10:06]

I'm reserving my opinion to let.

[00:10:08]

All right, that's fair.

[00:10:09]

All right, so spruce school, San Francisco. It was performed on these kids, white majority, mexican american minority, but mostly working class kids. This wasn't some like, when I first heard spruce school, I thought it was some like, super hoity twity private school.

[00:10:23]

I did, too. Sounds like it.

[00:10:25]

It sure does, doesn't it? Especially in San Francisco.

[00:10:27]

But one more thing, very crucially about this school, the kids were grouped by reading ability. So if you weren't a very good reader, you were in a group or a class with other kids who weren't a very good reader and so on and so forth.

[00:10:39]

Yeah. And we're going to talk a lot about grouping because it's not a great thing to do, as it turns out. And it's got a lot to do with a lot of this, for sure. So students at the school, they were given a test and the researchers told these teachers, and as we'll crucially find out, too, this test was not given by the researchers, they were given by the teachers, correct?

[00:10:58]

Yes.

[00:11:00]

So that's going to come into play as well. But they told the teachers, said, all right, we've got these results. You've got some bloomers or quote unquote growth spurters in your class, and they're probably just like these maze bright rats. They were like, they're going to really improve over the school year. Just you watch. We gave them this test. It was the Harvard's test of inflected acquisition, and it's supposed to assess their potential, which was not true at all. What they actually took was an IQ test called toga, Flanagan's test of general ability. And there were some problems with that right off the bat, right with this toga test.

[00:11:41]

So one thing, Chuck, about that test of inflected acquisition, it didn't just, they made it up so that teachers, if they were possibly familiar with the test of general ability, they wouldn't be like, wait, this isn't what you would use to find gross birders or bloomers. They just made up a test because this was a made up. The results were supposed to be made up, too. Again, the teachers thought that they were administering a test and that the results were real world, but they were being lied to. They were being manipulated in the exact same way. Those students were told that some of their rats were maze bright or maze dull. Exact same experiment, just with humans now.

[00:12:20]

Yeah, because the idea is to see if teachers think that a kid is supposed to have a growth spurt intellectually, then that will end up being the self fulfilling prophecy. So these students were chosen at random. The teachers were given that information, and after months and months, they took this test again, the toga test at the eight month mark, the one year mark and the two year mark.

[00:12:46]

Yeah. And so just as Rosenthal predicted in his hypothesis, I should say Rosenthal and Jacobson, the principal, the people who had been, or the kids who had been identified as growth spurters or bloomers, actually did bloom academically. They gained all sorts of IQ points over the course of the eight months and then year and then two years when they took and retook the test and that even though the effects were mostly pronounced among first and second graders, that was enough. That was enough to just kind of show like, this is a real deal. These kids were no different than the other kids. The only difference was that these bloomers were the ones whose teachers were told, keep an eye on them because they're going to be amazing kids.

[00:13:30]

Yeah. And what's interesting is for the, I think the third, fifth and 6th graders, they showed that they actually improved at the same rate the bloomers did, the ones who were assigned that tag, at least at the same rate or slightly even slower rate or lower rate than control group and the researchers, Rosenthal basically said, well, that's because when you're younger, your mind is more malleable. And so that's probably it. And also because the school and these teachers probably think that their reputation wasn't like, they may have felt bad for these kids who didn't get the bloomer tag, so they may have paid more attention to them or something.

[00:14:15]

Right. Or the younger kids hadn't been at school long enough to establish, like, hey, I'm actually not that smart, or, hey, I'm actually really bright. So their reputation wasn't established. No big man on campus label had been applied yet.

[00:14:31]

Yeah. So if you're starting to sense like, oh, wait a minute, then he just immediately sort of explained away something that didn't agree with his finding. You will see that that kind of becomes part of the story.

[00:14:41]

Right. So they published a study in 1968 called Pygmalion in the classroom. And again, they named it after Pygmalion because in that story from Ovid, the sculptor Pygmalion sculpts a beautiful woman, falls in love with her, and loves the statue so much that the goddess Venus says, I'm going to make you a real live person. So the attention that Pygmalion paid to Galatea, his statue, created a magical transformation in his statue from statue to human. There was some sort of magical intervention. So it actually is a really great name. And I can't think that that didn't have something to do with how much it exploded onto the scene, because it's really difficult to understate what, just a bomb this dropped, not just in academia, but in popular culture. It got picked up and talked about for years afterward.

[00:15:37]

That sounds like a great place to break, eh?

[00:15:40]

I thought you might say that.

[00:15:41]

All right, well, let's take a break and we'll be right back and talk about this explosion of understanding right after this.

[00:15:49]

You.

[00:16:01]

Hi, I'm Susie Esman.

[00:16:02]

And I am Jeff Garland.

[00:16:04]

Yes, you are. And we are the hosts of the history of Curb your enthusiasm podcast. We're going to watch every single episode. It's 122, including the pilot, and we're going break them down.

[00:16:15]

And by the way, most of these episodes I have not seen for 20 years.

[00:16:19]

Yeah, me too. We're going to have guest stars and people that are very important to the show, like Larry David.

[00:16:24]

I did once try and stop a woman who was about to get hit by a car. I screamed out, watch out. And she said, don't you tell me what to do.

[00:16:31]

And Cheryl Hines, why can't you just.

[00:16:33]

Lighten up and have a good time?

[00:16:35]

And Richard Lewis, how am I going.

[00:16:36]

To tell him I'm going to leave now?

[00:16:37]

Can you do it on the phone? Do you have to do it in person? What's the deal? Link cable. You have to go in. He's a human being. He's helped you.

[00:16:43]

And then we're going to have behind the scenes information. Tidbit.

[00:16:46]

Yes, Tidbit is a great word.

[00:16:48]

Anyway, we're both a wealth of knowledge about this show because we've been doing it for 23 years. So subscribe now and you could listen to the history of Kerber enthusiasm on iHeartRadio app, Apple Podcasts, or wherever you happen to get your podcasts.

[00:17:02]

Hey, this is Dana Schwartz. You may know my voice from Noble Blood, Haley Wood, or stealing Superman. I'm hosting a new podcast, and we're calling it very special episodes.

[00:17:14]

One week, we'll be on the case with special agents from NASA as they crack down on black market moon rocks.

[00:17:19]

H. Ross pro is on the other side, and he goes, hello, Joe. How can I help you? I said, Mr. Pro, what we need is $5 million to get back a moon rock.

[00:17:28]

Another week, we'll unravel a 90s Hollywood mystery.

[00:17:32]

It sounds like it should be the next season of true Detective or something. These canadian cops trying to solve this 25 year old mystery of who spiked the chowder on the Titanic set.

[00:17:41]

A very special episode is stranger than fiction. It's normal. People plop down in extraordinary circumstances. It's a story where you say, this should be a movie. Listen to very special episodes on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

[00:18:00]

One of the best shows of the year according to Apple, Amazon. And time is back for another round. We have more insightful conversations between myself, Paul Muldoon, and Paul McCartney about his life and career.

[00:18:19]

We had a big bear of the land called Mal Evans. And I was coming back on the plane, and he said, will you pass the salt and pepper? And I misheard him. I said, what salt and pepper?

[00:18:34]

This season, we're diving deep into some of McCartney's most beloved songs. Yesterday, band on the run, hagered, and McCartney's favorite song in his entire catalog. Here, there, and everywhere. Listen to season two of McCartney a life and lyrics on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

[00:19:04]

Fun with Josh and Chuck Don. All right, so where we left off, the study was called Pygmalion in the classroom. Published. Published in 68 as the paper and then also notably as a full book. If it was just the paper, it may have just sort of been passed around through academia, but because it was a book, it became very popular. And all of a sudden, Barbara Walters is interviewing Rosenthal, and the New York Times has got it on the front page, and the mainstream media is all over this, basically saying, and kind of like the media does with something like this. They're not digging into the data like academia will, as we'll see.

[00:19:52]

Right.

[00:19:53]

But they'll run big headlines and they'll say, this is really significant because we all knew that the way we teach our kids is wrong. And this kind of proves it.

[00:20:02]

Yeah.

[00:20:02]

So I think that was another reason why it had such a huge effect on the larger culture, because people had been suspecting for a while that putting kids into groups by reading ability was a bad idea. It was doing a disservice to them. Now, there was this paper that showed demonstrably that that was absolutely true. That was a terrible idea. And, yeah, like you said, there were headlines all over the place. People were discussing it. And Rosenthal, he was basically the ringleader, the ringmaster to all this stuff. He was very much on board with not pointing out, oh, actually, you guys are missing a lot of nuance. It's not quite that cut and dry. He was like, yes, absolutely. Like exactly what you're saying. This black and white thing where, yes, this is absolute proof. I'm totally going to go along with that. And he got criticized just for that alone, just not intervening in how his science and findings was being communicated to the larger public and, in fact, kind of playing a role in making that happen, just kind of capitalizing on the general population's incomprehension of statistical analysis. We don't know what that is or how to do it, so we rely on scientists to explain it to us in terms we can understand or the press.

[00:21:25]

And if the scientist, as we've covered many times, Chuck, isn't forthright or honest. That stuff can get turned into all sorts of misunderstandings or overblown findings.

[00:21:38]

Yeah. And one of the big things, too, that we should point out was that fact that we told you earlier, that before the break, that these tests really kind of showed this effect for these younger kids in first and second grade, but not for the kids in the older grades. And in fact, it showed a negative correlation sometimes in some of the older grades. They didn't even put that in the book at all. So they're already sort of cherry picking stuff. And the book was the thing that really blew up more so than the paper. And so, of course, the press isn't covering that aspect of it, probably because they didn't even know about it.

[00:22:13]

Yeah. So there's two tracks. The popular press, like the New York Times or Today show or whatever, they're covering it in glowing terms, like it's absolute proof of what everybody always suspected. The other track was a pretty wide river of criticism coming out of the halls of academia from other psychologists of different stripes who just were teeing off on this paper. And even though it didn't necessarily capture the attention of the larger public in academia, there was a thorough debate that started right after the paper came out and went on for a good decade and actually turned out to be really healthy, not just for Rosenthal's paper, Rosenthal and Jacobson's paper, but for, I guess, statistical analysis as a whole. But I think because Rosenthal ended up inadvertently creating the metaanalysis study. But before we get to that, Chuck, let's talk about some of the stuff that was wrong with the paper, statistically speaking.

[00:23:14]

So, yeah, I mean, the toga test, we should talk about right out of the gate because this test was not supposed to be used on first graders or with kids with an iq below 60. And that alone probably accounts for, or at least accounts for some of the fact that these low results were coming in on these kids in the younger grades. And then they would obviously gain much more ground because they've then aged into the test by their time. They're taking this when they're really supposed to be.

[00:23:46]

Exactly. And there was something that Rosenthal responded to that and even said, hey, even if that test doesn't apply to these younger kids, the fact that the same kid is taking the same test over, really, it renders that moot. It's still going to show accurate results.

[00:24:06]

I see what he's saying there, but it's just moot to me because it wasn't even supposed to be given to a kid that young.

[00:24:13]

Right. But also, he's totally full of beans right there. So the first initial findings, that first test produced such totally skewed results that as those kids aged into the test and started getting normal results, and you compared those later results to the first results, you would see all sorts of crazy gains that were completely incorrect. Like, they just weren't true. That was a big part of it. That toga test was not set up for kids with iqs under 60, which is a big problem because first graders in the United States, on average, had iqs of 58. And so you can see it reflected in some of those results. Like, some kid had an iq of 18. That's almost impossible. And certainly they wouldn't be, like, reading at that point. Same with a kid, I think, with 30. And then one of those kids later went from 30 to 100, which is coming close to maybe even gifted level. The results were just terrible. And even worse than that in the book, as an academic should, they didn't include any of the raw data either.

[00:25:21]

Yeah. So that means you can't go out as another researcher and sort of try and replicate that. It just kind of occurred to me what he was sort of saying with that initial defense was like, you have a broken scale. That doesn't say what your true weight is, but it's still accurate because you can see how much weight you gain or lose by using that same scale.

[00:25:44]

Right.

[00:25:44]

And you're like, yeah, but you still don't know how much somebody weighs.

[00:25:47]

Yes, that's true. But then also, the thing that makes him dishonest in that response is that imagine it's broken the first time you weigh yourself, and then you fix it, and you weigh yourself after that. And so those are the right results, but you're comparing them to that first broken result. It's completely useless.

[00:26:07]

Why does anyone even have scales anyway?

[00:26:09]

I don't know. Doesn't make any sense.

[00:26:11]

And why, for God's sakes, do they keep them at hotels, like at the beach?

[00:26:17]

I don't know, man. There was one in one of my rooms when we were on tour, and this one was in San Francisco. I'm like, why? I just glared at it a couple of times and it eased itself back under the.

[00:26:28]

Oh, I mean, hey, I weigh myself to keep track of things, but for God's sakes, don't weigh yourself on vacation.

[00:26:34]

No, I was going to say I do at home, but not on vacation. Not even on tour.

[00:26:39]

So anyway, scaled diversion aside, like you said, he didn't include raw data. That means you can't come along afterward and try and replicate it. So that's a big problem.

[00:26:47]

Yeah.

[00:26:48]

Other people chimed in and said things like, I think Richard East Snow was a psychologist who said, also apparently teachers couldn't remember. And a lot of them reported that they even didn't really even glance at this list on who was a bloomer or not a, which is very strange. It sounds like some of these teachers didn't even fully realize or care much that they had an experiment going on.

[00:27:15]

Yeah, I thought that was kind of weird, too, because I didn't get the impression that these were anything but normal, dedicated teachers. But I don't know, maybe they suspected that this was made up or that maybe they were like, there's no test that can really pick that up, so I'm not even going to pay attention to that kind of thing.

[00:27:31]

Or they were busy teaching.

[00:27:33]

That could be it, too, for sure. Another one was that the teachers themselves administered the tests, the initial tests. So they weren't administered by professional child psychologists. They were administered by teachers who already had an impression of the kids they were administering the tests to because it was the previous year's teachers. So if you were, say, in second grade, your first grade teacher was the one who administered your test. I didn't get that. But that was another criticism from academia.

[00:28:03]

Yeah, absolutely. So people are debating this. They're starting to sort of argue positions for and against over time. There was a 2018 overview of a lot of these debates from someone named Thomas L. Good and Natasha Sturzinger and Allison Levine. And hats off to Livia for getting all these names. There's a lot of people that did a lot of follow up stuff, so nice job.

[00:28:29]

Yeah.

[00:28:29]

But they noted that the individual students results varied a lot on the different post tests, saying, basically we just don't have a lot of evidence that these iqs really improved at all.

[00:28:46]

Yes, but here's the thing. This is what's astounding of it. I think it was. Robert Snow, in his book review of it, wrote that it's possible that the Pygmalion in the classroom study actually did turn up evidence of this idea that we've all considered for a long time as possible that teachers'expectations. Affect student performance.

[00:29:12]

Right.

[00:29:12]

But if it did, it did it by accident because he was just saying, like, the study was so poorly executed. And it seems like Robert Snow was correct in that guess that somehow, some way, this study did show this is a real thing. And over time from this ten year long debate over the results and the methodology and all that stuff. And hats off to Rosenthal. He didn't just throw the study out and run off with a big bag of money with a dollar sign on it. He stood there and he answered his critics. He engaged in the debate for a good decade, and over the course of that decade, more studies with better methodology and better execution were created and studied the same effect, this Pygmalion effect. And they found, no, he was right. Whether it was a bad study or is it produced some sort of correct results, we do realize now the Pygmalion effect does have. It is real to some degree.

[00:30:14]

Yeah. And depending on who you were, you could come at it from a different angle. And each of you have a point, because, as Livia points, like, just sort of politically, as far as being hard on teachers or not, it would play out in different ways. A writer for the San Francisco Chronicle said, like, see, here you go. These low expectations on these children of lesser income, that's what's causing them to fall behind and maybe even drop out of school later on. Whereas the Albert Schenker, who was with the United Federation of Teachers, said, no, it's not the teacher's fault. It's poverty itself. And we have too many kids in these classes and we don't have the right materials.

[00:30:58]

Yeah. And regardless of where you fall on it, there was still a big push to do away with, like, advanced placement classes or gifted tracks or even remedial stuff. They were like, just because you think a kid's remedial, do not put them in a remedial class. Put them in, like, a mixed aptitude class, and they'll do way better than if you put them in a remedial class. That was a big deal. I guess it wasn't successful because there was still plenty of AP classes and snotty little AP students in the 90s when I was in high school.

[00:31:32]

I just love to shove it in your face.

[00:31:35]

Oh, I'm in AP history.

[00:31:36]

Did you not take any AP classes? No, I took a couple. I took AP history in English. But looking back, I don't know. I can definitely see they were great classes and I felt like the teachers were better. But it also may have been my own bias because it was AP and also, like, a student that they don't think should have tested into there. If they had been thrown in there, maybe they would have risen to that level. So with adult eyes, I now look back at kind of how messed up all that stuff was, well, if it.

[00:32:12]

Was better teaching from better teachers with better material, then the argument for people who are against that would say, then all classes should be like that. Every history class should be taught like that. Don't just make it for the ones who you think are gifted or whatever. I think that still probably is a big deal. I'm not particularly up on the state of education today or early childhood education, so I don't know if they're still putting kids in classes, different separate classes or not. But if not, I'm sure there's still people arguing against it.

[00:32:48]

Yeah, I mean, I'm not sure yet. As far as upper grades. All my experience right now is with Ruby in the third grade at her little hippie dippy private school, where, of course, everybody is treated equally and given the same opportunities.

[00:33:02]

Everyone wins or loses.

[00:33:06]

So one kind of cool thing that came out of this was because it was so famous and because there were so many people sort of criticizing it, so many people defending it, and so many people doing other studies, because, as we'll see, this soon leapt into the private sector with business, into the military. People started sort of applying this kind of thing to all kinds of stuff outside the classroom. It led to Rosenthal saying, well, hey, now I can look at all these studies together. And was that the literal birth of meta analysis? Was he one of the first?

[00:33:42]

That's how I took it, yeah. 78. He got together with a colleague, Donald Rubin, who was the head of Harvard's statistics or statistical analysis department. So this guy's, like, as good as it gets with statistics. And they got together 345 studies that looked at expectancy effects and found that there was a pronounced effect that was detected if you just looked at the high quality studies on it.

[00:34:14]

Right? Okay, so they're looking at this stuff. Like you pointed out earlier, the people couldn't replicate because there wasn't raw data. And there were psychologists and neuroscience researchers that were pointing this stuff out like, hey, we can't even replicate this thing. There were also people pointing out that the people that are criticizing it and the people that are defending it, sometimes they're not even looking at the same data.

[00:34:39]

Right? Yeah. So Rosenthal was like, hey, if we're looking at actual student progress based on teacher expectations and you're just looking at gains in iq testing, that's not looking at the whole picture. Like, if you also take into account scores from year end achievement tests or teacher assessments on improvement or how many books a kid can walk around with on their head without spilling them over because they have really good posture. If you take all this stuff into account, you get a much clearer picture of whether the student actually did improve or not, thanks to teacher expectation.

[00:35:19]

Yeah, exactly. I mentioned they did it outside the classroom. I think they found the biggest gains in military settings.

[00:35:27]

Yeah.

[00:35:27]

Which makes the idea that you probably just have more sway as a drill sergeant than you do as a teacher, maybe.

[00:35:34]

Yeah, for sure. You have that much more influence. And the more influence and control you have over somebody, the more effect your expectations can have on them.

[00:35:44]

Yeah. But they saw this play out, so it's not like this is an episode on how this isn't a thing, because it is a thing, whether they found it by accident or not. Like, there was one example that Livia found where there were employees putting together medical kits, and they brought in this group of new hires and told the managers, like, hey, these people, they're maze happy. What was it? Maze.

[00:36:09]

Maze bright.

[00:36:11]

Yeah, they're maze bright. Like, you ought to see them put together these kits. Like, you're going to do great. They got a lot of potential here. And that group ended up breaking records for production levels.

[00:36:21]

Right. And so if you're into management science, you're teaching everybody this. Just go out and lie to your managers, and your employees will start actually producing way better than you would think for no reason other than their manager has higher expectations, thinks they're better at their job than other people. And that is a huge part of all of this. There doesn't seem to be the same effect if you are forthright and honest with the teacher, because the whole thing seems to be rooted in the idea that the teacher or the manager has to genuinely believe that this kid or this student or this employee is above average and expect above average results from them.

[00:37:11]

Yeah, I say we take another break and we dive a little bit more into that after this, eh?

[00:37:15]

Sounds good, ma'am.

[00:37:30]

Hi, I'm Susie Esman.

[00:37:32]

And I am Jeff Garland.

[00:37:33]

Yes, you are. And we are the hosts of the history of Curb your enthusiasm podcast. We're going to watch every single episode. It's 122, including the pilot, and we're going to break them down.

[00:37:45]

And by the way, most of these episodes I have not seen for 20 years.

[00:37:48]

Yeah, me too. We're going to have guest stars and people that are very important to the show, like Larry David.

[00:37:53]

I did once try and stop woman who was about to get hit by a car. I screamed out, watch out. And she said, don't you tell me what to do.

[00:38:00]

And Cheryl Hines, why can't you just.

[00:38:02]

Lighten up and have a good time.

[00:38:04]

And Richard Lewis, how am I going.

[00:38:06]

To tell him I'm going to leave now? Can you do it on the phone? Do you have to do it in person? What's the canceling cable? You have to go in. He's a human being. He's helped you.

[00:38:12]

And then we're going to have behind the scenes information. Tidbit.

[00:38:15]

Yes, Tidbit is a great word.

[00:38:17]

Anyway, we're both a wealth of knowledge about this show because we've been doing it for 23 years. So subscribe now and you could listen to the history of Kerber enthusiasm on iHeartRadio app, Apple Podcasts, or wherever you happen to get your podcasts.

[00:38:31]

Hey, this is Danish Schwartz. You may know my voice from Noble Blood, Haley Wood, or stealing Superman. I'm hosting a new podcast, and we're calling it very special episodes.

[00:38:43]

One week, we'll be on the case with special agents from NASA as they crack down on black market moon rocks.

[00:38:49]

H. Ross Pro is on the other side, and he goes, hello, Joe. How can I help you? I said, Mr. Pro, what we need is $5 million to get back a moon rock.

[00:38:58]

Another week we'll unravel a 90s Hollywood mystery.

[00:39:01]

It sounds like it should be the next season of true detective or something. These canadian cops trying to solve this 25 year old mystery of who's bike the chowder on the Titanic set.

[00:39:10]

A very special episode is stranger than fiction. It's normal. People plop down in extraordinary circumstances. It's a story where you say, this should be a movie. Listen to very special episodes on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

[00:39:30]

One of the best shows of the year according to Apple, Amazon. And time is back for another round. We have more insightful conversations between myself, Paul Baldun, and Paul McCartney about his life and career.

[00:39:48]

We had a big bear of a man called Mal Evans, and I was coming back on the plane and he said, will you pass the salt? And Pepper? And I misheard him. I said, what, Sergeant Pepper?

[00:40:03]

This season, we're diving deep into some of McCartney's most beloved songs. Yesterday, band on the run hagered, and McCartney's favorite song in his entire catalog. Here, there, and everywhere. Listen to season two of McCartney a life and lyrics on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

[00:40:34]

Josh and Chuck.

[00:40:43]

So, Chuck, one thing that I think probably everybody listening to this episode so far has come across as a question is, okay if teachers'expectations actually influence student performance. How, what are teachers doing that can have that effect? And that's been a big thread of this study as well.

[00:41:04]

Yeah. Because to implement this is kind of the important thing.

[00:41:09]

Sure.

[00:41:09]

It's not just to sit back and say, well, we know all this stuff now because hopefully the goal is to help kids learn better. So they did put together some broad categories over the years of how if you're a teacher, you might be transmitting positive expectations. You might not be and not even know it by saying certain things. And they put together a four point thing which after I read it, I was like, oh, my God, why weren't they already doing all this? I know it's kind of sad, but here it is. Climate that is giving a warm emotional environment input, giving them more and tougher assignments. These students output, allowing the students more opportunity to engage with that material. And then the fourth one is give more detailed feedback.

[00:41:53]

Right. So teaching essentially, ideally teaching well. Yeah, exactly. So what they found was that given that idea that some of their students were growth spurters or were going to really make some crazy good moves this school year, teachers did different stuff with that information. Like they didn't all just follow what Rosenthal would have expected, which is they create these high expectations, a warm learning environment for those growth spurters or bloomers. Instead, some of them were like, okay, well, then that kid's good. Let me go focus my attention on the lower students. Yeah, the maze. Dull students.

[00:42:37]

Yeah.

[00:42:38]

And so what was interesting about that, too, because that actually is kind of sensible. It's a sensible strategy if you have a finite amount of time and attention to give to all your students. They found that in some cases, there was a psychologist, Rana Weinstein, who found that when that was done, in some cases, the low performing students who got more attention actually still did worse than the higher performing students.

[00:43:05]

Yeah.

[00:43:06]

And she hypothesized that that was because those kids were basically being patronized. And even though they're six, they still understand that on some innate level. And so they were still getting signals that the expectations for them were low.

[00:43:21]

Yeah. Or maybe they were already separated out, which kind of goes to that whole idea that putting kids in a group just labeled. And a lot of times I remember my school, even where my father was principal, the troubled kids program, and this wasn't necessarily academically, but the behaviorally troubled kids were all put in a special group that had a label, I can't remember. It was an acronym that basically indicated kind of how great they were, which, it's a good thing. You definitely shouldn't say, call them like, these are the bad kids or whatever. So I think they would put labels on them that would hopefully give them an aspirational expectation or something.

[00:44:07]

Right.

[00:44:07]

Or they did. In the 70s, my dad's whole thing was outdoor programs. He was the first person, I think, in the state, definitely in the county, that started all these camping programs. And he really believed that getting kids out in nature, if they had behavioral problems, could really. You could see gains there and stuff like that.

[00:44:25]

Yeah, that makes a lot of sense.

[00:44:26]

That was pretty cool. Great principle. Yeah. And full stop.

[00:44:34]

So what your point is that if you separate kids or you even talk about certain kids in certain ways, if you even have them separated mentally, it's going to be transmitted or telegraphed to both groups of students as a whole. Sure.

[00:44:52]

And they found that even if they weren't separated, just sort of the language that teachers would use in the class would divide them the way they talk to certain kids and other kids.

[00:45:02]

Right. Yeah, that's what I was saying.

[00:45:04]

Okay.

[00:45:04]

Which is pretty interesting. But again, all of it comes down to this. I shouldn't say again, because I haven't made this point yet. This is all predicated on the fact that teachers are human beings with biases, with prejudices, with just thoughts that they can't avoid, unconscious ways that you treat or act towards certain kids where you favor some over others. And then there was something that stuck out to me because there is a researcher from New Zealand named Christine Ruby Davies. When she talks, it sounds awesome, but she has set up a project called the Teacher Expectation Project, where she's like, hey, remember how you guys said a minute ago that for this to be effective, you have to lie to the teachers, you have to mislead them so that they genuinely believe that the students are gifted. I say nuts to that. I'm going to figure out a way to teach teachers to be high expectancy teachers for everybody. Right. So that they have those effects on everybody without them being duped. But one of the things she came up with that to me was like, yes, I think that's 70% of it right there.

[00:46:20]

Teachers don't know all of their students equally well in the classroom. And if you've ever been one of those students who your teacher didn't really know you very well and clearly knew other students better, that is an isolating feeling. And it's not as easy to learn as it is when you're one of the students that the teacher knows that kid. And so that's one of the things that Christine Ruby Davies teaches, like, know all of your kids equally well. It's very important.

[00:46:52]

Yeah. For mean, I was well known by all my teachers because I didn't consciously make a point to, but I was the class clown and I was always involved in trying to crack jokes and being funny. And I may have been disruptive, but the teachers also loved me because it wasn't usually like a super negative disruption. I would just see a good opportunity for a joke and run with it.

[00:47:17]

Well, plus your dad would have fired them if they gave you any back talk, right?

[00:47:22]

Yeah. Right. Well, into high school, too. But as long story short, I was well liked by teachers and so they paid me more attention. Livia also points out something really important about grouping kids is if you just throw kids in a group of know maze, dull group. Some of these kids may have dyslexia, some may have ADHD, some may have insecure housing and family issues and be stress, some may have limited english fluency. So you're throwing all these different issues in as one group.

[00:47:58]

Right.

[00:47:59]

And of course that's going to be an issue.

[00:48:01]

Yeah. That's why they say use mixed group, mixed ability groups. That's one of the things they teach in the teacher expectation project. Another thing that was touched on is creating a caring, non threatening environment where it's a warm environment for all students and you use respectful language. You can't be like, gosh, you're so dumb. You dumb, dumb. You shouldn't say that to students. Right. And this is another one, too. Working with students to set their own goals, which a lot of teachers would be like, you can't actually do that. But apparently Ruby Davies research has shown, or some research out there that Ruby Davies sites has shown if you allow students to set their own learning goals, they will actually shoot for something that's challenging but doable. They probably aren't going to be like, well, I'm just going to learn to draw Huckleberry hound this year. That's my learning goal. They're going to do something a little more challenging than that and they'll learn along the way, and they will have a sense of agency and a stake in their learning. They'll take it that much more seriously. And they'll know if you plot and chart their learning through learning goals and allow them to track it themselves, they will know when they've learned, rather than having to look to the teacher to be like, yes, you just learned something.

[00:49:25]

Way to go.

[00:49:26]

Yeah. What year is the horse? If I remember correctly, you could draw.

[00:49:30]

A heck of a horse. I used to. I lost it, as I proved on Instagram.

[00:49:36]

No, everyone should go check that out. I thought it was a great drawing of a horse.

[00:49:40]

Thanks.

[00:49:41]

Another thing that they said, as far as the high expectations teaching from Christine Ruby Davies goes is praising effort rather than accuracy. Very big deal. And working equally with all students. And I'm not going to name my daughter's school for obvious reasons, but they're doing it right, and it's just great to see that happening. So just big props to her teachers and everyone at her school. And it's not just her school. It's happening more than when we were kids at more and more schools, but it's still not as much as it should in the same breath.

[00:50:18]

Yeah. Two more just related effects that have to do with the Pygmalion effect, the gollum effect, which is the opposite. If you have low expectations, it leads to lower performance, which makes sense, too. And that galatea effect, named after Pygmalion statue, that what we expect for ourselves impacts our performance, mostly because it mediates how the people in authority, a teacher or a manager or something, sees us. So the way that they see us impacts how we see ourselves, which impacts how we perform, which impacts how the manager teacher sees us. And it's just like boros.

[00:50:58]

That's right.

[00:50:58]

Pretty interesting stuff, man. Good pick, Chuck. I think it's so great you just came up with this all by yourself.

[00:51:04]

Oh, man, I hope I did.

[00:51:06]

I hope you did, too. Well, since we both hope that Chuck came up with this by himself, it's time, of course, for a listener. Mail.

[00:51:14]

Hey, guys. I live in Rhode island, where I run charter books, an independent bookstore opened in the spring of 2021.

[00:51:20]

Nice.

[00:51:21]

We report to the New York Times bestseller list.

[00:51:23]

Nice.

[00:51:24]

And I can confirm that you guys really nailed just about everything about it. And I thought you might like a few more tidbits.

[00:51:29]

Yes, please.

[00:51:30]

Every week, we export a CSV document from our bookstore point of sale software, upload it to the bestseller list portal. And as mighty as they are, it's still amusing to see that it basically just comes down to us emailing them a spreadsheet along with all the other booksellers. Of course, if we haven't done it by 11:00 a.m. On Monday, they send a gentle reminder. If we inadvertently miss a week because they require that you report all 52 weeks, they send a message about how much they value our input and how disappointed they are that we forgot.

[00:52:03]

Oh, wow.

[00:52:04]

A little passive aggressive. Yeah. And then every week, they also send an email asking about any bulk orders, which you explained very well in the episode. You are correct in implying how powerful it can be. The list that is authors, publishers, publicists, and other entities in the industry frequently ask if we report to the Times. And years ago, when I was with another bookstore, we received a weird order for 20 copies of a random ya fantasy book. Turned out to be a bungled effort by an obscure publisher to do some book laundering, as Chuck would say.

[00:52:36]

Wow.

[00:52:36]

So hours after we took the order, we received a sternly worded message from the New York Times that they wanted documentation of all orders, basically asking for our receipts. None of this is earth shattering to you guys, probably, but it was fun to hear you talk about my day to day work. That is Steve from charter books. So hey, if you're near charter books in Rhode island, support your indie bookstore.

[00:52:58]

Yeah, no matter where you live, support your indie bookstore friends, for sure. That was Steve, right?

[00:53:04]

Yeah, and he sent him a picture. They had our book on display.

[00:53:07]

Awesome. Thanks, Steve. We love it when people round out information that we've talked about. And if you want to be like Steve and do something like that, you can do it via email. Send it off to stuffpodcast@iheartradio.com.

[00:53:23]

Stuff you should know is a production of iHeartradio. For more podcasts my heart radio, visit the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows.

[00:53:39]

Start the weekend with Freebie Friday on Just Eat with freebies from McDonald's, Cafe, Nero Sombrero, and more. Your faves subject to availability and store servant times. Participating stores only. Minimum spend deploys promotion runs on Fridays only. Participating brands and free items may vary weekly. Seejusteep ie for details.

[00:54:00]

Hello, this is Susie Esman and Jeff Garland. I'm here, and we are the hosts of the history of Curb your enthusiasm podcast. Now we're going to be rewatching and talking about every single episode, and we're going to break it down and give behind the scenes knowledge that a lot of people don't know. And we're going to be joined by special guests including Larry David and Cheryl Hines, Richard Lewis, Bob Odenkirk, and so many more. And we're going to have clips and it's just going to be a lot of fun. So listen to the history of curb your enthusiasm on iHeartRadio app, Apple podcasts, or wherever you happen to get your podcasts.

[00:54:33]

Hey, this is Dana Schwartz. You may know my voice from Noble Blood, Haley Wood, or stealing Superman I'm hosting a new podcast, and we're calling it very special episodes. A very special episode is stranger than fiction.

[00:54:48]

It sounds like it should be the next season of true detectives, these canadian cops trying to solve this mystery of who spiked the chowder on the Titanic set.

[00:54:55]

Listen to very special episodes on the iHeartRadio app, Apple Podcasts, or wherever you get your podcast. Bye.