Transcribe your podcast
[00:00:00]

Hi, I'm Christian Holt, I've covered campaigns, Capitol Hill, the White House and everything Washington for CNN. But nothing tops the importance of this upcoming election and my job is to help you make sense of it all. Welcome to my new podcast, Election one, two, one. We'll figure out the electoral process together. I'll talk to experts, historians and some of you. Yes, this election year is different and this is a different kind of podcast.

[00:00:24]

Listen to election one to one every Wednesday on my Heart Radio, Apple podcast or wherever you get your podcasts.

[00:00:30]

Hi, I'm Bethany Van Delft, host of a new podcast, The 10 News, 10 Minutes of News and Fun for the new generation of Curious Thinkers.

[00:00:39]

We're here to help you make sense of it all, from current events to science, art and pop culture. We'll talk to experts and special guests and hear from young people just like you. Listen to the Sun News on the I Heart radio app, Apple podcast or wherever you get your podcasts with new episodes every Tuesday and Thursday.

[00:01:01]

Hey, everybody, it's me, Josh. And for this week's Wisk Selex, I've chosen our guide to Research Tips. It's a surprisingly good episode that shares the ins and outs of keeping from being duped online by bad information and how to read between the lines. Unsensational, science reporting, all sorts of stuff like that. And you might notice in this episode, Chuck sounds different than usual. That's because this is during the period that he was transitioning into a person with a full set of teeth.

[00:01:31]

So that adds to the hilarity of the whole thing. I hope you enjoy this as much as we did making it welcome to stuff you should know, a production of by Radio's HowStuffWorks.

[00:01:50]

Hey, and welcome to the podcast, I'm Josh Clark with Charles Zewe, Chuck Fry and Jerry. This stuff should, um, Josh, we're going to do something weird today and we're going to do a listener mail at the head of the park. I know. All right. What else do. OK, this is from BEWAILING.

[00:02:12]

Do we have the listener mail music going? Oh, I don't know. Actually, we could go the whole nine yards, so let's do it.

[00:02:18]

People might freak out. And, uh. All right.

[00:02:21]

This is from Bianca Voice, which is what I'm going to say. I think that's great.

[00:02:27]

Hey, guys, brought you not too long ago asking about your research, your own podcast. It just got back from a class where we talked about research, misrepresentation and journal articles. Apparently, journals don't publish everything that is submitted. A lot of researchers don't even publish their studies. They don't like the results. Some laws have been put into place to prevent misrepresentation, such as researchers having to register their studies before they get results and journals, only accepting preregistered studies.

[00:02:53]

But apparently this is not happening at all, even though it is now technically law. This ends with the general public being misinformed about methods and drugs that work. For example, there are 25 studies proving a drug works in 25 that don't. It's more likely that 20 of the positive results have been published and only one or two of the negative.

[00:03:14]

And that is from Beanca.

[00:03:16]

And that led us to this article on our own website, 10 signs that that study is bogus. Yeah, and here it is.

[00:03:25]

Nice, Chuck. Well, we get asked a lot about research from people usually in college or like you guys are professional researchers. How do I know I'm doing a good job and getting good info? And it's getting harder and harder these days.

[00:03:38]

It really is. You know, one sign that I've learned is if you are searching about a study and all of the hits that come back are from different news organizations and they're all within like a two, three day period. Yeah. From a year ago. Copy paste, nothing like nothing more recent than that than somebody released a sensational study and no one did any actual effort into investigating. And there was no follow up. Yeah. If you dig deep enough, somebody might have done follow up or something like that.

[00:04:09]

But for the most part, it was just something that splashed across the headlines, which more often than not, as is the case as far as science reporting goes. So that's a bonus. That's the 11th boom. How about that? Yeah. So we just start banging these out. Let's do it, or do you have some other clever Segway part and parcel with that? I don't know if it's clever. You do come across people who, you know, can be trusted and relied upon to do good science reporting.

[00:04:39]

So, like Ed Yong is one another guy named Ben Goldacre has something called bad science. I don't remember what he's with. And then there's a guy, I think Scientific American named John Horgan, who's awesome. Yeah.

[00:04:52]

Or some journalism organizations that have been around and stood the test of time that, you know, are really doing it right. Like nature. Yeah. Scientific Americans are like, really science.

[00:05:02]

Yeah. Like, I feel I feel really good about using those sources. Yeah.

[00:05:06]

But even they can you know, there's there's something called scientism where there's a lot of like faith in dogma associated with the scientific process. And, you know, you have to root through that as well. Try. Right. I'm done. Uh, the first one that they have here on the list is that it's unrepeatable and that's a big one. The Center for Open Science did a study. Uh, it was a project, really, where they took two hundred and seventy researchers and they said, you know what, take these 100 studies that have been published already, psychological studies and just pore over them.

[00:05:40]

And in 2015, just last year, it took them a while, took them several years. They said, you know what, more than half of these can't even be repeated using the same methods. They're not reproducible. No, not reproducible. That's a big one. And one's that means that they when they carried out, they followed the methodology, uh, scientific method podcast's. You should listen to that one. That was a good one. Yeah.

[00:06:03]

That they they found that their results were just not what the what the people published. Not anywhere near them. Yeah. For example, they use one as an example where a study found that men were terrible at it, determining whether a woman was giving them some sort of like clues to attraction or just being friendly.

[00:06:26]

Yeah. Sexy sexy stuff. Or just be friends or. Yeah. Or good to meet you. Yeah. Or buzz off Gerke. Sure. Yeah. And they they did the study again as part of this Open Science Center for Open Science study your survey. And they found that that was not reproducible or that they came up with totally different results. And that was just one of many. Yeah.

[00:06:47]

And in this case specifically, they looked into that study and they found that it was one was in the United Kingdom, one was in the United States. Right. It may have something to do with it.

[00:06:57]

But the point is, Chuck, is if you're talking about humanity, I don't think the study was like the American male is terrible at it. It's men are terrible at it. Right. So that means that whether it's in the UK, which is basically the U.S. with an accent and a penchant for tea, I'm just kidding you, Kacie.

[00:07:16]

Soon it should be universal.

[00:07:21]

Yeah. Yeah. Agreed. Unless you're saying no, it's just this only applies to American men. Right. Or they weren't one hundred American men. Right.

[00:07:31]

Then it's not even a study. Yeah. Uh, the next one we have is it's plausible.

[00:07:39]

Not necessarily provable, and this is a big one because and I think we're talking about observational studies here more than lab experiments, because with observational studies, you know, you sit in a room and get asked 300 questions about something and all these people get asked the same questions and then they pore over the data and they draw out their own observations.

[00:07:59]

Right. And one of the very famously an observational study that led to false results found a correlation between having a type A personality and being prone to risk for heart attack. Yeah, and for a long time, you know that the news outlets were like, oh, yes, of course, that makes total sense. Right. This study proves what we've all known all along. And then it came out that, no, actually what was going on was a well known anomaly where you have a five percent risk.

[00:08:33]

That chance will produce something that looks like a statistically significant correlation. But it's not at all when really it's just total chance. And science is aware of this, especially with observational studies, because the more questions you have, the more opportunity you have for that five percent chance to create a seemingly statistically significant correlation. Right. When really it's not there. It was just random chance where if somebody else goes back and does the same same study, they're not going to come up with the same results.

[00:09:06]

But the if a researcher is, I would guess, willfully blind to that five percent chance, they will go ahead and produce the study and be like, no, it's true. Here's the results right here. Go ahead and report on it and make my career.

[00:09:22]

Yeah, well, and they also might be looking for something. In fact, chances are they are, um, it's not just some random study. And look, let's just see what we get if we ask a bunch of weird questions. Yeah. It's like, hey, we're looking to try and prove something most likely so that Beta Minoff thing might come into play where you're kind of cherry picking data. Yeah, that's a big problem that kind of comes up.

[00:09:43]

A lot of these are really kind of interrelated.

[00:09:45]

So totally. The other big thing that's interrelated is how the media reports on science these days. Yeah. You know. Yeah. Big deal. Yeah.

[00:09:53]

Like John Oliver just recently went off on this and NPR did a thing on it. That's great. Like they might even like the researcher might say, plausible, but it doesn't get portrayed that way. Oh yeah. And the media sure.

[00:10:05]

Remember that poor kid who thought he found the ancient Mayan city. The media just took it and ran with it, you know?

[00:10:13]

Yeah. I think there was a lot of maybe or it's possible we need to go check kind of thing. The media is like, no. He discovered an ancient Mayan city never known before. Yeah. And let's put it in the headline. And that's I mean, that's that's just kind of the way it is these days. Yeah. Do you have to be able to sort through I guess that's what we're doing here, aren't we, Chuck? We're telling everybody how to sort through it.

[00:10:33]

At the very least, take scientific reporting with a grain of salt. Yes, right. Like you don't necessarily have the time to go through and double that research and then check on that research and, you know. Right. So take it with a grain of salt. Yeah. Um, unsound samples. Here was a study that basically said how you lost your virginity is going to have a very large impact and play a role on how you feel about sex and experience sex for the rest of your life.

[00:11:07]

Yeah, it's possible. Sure, it seems logical. So we'll just go with it. But when you only interview college students and you don't, you only interview heterosexual people, then you can't really say you've done a robust study now, can you? Plus, you also take out of the sample size, your sample population. Anybody who reports having had a violent encounter. Yeah, throw them out. Yeah. That date out. Because that's not going to inform how you feel about sex.

[00:11:41]

Right, exactly. You're just narrowing it down further and further. And again, cherry picking the data by throwing people out of your population sample that don't that will throw off the data that you want.

[00:11:52]

Yeah, and I'd never heard of this acronym. Weird. And a lot of these studies are conducted by professors and academics. So a lot of times you got college students as your sample and there's something called weird Western educated from industrialized, rich and democratic countries.

[00:12:10]

Right. Those are the participants in the studies study subject. But then they will say men. Right. Well, what about the gay man in Africa? Right. Like you didn't ask him.

[00:12:23]

So that was that's actually a really, really big deal. In 2010, these three researchers did a survey of a ton of social science and behavioral science studies found that 80 percent of them. Used weird study participants, so basically it was college kids for 80 percent of these papers and they surveyed a bunch of papers and they took it a little further and they said that people who fit into the weird category only make up 12 percent of the world population, but they represent 80 percent of the population of these studies.

[00:13:01]

And a college student, Chuck, in North America, Europe, Israel or Australia is 4000 times more likely to be in a scientific study than anyone else on the planet. Yeah, and their basic psychology and behavioral sciences are basing their findings on to everybody else based on this this small tranche of humanity. Yeah, and that's a that's a big problem. That's extremely misleading. Yeah. And it's also a little insulting because what they're essentially saying is like, this is who matters?

[00:13:36]

Well, also. Yeah, but what's sad is this is who I am going to go to, the trouble of recruiting for my study. Yeah. It's just sheer laziness. And I'm sure a lot of them are like, well, I don't have the funding to to to do that. I guess I see that. But at the same time, I guarantee there is a tremendous amount of laziness involved. Yeah. Or maybe if you don't have the money, maybe don't do that study.

[00:14:02]

Yeah. It's that simple. I'm probably oversimplifying, I don't know, I'm sure we're going to hear from some people in academia about this one will stop using weird. Participants, or at the very least say like this is heterosexual, yeah, Dartmouth students. This applies to them, right? Not everybody in the world. I mean, 80 percent of these studies were used, those people as study participants. And they're not even they're not even emblematic of the rest of the human race.

[00:14:33]

Like college students are shown to see the world differently than other people around the world. Yeah. So it's not like you can be like, well, it's still works. You can still extrapolate. It's like flawed in every way, shape and form. Right. Probably. Take a break.

[00:14:49]

Yeah. Let's take a break because you get a little hot under the collar. I love it. Yeah. We'll be right back after this.

[00:14:56]

Just like the number of stars in the sky, there's so much stuff. These. It's no secret that in Washington, D.C., corruption is everywhere, you could say it's gone viral and I should know my mom's the speaker of the House. My name is James Parker. My friends are all in the same boat. Daughters of the D.C. elite. When are this close to power? There's nowhere to hide. And when my friends and I got a little too visible, our parents broke us up.

[00:15:33]

But now I need them back because I'm in deep. You see, I'm a bit of a hacker in here.

[00:15:40]

No one knows me as James Parker. They only know me as Storm Boy and Storm Ally. Well, she went poking around somewhere she shouldn't have. I'm James. I'm Peyton. I'm Celia. I'm Natalie. And we're the daughters of DC. Join me and my friends for Daughters of DC, a new twelve part scripted podcast, political thriller from the team that brought you Liza Lit Einhorn's Epic Productions and I Heart Radio. Listen to Dogs for Free and I heart radio, Apple podcasts or wherever you get your podcast.

[00:16:12]

Hello, friends. Quick question, are you registered to vote at your current address? Well, get this, more than 60 percent of eligible voters have never been asked to register. And we asked stuff you should know are working with Head Count Dawg to change that.

[00:16:27]

That's right. All you have to do is go to head count dog right now and register to vote or check your voter registration status, something I did just yesterday.

[00:16:36]

Nice. Make sure you're ready for Election Day. Visit Head Count Dog today and register to vote.

[00:16:42]

That's W-W head count dog staff Joshua. All right, what's next, buddy? Very small sample sizes, right?

[00:17:05]

If you do a study with. Twenty miles, then you're not doing a good enough study, no. So the they use this in the in the article, they use the idea of 10000 smokers in 10000 nonsmokers. Yeah. And they said, OK, if you have a population sample that size, that's not bad. It's a pretty good start. And you find that 50 percent of the smokers develop lung cancer, but only five percent of nonsmokers did.

[00:17:35]

Then your study has what's called a high power. Yeah, um, it's if you had something like ten smokers in 10 nonsmokers and two of the smokers develop lung cancer and one develop lung cancer as well, you have very little power and you should have very little confidence in your findings. But regardless, it's still going to get reported if it's a sexy idea. Yeah, for sure. Um, and because these are kind of overlapping a lot of ways.

[00:18:06]

I was want to mention this guy, a scientist named Auric. Uh, Darnel. He and his colleague Malcolm McLeod have been trying I mean, and there are a lot of scientists that are trying to clean this up because they know it's a problem. But he co-wrote an article in Nature that's called Robust Research. Colen institutions must do their part for reproducibility. So this kind of ties back into the reproducing things, like we said earlier. Yeah, and his whole idea is, you know what, good funding.

[00:18:37]

They should tie funding to good institutional practices, like you shouldn't get the money if you can't show that you're doing it right. Yeah.

[00:18:46]

And he said that would just weed out a lot of stuff. Here's one staggering stat for reproducibility and small sample size. Biomedical researchers for drug companies reported that 25 percent of their only 25 percent of the papers that they publish are even reproducible like an insider stat and doesn't matter if the drugs are still going to market. Yeah, yeah. Which is that's a really good example of why this does matter to the average person. You know, like if you hear something like monkeys like to cuddle with one another because they are reminded of their mothers study shows.

[00:19:28]

Right. You could just be like, oh, that's great, I'm going to share that on the Internet. Doesn't really affect you in any way. Yeah, but when there's studies being conducted that are that are creating drugs that could kill you or not treat you or that kind of thing is it's attracting money and funding and that kind of stuff that's like that's harmful. Yeah, absolutely.

[00:19:52]

I found another survey. Did you like that terrible study idea that it came up like monkeys like to cuddle?

[00:20:04]

One hundred and forty trainees at the MD Anderson Cancer Center in Houston, Texas. Thank you, Houston, for being so kind to us. Yeah.

[00:20:12]

At a recent show, they found that nearly a third of these trainees felt pressure to support their mentors work. Like to get ahead or not get fired. So that's another issue is you've got these trainees or residents and you have these mentors. And even if you disagree or don't think it's a great study, you're pressured into just going along with it.

[00:20:35]

I could see that for sure. There seems to be a huge hierarchy in science. Yeah, for sure. In a lab, you know, you get the person who runs the lab, it's their lab and they go against them. Right. But there are people like Science and Nature to great journals are updating their guidelines right now. They're introducing checklists. Science hired statisticians to their panel, reviewing editors, not just other, you know, peer reviewed, like they actually actually hard numbers people specifically.

[00:21:05]

Oh, gotcha. Because that's about the process. That's a huge part of study. So it's like these mind breaking statistical analysis. Yeah, they can be used for good or ill. And I mean, I don't think the average scientist necessarily is a whiz at that, although it has to be part of training, but not necessarily. I mean, that's a different kind of beast altogether. Yeah. Stats. We talked about it earlier. I took a stats class in college.

[00:21:31]

Oh, man. I had so much trouble.

[00:21:33]

I was awful at it. It really just it's a special kind of. Is an even male. Yeah, I didn't get it, I passed it, though. I passed it because my professor took pity on me.

[00:21:48]

Oh, that's nice that Oelrich doesn't agree to go over there and go, huh?

[00:21:57]

He is a he's a big time crusader for his jam, making sure that science is good science. Yeah. One of the things he crusades against is the idea of you remembering that virginity study where they just threw out anybody who had a violent encounter for their first sexual experience. Apparently, that's a big deal with animal studies as well. If you're studying the effects of a drug or something like there is, there's one in the article. If you're studying the effects of a stroke drug and you've got a control group of mice that are taking the drug or that aren't taking the drug and then a test group that are getting the drug and then like three mice from the test group die, even though they're on the stroke drug, they die of a massive stroke.

[00:22:42]

And you just literally and figuratively throw them out of the study and don't include them in the results. That changes the data. And he's been on a peer review on a paper before. He's like, no, this doesn't pass peer review. You can't just throw out what happened to these three rodents. He started with 10. There's only seven reported in the end. What happened to those three and how many of them just don't report the ten? Yeah, they're like, oh, we only started with seven one going, you know?

[00:23:11]

Well, I was about to say I get the urge. I don't get it because it's not right. But I think what happens is you work so hard at something. Yeah. Yeah. And, you know, like, how can I just walk away from two years of this because it didn't get a result. OK, the point of real science, though.

[00:23:27]

Yeah. You have to walk away from it. Well you have to publish that. Yeah. And that's the other thing too. And I guarantee scientists will say, hey man, try getting a negative paper published in a good journal these days. You don't want that kind of stuff. But part of it also is I don't think it's enough to just have to be published in like a journal. You want to make the news cycle as well. That makes it even better, right?

[00:23:50]

So I think there's a lot of factors involved. But ultimately, if you take all that stuff away, if you take the culture away from it, you if you get negative results, you're supposed to publish that so that some other scientists can come along and be like, oh, somebody else already did this using these methods that I was going to use. I'm not going to waste two years of my career because somebody else already did. Thank you, buddy, for saving me this time in trouble and effort to know that this does not work.

[00:24:17]

Yeah, you've proven this doesn't work. When you start to prove it does work, you actually proved it didn't work. That's part of science.

[00:24:24]

Yeah. I wish there wasn't a negative. Connotation to a negative result, because to me, it's the value is the same as proving something does work, is proving something doesn't work. Right. Again, it's just not as sexy. Yeah, but I'm not sexy either, so maybe that's why I get it.

[00:24:43]

Here's one that I didn't know was a thing. Predatory publishing. I didn't know about it. You never heard of this. So here's a scenario. You're a doctor or scientist and you get an email from a journal that says, hey, you got anything interesting for us? I've heard about your work. And you say, well, actually, do I have this study right here? They said, Cool, will publish it. You go great, my career is taking off.

[00:25:07]

Then you get a bill that says, where's my three grand for publishing your article? And you're like, I don't owe you three grand.

[00:25:15]

All right, give us two and then I can't even give you two.

[00:25:19]

And if you fight them long enough, maybe they'll drop it and never work with you again.

[00:25:24]

Or maybe they'll just be like, well, we'll talk to you next quarter.

[00:25:29]

Exactly. That's called predatory publishing. And it's a I'm not sure how new it is. Maybe it's pretty new. Is it pretty new?

[00:25:36]

But it's a thing now where you can pay essentially to get something published. Yes, you can. Um, it it's kind of like who's who in behavioral sciences kind of thing. Yeah. You know, um and apparently it's new because it's a result of open source academic journals, which a lot of people push for, including Aaron Schwartz, very famously, who took a bunch of academic articles and published them online and was prosecuted heavily for it, persecuted, you could even say.

[00:26:09]

Yeah, but the idea that science is behind this paywall, which is another great article from Price nomics, by the way, um, really just takes a lot of people off. So they started to open source journals. Right. And as a result, predatory publishers came about and said, OK, yeah, let's make this free, but we need to make our money anyway. So we're going to charge the academic who wrote the study for publishing it.

[00:26:33]

Well, yeah. And sometimes now it's just a flat out scam operation. Yeah. One hundred percent. Right. There's this guy named Jeffrey Beal, uh, who is a research librarian. He is my new hero because he's truly like one of these dudes that has, uh, he's trying to make a difference and he's not profiting from this, but he's spending a lot of time by creating, uh, a list of predatory publishers. Yeah, a significant list, too.

[00:27:03]

Yeah. How many? 4000 of them right now.

[00:27:05]

Yeah.

[00:27:06]

Um, some of these companies flat out lie like they're literally based out of Pakistan or Nigeria and they say no. Where in New York. Oh yeah. Publisher. Uh, so it's just a flat out scam or they lie about their review practices like they might not have any review practices. Right. And they straight up lie and say they do. Those one called Scientific Journals International out of Minnesota that he found out was just one guy. Oh yeah.

[00:27:34]

Like literally working out of his home. Yeah, just lobbying for articles, charging to get them published, not reviewing anything and just saying I'm a journal. Yeah, I'm a scientific journal.

[00:27:47]

Go. He shut it down apparently or tried to sell it.

[00:27:49]

I think he was found out, um, in this other one, the International Journal of Engineering, Research and Applications. They created an award and then gave it to itself and even modeled the award from an Australian TV award, like the physical stuff.

[00:28:07]

Wow, that's fascinating. I didn't I could do that. I'm going to give ourselves. Yeah, it's the best podcast in the Universe award.

[00:28:16]

I like that. And it's going to look like the Oscar. Yeah.

[00:28:19]

OK, the Oscar cost of the Emmy, this other one med med. No publications actually confused the meaning of STEM science, technology, medicine. They thought it meant sports technology and medicine.

[00:28:32]

No. Well a lot of science journalists or scientists too. But watchdogs like to send like gibberish articles really into those things to see if they'll publish them. And sometimes they do. Frequently they do.

[00:28:46]

They sniff them off the case the big time. How about that call back? It's been a while. It has been it needs to be a t shirt. So we take a break. Yeah. All right. We'll be back and finish up right after this.

[00:28:58]

Just the number of stories. There is so much stuff, you know. Her with the Menagh Brown is a weekly podcast brought to you by Cynical Women Podcast Network and I Heart Radio. I'm your host, Amina Brown. And each week I'm bringing you hilarious storytelling and soulful conversation centering the stories of black, indigenous, Latino and Asian women. Each week we are going to laugh, consider and reflect upon the times. Join me as we remind each other to access joy, affect change and be inspired.

[00:29:35]

Listen to her with Amina Brown on the I Heart radio app, Apple podcast or wherever you get your podcasts.

[00:29:44]

In this pandemic, a ruinous recession, protest, riots, racial strife, police brutality and yes, Donald Trump.

[00:29:52]

America in 2020 feels like Apocalypse Now again. I'm John Heilemann and in hell on high water.

[00:29:58]

I'll explore this moment in a series of raw and real conversations with the people who shape our culture. And Water is a podcast from the recount.

[00:30:06]

Listen to Hell and High Water on the I Heart radio app, Apple podcast or wherever you get your podcasts. Nerdy stuff with Joshua. So here's a big one, you ever heard the term follow the money? Mm hmm. That's applicable to a lot of realms of society and most certainly in journals, if something looks hinky, just do a little investigating and see who's sponsoring their work. Well, especially if that person is like, no, everyone else is wrong, right.

[00:30:50]

Climate change is not manmade kind of thing. Sure. You know, if you look at where their funding is coming from, you might be unsurprised to find that it's coming from people who would benefit from the idea that anthropogenic climate change isn't real. Yeah, well, we might as well talk about him. OK, Willie. Soon. Yeah. Mr. Soon. Is he a doctor? He's a he's a physicist of some sort, yeah. All right.

[00:31:16]

I'm just going to say Mister or Dr. Soon, because I'm not positive. Uh, he is one of the few people on the planet Earth professionals. That is right. Who deny human climate change, human influenced climate change. Like you said. He said the fancier word for it, though, anthropogenic. Yeah, it's good word.

[00:31:39]

Um, and he works at the Harvard Smithsonian Center for Astrophysics. So, hey, he's with Harvard. He's got the cred. Right. Right.

[00:31:48]

Um, it turns out when you look into where he's getting his funding, he received one point two million dollars over the past decade from ExxonMobil, the southern company, the Kochs and the Koch brothers. Their foundation, the Charles Koch Foundation. Exxon stopped in 2010, stopped funding him, but the bulk of his money and its funding came. And I'm sorry, I forgot. The American Petroleum Institute came from people who clearly had a dog in this fight.

[00:32:17]

And it's just, uh, how can you trust this, you know?

[00:32:22]

Yeah, well, you chose to because there's a guy and he has a PhD in aerospace engineering, by the way. All right. He's a doc. He works with this, um, this organization, the Harvard Smithsonian Center for Astrophysics, which is a legitimate place. It doesn't get any funding from Harvard, but it gets a lot from NASA and from the Smithsonian.

[00:32:41]

Well, and Harvard's very clear to point this out when people ask him about Willie soon. Right. They're kind of like, well, here's the quote, Willie. Soon as the Smithsonian staff researcher at Harvard Smithsonian Center for Astrophysics, a collaboration of the Harvard College Observatory and the Smithsonian Astrophysical Observatory, like, they just want to be real clear, even though he uses a Harvard email address. Right. He's not our employee.

[00:33:05]

No, but again, he's getting lots of funding from NASA and lots of funding from the Smithsonian.

[00:33:09]

This guy, if his scientific beliefs are what they are and he's a smart guy. Yeah. Then yeah, I don't know about, like, getting fired for saying, you know, here's a paper on on the idea that climate change is not human made. Yeah.

[00:33:27]

He thinks it's the sun's fault, but he didn't he doesn't reveal in any of his conflicts of interest that should go at the end of the paper. He didn't reveal where his funding was coming from. Yeah. And I get the impression that in academia, if you are totally cool with everybody thinking like you're a shill, you can get away with it. Right. Well, this stuff, a lot of this stuff is not illegal, right? Even predatory publishing is not illegal.

[00:33:58]

Yeah, just unethical. Right. And if you're counting on people to police themselves with ethics, a lot of times will disappoint you. The Heartland Institute gave Willie soon a courage award.

[00:34:09]

And if you're not caring about what other scientists think of it, if you've heard the Heartland Institute, you might remember them.

[00:34:15]

They're a conservative think tank. You might remember them in the 90s when they worked alongside Philip Morris to deny the risks of secondhand smoke.

[00:34:24]

Yeah, that's all chronicled in that book. I've talked about merchants of doubt, really, just a bunch of scientists, legitimate, bona fide scientists who are like up for for being bought. Yeah. By groups like that. Said it is sad and the whole the whole thing is they're saying like, well, you can't say without beyond a shadow of a doubt. Right with absolute certainty that that's the case. And science is like, no, science doesn't do that.

[00:34:54]

Science doesn't do absolute certainty. But the average person reading a newspaper sees that. Oh, you can't say with absolute certainty or maybe it isn't manmade. Right. And then there's that doubt that the people just go and get the money for for saying that for writing papers about it. Yeah.

[00:35:08]

It's millions of despicable. Yeah. It really is. Um, self reviewed.

[00:35:16]

You've heard of peer review. We've talked about it quite a bit. Peer review is when you have a study and then one or more ideally more of your peers reviews your study and says, you know what, you had best practices. You did it right. It was reproducible. You follow the scientific method and you give it my stamp of approval and put my name on it. Not literally. Or is it? I think so. It says who reviewed it?

[00:35:38]

I believe in the journal when it's published, but not my name as the author of the study, you know what I mean? Right.

[00:35:45]

And the peer reviewer. Yeah, as a peer reviewer. And that's a wonderful thing.

[00:35:49]

But people have faked this and been their own peer reviewer, which is not how it works.

[00:35:56]

No. Who's this guy? Uh. Well, I'm terrible at pronouncing Korean names, so all apologies, but I'm going to say Nung in moon. Nice. Dr. Moon, I think, yeah, I'm Dr. Moon. OK, so Dr. Moon worked on natural medicine, I believe, and was submitting all these papers that were getting reviewed very quickly, because apparently part of the process of peer review is to say this paper is great. Can you recommend some people in your field.

[00:36:29]

Right. That can review your paper? And Dr. Moon said, I sure can.

[00:36:34]

Yeah, he was on fire. Let me go make up some people and make up some email addresses that actually come to my inbox and just posed as all of his own peer reviewers.

[00:36:44]

He was lazy, though, is the thing like I don't know that he would have been found out if he hadn't been careless, I guess because he was returning the reviews within like 24 hours sometimes. Yeah. A peer review of, like a real study should take, I would guess, weeks, if not months. Yeah. Like the the study, the publication schedule for the average study or paper I don't think is a very quick thing. There's no quick turnaround.

[00:37:13]

Right. And this guy was like 24 hours on air, like Dr. Moon.

[00:37:18]

I see your paper was reviewed and accepted by Dr. Mooney. It's like I just added a Y to the end. It seemed easy. Yeah.

[00:37:27]

Uh, if you Google peer review fraud, you will be shocked at how often this happens and how many legit science publishers are having to retract studies. And it doesn't mean they're bad. They're getting duped as well. But there is one based in Berlin that 2015 had 64 retractions because of fraudulent reviews. Oh, wow. And they're just one publisher of many. Every publisher out there probably has been duped. Um, maybe not everyone. I'm surmising that.

[00:38:01]

But it's a big problem. We should do a study on our review. It it'll end up in the headlines now, right? Every single publisher duped, says Chuck. And speaking of the headlines, Chuck, one of the problems with science reporting or reading science reporting, is that what you usually are hearing, especially if it's making a big splash, is what's called the initial findings. Right. Somebody carried out a study and this is what they found.

[00:38:30]

And it's amazing and mind blowing. And it it it supports everything. Everyone's always known. But now there's a scientific study that says, yes, that's the case. And then if you wait a year or two, when people follow up and reproduce the study and find that it's actually not the case, it doesn't get reported on usually.

[00:38:50]

Yeah. And sometimes the science scientist or the publisher is there doing it. Right. And they say initial findings. Right. But the public and sometimes even the reporter will say initial findings. But we as a people that ingest this stuff need to understand what that means. Right. Um, and the fine print is always like, you know, you know, more study is needed. But no one, if it's something that you want to be true, you'll just say, hey, look at the study.

[00:39:22]

Right. You know, it's brand new and they need to study it for 20 more years.

[00:39:26]

But, hey, look what it says, right? And the more the more you start paying attention this kind of thing, the more kind of disdain you have for that kind of just offhand, um, sensationalist science reporting. Yeah, but you'll still get caught up in it. Like, every once in a while I'll catch myself, like, say something like, oh, did you hear this? And then as I'm saying it out loud, I'm like, that's preposterous.

[00:39:49]

Yeah. There's no way that's going to pan out to be true. I got baited. I know. I mean, we we have to avoid this stuff. It's tough. Yeah. Because we have our name on this podcast. Uh, but luckily we've given ourselves the back door of saying, hey, we make mistakes a lot.

[00:40:06]

Yeah, it's true, though. We're not experts. No, we're not scientists.

[00:40:11]

And then finally, we're going to finish up with the header on this one is it's a cool story. Yeah. And that's a big one because it's not enough these days. And this all ties in with the media and how we read things as people. But it's not enough just to have a study that might prove something. Right. You have to wrap it up in a nice package. Yeah. And deliver people get it in the news cycle and the cooler the better.

[00:40:36]

Yep, yep. It almost doesn't matter about the science as far as the media is concerned. They just want a good headline and a scientist who will say, yeah, that's, that's cool. Here's what I found. Yep. This is going to change the world. Mm hmm. Loch Ness Monster is real, this is a kind of ended up being depressing somehow. Yeah, not somehow. Yeah, like, yeah, it's kind of depressing.

[00:41:05]

I know we'll figure it out, Chuck. Well, we do our best. I'll say that science will prevail. I hope so. If you want to know more about science and scientific studies and research fraud and all that kind of stuff, just type some random words into the search bar, HowStuffWorks dot com, see what comes up. Yeah, and since I said random, it's time for listener mail. Oh, no. Oh yeah. You know what?

[00:41:28]

It's time for Administrative Day. All right, Josh, administrative details, if you need to show you don't know what it is, that's very clunky title for saying thank you to listeners who send us neat things. It is clunky and generic, and I've totally gotten used to it by now. Well, you're the one who made it up to be clunky and generic, and it's stuck. Yeah. So people send us stuff from time to time and it's just very kind of you to do so.

[00:41:59]

Yes. And we like to give shout outs whether or not it's just out of the goodness of your heart or if you have a little small business that you're trying to plug. Either way, it's a sneaky way of getting it in there. Yeah, but I mean, I think we we brought that on, didn't we? Didn't we say, like, if you have a small business and send us something, we'll, we'll be happy to to say something.

[00:42:17]

Exactly. Thank you. All right. So let's get it going here. We got some coffee right from one thousand faces right here in Athens, Georgia. From Kayla. Yeah, delicious.

[00:42:28]

Yes, it was. We also got some other coffee, too, from Jonathan at Steamworks Coffee. He came up with a Josh and Chuck blend. Oh, yeah, it's pretty awesome. I believe it's available for sale, too. Yeah. That Josh and Chuck blend is dark and bitter.

[00:42:44]

Uh, Jim Simmons, he's a retired teacher who sent us some lovely handmade wooden bowls. Oh, yeah. And a very nice handwritten letter, which is always great. Thanks a lot, Jim. Uh, let's see. Chamberlain sent us homemade pasta, including a delicious savory pumpkin fettuccini. It was very nice. Yum.

[00:43:05]

Uh, Jay Kraft to send us a postcard from the Great Wall of China. It's kind of neat. Sometimes we get those postcards from places. We've talked about those like boom. Thanks, Jay, here. Uh, let's see. The Hammer press team.

[00:43:19]

They sent us a bunch of Mother's Day cards that are wonderful. Oh, those were really nice. Really great. You should check them out. The Hammer Press team. Yeah, yeah. Uh, Misti billion. Jessica, they sent us a care package of a lot of things. There were some cookies, um, including one of my favorite white chocolate dipped grits and peanut butter crackers.

[00:43:38]

Oh, yeah, man, I love those homemade. Right. Oh yeah. Yeah. And then some 70s macrame, uh, for you along with 70s macrame magazine. Yeah. Because you're obsessed with macrame. We have a macrame plant holder hanging from my microphone arm, having a cup of coffee mug sent to us by Joe and Linda Hecht. Oh, that's right. And it has some pens in it.

[00:44:02]

And they also send us a building, just a lovely little handwritten picture of us with their family, which was so sweet. That's very awesome.

[00:44:09]

Um, we've said it before. We'll say it again. Huge. Thank you to Jim Ruane. I believe that's how you say his name and the Crown Royal people for sending us all the Crown Royal. We are running low.

[00:44:21]

Uh, Mark Sahlberg of the Rocky Mountain Institute sent us a book called Reinventing Fire. Oh, yeah. They're great out there. And they know what they're talking about. And I think it's reinventing fire, colon, bold businesses, bold business solutions for the new energy era.

[00:44:36]

Yeah, they're basically like green energy observers. But I think they the, um, they're experts in all sectors of energy, but they have a focus on green energy, which is awesome. Yeah, they're pretty cool. Um, John, whose wife makes delightfully delicious doggy treats, delightfully delicious, is the name of the company. There's no artificial colors or flavors. And they got sweet little Momo hooked on sweet potato dog treats. I thought you can say hooked on the junk, the the sweet potato junk.

[00:45:06]

She's crazy cuckoo for sweet potatoes. Nice. Oh, man. That's good for a dog too. It is very, uh, Strat Johnson Tennis's Bandz LPI. And if you're in a band, your name is Stret and it's pretty cool. Sure. Uh, Diermeier still. I think that great. Yeah, I'm not sure if I pronounce it right. I owe him a pay.

[00:45:30]

Frederick, this is long overdue. FREDERICH At the 15 20 one store, one five to one store. Dotcom, send us some awesome low profile Calk iPhone cases and passport holders. And I was telling them Jerry walks around with her iPhone in the corner holder and it looks pretty sweet. Oh, yeah. So he said, awesome, glad to hear it. Joe and Holly Harper sent us some really cool 3D printed stuff you should know things like X, Y, České, uh, you know, like a little desk O is like after rubber Indianans love sculpture.

[00:46:04]

Yeah. That's what I couldn't think of what that was from. Yeah. It's awesome. It's really neat. And like a bracelet um made out of stuff you should know 3D carved like plastic is really neat. Yeah. They did some good stuff. Thanks, Joe and Holly Harper for that. And then last for this one, we got a postcard from Yosemite National Park from Lord Jex. And so thanks a lot for that. Thanks to everybody who sends this stuff.

[00:46:27]

It's nice to know we're thought of and we appreciate it. Yeah.

[00:46:31]

We're going to finish up with another set on the next episode of Administrative Details. You got anything else? No, that's it.

[00:46:39]

Oh, yeah. If you guys want to hang out with us on social media, you can go to Suay escaped podcast on Twitter or on Instagram. You can hang out with us at Facebook dot com slash doveish. You know, you can send us an email to Stuff podcast at HowStuffWorks dot com. And as always, join us at our home on the Web stuff you should know Dotcom.

[00:47:02]

Stuff you should know is a production of radios HowStuffWorks for more podcasts, my radio is the radio app, Apple podcasts or wherever you listen to your favorite shows. Hi, I'm Brian Husky, I'm bald, and I'm Charlie Sanders, I'm also bald and we want to talk to people about it. Charlie, did you know that the less hair you have, the more interesting you become? Yeah, of course everybody knows that. Oh, but I mention them well, on our podcast Ball Talk, we interview people about being bald.

[00:47:30]

Brian, is this show just for Baldy's?

[00:47:32]

Charlie No. Heroes' will enjoy this, too. I mean, the show is about perception, insecurity, vanity, just like human stuff. And you wouldn't believe the things that come up. Listen to bald talk on the radio, Apple podcast or wherever you listen to podcast.

[00:47:46]

I'm Jennifer Palmieri, host of a new podcast from the Recount. All just something about her. After working on five presidential campaigns, I thought women could achieve the same success as men if they played by the rules. Then 2016 happened in my podcast. Just something about her. I'll talk with women, CEOs, athletes, politicians and more. So together we can create our own girls. Listen to just something about her I heart radio app, Apple podcast or wherever you get your podcast.