This podcast is brought to you by the Albany public library main branch and the generosity of listeners like you. God daddy these people talk as much as you do! Razib Khan’s Unsupervised Learning
Hey, everybody, this is Razib Khan. And I am here with Unsupervised Learning podcast, and I guess it's on YouTube now. So I guess I don't know, maybe I should say, the show or whatever people say, I'm here with Adam Mastroianni. And he is a postdoctoral researcher at Columbia Business School. And I think he's a social psychologist. And we're going to talk about that a little later. And he also has a substack “experimental history” which I think, Adam, you, you have talked about the fact that people sometimes misunderstand what the substack is because the Yeah, you know - and I did, I was just like, Wait, like, because I, because I read, I read some of your stuff, just so the listeners know, I've read some of your stuff coming off the internet, and we'll talk about that. First, I saw off Twitter. And I did like, pay attention to what your substack was called. And then later, I went back to your substack independently - I was like wait - it's called Experimental history. What does this have to do this? You're gonna be you're gonna be a big enough deal, that people are gonna ignore that like little, you know, a little misdirection. But anyway, I think I introduced you a little you can introduce yourself more if you want. Or we can just go into peer review, and why it's a big deal. Why your post on peer review became a big deal. And where we go from here with peer review, so take it away.
Yeah. Yeah, I guess I am a social psychologist, although, I mean, I got my PhD in Psychology working with a social psychologist, which I guess is what qualifies you to be one. During the day, I teach negotiation to MBAs at Columbia Business School, that's my day job. But really, my love and my real job is writing that substack called “Experimental History” which is a terrible name for a blog that's mainly about psychology. But that's sort of a purposeful misdirect that I'm trying to make a point that I think we can get into later.
Yeah, and, um, you know, so I have, you know, a lot of people listening to this know about my substack like, some of them just subscribe on Spotify or something. And, you know, you illiterates, just listen to podcasts. But let's put that aside. There are people like that I know you exist, because you reach out to me. But in any case, you know, you're saying your blog, and I, you know, I have a blog, too. I do have to say, you know, our mutual friend. - so many years ago, do you know Gregory Benford is a science fiction writer? No. So he's a science fiction writer. He's written a lot of science fiction. But he was also I think, a professor of physics at UC Santa Barbara before he retired. And, you know, he was described once as what if a physicist swallowed an English professor, in terms of his prose style? And, you know, I think, Eric, I'm gonna praise him because he's not here. So he's not gonna be embarrassed. But I feel that about him. Like, what if a scientist swallowed a literary fiction writer, which I think, you know, he kind of is in a way. So you know, it's a blog? Yes. But, you know, he is a writer, and his work is very writerly. I think, you know, you know, I think I tried to do like something that's like, you know, a little, like, more long form, like a little depth. And I think you tried to do something to, I think you are pretty explicit on your podcast with Eric, which I'll link to, so people can listen to that if they want to, that, you know, you kind of looked at what he did. And you're like, Oh, this is, this is cool. And so I just want to be clear, you're trying to do something kind of different and related to science. And I don't know if we have like a good word for what that is. I think that was clear in the conversation between you and Eric, because you guys are both like, well, what are we doing here? You know, and I, you know, I think I feel like I feel like I'm a little bit in the same in the same vein as well. And so just want to put that out there so that people understand and, you know, everyone should check out your substack. Obviously, the link is there. But I just want to make that clear, you know, what do we mean by blog? I mean, you know, a blog could be like, Oh, like this is the experimental method, because there are scientists who are like, Okay, this is the experimental method. It's on their Lab website, and that's a blog, and then there's what Eric is doing, and that's a blog. So, you know, and you're, you're a piece on peer review is also a blog. And, you know, I don't know how much traffic you got, but people wouldn’t shut up about it for weeks. You know, Though that's obviously a whole different thing. You touched some you touched some nerves. You went viral, bro. You know, you went viral. And why did you go viral? Can you tell us? Can you tell us? What, uh, what happened there?
Yeah, I think it's because I pulled the internet slot machine and it came up. It came up all stars or however a slot machine works. It's funny when I went to post that piece, I was like, Ah, this is kind of inside baseball. Like, it's a pretty niche topic. I think a lot of people who read what I do, like, some of them are scientists, some of them aren't like this, well, this one probably won't get read that much. And then it became the most read thing that I've ever done. And I think it was a bit because it touched a nerve. Because I think it said a thing that a lot of people believe but don't want to say out loud, which is sort of an emperor has no clothes kind of moment where it's like the way that we treat this thing we call peer review, as some kind of quality assurance doesn't really do that. And, and in fact, might cost more than it benefits us. And is historically weird. And it's worth doing something else. And so I wrote that piece after a few weeks after I've just written and posted a scientific paper online. So I have written it, like I would write in my normal voice. And I said, here's what we did. And if you want to know more about it, like you can, here's the data. Here's the code, here are the materials. But the blog post itself for 99.9% of people is everything you need to understand what we did, why we why we did it, why we think it matters. And, and this post was sort of a follow up to that being like, that's kind of why I did it that I don't see any benefit actually of putting that paper in a journal I have everything that I want from posting that paper on the internet, because it got a ton of attention. And people responded to it. People sent me reviews, they sent me annotated PDFs, I heard from the people that I cited in it. And I was like, Well, what, what could a journal possibly give me that I didn't already get. And so it's this is sort of in a sequence with that piece as well.
Yeah. And I actually just have to say, you know, I have been thinking about this, you know, I wasn't academia, before I enter the startup world, about about like, six years ago now - I did write, I don't know, some of you. Some of you might know about this, but probably not. I wrote a piece with my friend, David Mittleman, Laurie Goodman, for genome biology dragging scientific publications of the 21st century. This is in 2014. And, you know, I think my hypothesis is the system's not sustainable, it's going to collapse, there's going to be kind of a preference cascade. But we're still here. So almost 10 years, we're still here. So someday it will change, the status quo will change, I still believe it will happen very fast, because many people are very, very unhappy. This is why your piece went viral. And just to describe, you know, peer review, and it's, you know, just the details of what it is, it's like, okay, like, so you have an editor at some publication, there are like 20,000, or whatever now. So we're not talking Nature or Science, we're talking just like, you know, a lot of regular publications, you send out a manuscript, and, you know, they have to find people who are peers who can review it. And most of the time, I think it's blind, although sometimes now they're having various, you know, different ways to, to indicate that. And, you know, often it's a situation where there's really like only six people that are good candidates. You know, I mean, you kind of know who's going to be looking at it. And, you know, there's, so that's your peer who has a veto on you. So, you know, there's shenanigans that can happen, like, it really just happened 20 years ago. So I think I can say it now in terms of, I'm not going to name names, but you know, people sit on papers, and keep getting it through peer review. And then they know that someone in their own lab is working on something similar. And, you know, it comes out and everyone knows after the fact what happened. But so this is like the peer review used in scientific politics, right. So it can be used for, frankly, you know, that was evil. What was really much more concrete, what, you know, a mentor of mine once told me was, uh, you know, I would I would love to do peer review, because, you know, I would get to see the results ahead of time. But now with Preprints, the value of that is really decreasing a lot. So preprints are - you read it up, and you just put it out there. And, you know, eventually it's going to go into the peer review system, usually. And you could also get some pre publication or early review early feedback. The problem is, I think, functionally, there's too many pre prints. And you know, everything is subject to a power law, and a few prints get a lot of feedback, and it works, but most of them are just out there, you know. So, that's, that's fundamentally part of the problem. But you know, you talk about the fact that if you're in academia And now it's like it's been around. It's been like this forever. It hasn't. Yeah. Can you talk about the history of it a little bit?
Yeah. I mean, it's funny how short our memories are. Because yeah, it feels like I came into this field and people were like, and, as did Pharaoh and and the scribes before us for centuries, it's been this way. And, and really, if you if you look at it like, there are things like peer review that have been around for a long time, but this thing that we have now, which is basically universal peer review, is pretty new. It's about a generation old, somewhere in the 1960s, or 70s, is where we bait we got rid of most of the other stuff. And up until that point, all of scientific publishing had been a pretty diverse hodgepodge, you know, some, there were some scientific societies that would be publishing the things that their members were doing, there were scientists sending letters to one another. There were basically magazines reporting on science. And all of these were ways that you could publish your science and, and it's really hard to reconstruct, I think, like what the prestige hierarchy was, but it doesn't seem clear that everybody knew like, Ah, yes, if you get it over here, like that's prestigious, and over here that isn't, however, prestigious working in science was administrated some other way other than the way that you publish your paper. These days, it's not that way. And everybody knows that, like getting a paper into Science, or Nature is the best. And after that, there's a few other general science journals. And after that, there's a clear hierarchy in your field. And that's all also new. And so this idea that the only way that a finding enters the scientific record, or is like socially co registered that we all agree that this has happened is you send it to a peer reviewed journal, they send it out and they publish it. That part is very historically strange. And I think one anecdote that illustrates this is Einstein only ever had one paper that was peer reviewed, and he didn't realize it was going to happen. And he was so angry that it happened that he pulled that paper and publish it somewhere else. So even you know, 100 years ago, the idea that this would be what you do de facto is kind of crazy.
Yeah, so, you know, let's, let's talk about this and like, in a meta way, a little bit. A lot of times academics today, the way we're trained, we, let's say past tense, I was right. So is the peer review system, the publication system, that is science qua science. But that's not how it always was. science has existed since at least the 1600s. If you take a narrow view, and for 1000s of years, if you take a broad view, right, so So what is science sciences, you can say it's a method, but it's not like a specific method, like Popper didn't have this method, like 300 years ago. So science has been around for a long time. And it's a different instantiations different inculturations right, it's incultured in different ways. So science was professionalized, there are professional scientists. And these professional scientists now need to have metrics to justify their grants, so the metrics or the publications. So this has been subjected to Goodhart's rule, where when you have a measure, and you have a measure that's explicit, people start aiming for the measure, as opposed to what they're actually supposed to be doing, which is science. And that's just the reality. I'm sure we can tell many stories. Everyone within academia knows how this works. It's a little weird to people outside of academia, including, like bizarre things that I took for such great that I took for granted to such an extent that I didn't understand that other like, you know, writers, right? - So my friend Thomas Chatterton Williams, prominent writer, he was shocked that scientists don't get paid. Okay, everyone who's a scientist, or has been in academia is laughing right now that this would be shocking reveal. In fact, you quite often have to pay in certain journals, open access to get it out there, or to get like color figures and all this stuff. So this is all like a local, like, Inculturation of how science is done. But there's nothing fundamentally deep about peer review. And I won't like you know, let you comment on this. But I do want to say one thing that really annoys me about peer review and you said this, in your podcasts with Eric, is once something is peer reviewed, and it's published out there, it has the patina of truth. And it's really annoying, when it just happens to be wrong to the point of - but not wrong in a way that can be retractable. So there are papers out there. Like, I'll give the listeners a concrete example. So that they can, it's not as abstract. There's a paper that was published in 2013 that showed the Australian aborigines in Northern Territory had 10% Indian ancestry. It was published in Proceedings of National Academy of Science. So you know, not Nature, Cell, whatever, but science, but like, just below, it's peer reviewed, it's published, it's out there. The statistics just kind of messed up for some weird way that I don't think there was any fraud. And, you know, there's nothing where it's like, it shouldn't be retracted, but nobody really believes that result. So people will point that result to me and I will just have to say nobody believes that result. But I'm having to do this for 10 years now. You know, Should I? Can I just put a note on the top of PNAS, “nobody believes this result” - I don't want to like, I don't want there to be a retraction. I don't want to cause a problem for the authors, stuff happens. That's not like robust. That happens. Maybe that's what I should say. “It's not robust” but normal people don't know what that means. In that context. I just say, look, the results, not real result. Can you just stop telling me about this Peer reviewed paper in a prestigious journal? It's an awkward thing to say, but it’s true.
Yeah, which I think is a great example that like, the way that we form scientific consensus, or the way that we build progress, it isn't exactly that like, oh, the papers come out, and they're all vetted. And then they go into the scientific record, because we know exactly things like this happen. I mean, we know a good chunk of these papers are fraudulent, or would never replicate. Or there's various issues with them. And like, in the long term, I do think that all generally works out in the short term, which is what we really want, I think, peer review to do for us to tell us what's true and not true in the short term, that doesn't work out. Because to actually do the vetting that it takes to really tell whether a paper is true or not, is a ton of work. And I think virtually no paper actually deserves that amount of attention. And, and when we pretend that we've given it that amount of attention, I think we end up worse off because we're like, Well, I mean, it's been peer reviewed. And so who is to say that it's untrue, other than literally everybody, because you can read it for yourself and see whether it's true or not.
Yeah, I mean, I, you know, I'm sure you've been a peer reviewer, I've been a peer reviewer, sometimes, you know, okay, it just depends on it. I wouldn't get in trouble for saying it. But sometimes, like, you don't put that much work into it. Sometimes you do. There's variance. Okay. There's other things, there's trade offs. And, you know, I don't feel great about like, not digging as deep as I should. But a lot of times, like, this was back when I was in graduate school, you know, cuz I didn't say in academia very long. And, you know, they were desperate to find someone to review, say, like a mammalian genomics paper. Okay, like, concretely this was happening. And so I'm doing them, I'm doing them a mitzvah. So I'm just like, okay, you know, what, like, I can't devote that much time, but I will. And there's a lot of sub optimality here. But the issue is, it's still called peer review. You don't know how much effort the peer put in, you don't know who they are, you know, so there's like, all of this variance going in. And, you know, it's how the sausage is made. Makes it seem a lot less abstract and elegant. And, you know, kind of highfalutin, you know, and I think when people say because like people will say to me, like, you know, these are often scientists, they will say this is a peer reviewed paper. Come on, like in your own. This is like a form of Gell-Mann’s amnesia, because I'm just like, in your own field, you would never say that.
Yeah,
because you know who’s papers are real and who’s aren't.
Yeah,
You know what I’m saying?
- whatever someone says that, like, basically, they're saying, Well, ‘someone else did my homework for me’ and has like, unfortunately, if you really want to know whether something's true or not, you do have to apply some labor. And I think the solution isn't, well apply 10 times more labor and figure out whether each thing is true or not, I think it is actually hold each truth much more loosely. And understand that these things are speculative, a lot of them are going to be untrue, we're probably not going to know right away, which ones are true or untrue. A lot of them don't matter in the first place. And so this is sort of the stance that I developed toward any scientific claim that I see. And most of the time just like, Oh, could be. A lot of times I'm like, could not be. And very rarely am I'm like do I feel like definitely is. And I think that is the appropriate stance of like, countering all these scientific claims where you can't actually spend the time that it would take to really evaluate them. And I think that is a level of rigor that should basically be reserved for the things that you need to be true in order for your work to be true.
Yeah. Yeah, and another thing that, that I do have to say also on this was kind of unrelated to peer review, I just want to know what you think. Credentialism in terms of a traditionalism is often very, very selective too, by the way, you know, but so, you know, sometimes you will hear like, oh, so this was a biologist, they know about this topic. And I'm just like, there's really no such thing as a biologist. You know, there's no one who studies -iology, they studied one little domain in biology, and then outside of that, you know, because, okay, like, I know, a little bit of developmental genetics, you know, a little bit, but I don't know that much. I mean, I really feel very uncomfortable. Evaluating any of that talking about any of that beyond like, a random person on the street, like the differences in my opinion, marginal, you know, population genetics is a different thing. Okay. So there's like this narrow specificity. So that's fair. On the other hand, I'm gonna be honest, you know, I don't know what you think because I honestly don’t really know you that well, that well Adam, okay, like if follow you on Twitter, I read your stuff. I don't know you, you know, as a person, I'm starting to get really annoyed recently about like, you know, oh, so and so's a PhD, you should give them the respect of a PhD. And I'm just like, what the hell?
No. No way.
And also to be honest - I've never had a physicist, physicist say, Well, you don't have a PhD. Just, I'm just saying physics. They have a certain culture. Credentials don't - you know, it doesn't matter who you are, you know. So I think that's something that I'm seeing recently, and this is associated this, I think this is related to the whole world. It's peer reviewed. Oh, it's in that journal. We're forgetting what science is about. And science is about what you say and not who you are, or how you did it anyway.
Yeah. I think whenever someone makes an argument like that, they're basically telling themselves that they're like, I can't produce for you the arguments that would convince you. And so instead, I'm going to appeal to authority, which is basically no different from, you know, holding up a Bible and saying, well, but it's in this book. And so therefore, it is true. And I think the whole point of being an expert is you should have a better ability of producing arguments that are convincing to people. And if all you can say is well, but I spent six years and now I have a document that says I spent six years doing this, and therefore you should trust me, that is a big red flag, I do not trust you. Because you also I feel like don't actually take me seriously. If that's all you can tell me that, like, I understand that, like, oh, maybe this actually might take a long time for me to be able to understand. But if you can't actually get me there, then I just have to have faith in you like I would a priest. And the whole point of us doing this thing is that we shouldn't have to do that. That like we don't have to rely on faith, we can verify.
Yeah, yeah. And I have thought, I have thought sometimes, like some scientists act like, like they are a priesthood. You know, I mean, you know, I will just like this is kind of a little off topic, not part of like, what we saw I was, I was an early COVID hawk, you know, and I will tell you that I think I've said this in the podcast. So some listeners will know, between like the last week of January, and the last week of February of 2020. I was, I was extremely stressed out. A lot of people also thought I was like mentally unstable, because of what I would talk about all the time. You know, the coming pandemic. Later, they apologize, quite I actually got texts from people, they're like, Oh, we took some precautions, because you seemed a little too crazy. And so I guess you weren't crazy. Anyway. But but you know, I, you know, I follow up mostly bio Twitter, you know, mostly, you know, geneticists and stuff like that. And, you know, these are PhDs, a lot of them are also microbiologist, and everything was fine.And so I was just like, Am I insane? You know? and that, I mean, I'm not gonna lie, like, after that, I was just like, I don't really care about the wisdom of the community, because it was pretty obvious, in my opinion, what was happening at the end of January, and I just routinely would get like, Okay, you're… and I talked to other geneticists, like in my field, people, you know, he's not there anymore. So I can say, I'm not gonna say his name, but a friend who's a professor at Cornell, and he started freaking out, like, around February 10. Like, you know, all of us were early, we remember the dates, you know, as we started freaking out, and basically, people in his department were like, you're, you're insane. Just get a grip. You know. And I think that that sort of thing shakes you. Because you see, you know, these experts, you know, these people that you trust to think deeply about things. They're on autopilot a lot of the time, you know, and so peer review? like, going back to what we're talking about - these scientific frameworks, the credentialing, the institutions, they've taken on a life of their own. And, you know, I think like, what, you know, conversation between you and Eric, I think you're talking about getting a PhD is an apprenticeship. And what it's about is like grasping the truth that's out there. And I think sometimes we forget the forest from the trees here. And I think that's why your post about peer review elicited extremely strong opinions, because the people who agreed with it were like, yes, yes, this is true, but the people who disagreed with it, I'm not gonna lie. They know that it's touching.at something real. And that's why they're enraged. Otherwise they’d ignore you.
Yes. This is again, people telling on themselves, so I had literally a tenured professor, in my comments being like, Adam, are you still like a Resident Advisor at Quincy House at Harvard, which is a job I used to have. Like, I'm a Harvard alum, and I have worries about your ability to mentor undergraduates. And like this is very serious, like basically threatening my job on the internet. I mean, fortunately, a previous job. And I was like, man In what world? Is this? A good use of time for you like How could a blog post by some random guy be worth this person descending from their tenured chair, to be like, I want to get you fired. I think it goes to show that like these are these are assumptions that people really don't appreciate having questioned, in large part because you benefit from them. Something a belief, I think that underlies a lot of what you're just saying that like, Is this funny thing I've wanted to write about, I haven't gotten quite right yet. But this this idea that we are at the end of science, which I think is kind of an unspoken, but implicit assumption that a lot of people have that just like, with all the stories that we hear about the scientific consensus being overturned, those are in the past, like, that won't happen to us now. Like now, we like pretty much got it down. And obviously, we don't know everything, but like, we really are on the right track, like our understanding of the universe is not going to be revolutionized in the future. Anything that's just wrong, and like, every past generation thought that as well. And so when there are cases where, you know, the scientific consensus says one thing, and there are a few people saying another thing, and people go like, Well, why can't you get on board with a consensus? I mean, maybe they're right. But maybe this is one of those cases where things are going to flip. I mean, there was a time when, you know, people thought it was crazy, the idea that you should wash your hands between when you touch a cadaver, and when you deliver a baby. And now we think it's crazy that you ever would do that. There will be things today that we feel that way about in the future, we just don't know which ones. And I think if you don't believe that, if you think that we're pretty much at the end, I understand why you would love peer review, because you really want to make sure that those those T's are crossed, and those i's are dotted appropriately. Because like we're kind of just finishing things up here. But if you think that there are big things left to discover, then I think you would favor strategy has actually increased the variance of the things that we're doing that like some things are gonna be radically wrong, but some things might be radically right, in a way that is at odds with our current understanding.
Yeah. I mean, I'm actually like, not very excited, partly, you know, mantras, you know, like, “trust the science” I'm actually a big fan of those partly because, yeah, I mean, look, I fly in airplanes. I trust that science, okay. But there's a lot of times, so like, you know, going back to COVID, stuff masking, we flipped multiple times. And the science is really uncertain actually, it just is, you know, and I'm not a pro or anti masker, I was an early masker I'll tell you, but whatever, like the science was uncertain. I remember a friend of mine, Iona Italia, she was like verbally abused by a scientist in early March for being a pro masker. And then this scientist was verbally abusing anti maskers in June, and she's like, what's going on here is the whole thing was just really weird that people behave this way. Like, they're 100%. Sure. You know, like they have an opinion there or what it doesn't matter what the opinion is, but they're 100%. Sure. And I think what you were saying earlier about uncertainty is very well taken. And I think we all need to be more humble. And this is something that we need to all think about. Because I think it is human nature to kind of dig in and take whatever opinion you have, and be maximalist, you know? So in terms of uncertainty, as I said earlier, I'm a little uncertain what social psychology is. So tell me what it is.
Me too, I don't think anyone could really give you a good answer this, but let me try. I think social psychology is experimental history. And that's why I call my substack that which is, which is to say that regular historians wait for things to happen, and then try to understand why they happened. And what happened. I think social psychologists do exactly the same thing, except that we construct those situations rather than waiting for them to occur. So we create a situation whether it's asking someone a question, or whether it's putting them in a situation, we observe what they do, and they try to we try to make sense of it. And I think, really, the processes are very similar, and the only difference is whether you create the situation or not. And I think what we produce much like what historians produce, is stories. And I know a lot of my colleagues don't like that word, because it's become equated with like, oh, well, to publish a paper, you got to tell a good story. And telling a good story suggests that you are, you know, glossing over the details or maybe lying a little bit, or you're leaving something out. But when I say story, I mean, in the in the most literal sense of people doing things in a place is inherently a story. And that is actually the knowledge that psychology produces. We have people do things in a place. And to talk about it any other way is actually to obscure what's really going on. And so what happens a lot in psychology is, rather than talking about people doing places, doing things in places, we talk about constructs interacting with each other, so like, people, like depression moderates, the interaction of self objectification, and like approach orientation, all of which are just clumsy ways of describing things that people did were like the real story behind all those words is the way that People like bubbled in a bubble over here, when we asked them how sad they are, like was statistically related to how they bubbled in a bubble over here when we asked them like, how much do you think of yourself as an object? And this other question Where was like, How much do you want to go toward this thing? Like, really, this is a story about people bubbling in bubbles on a computer screen. And when you put it that way, I think it becomes much clearer that like, oh, actually, that story is not interesting. And it probably doesn't tell us that much that's important about human behavior. It's not always the case. Like sometimes people bubbling things in matters a lot. That is what literally voting is. That has a huge effect. So like, we do want to understand those behaviors. But I think when you think about psychology as the production of these stories of that you get better at picking out like which these stories wouldn't matter in the first place.
So it's a question I think was asked to Paul Samuelson, what is what is a non obvious significant finding in your field? So I think Samuelson said in economics said, comparative advantage. Okay. Is there anything like that in social psychology?
Yeah, this is going to be unpopular answer, because like the narrative on these studies changes. Over time, I'll go into that. I think the Milgram studies are this. And like the level one, so the Milgram studies are back in the 60s, a guy and yes, yeah, in the basement. At Yale, he basically got people to think that they're shocking someone to death when they really weren't. And the level one understanding of that study is, wow, people are so obedient to authority, like, look, what people will do just when they're told to the level two understanding of that study is, you know, actually, if you look really closely at it, like Did that really happen? Like they were really pushing people and a lot of stuff has come out of like, well, how much can we actually trust those results? I think the level three understanding, which is, of course, the top level understanding and the one that I inhabit, is that like, actually, this does tell us something extremely insightful about human behavior, which is when they surveyed people and ask them, What do you think people would do in this situation? Like, what percent of people do you think would shock someone all the way to the end? And people said, like, basically no one that they thought that like this was implausible that you could construct a situation where people would do this thing where they thought that they were killing someone, and actually turned out to be fairly easy to construct a situation where at least some people did that. And they did a bunch of different versions of this. But in one version, it was as high as sixty - as two thirds. And I think that tells us something that like you can create situations in which people do things that just seem impossible for them to do. And I think that's actually one of the central findings of social psychology is this power of the situation that you might think, well, there's good people and bad people. And so what's going to happen is the bad people come in and do the bad thing. The good people come in and do the good thing. Like no, actually, the good people are going to come in and try really hard to be good. And if you create a situation where being good, actually does something bad, they're going to do a bad thing. So that is one of them.
Well, so isn't that also the insight of history?
Yeah, though, I think you can. You can look back on history and go, well, the people who did the bad things were bad people, and that's why they did them. And I think that was a hypothesis about like, well, the Nazis did the Holocaust, because like, well, that's German culture, and the German people, and like, you couldn't just have a holocaust in any country, like, you need a certain kind of person to do it. And I think a fair lesson is like, well, actually, anybody is capable of doing that in the right situation. Not literally anybody, but far more people than we think are capable of doing that. And you can't write that off when it's like, well, this is a sample of mild mannered people who don't do this in their everyday lives, you know, very in a bunch of different ways. Yeah.
Yeah. I mean, I think for me, the best example would be the My Lai Massacre, because they're just, those are American soldiers. And, you know, following orders, and I mean- okay, they're women and children there. They see them. Yeah. So they can't say that this was like they were bombing from afar. And, you know, these people were just regular Americans. The colonel, that was the commanding officer, there was actually a campaign to exonerate him led by Jimmy Carter, actually,
yeah,
In the early 1970s if you check it out, so, you know, history is like complicated and to me that that seems okay, like that seems obvious from history, but what you're saying is, social psychology has like given us these control experiments now, and to some extent you quantifying and I think that that is, that is, for me the value of a lot of behavioral science contemporary that dovetails well with historical sciences, where we can see in aggregate, you know, social situations, is that okay? Like you give like a quantity like what percentage of people are extroverts, what percentage people are introverts and we know from history that some people are this or that, you know, we will hypotheses and Stalin is a sociopath or a psychopath. Well what percentage of people are sociopaths? Or psychopaths? You know, like, what is the enrichment? What's the, you know, odds ratio of a dictator being this and that? I think that is definitely interesting. How do you feel about, you know, Joe Henrik, and some of his colleagues, criticisms about psychology, focusing on weird subjects, because social psychology from what I remember, maybe it's changed. This is a social psychology of mostly white college students in developed nations, which it's a very narrow segment. And I know there's some universals but still,
yeah, I mean, these days, it's mainly the psychology of people who are on Amazon Mechanical Turk. I think if you open up pretty much any study these days, it's probably mostly that. To that I would say, that's a fair critique when what you want to claim is, this is something humans do universally, which is actually a really difficult claim to make in psychology. And in fact, most of what our studies are set up to do is create an existence proof that we can show not that something happens, but that it can't happen. And this is actually why I think it's really hard to run a psychology study that is interesting, because you need to have a really good null hypothesis, which has to be that this can't happen. And actually, for most things, that we have any possibility of testing, I think it's not that crazy to think that they can happen. And so you can create a situation where like, okay, yeah, if you kind of say these words to people, you can create a group difference. But it doesn't mean that that actually happens in the world, it means that you could couldn't construct a situation where that does, I think that's actually why the Milgram studies are useful, because because people have the null hypothesis of like, you couldn't create that, or that situation wouldn't do that. And you can show that like, actually, people have the wrong theory about that situation. Which is why I think it's been the the campaign of like, judgment and decision making or like heuristics and biases has been so successful. Since the 1970s, that there is built in, this null hypothesis, that like, people on average, should get this right. And like, of course, we'll accept that they're inaccurate, but they shouldn't be biased on average, which, you know, is a fair default assumption, because, you know, there's wisdom of crowds, so long as people are just bad at this, they should cancel each other out. But if people consistently, you know, think that thing will, things that are easier to call to mind are actually more likely to happen or, or occur more often than that's a bias. And I think that's why that work has been so successful, because it disproves a very reasonable null hypothesis. And that's also why a lot of psychology fails. Yeah.
Is that null hypothesis that people would not do this? Or that? Is that like, related to the fundamental attribution error?
I'm not exactly so that's the idea. Right? That you would you tend to attribute other people's behaviors to their dispositions, rather than their situations. I think this is a little bit different, that this is the idea that like, people can be plenty inaccurate at whatever task they give you. But they shouldn't judge things on average in a biased way. And that whenever they do, that's actually kind of interesting, because there must be some reason why you're pushing people in that direction. So if people consistently think that you're more likely to die in a plane crash versus in a car crash, when we know that's not the case, people should get that, on average, correct. If they're just, you know, naturally observing the world, even though some people are going to get it wrong, another direction, some people are going to get it wrong in one direction. But what we find that actually people consistently get it wrong in the same direction, then it's suggested like, oh, something's going on there. That's interesting. Because yeah, yeah.
All right. So how do you feel so I think most of the listeners, we don't need to, like, repeat the whole replication crisis, blah, blah, blah, which is not just a thing in psychology, it's kind of like a rolling wave through academia, because of the way statistics and you know, not just statistics, but also just like the way scientific setup is with, you know, selection effects of, you know, just basically Google funnel plots, science, and you'll know exactly what I'm talking about, in terms of effect sizes, and P values and all that stuff, right. So how I feel like in a way, psychology is quite healthy because it went through the replication crisis really intensely, about a decade ago, still going through it, whereas other fields like so for example, Biomedicine I think still has not really faced up to what's happening and it's going to be pretty atrocious when it does because you know, we're not talking like little like college students pressing buttons on a computer you know, we're talking this there's gonna be like a lot of people are gonna pay for pay for like, what's going to happen honestly, like companies are gonna pay for it. So but let's talk about psychology like how do you feel after the replication crisis Do you think People are like it's changed a lot. I mean, I think you've probably were started your academic career right at the beginning of it.
Yeah. Yeah, I was just starting graduate school as like, it really got bad. And my take on it is, yeah, it just seemed like things have changed a lot. But I have this feeling that we maybe learned the wrong lesson from it, which is now people are really interested in getting to the bottom of is this thing true or not? Is this thing a statistical artifact or not? Which is actually the second question that you should ask about anything. And the first is, would it matter if this is true or not? And I actually think for the reasons we just discussed, the answer to that question for most psychology studies is No, it wouldn't matter if this is true or not, there's not a reasonable null hypothesis here. And I wouldn't pay any money to know whether this is true. Because it wouldn't shift my understanding of the world at all. And if you don't ask that first question, you can waste a ton of time and resources trying to answer the second question about finding that doesn't matter. And I feel like this is what we've actually been doing a lot of for the past 10 years. So one of the I think poster children for this is ego depletion, which is this idea that, you know, as you use your willpower, like, you drain it, which I think again, like, could be trivially shown to be true, I don't think anybody thinks that isn't impossible that that is true. Or you can create some situation in which that is true. But what we spent a lot of time doing is testing out this one specific task where like, you cross out ease on a page, and then you do some other tasks where you need to use your self control or like, you don't eat a cookie, or you eat a rutabaga instead, or whatever it is various versions of this. And we and at the end of it, we're all we're like, it seems like maybe it doesn't look like what was the question we were trying to answer in the first place like that people get tired? Because obviously they do. But if what we really want to know is like, Well, does crossing out all the letter E on a page lead you to something else, like Obviously, no one cares about that thing, specifically, because nobody's sitting around all day crossing all the budget letter E’s. So what is it that we actually stood to learn here? I think that was a question that like, no one's really, I think asking because it's a harder question to ask. And I think it's a little more uncomfortable. I think even as hard as it is to say, like, I don't trust this result, like I think you P-hacked or whatever, it's a lot harder to go like, I don't think you could have provided useful information by doing what you did. Like that is - maybe philosophers can say that to one another. Psychologists are far too nice to be able to say that. And so instead, you know, we'll rerun the statistics, but not question like, why did you get out of bed and do this in the first place?
Well, I mean, you know, I have a mentor of mine would be, he would say, you know, he was like, an ecological geneticists. But anyway, he would say, you know, you can find a lot of things that are true, but trivial. There's no point to that, you know, and so is this, what you're saying? It's like, okay, like, whatever is true, but whatever, you know, I mean, yes, there's no, like, General General find, there's no general inference you can make about the world, it doesn't enlighten, it doesn't fill in gaps in your, you know, epistemological framework or whatever, like you use whatever words, but you know, I think we all know what we're saying. It's like, science is contingent, it builds on each, you know, findings are building on each other, it's creating this kind of like, network of information, and even adjacent fields are feeding into each other. So ideally, so for example, you know, neural networks are becoming like really big and population genetics now - obviously, they did not start out in population genetics, computer science and neuroscience and stuff like that. So it's like sciences, progressive, and it's building and are you saying that some of these things are just kind of like one offs, they’re like, okay, they're there.
Yeah, like, we had this idea that, you know, we're, we're building this, like, Great castle of science, and each stone is placed on another stone. And I think mainly, what we have in psychology is just a bunch of stones in a field, like sitting alone. And so when people go up to one and like, kick it to pieces and go, Aha, this stone, like wasn't, it wasn't solid, you can't build on this. But there wasn't anything built on it. And so what was the point of kicking it in the first place? Like, don't you want to try instead to build a stone that other people could build on? Or build on another stone that you think is actually worth building on? I don't really see us doing that. And I think it comes back to this idea of thinking that we are at the end of science that like, if you think that then you think it's really important to figure out whether each thing is true or not, because we're getting close to the end like this is kind of all there is. And so it makes sense that we should start spell checking the document because we're about to turn it in. Whereas like, the whole document could get deleted, and we could start a new one tomorrow. I mean, this is especially true in psychology. And part of this is because like we don't really have a paradigm or an ontology, like we don't have a strong way of knowing what is useful and what isn't. Like we don't have a big periodic table with some gaps missing where we're like, oh, well, we have 39 and we have 41 so like someone should go out and find 40 and when you Find for you were like, oh, okay, like, we know that it's probably gonna have these properties. And it's probably gonna be useful for these things like, we don't have anything like that. And I don't think that's like to our shame or discredit, I think we've only really been at this for a very short amount of time, you know, maybe one to 200 years, but really like a generation, like truly in earnest. And I think this makes it what makes it exciting for me to do psychology is that you still could, like, you know, control A and delete everything and start over again. And I don't think there's all that many things that I would bet would survive the next the next paradigm shift.
Yeah, yeah, that's, I mean, so here's a question I haven't thought about in some detail. So, you know, I don't normally say like, evolutionary biology, you have questions that like, you know, driven by these big questions, why? Why are complex life forms divided into two sexes with different gamete sizes? You know? You know, could if you rewound history and let it go with the diversification happened in the same way, like, you know, what is, you know, convergent patterns and deterministic patterns versus stochastic patterns. These are, these are questions that we're, we're testing, what are the equivalents in psychology, I actually have, honestly, just for the listener, viewer, I mean, my experience was a lot of psychology. So cognitive psychology is a little different, in my opinion, because I feel like they have, whether it's right or wrong, they have kind of certain models and all these things. But a lot of psychology is these neat results. And they're kind of like the results that you will tell at a cocktail party. So there was a study, and I read a New York Times article about it, and blah, blah, blah. But actually, now that you, you know, you're talking and I listened to your podcast with Eric, I don't really know, where does that go? So what would it look like? I mean, one of the big questions that we're trying to solve, do we have a soul? I mean, I don't know.
Yeah, I think honestly, the big question is, what is the big question that, like, if we could pose the question better, we might do a better job. And I know, that's, that's kind of a cop out. But I do think we're still at that point. Where, you know, however long ago it was in chemistry, where, you know, we didn't even really know like, what stuff made out of, and it could have been a lot of different things. And like, we didn't even really know what a thing was, like, when you can't see it isn't even there? Are there things that are small enough that you can't perceive with the naked eye? I think we're kind of at that level. And until we figure out, like, we think the universe is kind of organized this way, it's really hard to make a lot of progress doesn't mean it's impossible, I do think we can produce results that like will ultimately end up in our understanding, because they are important that they speak to some kind of truth. But I think actually finding those is basically a matter of applying intuition, rather than applying pure logic. And I think that's actually what I was learning in my PhD was like to sit in a room with a longtime with a guy who I think had an intuition for doing exactly that. And now I tried to do it myself. Yeah.
Yeah, it's weird. Listening to you, this is not like super flattering, but I'm gonna just say it. You know, because you're talking about chemistry. And, you know, lesser extent, biology, let's talk about chemistry, because chemistry, in a way has been around forever, the alchemists, they were not really cranks in the way that they're kind of depicted. Like, they were trying hard. They just didn't know, the fundamental units of -
Yeah,
you know, Atoms, they didn't have the theory. And so they're doing all these experiments and mixing things together, but they're trying hard, you know, they were like, they were definitely proto scientists. And, you know, in biology, you have some similar things. If you read if some of Aristotle's ideas of the generative, like how slimes come from mud and weird, okay, like, okay, they just didn't understand, okay, but they were trying, and then the pre pre Darwinian theories of evolution there were around actually, and they were kind of weird, though, Lamarck, the Lamarckian theory was actually one of the more coherent ones. So people have always had interest in life interested in in chemicals, okay, but like they didn't have a framework to actually work through it. I mean, are psychologists alchemists, is that what you're saying? Trying not to be a dick like, I like listening to you. It sounds like you guys are doing a lot of stuff. But, and it's not like fake, you're doing a lot of stuff. The problem with alchemy was, they didn't have the periodic table. They didn't understand how they didn't understand molecules and atoms. That's fundamentally the issue. If you don't understand molecules and atoms. It's going to be a problem. There was actually like Darwin didn't have a genetic theory. And that was like a huge problem for evolution for a while. And then all of a sudden genes were invented, quote, unquote, or rediscover and then everything - the math worked. You know, you could actually do math, you know?
Yeah.
So I mean, just like, we're kind of coming to the tail end, I'm just, I didn't need to say this. But like, I guess a little spicy. Let's keep it spicy. I was just like, bro, like, are you an alchemist? Like, you know, what are you doing with your life? Like, how to get to the next stage? Right? Like, this is a yeah, this is actually pretty, like, I understand why this guy or this person wanted you fired from Quincy House.
Yeah, I’m poisoning the youth by tell them that we know so little. But again, I actually think that that's the reason I find this exciting and look like in the work that I do. I'm trying to do that thing of like, okay, what is a finding that I think could be useful somewhere? So. So like the the paper that I that I published on my blog, was trying to answer the question, like, when we judge things as good or bad, what are we actually doing? And I thought, a reasonable way that the mind might do this is we judge things as good when it's easy to imagine ways that it could be worse. And we judge things as bad when it's easy to imagine ways that they could be better. And like, we are still early enough in the science that like, that's not a crazy hypothesis to have. And this hasn't been shown one way or the other. And so when my friend and I did was ask people a bunch of questions like, How could this be different? How could that be? How could cars be different? How could your pets be different? How could your life be different? People typed it out? And they, we asked, okay, if it were different this way, would it be better or worse, or just neutrally different? And people were like, it'd be better, it'd be better for every single item that we tested. We're like, Okay, if we ask the question in a different way, does that change it? No. And we asked, like, how could this be better or worse, people tell us better when we ask them like, Have you thought recently about things could be better? Or worse? People say yes. And then they say better. When we ask Polish people in English, they do exactly the same thing. When we translate into Mandarin, we get exactly the same answers. And so it really seems like people are naturally imagining how things could be better. And like, it could be that that fact ends up being totally irrelevant to whatever our understanding of psychology becomes. But I think it's a useful one, because a 90% of our participants show it, which is very rare for us psychological finding normally, it's like, when it's on average is maybe a little bit more than half do it, it's seems to be pretty hard to turn this off. So it seems to be something that we do quite a lot, by default. And I think the reason it could be useful is because this might be one of the reasons the hedonic treadmill works. So we know that once you're pretty happy, it's hard to make you much happier. And like, I don't think anyone really has a good explanation for why that is. And I think this could be part of that explanation that if you are by default, thinking about how things could be better. Even when things get better, you don't necessarily feel better. And like, look, we'll all just get washed away. One day is our revolution, our understanding of psychology has revolutionized like, totally possible like that. I hope that that is a block that's worth building on. And at least thing is a block that, you know, when you kick it, it won't fall apart. But the more important thing is, I think it was worth finding out whether that was true or not. It didn't answer our original question. I don't think this is how people judge whether things are good or not. But I think we found something else interesting along the way. So am I doing alchemy? I mean, yeah, maybe. And I hope to - that one day we aren't doing it. I think the only way of that ever happening is like to try to do the alchemy the very best you can until you realize why you can't you know, turn lead into gold.
Yeah, yeah. Okay, so it was great talking to you, I'm gonna, I'm gonna be honest, I'm not going to title this, what I want to title it ‘the last of the magicians’ because not everyone is gonna get that, get that reference, I think. But that's what I'm thinking of, um, you hopefully you will be the last of the magicians I, you know, I, I think all of us here like, you know, all the listeners, and both you and you know, me, Adam, I think we all want to know how the world works. And, you know, one reason psychology, you know, at least from my perspective gets, you know, criticized a lot is because people care about it. People care about how we behave in the world and how we think and, you know, we think therefore we are so this is one of the most important fields actually, in the human intellectual endeavor in scholarship. And so that's why something like the replication crisis, you know, the fact that it's theory poor, these matter to people in a very deep way. I mean, solid state physics also matters to people with micro microchip development, but you can't have a conversation with a person on the street about that. So this is ultimately just very, very fundamental to us as human beings. We have theory of mind. This is, this is something you know, it's trite to say we can relate to it, but yes, we can relate to it. I mean, we are thinking social creatures. And so you know, if you study what is bracketed under social psychology, yes, it's going to be viral. Yes. It's going to be interesting, but it's also incredibly important. And that means you should do it right. And you know, as you said, you know, do things that really matter that make an impact. So, I think I personally, I was kind of already I'm not gonna lie, I was already on the same page with you about peer review. I think a lot of people, especially younger people, they know that the rest of their, you know, if they have like a decades long career, they're going to be operating under a different system. I think they're just waiting. And so sometimes someone just pops off and does it. And that kind of annoys other people who are in different stages of their career. But in general, you know, you know, someone like you or Eric Hoel, I mean, really love the fact that you're grasping for reality. And, you know, it's not always easy. And so I was always feel good, because most of the time in sciences, you know, we fail.
Yeah.
This is like a litany of failures. But, you know, through those failures, we learn, it's also a thing in startup world, you know, so there's similarities here. And, you know, I learned a lot from talking to you and I hope the listeners and viewers learned a lot from listening and watching you, I don't know about the whole watching thing. I'm still skeptical of that. But I know that Zoomers love to watch people with their hand motions and and their faces and stuff. So I'm going with it you know, I'm going with it. All right. Thank you Adam.
Hey, thanks for having me.
Is this podcast for kids? This is my favorite podcast.