BJKS Podcast

86. Elisabeth Bik: Reporting scientific misconduct, the arms race between fraud & fraud detection, and the microbiome of dolphins

December 22, 2023
BJKS Podcast
86. Elisabeth Bik: Reporting scientific misconduct, the arms race between fraud & fraud detection, and the microbiome of dolphins
Show Notes Transcript Chapter Markers

Elisabeth Bik is a science integrity consultant. In this conversation, we talk about her work on reporting scientific errors and misconduct, how one becomes a full-time scientific integrity consultant, her postdoc work on the microbiome of dolphins, reactions to her work (both positive and negative), how to deal with online abuse, the arms race between fraudsters and fraud detectors, and much more.

BJKS Podcast is a podcast about neuroscience, psychology, and anything vaguely related, hosted by Benjamin James Kuper-Smith.

Support the show: https://geni.us/bjks-patreon

Timestamps
0:00:00: How Elisabeth became a full-time science integrity consultant
0:04:45: The microbiome of dolphins
0:12:02: What should I do if I find errors or fraud in a paper?
0:28:58: Reactions to Elisabeth's work: awards, online abuse, and lots of silence from journals
0:52:23: Should you report misconduct if you're in a vulnerable position?
0:58:19: What problems are worth reporting?
1:05:51: How does one become a (full-time) research integrity consultant?
1:13:21: The arms race between people commiting fraud and people detecting fraud
1:22:49: A book or paper more people should read
1:25:26: Something Elisabeth wishes she'd learnt sooner
1:29:09: Advice for PhD students/postdocs

Podcast links

Elisabeth's links

Ben's links


References & links

PubPeer: https://pubpeer.com/
COPE: https://publicationethics.org/
John Maddox Prize: https://en.wikipedia.org/wiki/John_Maddox_Prize
Episode w/ Joe Hilgard: https://geni.us/bjks-hilgard

Bik, Casadevall  & Fang (2016). The prevalence of inappropriate image duplication in biomedical research publications. MBio.
Bik, Costello, Switzer, Callahan, Holmes, Wells, ... & Relman (2016). Marine mammals harbor unique microbiotas shaped by and yet distinct from the sea. Nature Communications.
Brown & Heathers (2017). The GRIM test: A simple technique detects numerous anomalies in the reporting of results in psychology. Social Psychological and Personality Science.
Reich (2009): Plastic fantastic: How the Biggest Fraud in Physics Shook the Scientific

[This is an automated transcript that contains many errors]

Benjamin James Kuper-Smith: [00:00:00] Maybe I can just start by asking or first stating that I think you're the first person I'm talking to who does the kind of error detection, let's maybe call it that, or science integrity full time. I think everyone else I've talked to did it as a kind of slightly weird hobby. Um, maybe sometimes as part of like some sort of science journal, uh, like being editor kind of work. 
 
 

But I think you're the first person I'm talking to actually kind of does this full time and I'm assuming there aren't that many people in the world who do that. Yeah. Maybe could you just kind of briefly provide like a brief bio sketch from kind of your, your postdoc work to the point where you went full time, um, as doing this kind of work. 
 
 

Elisabeth Bik: I did my PhD in the Netherlands. And after that, I worked in a clinical hospital for a couple of years, setting up a molecular lab. And then I moved to the U S working at Stanford for 15 years. And, uh, I was working in, uh, on the microbiome of humans and dolphins. [00:01:00] And at some point I found by accident that somebody had plagiarized. My paper, a paper I had written, and I was very mad when I heard that, and I just decided to look into plagiarism as a hobby and find tons of PhD theses and papers that had been plagiarized. And then, by another accident, I found a, an image that had been reused in a different PhD thesis. It was the same photo, but it had been used to represent a different experiment. 
 
 

It was used in mirror image also. So I recognized the little spots there. And I, uh, I just thought, why didn't nobody, why didn't anybody see this? It's the same photo, but it's a different experiment that is cheating. And so that, uh, led me into this strange hobby of looking at images in scientific papers and. I did this for several years when I was still fully employed at Stanford. [00:02:00] then switched to biotech. I worked, um, two years at a, uh, in a company that turned out to be a fraudulent company, but that's a whole different story. then at some point I just, uh, I moved on to 
 
 

Benjamin James Kuper-Smith: So, I mean, I guess the relevant part is that you weren't actively involved in the fraud. Maybe that is something worth, worth saying. 
 
 

Elisabeth Bik: a very relevant part. Um, you know, my bosses or the bosses of my bosses were the founders of the company and they have been charged with insurance fraud, but none of the employees have. And so, 
 
 

Benjamin James Kuper-Smith: I mean, it's also not scientific fraud or anything like that. Right. It's 
 
 

Elisabeth Bik: it was insurance 
 
 

Benjamin James Kuper-Smith: yeah. 
 
 

Elisabeth Bik: um, double billing people who are on federal insurance, Medicare, Medicaid, which are, you know, that's a federal offense in the US. 
 
 

Benjamin James Kuper-Smith: Yeah. 
 
 

Elisabeth Bik: Yeah, so I, I, I still was working on, on fraud detection as a hobby or image, image, uh, fraud, I guess. And, [00:03:00] uh, then at some point I moved to another company where I only worked for a very short period. 
 
 

Uh, and I just decided I want to do this full time because I feel am making a difference in science. If I can work on this full time, I don't have to do My day job and just want to do this full time. I will also not have a boss who will tell me what to work on and whatnot. hopefully can make more of a difference in science than if I do my regular job. 
 
 

So I just quit my job, wish them all the best, and I'm doing it now full time. 
 
 

Benjamin James Kuper-Smith: Yeah. It's interesting making a difference because it's, um, Yeah, I mean, you're also really changing kind of the position you occupy in the kind of scientific sphere in the sense of, you know, being one of many many scientists, let's say, to suddenly being like one of very few people who do that kind of stuff. 
 
 

So almost by being one of the few people who do it, it makes so much more of a difference, I would imagine, than being a regular scientist. 
 
 

Elisabeth Bik: Exactly. There's, there's many scientists [00:04:00] doing regular science and, um, and I think the work that I do is not actually making new science, but, but breaking science. So that's how I usually say it. I don't make science, I break science. I criticize other scientists work. I find concerns with them and I report them and I talk about that on social media. Uh, try not to name any names, but make it about the papers. And. And I do feel I'm trying to make a difference. I'm trying to make people aware that there are errors in science papers, but there could also be fraud and that we need to talk about that and correct it and that we need to take a stance against it, that we need to say, this is not okay. 
 
 

We need to. address that and make people aware of that. 
 
 

Benjamin James Kuper-Smith: I wanted to just briefly ask about something you mentioned earlier in passing, um, which sounds like a very random statement, um, but I'm curious like how random it is or how this came about. Um, so just briefly, you said microbiome [00:05:00] of humans and dolphins, um, why dolphins? 
 
 

Elisabeth Bik: Well, I worked at the Stanford School of Medicine, so dolphins is not a topic you would typically be working on if you normally work on humans, right? But we were working first on the microbiome of humans. So which bacteria live inside our bodies, in our mouths, in our guts, on our skin, things like that. And then we were contacted by the US Navy, who, uh, they were managing a fleet of dolphins. So a group of dolphins that were trained to find underwater objects. And this is a program that has since then been retired. It's not secret. It's all online. These dolphins were trained to find things like sea mines in areas of the world where there has been a war and there are leftover mines on the bottom of the sea that could. You know, be triggered by a passing ship. And so they're like landmines. They need to be cleaned up. And so the dolphins were trained because these are animals that can [00:06:00] make very deep dives and can then come back up very quickly and report to their human trainer whether or not I had found something. So that's that's what they were trained for, like sniffing dogs. 
 
 

They were trained to do that task. And the U. S. Navy was doing also a lot of Research. So they had a group of veterinarians working with these animals and, and checking if they were healthy and checking if there were any health problems, but also doing a lot of basic research. So guess things like, uh, how do you measure the blood pressure of a dolphin? 
 
 

I don't know, like, I'm just saying something silly, but they were, they were also interested in what makes these bacteria, what, what makes these animals sick, like which bacteria are they supposed to carry. Which would they carry when they're healthy? And what would they carry when they're sick? And a lot of that research had been done on stranded animals, stranded dolphins. 
 
 

Dolphins or whales that would end up on a beach and be [00:07:00] dead. And obviously that is not a very good topic to do your research on. And so they wanted to know, if we just sample these animals, their mouths, uh, their rectum or their blowhole. If we sample that with a little Q tip, would you, what would you normally find in a healthy animal? 
 
 

So that we can better recognize what would be a bacterium that makes them sick. And, and this is, this was a relatively new topic. So we were like, sure, sent in the samples. We, You know, they're in terms of research. They're actually the same as human samples. You just sent them in a tube and we'll do the rest. 
 
 

We extract the DNA and do the sequencing. It doesn't really matter from our point of view, whether or not it's a human or a dolphin sample. And so I actually Almost never saw the animals. I was not allowed, obviously, to sample them myself, but I was allowed to once visit the facility and see the dolphins. 
 
 

Benjamin James Kuper-Smith: Okay. Yeah, it's just funny that, um, I don't think I've ever heard about [00:08:00] any, like. You know, sometimes you hear about research if you do psychology and neuroscience, you know, there's some like evolutionary stuff and, um, or what a comparative cognition or whatever, um, you know, you hear about different things, but I don't think I've ever heard about dolphins. 
 
 

So I was just really curious when you said that. Um. 
 
 

Elisabeth Bik: intelligent animals. And, um, so these were trained to open their mouths or like present their, their butts to, um, to, to the trainers and be sampled. And, and they were, uh, Yeah, they were, like I said, they were trained and so they were managed, but they never escaped because they, they could, you know, when they were on a mission, they could easily take off, right. 
 
 

But they came back like a dog, you know, you whistle, the dog comes back. So the dolphins did that too, but they're now all retired. So I, uh, I hope they live a nice life and I'm not sure where they are now, but the, they used to be in San Diego. 
 
 

Benjamin James Kuper-Smith: So, uh, can you say anything? I mean, I don't know anything about microbiology, but can you say something about the. I mean, I would assume they'd have quite a different [00:09:00] microbiome just based on that they eat very different things and are in a very different environment. Um, you know, just water versus air. 
 
 

Is that the main difference or are there kind of other differences? 
 
 

Elisabeth Bik: So they're mammals. And what we did is we looked at other studies and compared the dolphins to other studies by other groups on mammals in zoos, where, you know, you try to link how dolphins in an evolutionary sense of way are related to other mammals and would their microbiome follow the same path. And it was very complicated because there's many factors that will determine which bacteria we have in our environment. bodies, uh, specifically in our gut. And one of it is what we eat. Uh, another one is living in the sea. Um, a third one is evolutionary distance to, to other mammals. so it seemed that it was a mix of all three. 
 
 

So we couldn't really figure out was determining the most, um, that made these dolphins look very different in terms of their [00:10:00] microbiome. Uh, we also looked at their fish. So we were sent a couple of fish, uh, that they were given as food. So these frozen blocks. Well, when they're, it's given to the dolphins, it's obviously not frozen, but when they send it to me, it was frozen in a bucket of dry ice. And so I had to look at these fish. And this was one of the most funniest moments because, you know, I'm, usually I would get like a swab or a little tube to extract the DNA off. But now I was getting a whole block of frozen fish. I'm like, how do I extract DNA? One of those moments where You wish you had somebody telling you what to do, but you, I was a postdoc. 
 
 

So I had to improvise, right? Yeah. And you had to come up with something. So I just thought the fish put it in a plastic bag with some, uh, some salt. And I, um, found this machine that could, um, it was called the stomacher. So it has these two metal plates that will, will pulverize whatever you, you put in it. 
 
 

It's used a lot in food industry. So I did that and ended up [00:11:00] with a sort of a fish soup in a. In a baggie that I extracted the DNA out of, but, uh, the, the fish had a very different bacteria in them than the, than the dolphins, so that in the end didn't provide the answer, but it was an interesting experiment uh, have a bag of, uh, frozen fish in the lab. 
 
 

Benjamin James Kuper-Smith: Yeah, well I'm glad now I can add dolphin as another species being studied by my guests. I always like when I can add a different species. Uh, not, no, I haven't written it down, but I, I've had some, whatever, bumblebees, uh, obviously humans, rats, mice, you know, the typical, 
 
 

Elisabeth Bik: are the 
 
 

Benjamin James Kuper-Smith: um, 
 
 

Elisabeth Bik: suspects. 
 
 

Benjamin James Kuper-Smith: Yeah, someone had, what did he, he did, um, um, god, what are they called? 
 
 

Praying mantis? Shrimp? No, no, not praying, mantis shrimp. It's two different species. 
 
 

Elisabeth Bik: shrimp. Okay. 
 
 

Benjamin James Kuper-Smith: Mandestrump, 
 
 

Elisabeth Bik: okay. 
 
 

Benjamin James Kuper-Smith: um, dogs, parrots. Yeah. So I'm getting there. 
 
 

Elisabeth Bik: Nice. 
 
 

Benjamin James Kuper-Smith: can add dolphin. 
 
 

Elisabeth Bik: great. 
 
 

Benjamin James Kuper-Smith: [00:12:00] Anyway, um, okay. So we, you know, with a slight detour reached the point where you're now doing science integrity full time for, for most of the conversation. 
 
 

I'd like to talk about, you know, discuss scientific, not necessarily fraud, but anomalies, let's call it, uh, in, in more detail. And. Uh, specifically also kind of the question of what you should do when you find something that looks a bit off or wrong and you just don't quite know what to do with it. And to kind of use that discussion as a kind of jumping off point for all sorts of topics about that. 
 
 

Um, and I thought the way we could do it is that I actually have something that I'm not sure what to do about. So I thought I'd just kind of in vague terms present kind of what I found. And then we can kind of. I can basically ask you from your experience what I should do, and yeah, use that to discuss other things. 
 
 

So Again, I don't want to be too specific, but basically it's, [00:13:00] it's from a field adjacent to my own, uh, so something I kind of once randomly read and I suddenly saw that like some of the stats just don't add up. Like there's a summary on one point and then like specified another point and they're basically mutually exclusive. 
 
 

Uh, the, the easiest way to put it is that, you know, if you have four points, the average of those four can't be higher than each of those four points. It's that kind of thing. Um, so. I was like, okay, that's, that's weird. So then I started looking kind of, I think at the time I just heard about kind of, uh, James Heather's grim technique and that kind of stuff. 
 
 

So I found like one grim inconsistency in the paper, uh, where the data points can't quite add up. Um, I also found like just some basic stuff. Like if you have a T test, some of the internal values didn't quite add up from what they reported. So it's just basically a lot of kind of basic small errors that I found in the paper. 
 
 

The paper is. From a fairly prestigious journal, I'd say, um, and its main influence is because it's an area that's, [00:14:00] it's difficult to collect kind of data in that area and that is just isn't much like it. Yeah, maybe that's kind of the, the, the kind of rough description of the situation, and now I have, I'm basically wondering what I should do with this information. 
 
 

I've had this information for a while now. I feel like I should do something about it, but I'm not sure exactly what to do. 
 
 

Elisabeth Bik: Yeah, there's, there's several things you can do. So the, official thing to do is to write to the editor and just send them an email and saying, Hey, I'm concerned about the data in this paper because of these and these reasons. And I suspect, perhaps, that there's something quite honest about the paper, or you can word it very neutrally. You could also write to authors, and if you, if you really think it might be just an error, writing to the authors might be helpful. But, again, you don't know what to answer. You can always ask for the raw data, because if you [00:15:00] are dealing with means and averages, probably raw measurements, original measurements, and what would they be, and that could be helpful to determine whether or not that fits the reporter data. Uh, and then you could post it online. So I use a website called pubpeer, and you can, you can ask the authors there, so it will actually connect with the authors, uh, through their email address. And you can ask either anonymously or under your full name, what the deal is like, Hey, this doesn't make any sense. 
 
 

Can you share the raw data here? unfortunately, the authors do not usually reply. And I think is, that might be a sign that there's actually something, you've discovered something that they are aware of. If it's really all in good faith, they might answer and say, Hey, yeah, you're right. Like me, let me check the data. 
 
 

Here's the data. Oh yeah, we made an error. Things like that. for me posting on Poppear, what that, the [00:16:00] purpose that I feel that it serves the most is to warn other people that there's a potential problem with the paper. And they might then discover even more concerns. But most importantly, you are warning other people, Hey, there is a problem with this paper. 
 
 

If you are basing Your research on this particular paper, it might be worth knowing that the data looks like it, you know, something is not right. Don't, don't put all your hypotheses on this particular finding. So, uh, people can install a plugin, uh, like an ins a browser extension that will work with your browser of choice. then you can see if a paper has a pubpier comment, it will show A green banner, and you can click on it and see what other people had to say about 
 
 

Benjamin James Kuper-Smith: So in Google Scholar or in, where do it, where does it show that? 
 
 

Elisabeth Bik: it doesn't seem to work very well with Google Scholar. So how it works is that it will find the D. O. I. [00:17:00] The unique identifier of a paper and Google Scholar doesn't normally show those. But if you go to the landing side of a sort of the home page of a paper that usually has the D. O. I. You will see a green banner at the top. 
 
 

It's not perfect. Um, the you know, it's all run by volunteers. So You know, you, you have to sort of know its limitations, but it works really well in PubMed, which is a website that most medical or biomedical, researchers will use to do their literature search. So there you will see the DOI in the, on the page, and so it will give green banners when a particular paper has a popular comment. 
 
 

Benjamin James Kuper-Smith: You mentioned earlier, you know, I can contact the authors, I can contact the journal, I can, you know, pop it, etc. That's kind of one of the questions I have because, I mean, I actually also wrote down, it seems to me like, if you think it's fraud, probably just contact the journal. If you think it's just an error, contact the authors. 
 
 

I don't know. Like, how do you kind of, how do you [00:18:00] make that decision between whether you think something is, I mean, for example, in this case, it's, it seems to me like the person doing it was just very sloppy to a point where you, that makes me question, like, can I trust anything about this? It's not like, you know, the, the. 
 
 

Statistical inconsistencies, I'd say. It's not like the t test changes whether it's significant or not. It's not like that. It's just slightly off a lot of the time. And so it just makes me like, for example, me, I just go like, I don't sure I would trust the way they did anything in this paper. Because the numbers, like the one thing I can look. 
 
 

at just doesn't add up a lot of the time. I don't think it's fraud, but you know, like, how do you make kind of that value call of who to contact? 
 
 

Elisabeth Bik: Uh, uh, yeah, I guess, uh, like I don't, I never write to the authors privately anymore. I just. Write either to the editor or a post on PubBear or both. So, uh, those would be both good avenues. [00:19:00] Um, and it's, it is hard to make the decision in, in some cases, I'm pretty sure it was fraud just because I do a ton of these investigations and you just get a feeling for, yeah, this is. This is not just sloppy. This is just like, you know, really trying to fool the reader. And this is intentionally done, which makes it misconduct. But yeah, in some cases when it's just one paper and, and like two images overlap, or there is a, you know, a number, the numbers don't quite add up to a hundred percent, things like that, that could just be sloppiness. Uh, things that could have been maybe picked up in peer review, but They weren't and, and maybe it's an honest error, but yeah, at some point the, the number of honest errors, if it gets too much, then you have to, you have to assume it's fraud, but it's, it's really a hard call to make, 
 
 

Benjamin James Kuper-Smith: And I think I also don't like, you know, just going to the journal directly feels like, you know, in school, just going to the teacher instead of like figuring it out with the person, you know?[00:20:00]  
 
 

Elisabeth Bik: but it is the official way like that is the official way. So cope guidelines, uh, which is so cope is the committee on publication ethics. That is the original way that you should, you should contact the editors and decades ago you would write. It's. A letter to the editor, but very officially that would be published where you would raise these concerns, that does take a long time, and a lot of editors will not it because it might look make their journal look back, so 
 
 

Benjamin James Kuper-Smith: Yeah, this might. 
 
 

Elisabeth Bik: be willing to publish it, but reporting it to the journal is And then asking the journal to ask the authors for the original data. 
 
 

So that, you know, you're sort of kept out of the loop. The journal then functions as an intermediate, but is true. I've done this for, for the first 800 papers or so that I found during my, uh, when I started doing this and of the 800 papers I reported back in 2014 and [00:21:00] 15. Only 50 percent now have been either corrected or retracted. 
 
 

So in, in, and a couple of years ago, that was only 30%. So the, the, it takes years and years and years to address these, these problems, if you are waiting for the editor to. Do something. So that is a frustrating thing. And I find it now a little bit more efficient to just post it on pop here and have at least warned other people that there's a problem with that paper. 
 
 

Benjamin James Kuper-Smith: Yeah, I had one question about time frame. Say I contact the authors, uh, or the journal. I don't know whether this makes a difference to your answer. You know, I contact them and you know, obviously, they're not going to respond within an hour. Probably not within a day, especially if it's like some sort of something I actually want to look at. 
 
 

You know, academics often don't respond within a week, but like how long, at some point, you know, do you just mail them again and again or 
 
 

Elisabeth Bik: You could, uh, you could try to look up the email [00:22:00] addresses of some of the other authors that it usually helps when you have multiple. Email addresses that you sent the email to so that one person cannot ignore the email and they know that the other people are also in the loop. And then at least they, they know they need to talk about this amongst themselves. 
 
 

And so if you include a couple of other people, then that might help be helpful to send them a reminder. And, but yeah, we, we all have overflowing inboxes. 
 
 

Benjamin James Kuper-Smith: Yeah, exactly. 
 
 

Elisabeth Bik: I do, I do recognize myself that there are just emails. You know, at the end of the day, I still have emails that I haven't answered, but the day is over, right? 
 
 

And if that happens too often, it moves to the bottom of the screen and you never answer it. 
 
 

Benjamin James Kuper-Smith: You know, when you contact a journal now, do you have like, I'm assuming because you do it so much, you have like pretty routine things you do, like, but you wait X amount of days or weeks until you send another, or kind of what's your process there, or 
 
 

Elisabeth Bik: I don't have a standard process, but I keep track of [00:23:00] when I sent an email for a particular paper. So I have a spreadsheet of almost 8, 000 papers I'm dealing with and most of these I haven't even reported to the editor because just every day I get new requests. to look at papers. And, uh, so around two or three thousand I've reported to the editors, but the majority I've only reported online on PAPIR, because that's my main goal now, just warning other people there's a particular problem with this paper, you know, FYI, proceed with caution. And, um, the reporting to the editor is a lot more work than posting it online because you have to find the email address of the editor. And, If you have 6, 000 or so emails to send, that is, uh, that's a lot of work finding all these email addresses. But if you only have one 
 
 

Benjamin James Kuper-Smith: I was about to say, I don't have these scale issues. I just randomly found one paper where I was like, this just clearly doesn't add up. Um Uh, so just briefly, you mentioned when you get contacted. So how does, is that just [00:24:00] people going like, Hey, I think something is duplicated. Do you agree? Or is it what kind of requests do you get? 
 
 

Elisabeth Bik: I get all kinds of requests usually per email or direct message on Twitter. Um, sometimes on LinkedIn where I tend to ignore it because I just don't log in there very often. Um, so all kinds of, of, uh, channels that people come to me, uh, with all kinds of. Problems. So, uh, for example, this morning I had, uh, an email, somebody asking me about a paper that was. It looked like a bogus paper to me, but it was very outside of my field. It was in physics, and there was chemistry, and there was a lot of formulas, and are the type of papers. I sort of have a sense that it might be completely fake, but you don't really I cannot really say much because, you know, these formulas all look like, you know, try to dazzle the reader with lots of formulas, but they do make sense, I have no idea. 
 
 

So I cannot really help those types of problems. So I write them back. I mean, I [00:25:00] found another problem with the paper, which I posted on POP here. But, um, I might also get requests to where somebody said, Hey, I found an image that looks photoshopped, or it looks weird, or it has a line here, or blockiness, or can you look at it? 
 
 

And those are the things I specialize in, especially if it's molecular biology images. Such as, um, western blots, or DNA gels, or photos of, uh, tissues, or cells, or things like that. I, so I will look at those, and if I find a problem, I will write it back, and I will post it on POP here. And each one of these problems that I might find might lead to a whole hairball of other problems. 
 
 

So I might follow, let's say the papers from the first author or from the last author. And I might find more problems because, you know, if people are Photoshopping, they might do it again. And if I find a real problem that could lead to. Maybe a week of of me spending time hunting [00:26:00] down the 500 papers that these people have written and seeing if there's problems in them and some other tips just Lead to dead ends and I'll see maybe one paper with a problem But I cannot find anything else from those authors and all the other papers look fine So if I don't find something in let's say the first 20 Papers that are scanned and I'll move on to the next topic. 
 
 

Benjamin James Kuper-Smith: Is that then for you also an indication that maybe this is just an error in this one paper? Like, do you use that kind of, and like, how much, how likely do you are, I mean, do you report then everything where you find something, or just don't have time, so you focus on the, you know, the big cases where it's like, here are 20 papers with duplications in it, or? 
 
 

Elisabeth Bik: I do report everything I find that I think is a, an inappropriate duplication or things like plagiarism or, uh, all kinds of other problems that you might find in a paper. If I think there's a problem, I will report it. if it's a small problem, and I think it might be an honest error, I think if you report it online and you [00:27:00] tag the author, so they get an email, if it's an honest error, most of those people will actually reply pretty quickly and say, thank you for pointing it out. 
 
 

I will look into this. Or, uh, you are absolutely correct. We use the wrong blot. Here is the correct blot. And those are fantastic answers. That is what we all hope to see. And it tells me that it was an honest error, but when the authors choose to not reply or send me a very long, um, you know, reply of you're, you're a disillusioned person and who are you to question my work and I have 400 papers that I wrote and you're pointing out this little one error, those types of replies will tell me hit a good spot. 
 
 

Like I actually found something and there might be more for me to find. So if they reply like that, I will probably dig deeper. that's all based on experience and having worked with thousands of these cases. 
 
 

Benjamin James Kuper-Smith: you're one of the few people who actually enjoys getting [00:28:00] hateful email because you're like, nice. Now I've got something. 
 
 

Elisabeth Bik: Well, I enjoy it when it's public because at least other people can see it. Uh, it, no, I do not enjoy hateful emails. Um, you know, as, as the next person, I get a lot of hate online, but. I don't enjoy it, but it, it's interesting to see how it seems that honest people making honest mistakes will reply in a very different way than people who know they have been caught. 
 
 

It appears that those folks to the attention to the error they might have made or the, maybe the, the fraud they did. And try to attack me, attack the messenger. And that tells me, Oh, they have something that they're not willing to share with me. They try to divert attention and it, it's, it's very interesting, but it's not, it's not fun for me, absolutely not to receive, uh, you know, nasty and unkind messages. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, I thought we'd talk about this a little bit later, [00:29:00] um, kind of the reception of your work, but I guess we might as well just do that now a little bit if you don't mind. Should, you know, to not make it look like you just get hate all the time. Uh, I saw you did get, uh, the, the John Maddox prize in 2021. 
 
 

So it's, you know, it's, it's not just abuse. So there's also some positives, by the way, to do prizes like that. Does that like really matter to you? Or is it just like, yeah, whatever, or I'm curious. I've never won an award like that. 
 
 

Elisabeth Bik: Oh, um, well, first of all, there's not a lot of awards to be won by being a science critic. Like, there's a lot of awards that will recognize breakthrough discoveries and, and, you know, fantastic science. There's not a lot of awards, obviously, for criticizing other people's science. So the John Maddox Prize in that is, is, is a unique one. 
 
 

That's one that really recognizes. to be critical of, of science or to be critical of certain things that could [00:30:00] the integrity of science doing so in the face of a lot of backlash in the face of hate and doing it under very difficult circumstances. So this is the one prize I think that really was hoping to win because I feel that's exactly what I'm doing. 
 
 

And I did win it. And that was just fantastic because it's such a great. Recognition of the work that I do that I do it in a way that recognized by the scientific community. So this is an award that is. Uh, awarded by, uh, Sense About Science, uh, and Nature, the journal. And so it's a fantastic group of other scientists who recognize that I do important work. 
 
 

And so there's, um, two awards every year, one for more senior person, which is the one I won, and then for a junior person. So people winning these awards, there's only two a year. It's very rare. it was just one of the highlights [00:31:00] of my career 
 
 

Benjamin James Kuper-Smith: Okay. 
 
 

Elisabeth Bik: that. 
 
 

Benjamin James Kuper-Smith: Okay. Yeah. Do you have ? Is there, I mean, you said you, you were hoping for that award and also that there aren't many awards like that. Is that kind of like now there's nothing left to win now in this field? 
 
 

Elisabeth Bik: I'm sure there, I hope there will be other awards. No, no, I mean, there, I've won a couple of other awards that are equally, uh, you know, I have them here on, on, on my cabinet at home and I, I look at them and I think the John Maddox prize was a very big one, but it's it the work that I do. I work mainly by myself, right? 
 
 

I'm a, I'm a, I'm not employed, I work by myself, I don't get a lot of feedback on what I do, other than a lot of these hate tweets, uh, you know, uh, slung at me. it is nice to hear from the scientific community that the work that I do is appreciated, and I think that makes it worth doing, and so whenever I get a lot of these hate emails or hate tweets. 
 
 

I will [00:32:00] look at my little cabinet with my, my couple of awards and like, okay, no, the work that I do is important. Like your colleagues, my peers recognize it. And I shouldn't really think that these anonymous trolls who sling insults at me, those are not important. It's the scientists. Who recognized my work. 
 
 

So it is very important to have won that. And, and I will, I'm very grateful for all the support that I get from scientists. 
 
 

Benjamin James Kuper-Smith: Hmm. Uh, means you, you already partially answered my next question, um, but one question was kind of how do you deal with online hate, online abuse, that kind of thing. I saw. Uh, I don't know, a week ago or something, your Twitter feed, your Twitter was private or something like that, at least a few days ago I wanted to, 
 
 

Elisabeth Bik: ago. Yeah, yeah, 
 
 

Benjamin James Kuper-Smith: um, was that related to something like that or, 
 
 

Elisabeth Bik: Yeah, there was one, uh, it's a long story, but, um, there's this, um, A set of papers that are criticized that has been used in a biotech company. that company is [00:33:00] also listed at the NASDAQ. So there's a stock of that company. And the company is developing a drug against Alzheimer's. So, you know, it's a very important. Quaest, we, we all want Alzheimer's to end. We all have, we all are affected by Alzheimer's. We all have family members, that we either care for or have seen deteriorating because of this disease. So it's a, it's a horrible disease that There's just no known cure for, so it's a very good cause. 
 
 

We all hope to cure Alzheimer's with this particular drug that is in development. So it's in currently in phase three trials, but I found, um, uh, well, actually I wasn't the first to find that was a group of people who found problems in the papers that led to the development of the, the drug. So it was, uh, uh, a scientist working at, uh, in New York at a university and. In his work, there seemed to be a lot of problems that were in Western blots. So that's exactly what I do. So I was sort of [00:34:00] called on Twitter. Can you, can you look at these papers? And I'm like, yeah, no, I agree that these, these blots look of concern. Like I'm concerned as well about these papers. And, know, it doesn't mean that the drug does not or will work. 
 
 

We don't know, but the, the. The preclinical work, the work done in the lab, the work done with the proteins and, and patient samples or animal samples, that looks of concern. So also wrote, I wrote a couple of, uh, blog posts about it. And long story short, of course, the, the stock was dropping a lot, not just because of my blog posts, but because of other people already talking about it and filing some citizen petition online. 
 
 

So the stock had dropped from 120. 1 to 50 or so in one or two days. So that's a, a lot of people lost a lot of money investors. So the hate long story short, the investors of this company do not like what I did. They don't like my criticism. I mean, I [00:35:00] tried to warn them, like, maybe you should sell the stock and, you know, invest in something else, but they, they sort of formed this group and there was one particular person, sorry, very long story, but this one particular person who is. The biggest, uh, shareholder of this company. So I think he has like over 1 million shares. He, uh, started this whole rampage on Twitter accusing me of fraud because I did work at a fraudulent company. Uh, again, I was not involved in the fraud, but he sort of was starting to send emails to journalists, trying to pitch a story that I was in the middle of the fraud and that they should write a story about that. And I think what he was trying to do is. Discrediting me as a science critic and, you know, whether or not I worked at a fraudulent company in the past doesn't really, I think, has to do much with the problems I'm finding in current papers. It's focusing on the wrong things and, he was trying again to divert [00:36:00] attention. 
 
 

Benjamin James Kuper-Smith: yeah, it's diverting attention exactly, yeah. 
 
 

Elisabeth Bik: yeah, I had to shut down my Twitter because I was receiving at some point. was, he was just tagging all these people who, whose work I've criticized in the past and, and some of them have fans and I was getting death threats and, and it just became very, uh, nasty to look at my Twitter and not see anything positive, just very negative things. 
 
 

And, just for my mental health, I will shut down my Twitter so I cannot see that for a couple of days until the whole thing dies down and then I'll be back there again. But yeah, Twitter is becoming. You know, a lot of things are not moderated anymore, so anybody can just sling insults at other people and seems all to be okay. So, uh, I'm not sure how long I will be there, but I do have a lot of followers on Twitter, so it's still where most of my audience is. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, one thing I found interesting there was that slight counterweight almost to the general picture that, you know, Twitter's deteriorating. One thing I found interesting there was that the community notes, [00:37:00] actually the last time I saw it, I think now the person has, Made their tweet private or something like that, but it actually had community nodes that said something like, this person is one of the large stakeholders in this company that's been tanking. 
 
 

So yeah, that kind of thing. So it was actually kind of interesting to see like, okay, this, this thing that they're implementing seems to have done some sort of seems to have worked in this case, at least, but I don't know. 
 
 

Elisabeth Bik: Yes. And, and, and he was, of course, then rallying up his followers to that the community note that had been added was not helpful in an attempt to, to break down the community note. And he accused me of lying. And, um, yeah, so it, it became this, you know, he said, she said type of thing. And then he Yesterday he suddenly offered, um, sort of a, a piece and he wanted to collaborate with me. 
 
 

I'm like, really? You want to collaborate with me? That's a big change. So I'm like, um, dude, you know, best of luck. But I'm not gonna, you, you, you're calling me a fraud for a month. And now you want to [00:38:00] collaborate. Uh, you know, best of luck, but I'm not going to collaborate with you. Find somebody else. Best of luck. 
 
 

Bye. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, you know, the reason I kind of wanted to ask about this is that, um, to kind of the question is like, how do you deal with this kind of stuff? And, um, Again, as I said, like I've, I have no experience with this kind of stuff, but you know, let's say someone, you know, it's always possible, I mean, we've definitely seen these, these online pylons where someone does something that was probably a bit stupid, but probably not as stupid as everyone pretended, and then you're in the middle of this kind of thing. 
 
 

So, um, what would you recommend for someone in that kind of situation? 
 
 

Elisabeth Bik: It's tough because, yeah, like you said, like it could some, a very minor. Thing that you said was, you know, a little bit off, off topic or off color could lead to complete cancellation. If one person of influences, influence will say you should be canceled, then you will be [00:39:00] canceled. And it's just really hard to get out of that. 
 
 

And you make in a situation like that, I'm not sure what to do. Like, I, uh, you know, hope that will never happen to me. And I know I have a lot of support, but it is nice to get support. Even privately. So if I see a person being scrutinized for something minor, I try to send them a direct message and, you know, some, some words of support, like, you know, and shutting down Twitter or sending it to private does help because at least it sort of, uh, you're then alone with your followers, not, not random trolls who will. um, you know, oil to the fire and, and, and blow up the whole thing. And it sort of calms down the whole Twitter storm a little bit, but, but it's tough, every situation deals for something different. And I know that the work that I do will make people unhappy, I criticize other people's papers and I might find errors, and I've been criticized before, and it's not fun if somebody finds an error in your own paper, [00:40:00] uh, I've been on the other side, but also been falsely accused of all kinds of things that are, you know, I've been accused of plagiarism and, and strange things in my, in my blood scans, um, Which turned out to be artifacts. 
 
 

So I, you know, it's not fun to be on the receiving end of that. I totally get that. But I, again, I always tell myself if, if these people out to me, the, the, the person who raises the concern, it means they have no scientific defense. Because they could take away my concerns by sharing original blots. saying, okay, well you, you, you say that these things look similar, but here's the original scan and look, they're, you know, in high resolution, these things are actually very different. tell me that in a convincing way and I'll be happy. But yeah, if they lash out to me, that means they might have something to hide. 
 
 

So that is what I keep on telling me, myself.[00:41:00]  
 
 

Benjamin James Kuper-Smith: Are there, I mean, I agree in principle, I'm just curious right now, I'm just wondering whether there is maybe some situation, I mean, Yeah, as I said, I completely agree. Like if they, you know, if you made an error, just go over your data again, and maybe it takes a week or whatever, a month or, you know, sometimes you have stuff to do, but there are definitely other ways of dealing with it. 
 
 

I'm just curious. Is there. A reason why you might not, 
 
 

you know, be able to, I mean, like if you don't have the originals and the papers, the only thing you have is that maybe a reason why you just can't say anything other than like, yeah, no, you're wrong. I mean, did you see what I'm trying to get at? Um, 
 
 

Elisabeth Bik: Yeah. I, I always hope that the author will answer. And I've had a couple of cases where an author Uh, immediately replies, let's say on PubPier, where I've posted my concern and, and the senior author writes in and said, Oh, this looks very seriously. I'm going to look into this. And I think that's a good answer. 
 
 

That's a, you know, I know [00:42:00] these things might take time. Maybe the originals were already lost because they're old. But I feel that's, that sounds genuine, but that is very rare that that happens. 
 
 

Benjamin James Kuper-Smith: so, so what is the response then? Well, I guess you said you don't really contact the authors that much directly anymore, but I also had a question about the last part in some sense of reactions to your work as journals and how they react to you, you know, flagging. Some stuff, uh, that, I mean, probably in most cases, the editors had actually nothing to do with because it happened in like the previous 10 years or whatever. 
 
 

I could imagine that happens often. Um, but kind of how, how do journals react to, you know, you being, to being sent an email saying like, Hey, you want to look at this fraud in your journal? Oh, not fraud. 
 
 

Elisabeth Bik: not how I 
 
 

Benjamin James Kuper-Smith: Yeah, yeah, I know, I know, I was gonna say, you wouldn't word it that way, but someone saying like, you know, here's some major concern, what's the kind of typical reaction you can expect? 
 
 

Elisabeth Bik: Uh, [00:43:00] nowadays I ask for confirmation that they at least got the email and even that doesn't always happen. Uh, there seems to be some, some, some of the editors who are either very silent or they don't want to deal with this little annoying email or, or there will say, okay, thank you. We got the email. We'll look into that. 
 
 

And then. Nothing happens. And so that, that was the typical response for the, the set of papers, the initial set that I sent, uh, in 2014 and 15. most cases, there was no response at all. And I sent them reminder like a year or so later in 2016 or 17 and asking them again for a reply and, and that led to even more silence. 
 
 

So that was frustrating. But like you said, in, in many of these cases, the leadership. Of some journals slowly changes over time. There might be a new editor in chief and there might be just a whole new set of people, uh, running the journal [00:44:00] it feels that journals are now a little bit more open to these criticism and then that they're starting to see that the concerns I'm raising and many others are raising are legit. 
 
 

And they, you know, these are that slip through the cracks and slip through the normal. safeguards that were set up at the journal and, and, and they, they have to then of course admit that some, some things were not during peer review or the editorial process in the past, which is not fun. I understand that. 
 
 

But I do think that there's a growing understanding of journals. Yeah, there are. There will be some small percentage of papers that we'll receive on our desks as manuscripts that will contain fraud, and we need to screen for that, and we just need to be aware that some of these might be fake. And I think, like with any credit card scams, you have to hear about a certain scam, and then you recognize it, and you say, [00:45:00] Aha, I did not win 200 million dollars from a prince in Nigeria. 
 
 

I know this is a scam. So similarly, If you know certain things are scams, you will recognize them. And so editors are increasingly recognizing that. The problems we're raising are real, that part of this could be caused by honest error, but another part of that could be caused by an intention to mislead, and they should look into it, because I do keep track of how journals respond, and that's sort of my final sentence now when I send my standard email. do keep track of what you're gonna do with this. And then one editor wrote back, that sounds like you're blackmailing me. I'm like, dude, I'm just keeping track of your customer response. Like that's, I'm not blackmailing you. If you handle it well, you'll end up well in my statistics. But that seems to motivate a couple of editors. 
 
 

But yeah, there's a still a couple of very unwilling editors who do not, who do not see that these are problems who will write to [00:46:00] me. One even wrote back saying, well, you're writing to me from a Gmail account. You're not an academic. And so I'm not going to listen to you. I'm like, I have a PhD. Yeah, I work at a consultant, but my concerns are real. 
 
 

It doesn't matter who I am, you know, 
 
 

Benjamin James Kuper-Smith: Yeah. 
 
 

Elisabeth Bik: if it does matter, I think I've earned recognition from the scientific community that I'm just not a crazy woman sending you these things. No, I, I. I have legit concerns about this paper, and it doesn't matter who sends it to you, you have to respond to it. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, that's such a disappointing response, particularly because I mean, I never use my institutional email to start with, because, because I'm in a PhD phase, you know, you move and that kind of stuff. And I've lost, so I think you mentioned at some point you lost all your correspondence from your Stanford email. 
 
 

Elisabeth Bik:
 
 

Benjamin James Kuper-Smith: Um, so I've had the same thing where, you know, often minor things, but sometimes it caused like some [00:47:00] minor problems because I switched email. So now I just always use my private email. But again, as I said, like it's, it's deflecting away from the actual criticism. Towards stuff like proxies of prestige and, or anything like that. 
 
 

That's 
 
 

Elisabeth Bik: Yeah, I, I, some of the whose work I've criticized will, say online that I'm a failed scientist of medium intelligence, and, uh, there was one particular person who said 
 
 

Benjamin James Kuper-Smith: not too bad, medium. 
 
 

Elisabeth Bik: uh, what's that? 
 
 

Benjamin James Kuper-Smith: Medium intelligence isn't too bad. 
 
 

Elisabeth Bik: medium Intelligence. Yeah. I don't dunno. Uh, this is a person who claims, uh, he has a IQ of, I dunno, 180 or 
 
 

Benjamin James Kuper-Smith: That's pretty high. 
 
 

Elisabeth Bik: yeah. 
 
 

Who can beat that? It's pretty high. Yeah. But, uh, you know, does that make him a better scientist? Not necessarily. 
 
 

Benjamin James Kuper-Smith: but yeah, but getting back to the criticism, what do you have to say about that? 
 
 

Elisabeth Bik: Exactly. So, so they're trying to deflect on how many papers they have published and. Uh huh, but their age index is, and I've only published 40 papers, you know, I think that's [00:48:00] decent, especially since I left academia a couple of years ago, I think, uh, 40 papers, you know, it's, not gonna end up in the top 10 of scientists, obviously, but I, I did. Do science and I have my PhD and I think that's all that matters. And even if you don't have a PhD, if you see a big problem in a paper, it totally doesn't matter who you are. If you're a scientist or send it from an institutional email or not, that doesn't matter. It's, it's about the problem. So the deflection on who sends these emails is all in my opinion, an unwillingness to deal with the actual. So if people do that, I will bite even harder my teeth into the problem and I will search deeper because I, that's just not the response I like to hear. it's in the end, not going to help them. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, the criticism that you only published 40 papers is actually pretty, it's, uh, It's a pretty impressively dumb statement to make just [00:49:00] because, as you said, like it, you know, it doesn't address the problem in itself. And even then, like, 40 papers shows you've clearly been in academia and science for a long time. 
 
 

Like, you know, you're not just a random person. I mean, yeah. Anyway, 
 
 

Elisabeth Bik: Well, and even a random person again, uh, I mean, some people doing, doing this science criticism. Criticism work are not scientists and they, you know, they, they had to perhaps learn a little bit more on the background on how to interpret images and, and problems, but they're equally good in finding the problems. 
 
 

It doesn't matter who you are. 
 
 

Benjamin James Kuper-Smith: yeah. I just find the idea funny of deflecting towards something that doesn't really show what they want to show. I 
 
 

Elisabeth Bik: that it is a deflecting of the, of the problem. is a very important one for me to keep on going. And, um, yeah, again, this, I think this technique of bullying and sort of creating this air of I'm an authority and you're a nobody that might work in certain circumstances. It might work when you're [00:50:00] young and you're, uh, perhaps earlier in your career and you are very, Impressed by such a person, I just don't give a fork. 
 
 

And, um, and I think they sometimes forget that I, I have done this for a long time. And it's, it's like, I don't really care what you're trying to impress me with because cannot break my career because I don't really have one. Like, uh, you know, I'm, I don't, I'm not dependent on your letter of recommendation. 
 
 

And that makes me in a lucky situation. Like I'm not dependent on the judgment of these people to continue my career. I'm actually in a very late stage of my career and I already did my PhD and everything and I, you know, worked in academia and I don't really care what you think about me. I care about the problem in your paper. 
 
 

Can we focus on this particular problem? And I think that some of these people are very frustrated that their normal technique of bullying and trying to impress me does not work always on me. Usually it doesn't [00:51:00] work. 
 
 

Benjamin James Kuper-Smith: mean, it seems to me, you've kind of already said this also that you're kind of outsider's perspective in the sense that, you know, you're not reliant on trying to publish in that journal or anything like that. That gives you a lot of, not exactly leverage, but it just takes away most of the bad strategies you might have to respond to this kind of criticism. 
 
 

Elisabeth Bik: Exactly. If I was still earlier in my career or dependent on what a very prominent researcher thought of me, I would be in a much more vulnerable situation because that would, know, one letter from a very respected professor to another professor where I might want to work, that could damage my career greatly. 
 
 

And I can see that when I was much earlier in my career, I could not do the work I'm doing now. Or I would have a boss saying, uh, you know, don't touch that particular researcher because we're working, we're collaborating with them and. You know, or [00:52:00] such a nice woman or he's such a nice pal or no, I don't really have those worries to deal with. 
 
 

Like if I see a problem, I don't really care who that person is. If they have won a Nobel prize or if they are an early career researcher, will focus on the papers and the problems, not on the people. I will, I will call that out. 
 
 

Benjamin James Kuper-Smith: Actually a brief question here about, you know, when you find something, not you specifically, but other people, um, you know, people who might be in this kind of more vulnerable position. In my example, uh, it's not exactly that way. Uh, the people are a lot more, are very senior now, but it's exactly, it's not exactly my field, so I don't really, we're not going to come across them much anyway, so it's not that much of a risk, I don't think. 
 
 

But what, what should people do if they are in that position? You know, let's say you're a PhD student and it's in your field and they're a very respected person, would you even recommend they actually do anything about it? Or I mean, I, for example, I, [00:53:00] I talked to Joe Hillgard about this very early on in my podcast and he basically said like, you know what, it's probably not worth it. 
 
 

Like from all his experience, he's like, it just takes lots of time, uh, you potentially risk, uh, upsetting the wrong people. And even if that doesn't happen, there's not that high of a chance of anything actually changing or happening. 
 
 

Elisabeth Bik: And I think that's a decent point to make, because academia is hierarchical. We're so dependent on other people's opinions who are above us. And so if we criticize those people, we will be seen as troublemakers. And it could severely damage one's career, if you're early career. And so I would definitely not recommend writing to the authors in that case, uh, I would not recommend writing even to an editor, uh, anonymously or under your full name because it's going to be a tough case, but you could still post on pubpeer anonymously. You will, you can create an [00:54:00] anonymous account. And as long as the problems that you want to raise are visible from the paper itself, um, not inside information, but visible for anybody to look at, then you can report it anonymously and you can make a point and nobody will know who you are. Even if Papier would be subpoenaed, they do not keep track of your IP address or things like that. 
 
 

You will just be assigned. A passcode. And with that passcode, which you, you know, sort of like your, uh, I don't know, your Bitcoin, uh, how do you call that? Like code? You have to keep that really, uh, yeah, really careful. And then you, you can log in with that code and then you will, always have that particular name. 
 
 

You will be assigned the name of an organism from the tree of life. So you cannot choose what your username will be, but it might be a fungus, I always say, or, uh, or a bacterium. And so then you can comment on that paper, and that is completely anonymously, and then hopefully the author will [00:55:00] reply, but as I already talked about, we, they usually don't, but if they do, hopefully they will have a good answer, and the best you can do is then at least warn other people, because now There is a popular record and there's a public comment that other people can look at. 
 
 

Benjamin James Kuper-Smith: Yeah, I was curious like how Ser Papa is actually completely anonymous if you want it to be. I mean it sounds like you can make it also under your name or whatever. 
 
 

Elisabeth Bik: Yes, you can, you can choose two types of, uh, comments. So I usually comment under my full name. When I was bit earlier in doing this type of work, I have created a couple of anonymous accounts. So I have a couple of these fungus names or bacteria. And I, uh, when I have commented on a paper, I will then use that account to comment if I find another problem. 
 
 

Because I don't want to do sock puppetry, you know, where it looks like I have that multiple accounts agree with me. No, I do want to, I will keep then that, that one [00:56:00] account to comment on it. And I use that for some of the authors who are very particularly litigious, so who would Uh, now I'm a little bit less worried about that, even though I'm sure it will happen at some point, like we have seen with the Data Collada team, who are now being sued by one of the professors whose work they criticized. So I have not been sued, but, uh, yeah, I'm careful with some of the authors, so I'll use an anonymous account. 
 
 

Benjamin James Kuper-Smith: Would you recommend something like, let's say you put it on PubPear, but I mean, to be fair, I've heard of PubPear for a while, but I've never really been on the website, or maybe I might have been once or twice, but like, it's, I don't think it's like in the conscious awareness of most scientists that, you know, they can look for a random paper, like for a given paper on this. 
 
 

Would you recommend, let's say I want to contact the journal and make sure they get it. Would you recommend something like, Uh, these like, uh, short term anonymous email addresses, or is that also just a silly idea that's not going to lead anywhere? 
 
 

Elisabeth Bik: Yeah, it, it, it could be because, um, some journals [00:57:00] keep track of what gets posted about their papers on papyr, and others don't, um, so it depends on the journal, and I, I have no good memory, like, I think PLOS One, for example, Seems to carefully check that every now and then, because they, even if I haven't reported a paper to them, they are responding to it. 
 
 

So they might check it, uh, but other journals don't. So it is definitely a good idea to send it to the journal as well. But unfortunately, if you send an anonymous email, let's say you create a proton email account or some throw away. throw away Gmail accounts, some journal editors will not be uh, tempted to, to take action, even though it, it, it is always worth pointing out the cope guidelines. 
 
 

Again, the committee on publication ethics has guidelines for how to deal with anonymous reporters. so to include, include a little link to that, uh, websites, like. If your journal is a member of COPE, you have to [00:58:00] respond to this, you have to take action, even though this is an anonymous email address. it might be worth pointing that out, but yeah, some editors are just not very willing to investigate it, especially when it's an anonymous email. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, it seems like more like a last minute thing. Uh, resort and we've talked mainly so far about actually not that much about your duplications, but I guess we've talked about it fairly vaguely in terms of like the most common things like plagiarism, duplications, uh, things that I found that are just kind of like statistical inconsistencies. 
 
 

Uh, what, I mean, not that we want to necessarily encourage people just go and report everything, but what are kind of some other things that maybe are worth pointing out that are wrong in papers one might read that's. Get a little less attention. 
 
 

Elisabeth Bik: Uh, there's many things that can be wrong in, in scientific papers. Um, 
 
 

Benjamin James Kuper-Smith: I mean in a way that that that is [00:59:00] worth like reporting maybe I mean obviously you might disagree with someone's interpretation of that kind of thing, but that's not you know. 
 
 

Elisabeth Bik: No. So, so some of the errors I've also reported on are, um, let's say methodology. So let's say you have a paper where there's no control group. There's just a group of patients who have, let's say long COVID or some other disease, and they're being treated with some magical, cocktail. And, um, uh, they're, they're treated for a year. 
 
 

And after a year, um, a lot of these patients are doing, feeling much better Is that because they're, you know, they're, correct. Slowly getting better from long COVID just by themselves or is it because of the magical cocktail and it's really hard if there's no control group. So you, you cannot really do good science if you don't have that, that set of control. 
 
 

So you have to have a set of people who you don't treat with. That would be a good, it's very basic 
 
 

Benjamin James Kuper-Smith:
 
 

Elisabeth Bik: to do science. You have to, do one [01:00:00] thing to one group of patients, you have to not do the other thing. And there's a whole. it's not ethical to do nothing to the other people, but like to have one set of patients treated with the normal way you would treat these patients with, and then have a new device or a new drug. But have to, you have to have these two groups to compare it to each other. Otherwise, you have no idea what the effect of your magical cocktail of drugs is. And, you know, maybe they, it would have helped them anyways, if you had not treated them. So that's a very basic thing, um, that could be wrong with a paper and definitely worth it. 
 
 

pointing out. And then, um, there could be a problem where there is a control group, but the control group is much sicker than the people you're treating. And so if there's already a difference between the control group the treatment group at the beginning of the experiment, you know, then you can also raise questions that is not good either. Like if they're much older or much younger or much [01:01:00] sicker or have Live in a different city or are much more educated or more, um, uh, have a higher income. All those things could already make a big difference on, on a group of patients. Um, other things that could be wrong is like lack of ethical approval. 
 
 

If you do an experiment on animals or on human subjects, uh, nowadays you have to have, you have to follow the rules. You have to have the right permits. You have to have the right. Consents, you have to have approval from either your institution or a regional organization that will give this approval and make sure that you're doing experiments according to all the ethical rules that we have in place to treat uh, fairly and, you know, make them aware of potential risks or to treat animals fairly and not do any unnecessary experiments. So those are things I've found wrong as well. 
 
 

Benjamin James Kuper-Smith: mean, where do you draw the line [01:02:00] there, right? I mean, for example, let's say control group, I think is a classic example where there is basically no perfect control group in almost any study. Uh, you know, there's always going to be some sort of difference, uh, between groups, um, and some control groups will be better than others. 
 
 

Uh, where do you draw the, you know, this is a continuous problem that at some point you have to make a cut off and say like, okay, this is so bad, like, this, something has to be pointed out here. Uh, so first of all, like, where do you draw that line? And then secondly, related to this, you know, when is it something you might mention to someone you talk to versus post on Twitter versus make a pub peer post and that kind of stuff. 
 
 

Elisabeth Bik: Um, feel I, I have a, you know, there's, most scientists will have a decent sense for what is. You know, not ideal versus what is really complete sloppiness. So, yeah, having no control group or a control group that is very different from the treatment group. That is usually very obvious. I mean, I'm not [01:03:00] talking about, you know, the age of one group is 39. 
 
 

7 years and the other one is 41. 3, even though that is perhaps statistically different. Depending on the number of subjects, but if one 33, that seems like a huge difference that could make, that would be worth pointing out. So, sure there's a gray zone, um. But the examples I've been pointing out are so extreme that I feel it's very obvious that we have a problem here. Uh, I'm not, I'm trying to not nitpick, uh, usually when I have a group of authors that I'm suspecting are cutting corners or are doing deliberately things. Uh, wrong with an intention to mislead, you will find more problems that by itself might not be that big, like one was in COVID times where it seems that approval was given one day after the study had started, but it was in, you know, in the, in the, in 2020. 
 
 

So when COVID just started, it was perhaps in [01:04:00] panic modes. I can see by itself. That is not a reason to criticize a paper. You can sort of see people were. know, desperate, and okay, maybe we could cut some corners, but there were many other problems in the paper, and so I will include such a, perhaps by itself, smallish thing, because I think there's a larger picture where other things are wrong, and then we can learn from them. You know, we can point out some other faults in a, uh, in a paper as well. 
 
 

Benjamin James Kuper-Smith: Yeah. So kind of once it reaches some sort of critical mass where you go like this, there's just something wrong here with the paper, whether it's lots of individual small things, kind of similar to what I have almost, um, none of them by itself is too bad. Um, but on mass, you go like, what did they get? 
 
 

Right. Um, 
 
 

Elisabeth Bik: And then, then you could even point out a little typo, which by itself, you know, we all know it's a typo, it's fine. We're, we're not going to care about that, but yeah, once, like you said, once you've seen other problems, then you're [01:05:00] just gonna put all of them there because it just. It adds to the feeling that a paper is just not handled carefully, that there's sloppiness, uh, even an intention to mislead, but even sloppiness is, is worth pointing out, because we, we need to be careful, right, with our studies, and especially when you work with humans, humans, uh, you know, human subjects, or with animals, we have a responsibility to do our experiments correctly and not waste, Materials not waste the quality of life, not endanger the of patients. 
 
 

We have a responsibility to do ethical research, and to do ethical reporting, and be as close to the truth as we possibly can. 
 
 

Benjamin James Kuper-Smith: So as we mentioned in the beginning, you now do this full time, you know, you're not working main time full time as a Scientist, and then you do this, [01:06:00] you know, on a relaxed Sunday afternoon. Um, Curious, who hires a research integrity consultant? 
 
 

Elisabeth Bik: Well, so I do sometimes work for universities, publishers or journals or editors who are investigating Very specific cases of allegations of misconduct. So, for example, a, another person doing similar work as I do has perhaps criticized the work of a particular scientist at a university or a set of papers at a particular, uh, publisher. And they might want to hire an expert to sort of as a second opinion, like, are these concerns that were raised legit? Do we have a legal case? Like, they might want to have a report written by a person, you know, who is somewhat credible. So I hope that person could be me. And, and so similarly to, to any other case that you want to investigate, you want to have. 
 
 

Um, and [01:07:00] then you have to have an expert opinion, like maybe a pathologist, you know, looking at a dead body. Like how did the person die? You want to hire a person who could write a little report that will help the whole investigation move forward. I, I am sometimes hired to do that. worked with, um, lawyers in the past who also had like a client accused and they wanted to perhaps make a particular case that, uh, you know, other people do it too. Things like that, if they, if there's just a general, a general atmosphere of cheating in that lab. And so maybe they want to look at other papers from that lab and sort of prove that it wasn't the client. It was somebody else. And if I find other problems or if I don't, like I'll just report what I find very honestly. I've also given talks for universities, publishers, et cetera, either. Sometimes that's at conferences, keynotes, speak, uh, uh, speaking opportunities. So I get a honoraria to give talks. I've also worked with [01:08:00] journals on guidelines, um, for let's say image preparation. How can we. As a journal, have better guidelines so that we can circumvent problems of, you know, cropping too tightly or changing the contrast, what is allowed, what is not allowed. Have very clear guidelines. It's going to help everybody to, you know, move forward in a way that is more close to the truth. So those are all kinds of things that I money with. Um, and then I have a Patreon account, which is sort of crowdfunding where people usually it's graduate students, postdocs. 
 
 

Um, I get a lot of people supporting me and altogether that gives me sort of a basic income as well. So it's a combination of consulting work and crowdfunding that, um, you know, that, uh, funds my work. 
 
 

Benjamin James Kuper-Smith: How interested are journals and universities in this kind of stuff? Because Part of me thinks that, you know, for them it's still a bit of a nuisance, but I also [01:09:00] feel like they must see that this has just increased so much over the past, let's say, 10, 15 years, that I feel like an intelligent editor of a journal would say like, okay, let's maybe, you know, make sure that, yeah, we have, as you kind of already said, some of the right procedures in place and Yeah, is it something you feel like they're really interested in or is it just like occasionally a journal? 
 
 

Yeah. 
 
 

Elisabeth Bik: They're they're more and more interested in it. Yeah, just because they want to, they want to prevent these things for from being published in the first place. And you catch these things before they get published, that's much better for all the parties involved, including the authors. And you know, you want to catch these things before they're in the public domain. And so journals, publishers, institutions are more and more interested in hiring specific people hiring staff who are specifically looking at papers. From that point of view, could it be fraud? Is there a potential problem with the, with the [01:10:00] manuscript and catch these things before they get published? 
 
 

So it is actually a, there's a growing interest in hiring people for those roles so that they, they're not dependent on an outside expert, but they have a person within their that will look at these things. 
 
 

Benjamin James Kuper-Smith: Let's say someone listening is interested in this and thinks Maybe not exactly what you do with image duplication, but you know that kind of stuff sounds very cool. I would also like to be someone who gets abused online and then 
 
 

Elisabeth Bik: Yeah, yeah, why not? Yeah. And gets little paid. Yeah, 
 
 

Benjamin James Kuper-Smith: Exactly. Um, yeah, but, 
 
 

Elisabeth Bik: to 
 
 

Benjamin James Kuper-Smith: um, how, what, what, what can they do to go about this? I mean, is there, are there already, you know, positions at universities or, yeah, how can you kind of get into the, is it just doing it on pub here and hope that something happens or, 
 
 

Elisabeth Bik: So I would say that you have to have some background in science before you can do this and some experience. So I don't think it would be a good [01:11:00] job to work as a research integrity officer or online detective as I do, you know, straight out of grad school. I would say it is good to have some experience doing research yourself under the belt. But, um, having said that, yeah, there, I would not recommend doing what I do as, as an early career researcher, because you have to be in that you don't really care about not publishing anymore, but about. Criticizing other people. So if you want to do that, sure, start doing it, you know, on if, if you have some free time, you could start looking just at Puppy or what gets published there. 
 
 

You could just sort of see every paper as it, as it appears online and what people have to say about it. I think that's a wonderful way of learning what type of problems you might look at. Again, if you them, you can better recognize them yourself and find more of them. So you sort of learn by what other people are posting and how they [01:12:00] formulate their concerns. Yeah, do it as a hobby. But if you are really interested in having this type of work as a career, Uh, some publishers are hiring people who will look at papers from the point of view that it could be fraud from, you know, scanning papers for image problems, uh, to, making sure that the papers that they cite have, you know, are not excessively only by themselves, that the peer reviewers are not fake accounts created by the authors themselves, all these types of frauds that you can think of. So there are people at, at journals doing that. There's also research integrity officers at universities whose role is more of providing guidance and teaching, uh, graduate students how to do research, how, you know, like write these codes of conduct, guidelines on how to deal with potential allegations of misconduct. So [01:13:00] that's more of a, it's not like. I, for me, it doesn't sound much as much fun. It's more dealing with, uh, protocols and, and, um, developing online courses and things like that. But if that's what you like, then there is definitely, there are careers in that at universities. 
 
 

Benjamin James Kuper-Smith: okay. One question I had is kind of, it's, it's a fairly kind of generic question, I think about crimes in general, uh, whenever kind of any, you know, methods for detecting fraud or that kind of stuff are published and talked about openly, then in some sense there's the question of whether We're just teaching the criminals. 
 
 

And now we can actually talk about let's hypothetical criminal, not just someone who makes an error. We're just teaching them to become better fraudsters and to hide what they do better. You know, it seems that there's always kind of a bit of an arm race between the people committing fraud and the people detecting fraud. 
 
 

So, um, I'm just curious whether [01:14:00] you have any opinions on that. Should you, should you maybe be less public about it and kind of, uh, you know, 
 
 

Elisabeth Bik: Oh, 
 
 

Benjamin James Kuper-Smith: you even do that? I don't know. 
 
 

Elisabeth Bik: In a way. Yeah. I mean, I'm, I'm, I feel I'm only catching the really dumb fraudsters. But yeah, I am also telling them how to fraud better. So for example, there was this set of papers that multiple people independently of each other discovered that all had Western blots, these protein blots that all had Different bands, but they were all put on the same background. So, it appeared they had all been photoshopped, and so we, I wrote a blog post about it, I called this the Tadpole Paper mill. They, because the, the, the protein bands look like little baby frogs, like tadpoles. But the, the fraudsters had, had put all of these on the same background. So you could recognize at some point the background and say, aha, that, you know, they, these were all fabricated in the same lab or the same studio. 
 
 

And then these papers [01:15:00] must've been sold these different authors working at different institutions. But of course, now we told them like, okay, if you do that, we're going to catch you. So you have to fraud better. You have to put a little bit more effort in your background and, and adding some variety in there as well. 
 
 

So we kind of catch you. So yeah, there it's, it is always going to be an arms race and it's like scams, online scams, the, the Nigerian print scam, or, you know, phone scams, uh, like your account has been hacked. Give me your password and I'll take care of it. We know that these are scams. If we tell other people, I mean, the point is to warn other people so that they're not going to fall for it, but there's always going to be a new scam. 
 
 

So, there's just no way to prevent it. If we want to warn other people, we have to tell them why we found it. And, and, yeah, there always will be scammers. But we can work on other things, like Perhaps like pre registration or just having less emphasis on things like you need to publish X amount of [01:16:00] papers at this point in your career. we focus less on, on metrics, on quantity, but more on quality and more on reproducibility, these, the incentives to do fraud might go away. I hope we can move forward to a system where we can, for example, put a small, smallish experiment online, and then other people can try to replicate it. And then, if it's replicable, that means that my experiment was done correctly. 
 
 

It, and we all get, you know, credit for that. That not just me for doing the experiment, but for other people it. And if we, if we can work towards a system that is less dependent perhaps on scientific publications and impact factors and all these very easy to measure metrics, but that do not necessarily make me a better scientist, then we might slow down science a little bit, obviously, but we will make science better. 
 
 

So hope we can move towards that system because then it's going to be [01:17:00] much, much harder to do fraud if your experiment needs to be replicated in order to move forward. 
 
 

Benjamin James Kuper-Smith: I mean, I'm not even sure I'd agree with that it slows science down, right? I mean, or it's being slowed down now by lots of people trying to replicate stuff that they thought is a great finding because it's in a famous, right? I mean, I know people who spent months trying to replicate something and, you know, maybe it just for whatever reason didn't replicate, but yeah. 
 
 

Yeah. 
 
 

Elisabeth Bik: reasons why a paper cannot be replicated and you can think about, you know, the secret sauce that we add, or the secret little flick that we give to our tube that actually does make a difference, or I've heard about the amount of ozone in the air, you know, you have a printer somewhere. In the admin office next door to your lab that is actually creating a lot of ozone that might, um, inhibit your experiment and you have no idea why it doesn't work in your lab. So you don't always know what is the reason and it is definitely not always [01:18:00] fraud. It could be many different things that you just don't realize might make. Uh, might make a difference. And so, but it is very hard to replicate experiments as we have all dealt with. And you're right, that also slows down science, but it's not very visible because we tend to only report. The positive things and not the many hours that we tried replicating something that didn't work. 
 
 

Benjamin James Kuper-Smith: I always find it so interesting with going back to kind of the initial question with Kind of teaching people how to commit better fraud. Um, I mean, I think I started my episode with Joe Hill got asking him whether he could commit fraud and get away with it. And he was like, Oh yeah, it's easy. You know, you don't, you, 
 
 

Elisabeth Bik: could, 
 
 

Benjamin James Kuper-Smith: yeah, 
 
 

Elisabeth Bik: easily do that 
 
 

Benjamin James Kuper-Smith: you really don't need to know that much to 
 
 

Elisabeth Bik: Yeah, 
 
 

Benjamin James Kuper-Smith: don't be a complete moron basically, then you can get away with it. 
 
 

Elisabeth Bik: And most, and most scientists are smart, right? Like, like I might be catching the dumb fraudsters, um, but most scientists are pretty smart. And so it must mean for every paper we [01:19:00] catch that has a visible problem with it, like Uh, a repeated row in a table or a overlapping set of microscopy images. You know, for every one we catch, there must be a lot more that we don't catch. So we're really only catching the tip of the iceberg. And again, some of this might be honest errors, but there's a lot of ways to do fraud, unfortunately, and that is what I'm not going to tell you, but I think we can all come up with ways to do fraud that would absolutely not be visible in any way, even, you know, we could do fraud that is not visible from looking at our lab books, or only maybe if you look over my shoulder while I do the experiment, you could maybe catch me, but we can all think of ways of doing it. 
 
 

But it. know that most scientists are honest. We're not doing fraud. Uh, we know some of us might massage the data a little bit or tweak it, but. We're not in science because we want to do fraud. We are in science because we, we have this idealistic [01:20:00] feeling that we want to improve the world and make, you know, make us healthier and make the world a better place to live. 
 
 

At least I feel that very strongly and I hope most scientists do that too. I don't think people are in, in the field of science to do fraud. Most people 
 
 

Benjamin James Kuper-Smith: I mean, I think, yeah, I think we can say most people aren't in science to commit fraud. Yeah. I think that's a fairly good, fairly broad blanket statement we can make. Again, if they were, they could make a lot more money in other fields if they wanted to commit fraud. 
 
 

Elisabeth Bik: and there are people, there are people doing, there are, like, we, we haven't even talked about paper mills. Uh, well, a little bit. I, I talked about the taphole paper mill, but there are people who are creating fake PA papers and selling fake papers, selling the authorships to people who need them. And so tho the authors on those papers just. uh, the, those paper mill scammers some money now their, their name is [01:21:00] being put on that paper. And now they have a paper and they can continue their career. Maybe now they can do, become a PhD or they can, they can go to medical school or finish their medical school because now they have a paper and they don't really care whether or not the paper is fake or even in their own field. So you see these papers that have multiple authors from multiple countries and multiple institutions that. Would have been very unlikely to have found together, uh, found each other in real life, like how their, their, uh, affiliations do not match really the paper, let's say it's a paper about, I don't know, electrical vehicles, and they work in the department of, psychiatry, and they're like, oh, do you publish a paper about electrical vehicles? Doesn't really make a lot of sense, so those are signs of, uh, those papers being completely fake and being sold to whoever wants to pay money. Yeah. 
 
 

Benjamin James Kuper-Smith: a friend of mine who works in publishing, [01:22:00] um, for science or worked for scientific reports, for example, said that they had a massive problem with paper mills, because I think people, they're kind of the fraudsters were being intelligent because they didn't try and get it into nature. 
 
 

They tried to get that into the kind of journal that's good enough. No one's going to pay too close of attention if something's slightly off. So yeah, unless you're like, was it Dietrich Stapel who was publishing in science and that kind of stuff with just made up stuff? Yeah, as long as you don't do that, again, as long as you avoid the really dumb mistakes, you can get away with it. 
 
 

And I think that's the nice positive note we're gonna end on. If you want to commit science, you can. Um, 
 
 

Elisabeth Bik: you might be caught. So, uh, yeah. 
 
 

Benjamin James Kuper-Smith: yeah, maybe. 
 
 

Elisabeth Bik: any traces for us to find. Yeah. 
 
 

Benjamin James Kuper-Smith: Yeah, don't duplicate images. I think that's That should be out for now. Um, uh, I have for coming questions. Um, first question is a book or paper that you think more people should read. It can basically be anything, something old, new, [01:23:00] very famous. 
 
 

No one ever knows about it. Just something you think that should be read more. 
 
 

Elisabeth Bik: Uh, I, I really like Plastic Fantastic. So that's a book about one of the science frauds. So it's about, um, Jan Hendrik Schoen, who was a physicist. He worked at Bell Labs. And he was found to, to fake results. And he published in Nature and Science. was caught because he did, um, he published a couple of times the same graph he was caught and he was always, was never seen in the lab. 
 
 

But then the next day, suddenly he had this beautiful graph and people thought he was a fantastic researcher and he won some awards. And, but then people were like, wait. you really doing those experiments or just drawing a graph? And it turns out he did the last, uh, the latter. And, um, so the book is from, um, uh, Eugenie Reich, who is a journalist and now also a lawyer in science misconduct. 
 
 

And it's, it's a really [01:24:00] Nice story. I don't read a lot of books, but what I liked about it is you can sort of see how a person is allowed to be cheating because everybody wanted this person to be a brilliant scientist, but also how hard it was for several whistleblowers raise the alarm and how they were not believed by the leadership at Bell Labs. 
 
 

They were sent away like, well, you're just You know, you might not see him in the lab, but maybe he does the experiments at night. And then finally, it took multiple rounds of people raising the alarm before there was an actual investigation. And it turned out he was a fraud. But the frustration of the whistleblower being sent away and not believed. I felt was, was something I could relate with a lot. And so, uh, Plastic Fantastic, I, I really enjoyed that book. It's about, uh, I think semiconductors in plastic. And, um, I don't quite understand the science behind it, but the whole story about, about a person cheating, about the whistleblowers trying to raise the [01:25:00] alarm and, and, and finally the detective work, I think that's a very interesting story that applies to a lot of cases of misconduct. 
 
 

Benjamin James Kuper-Smith: Great. I think. Semin mentioned that one. Or someone did. Someone did? I think I've definitely, I can't remember who, but I've definitely, this book has been recommended before. Uh, so yeah, definitely go read it. I haven't read it yet, but I guess now I have to. Now it's been recommended twice. Maybe that's the, that's the test. 
 
 

Uh, second question. Something you wish you'd learned sooner, uh, this can be from any, from any realm of life you want to answer. Um, but yeah, just something you went like, oh, if I learned that just a, you know, a bit sooner, that would have helped me out quite a bit. 
 
 

Elisabeth Bik: I think there's, there's just, that there is cheating in science. I think I was shocked the first time when I found out. Because, you know, I know that people cheat. That, uh, although I still remember when I was like 8 or 9 years old. That I was completely shocked that people were cheating on their test in elementary school. 
 
 

Like, why do [01:26:00] people do that? um, to be learning that scientists are cheating too, that was a shock to me. now of course I, I work in this field, so I know it's happening, 
 
 

Benjamin James Kuper-Smith: Yeah. I hope you're aware of it now. 
 
 

Elisabeth Bik: I was just shocked. Like I just thought. Like I said before, that everybody working in science is doing it for some idealistic reason, and that, you know, you shouldn't be cheating in science. 
 
 

Because for me, science is about finding the truth. And cheating is just not the way to do that. And so I wish I had learned that sooner. And I guess also that people are just lying and lying just to To, uh, if people lie, then they will lie again, like they might provide you with an original and say, well, see, the original is, is different, but the original might actually also be photoshopped as I've learned, if people lie in their papers, they're going to lie with the evidence that they're presenting to you. 
 
 

So I think those are things that I've learned along the [01:27:00] way, and I wish I had known them a little bit earlier. 
 
 

Benjamin James Kuper-Smith: By the way, just a brief question that does, does working on scientific fraud all the time make you a bit cynical? Or kind of take a negative view. 
 
 

Elisabeth Bik: always been very cynical thing. 
 
 

Benjamin James Kuper-Smith: No, but you know what I mean, right? Because you like see so much and you have to question, you know, as you just said, it's kind of like, well, are they telling the truth now? 
 
 

Or is this another lie? 
 
 

Elisabeth Bik: I, um, I, I probably am a little bit more cynical, but I still believe in science. Like I still know that all scientists are doing this. Out of, you know, out of a passion for, for science and we need science to solve all the big problems in the world. So even though I'm working on that perhaps smallish percentage of scientists who do bad things, I'm still very passionate about science in general. 
 
 

And, and so I think that is something important to, that you might not get from listening to me talk about. you know, science fraud for, for, for a long time. I, I am a [01:28:00] scientist myself, and I have met so many great and honest and scientists who struggle also sometimes with questions like we all do, like, can we leave out this outlier? 
 
 

Can we Uh, if we take out this experiment and everything looks better, is that allowed or is that science fraud? And I think all of us are struggling with that fine line. Like when do we cross the line? When, when are we still good? And so that is, um, yeah, no, I, I, I believe in science and I will defend science till my last breath, but, uh, yeah, I'm, I'm a little bit more cynical, cynical. 
 
 

And I just. You know, trust, but verify. Like, I like to verify things. If, if people make statements online, I like to ask them for the references. And, you know, if they're from certain sources, I might not tend to believe them. 
 
 

Benjamin James Kuper-Smith: Yeah, I mean, it doesn't, uh, you know, that, that wasn't, uh, my question wasn't a response to me getting that impression I was talking to. But I guess there is always this, like, question of, like, how one's work changes how one views the world. [01:29:00] But it's nice that you managed to not like, you know, let that seep into you. 
 
 

Elisabeth Bik: Yes, 
 
 

Benjamin James Kuper-Smith: Um, yeah. Uh, final question. Any advice for basically people like me, uh, people like advanced specialty students, maybe early postdocs, people on that kind of transition. What would you say? Anything you, you would like to tell those people? 
 
 

Elisabeth Bik: Well, very, basically, if you're thinking about joining a particular lab, um, I would do a quick pub, pub peer search to see if, uh, you know, if, Is the lad with a lot of papyr commands and potential misconduct. Uh, don't join that lab. That's pretty obvious. 
 
 

Benjamin James Kuper-Smith: Unless you want to be an investigative, uh, fraud finder. 
 
 

Elisabeth Bik: to be, yeah, but it's no gonna, it's not going to end well for any of the 
 
 

Benjamin James Kuper-Smith: Okay. 
 
 

Elisabeth Bik: sure. but, um, yeah, so pick your next lap carefully interview with people, not just the, you know, who will be your manager or your PI, but also other people working in that lab. What is the, what is the atmosphere? The, [01:30:00] the, the culture of that lab, is it one of, of respect, of support, of mentoring, or is it one of bullying, you know, high expectations of working in the weekends or working day and nights and, uh, never taking a day off and never allowing a sick day, just having to work and nights and, and in the weekends. And having to supply the PI with results that they expect, because I think if you work in a lab like that, it's just not good for your mental health. You want to have a lab where there is support from your PI, even if the results you bring, him or her, are not ones they had hoped for, like, how does the PI deal with that? 
 
 

How are you expected to give them those results that they want? Or is it okay to fail and to retry? And, uh, and are you, are you mentored? Are you being supervised with the way? And, and of course, when you get [01:31:00] along in your career, you need a little bit less of mentoring and supervising, but we all need to, to be helped and supervised. 
 
 

And as. If you are dealing with a person who is a bully, who is your boss, that is just not a good situation. So, if a person tells you, uh, a person of authority tells you this is how we all do science, we all cheat, Um, that is actually incorrect. We don't all cheat, no. Get out of that lab as, if you can, as 
 
 

Benjamin James Kuper-Smith: Or don't get into it in the first place. 
 
 

Elisabeth Bik: Don't get into it at all. So interview people. 
 
 

Benjamin James Kuper-Smith: Do you commit fraud in this lab? 
 
 

Elisabeth Bik: Well, you're that's not perhaps a good question to answer. But like, if you if you can interview with people who have worked in that lab recently, or maybe moved on, are these people who also start their own labs? Or did everybody work in that lab leave academia altogether? You know, you can sort of picture out if leader of the [01:32:00] lab, the group leader, the professor, is that a good mentor for for his or her career? students and postdocs, do they get the supervision so they can then also become successful further on in their career? Or do they all science and academia, at all and not even end up in, in what we call alternative careers, which are very decent careers as well with a PhD. But if everybody starts a coffee shop after working in this lab, then maybe There's something wrong with that. 
 
 

And 
 
 

Benjamin James Kuper-Smith: Yeah. 
 
 

Elisabeth Bik: is not an X, not a hypothetical example either. 
 
 

Benjamin James Kuper-Smith: Okay. Uh, what were that? Uh, thank you very much. 
 
 

Elisabeth Bik: You're very welcome. It was my pleasure to be here. And yeah, that was wonderful. Thank you so much.

How Elisabeth became a full-time science integrity consultant
The microbiome of dolphins
What should I do if I find errors or fraud in a paper?
Reactions to Elisabeth's work: awards, online abuse, and lots of silence from journals
Should you report misconduct if you're in a vulnerable position?
What problems are worth reporting?
How does one become a (full-time) research integrity consultant?
The arms race between people commiting fraud and people detecting fraud
A book or paper more people should read
Something Elisabeth wishes she'd learnt sooner
Advice for PhD students/postdocs