BJKS Podcast

105. Eugenie Reich (Part 1): Plastic Fantastic, scientific fraud, and institutional norms

Eugenie Reich is an attorney who represents scientific whistleblowers, and a former investigative science journalist. We talk about her previous work as a science journalist, in particular her book Plastic Fantastic about one of the biggest fraud cases in physics, the case of Jan-Hendrik Schön. We'd planned to also discuss Eugenie's current work as an attorney, but spent all our time on the Schön case. Eugenie kindly agreed to do another interview, in which we cover the legal aspects of fraud, which will be the next episode (#106).

BJKS Podcast is a podcast about neuroscience, psychology, and anything vaguely related, hosted by Benjamin James Kuper-Smith.

Support the show: https://geni.us/bjks-patreon

Timestamps
0:00:00: One of the biggest fraud cases in physics/all of science
0:05:47: How and why Eugenie started writing about the Schön case
0:09:26: Why did Schön commit fraud?
0:19:30: Schön's PhD: he never saved any original data
0:30:05: Bell Labs vs. Schön's PhD lab: long-term revolutions vs. short-term applications
0:36:42: Schön's first work at Bell Labs was 'unpublishable'
0:41:42: How to get away with fraud: pretend you collected data in another lab
0:47:45: Bertram Batlogg and the role of the supervisors of fraudsters
0:56:20: How the bursting of the Dot-Com Bubble and 9/11 may (indirectly) have exacerbated Schön's fraud
1:01:09: How to use your colleagues' ideas to commit better fraud
1:05:05: How Schön's fraud unraveled
1:13:45: What is Schön doing now?
1:18:11: A book or paper more people should read
1:20:20: Something Eugenie wishes she'd learnt sooner
1:22:58: Advice for PhD students/postdocs

Podcast links


Eugenie's links


Ben's links


References and links

Episode with Simine Vazire: https://geni.us/bjks-vazire
Episode with Elisabeth Bik: https://geni.us/bjks-bik

Bell Labs (2002). The Schon report: https://media-bell-labs-com.s3.amazonaws.com/pages/20170403_1709/misconduct-revew-report-lucent.pdf
Reich (2009). Plastic fantastic: How the biggest fraud in physics shook the scientific world.
Shapin & Schaffer (1985). Leviathan and the air-pump: Hobbes, Boyle, and the experimental life.

[This is an automated transcript that contains any errors]

Benjamin James Kuper-Smith: [00:00:00] Yeah, you know, it's funny, like I've, I have this like, you know, recurring questions at the end with what book or paper do you think people should read? And as I wrote to you in the email, uh, your, your book Plastic Matastic has been recorded, recommended twice. And, um, so usually, the funny thing is I ask for these recommendations and I almost never take people off of them.

I just somehow, now it's this really nice list of literature that, you know, I just don't get around to doing, but because you also recommended it twice, uh, I did. And I'm really glad I did because I think it's a great book. And I think the best kind of evidence for that is that your book is, in a way, about kind of the border between.

physics and chemistry, a lot of the science area I don't care about at all. Uh, and I still really enjoyed it. So congratulations. That's a high praise.

Eugenie Reich: Thank you.

Benjamin James Kuper-Smith: Um, but yeah, I thought that maybe to, to kind of start off and to illustrate, I mean, so we'll be talking about, uh, in part Jan Hendrik Schoons case, uh, which is the, the fraud case in physics from about 20 [00:01:00] years ago that you wrote about.

And, uh, I was 25 years almost by now. Um, and, uh, then. Um, also how you then transitioned into being a lawyer representing, uh, scientific whistleblowers. Um, and I thought maybe to start kind of to, to get people familiar with the, the extent of the. of the Schroen Fraud. Um, this is, you know, I checked out a few of the papers that you mentioned and one of them, it's kind of really crazy, this paper, it was published in 2000 in 2001 it was cited 100 times in 2002 it was cited 100 times after then, almost never again and at the end of the paper it said lists eight science papers by Hendrik Schoen from 2000 and 2001 alone that have been retracted.

And I think, uh, you know, this is just science magazine, eight papers, uh, in two years, and there's much more from other journals. Um, and I think, is it fair to say that this is the biggest fraud case in physics, or [00:02:00] does any rival it?

Eugenie Reich: I, I think it is even 20 say that this has been the biggest fraud case in physics. I would say that it's not really a competition. I don't want people who experience scientific integrity lapses in, in physics or chemistry to feel that, you know, they're upstaged and they can't bring their complaints forward or, or to feel that, you know, that they have to sort of measure it by that benchmark.

Um, I think it's the biggest quote, biggest case of fraud in the sense that there were a remarkable number of fraudulent papers published in nature and science. So I think it's 13 between the two top journals. There was a remarkable sort of productivity of fraudulent data. Sean was submitting fraudulent manuscripts at a rate of one every eight days or so at his peak, [00:03:00] um, activity, which is remarkable even considering that the data in them was fabricated.

Um, quite a sort of writing achievement on its own. And when I tried to sort of measure Measure may be the wrong word, but when I tried to sort of estimate the amount of harm done, I found dozens and dozens of people whose lives had been derailed trying to follow up on his claims. Um, and we don't really have a way to actually quantify dollars, but there was at least one major NSF funded The funding decision for which I think was arguably swung by the idea that we're going to follow up on this research and might have gone to a different, a different center.

That was at Columbia University where, um, there was a senior [00:04:00] scientist who had a connection to Bell Labs, who was going to work on organic crystals. And that was a, I think it was. It's dozens or maybe even, it was substantial, dozens of millions of dollars, I think, in federal funding, not going to SHERN, but going to research that might not otherwise have been the top choice of the funding agencies to pursue.

So, you know, you can't measure some of these intangible harms against other intangible harms from more recent cases very well. Uh, so I shouldn't really. say that it's the biggest, but by a lot of metrics, it's, it certainly, um, sort of is the biggest. Um, so, and it's, it's been 20 years and, and there, I think there are reasons why it, why it made such an impact.

Yeah,

Benjamin James Kuper-Smith: when I read the book, there's this, In some sense, and [00:05:00] this is, it sounds, might sound like a criticism of you, but it's not, at some point it gets repetitive because you have this like major paper that he publishes in a major journal, and then you keep getting these likes, these like, uh, You know, because you know, from the beginning of your book that, you know, this is a fraud and it's all going to fly up and you always ask, like, when is it gonna, when is it going to start falling apart?

And you always think like, ah, this is the one now, you know, it's like, there's another major paper. People are starting to question it. They can't replicate it, whatever. And then it just, you know, happens again and again and again and again. But yeah, but what you also mentioned earlier about the kind of harm done to people trying to replicate stuff that just isn't true.

Uh, I hope we'll get to maybe some. An example of two of that later in the discussion, but maybe, um, yeah, just to kind of kick things off. How did you, I mean, you, you wrote to the beginning, you were working as a science journalist at the time. What did you find? So I think you wrote, you read the Shrewden investigation report and called it the most [00:06:00] riveting piece of technical writing I've ever come across.

Uh, what was so riveting about that? Uh, and kind of then maybe also what made you kind of want to write the book?

Eugenie Reich: that's correct. So in 2002, I was a science journalist and I've I've been working in that. area about three years since graduating college. And so I was, um, sort of my bread and butter was to read technical writing and see if there was a story or a news article behind it. Not for the general public, so maybe for the scientifically curious public.

So I worked for a popular science magazine, New Scientist. And I read the Shannon investigation report when it was publicly released. And it was different to most pieces of, of, of technical writing that I would read. Um, because unlike scientific articles, it was attempting to assess the mental state of the data producer, [00:07:00] which was shown.

And what I found, uh, very interesting about the report, and what continues to interest me today, is how you can look at data processing in a scientific context. And you can infer something about the thought of the, of the data producer, which is not what you want. You want data to be objectively collected, and that's what we hope and assume.

But there are certain kinds of analyses of data or reviews of data processing steps. Um, that, that give rise to, uh, to an inference that the data producer was, was acting on their own beliefs and not on, on, and not just being sort of the blank state, the blank slate. There, there are types of data processing, um, reviews and critiques that, um, Assess the [00:08:00] mental state of the data producer and that, in other words, you are looking at how they process the data and their intentionality pops out of your analysis and you see it and you feel it viscerally.

And people who investigate fraud and detect fraud in science know that feeling like, oh boy, uh, something very weird behind this. Data analysis procedure, which should have been an objective series of steps. So that, I think that's what fascinated me is to see the psychology pop out of, of, of the data analysis, a critique that the investigators undertook.

Benjamin James Kuper-Smith: Yeah, I mean, I didn't read the report, but I, uh, having, you know, read your comment on it, I thought I should at least check it out before we speak. And I did find it kind of interesting that from just, you know, scanning over it a few pages, how they, in their summary, you know, as you said earlier, like independent of, of the fraud itself.

I mean, I find it kind [00:09:00] of funny that like, this isn't a report where they clearly say that this was fraudulent, uh, data and that these things can't be true. And they also start out by saying like Jan Hendrik Schön is a very like, uh, hard working and intelligent scientist and like this is extraordinary like impressive Even as you said earlier, even if you consider that the data was made up To have just published all those papers in such a short time and um, yeah, I liked your comment about the the psychological aspects because I feel like I still have a I still don't really know what to make of Schön um He seems to be uh You Uh, I don't know whether that was, what exactly your intentions were there, but to me, uh, I mean, I, I think he actually believes what he says, uh, that he didn't try to fraud, do any, do any fraud, commit any fraud.

Um, but I don't know, he's this kind of like, I find him like a, a hard to read in the sense that, yeah, maybe I guess because he just [00:10:00] denied it, but it was such a, uh, Enormous scale, but also, um, we'll get later into exactly how the fraud happened. That kind of stuff, because I think that's kind of, what's really interesting about the whole procedure of how it like kind of, uh, gradually happened.

Um,

Eugenie Reich: Right? So something about the, that's important about the scope of my project is that, It was not an effort to psychoanalyse Shren, it wasn't an effort to understand, you know. I do recite facts about his childhood as I learn them, but it wasn't an attempt to find out. if he was proving a point to anyone, if he'd been through any trauma that contributed, or if he, um, had any sort of inner desires that, that produced his fraudulent behavior.

So it wasn't, it wasn't a psychological, um, inquiry into, into him as a character. However, in any fraud case, when you're showing, when you're discussing intent, you do have the [00:11:00] idea of a motive, and I did. I collected information about his motive, his likely motive, which I think was very clearly to please his supervisor to make career progress in the lab he was in and in the fields he was in, to get a good job, to try and fit in with scientific community expectations, both in terms of a career success and in terms of.

the theoretical expectations of his field. And I really would want to demystify his psychology by saying, and I've said this before, it's not a mystery why a person would misstate

It's not a mystery why someone might be tempted to, uh, misrepresent an experiment that isn't working very well, uh, [00:12:00] in order to, uh, maintain a permanent job in a, in a competitive lab, which Bell Labs was. So, I don't think, I think that scientists are sort of, too much, and this is the same today, too much over mystifying normal human behaviours that if you worked in banking, or you work in finance, or you work in even just, uh, uh, in a supermarket chain, you understand that, that people engage in certain, um, um, Behaviors, you know, they might appear to clock in earlier than they really did, or forget to clock out to add a few more hours that day.

These behaviors are everywhere. And it's not, I think when we go into the psychology of how it happens in science, it's very curious because of the effort in science to do objective data collection. But it's, but we don't want to over mystify why people might do it. Um, I, I completely. [00:13:00] disagree with the statement that, um, I sometimes hear that he did not know he was faking data or that he truly believed his results.

Um, and I'm surprised that some people who read my book have somehow come away with that message. You know, they're, they're, they're free to do so because I mostly present facts. I don't tell the reader what to think, but my own private assessment, which I, which I didn't. So I'm not going to force on the reader, but my, my, my private assessment is he knew perfectly well that this, that the data he presented was fabricated and that he had not seen the effects.

Had he seen the effects, had he believed he saw the effects, I think he'd want to characterize them in an honest way. It's because he didn't see the desired effects that he fabricated data. Because those effects were desired and they were expected and they were, They were very much a goal of his supervisor in [00:14:00] his lab and knowing he hadn't saw them, knowing that he had not seen those effects, provided a motive to fabricate data suggesting that they were there.

So I think there's a complete lack of, of scientific, um, curiosity and, and, and sort of, uh, interests that he really, that he really had. Um, but, but, you know, some people read the book and they, they see the facts I've collected and they come away thinking he's a poor guy, you know, he really thought he saw it and what a shame he, he sort of fabricated the evidence.

Benjamin James Kuper-Smith: Yeah, I mean, I'm not that extreme, but I do, I think I am a bit more, uh, sympathetic, not sympathetic maybe, but I do feel like, I think maybe what I came across with it was this sense that he convinced himself that what he, that there was something correct in what he did, because I think what we, I mean, Now I'm speaking for all of science.

Um, but I think what you assume when you write a paper and publish it is that, like, if you write complete [00:15:00] nonsense, I mean, sure, you can maybe do it once, right? It doesn't replicate or whatever. Sure, you can do that once. But, like, I guess you did it at such a scale that you just feel like you know you're going to be found out if you don't actually read what you wrote.

I think that's kind of why, um, at least I, I think, yeah, I think, I mean it would just be insanity to believe that you can publish all these major breakthroughs and then just hope to like get away with it even though it's complete nonsense.

Eugenie Reich: Well, so, so first of all, I'll explain why I, why I Why I don't think Shan believed he, he saw the effects he reported. If, if a scientist has, has, believes they've observed a remarkable effect, then they will try to persuade others using the evidence that persuaded them. Now, when somebody starts hiding information about what they saw, or misrepresenting what they saw, to [00:16:00] others, uh, that's telling you that they don't believe it.

Um, and there are examples where, where people do believe they've seen something they haven't seen, and they give, and they give, an account that is unpersuasive to others, and They really, really believe it and they can, they can be laying on super strong the parts of the evidence that persuaded them and, and being uncritical about, about reasons why they're wrong.

But in the end, if they're honest, they're going to produce and others ask them for data and others ask to observe their experiment. In the end, it's going to come out that they're wrong. Um, unless. And this is where they cross the line to fraud. They hide something that they know will undermine their position from others.

And, and it's the hiding behavior or the, the lying. [00:17:00] Uh, that tells you lying not about the effect, because anyone can be wrong about what, what was, what the final conclusion is. But the lying about the facts, the lying about the experimental circumstances, or the, the hiding data or, or not saving data that I think tells you that, um, this person knows there's a problem with, with their scientific conclusions.

Um, I do think it's a good point that Sean, uh, made so many remarkable, uh, claims. He surely must have, um, must have known he would get caught. Um, I think when I break it down in the book, you, you, you see that he, it takes a lot to catch him. And he was, His fraud was detected only because of some rather low hanging fruit that he left for skeptics, which was the pattern of duplicated data, whereby he presented the same graphs as the results of [00:18:00] different experiments, down to noise that would not be the same in different experiments.

And the number of duplications was very large. Which, um, if Shan had slowed his pace of submissions, if he'd fabricated noise, uh, ab initio each time, if he hadn't duplicated, he would have, he would have not been detected as a fraud. And we know that because of the many examples where people couldn't replicate his work, but didn't find him to be fraudulent until those, uh, smoking guns for, for data, um, mishandling.

were found. So I think he, uh, correctly assessed that he could get away with it. Um, and he made some very clumsy mistakes, um, which had he not made, he would have got away with it. So, um, that's [00:19:00] troubling because we can imagine that there are people like him that's a bit smarter who were evading detection.

Benjamin James Kuper-Smith: Yeah, definitely. I mean, I've, I've had a few episodes now about like fraud in science and reporting it and finding it and all this kind of stuff. And I think the general consensus is that we only find the dumb fraudsters. If you're actually smart about this, it should be relatively straightforward to get away with it.

Uh, which is not a good thing to know, but, uh, that seems to be, I think, most people's opinions. Um, but I wanted to tell a story a little bit, um, to give, provide a little bit of context, um, about him and what he did and how he got away with it. And, you know, the, what I, why I found this book, I think in particular, so interesting is because a lot of the things that he describes are maybe more extreme than I typically see.

And especially, you know, I grew up now after the, uh, replication crisis or what do you call it? Credibility revolution being the nicer framing, uh, in psychology where, uh, [00:20:00] Um, well, actually there's still lots of people who don't share that data, but, uh, it's now very common to share all your data for people to ask for it in the review process process.

Um, so from, uh, anyway, my point is like, I've seen a lot of the stuff that happens in his career. Doesn't seem that, you know, it's not a million miles away from stuff I see every day. And that makes you go like, Oh yeah, maybe we should do this in a better way. So I want to tell the story a little bit, um, a little bit more details to kind of, uh, Yeah, show, I guess, like, how someone can get away with fraud for such a quite long time.

Um, shall we maybe start with his Ph. D.? I believe in his Ph. D., everything was still It started off, I guess, all of his very unremarkably. I think he was given a, a After doing his, kind of, German equivalent of a master's thesis at the time, in the lab, he was given a Ph. D. project, uh, by the lab, and Yeah, do you maybe want to describe a little bit about, kind of, what he did in his Ph.

D.? And Uh, and you know, this is obviously the first [00:21:00] question, always like, what is, what is relationship to truth and veracity and reporting was in his PhD.

Eugenie Reich: So during, during Sharon's PhD, There's evidence that he was collecting data and that he, from an instrument he knew how to use, which was primarily a photoluminescent equipment, which is where he was measuring, um, light coming out of a sample of, of a novel material that it was hoped would be used for solar cells.

So they're trying to determine whether this material can absorb some of the, um, Visible light wavelengths with high efficiency to create electricity. He knew how to use the equipment because I spoke to the people who taught him how to use it. He was seen using the equipment and he has collected some data that looks like the type of data, um, that would come off that kind of instrument.

[00:22:00] He says by his own admission that he only saved process data. from these data runs. He didn't save original data. And at that time, that lab, like many labs, was transitioning from the sort of hard copy notebook. This was in the late 90s, but they were transitioning from, from having a laboratory notebook that would be a physical notebook, to saving things electronically.

And so, um, It, they did not have protocols for making sure everyone saves everything to the cloud or to a shared system or that the instrument, the instrument should have kept a copy, but they weren't, they didn't have a sort of policy or practice of making sure that it did. And he says that he only saved processed data.

And then when you compare his early PhD figures

Benjamin James Kuper-Smith: So just to briefly interrupt, [00:23:00] um, just to clarify on the process data, I believe what we mean by this is like, I guess the, he, he collected the original data, but then what he saved was basically his analysis itself, like the result of his analysis, right?

Eugenie Reich: Yeah, so Shen, um, Shen collected data from, from his photoluminescence equipment or from his electrical measurements. And then he brought the data into a data analysis program, which enabled him to plot graphs and calculate, um, things like, Coefficients of correlation or run other kinds of fairly basic statistical tests on the data he'd collected, which he reported on those measurements and those sort of outputs of various physical quantities that he was trying to determine through his measurements.

So then he would save the output of his [00:24:00] calculations in that program, including the graphs. And figures that, that he spat out, but he didn't save what he put in, which came straight off the instruments. So, he's making certain changes to his data, and if someone else had a question, whether he analysed it correctly, or if he had a question whether his past self analysed it correctly, he couldn't tell.

Because he didn't have a copy, he couldn't repeat his own work and just check he did it right. Um, so his self accountability went out the window along with the original data, way before accountability to anyone else came into, uh, um, came into question.

Benjamin James Kuper-Smith: Yeah, I mean, he said that he didn't learn how to, I mean, so You know, anyone who's done an analysis, I mean, it just sounds crazy to, you know, not save the raw data, [00:25:00] but he's claimed that he never learned to save the data properly or something like that. Do you buy that? I don't know. I mean, in a way it's, I, I, I don't think I've ever really been taught that I have to save the original data because like, I mean, I guess, well, today also you have to, because it's digital.

Um, like that's how we collect data on a computer. Um, so maybe like in, in, in like psychology now, if you do like a task on a computer, it's automatically, that's how you have to save it. But it just seems, I mean, it seems like kind of the thing you might not tell someone because it's so blatantly obvious.

Uh, do you buy that, what he says there? Or do you, yeah, I just found it so baffling that someone would, you know, finish a PhD and then say like, oh, yeah, well, I never learned that I had to save the extra data.

Eugenie Reich: I, I don't know, um, whether I, I believe that, that Shen encountered a laissez faire attitude to data retention in his, [00:26:00] um, PhD lab, not because his supervisor or others didn't care about original data, but because of the transition from hard copy notebooks to electron, electronic records, and there was no real standard for electronic notebooks.

Thanks. Thanks. Uh, like some fields have now. And I don't think it should be something that, um, is left to the individual. I think all data that is acquired should go immediately to somewhere where you cannot mess with it, even if you want to, and you cannot lose it, even if you want to. Some central repository directly, there's all this emphasis on data sharing, um, Let's just back up, save the data and save it so that the people who are going to work with it can't later change what was saved.

And then let's have a discussion [00:27:00] later about who gets to share it when they get to see it. Just make sure it's there and you can't tamper with it. Um, and that that's missing to this day in many labs and many institutions, including institutions with the storage space and the money and the resources to save anything they want.

So, um, so,

Benjamin James Kuper-Smith: I mean, just to add to that, that's something that, uh, again, like I said, like in the stuff I do, everything's saved on a computer automatically. Um, but it's always like on a, yeah, it's pretty much always like on a local computer and then you have to download it and then you save it on your computer or on the server or something like that.

But yeah, from what I've seen so far, uh, I don't think there's anything like that where there's always made like a central copy that you can't touch. Sounds like something that should be done, but I've actually haven't heard of it or even thought about it until you just mentioned it.

Eugenie Reich: Right. And I mean, there are, um, this gets into [00:28:00] my legal work now, but there are reasons why I think institutions prefer to keep responsibility with individuals, uh, for the data. And, um, you know, from that perspective, I feel sorry for junior researchers, including Sharon at that time, who didn't. Don't get a clear enough message that they've got to save their data.

And even if they do, even if they're explicitly told, save your data, the fact that the institution itself isn't providing that resource and making it super easy to make sure everything gets saved, that is in itself a message to them about how important this is. So there's, there's different ways that you communicate what's important to people.

And only one of them is by telling them. There are other things you can do to make sure that people get a message about how important something is. Um, if you keep telling [00:29:00] someone, such and such is important, but you never check it. Uh, you never call anyone out who doesn't do it. Um, an intelligent person, which scientists are, will quickly figure out that this isn't as important as they say it is.

So you know, I don't think that's a good thing that, that, that Shan thought like that, but I think that's, it's, it's not only, uh, his fault.

Benjamin James Kuper-Smith: Yeah, I mean, I guess we still, I mean, this is the major thing also that we'll come back to later. It's just this, number one, the assumption that people don't commit fraud and, uh, also then the, uh, yeah, not trying to offend people by suggesting they might be doing it. But then maybe we can discuss that later a little bit.

Um, Maybe, I mean, his PhD, from what I can tell, was uh, alright, like was, I think, from what I understand, it was a decent PhD, he published a few papers, um, he wasn't a standout, I think you said something like Um, maybe you can, [00:30:00] maybe you can tell the story of how he ended up at Bell Labs, but that he wasn't the first choice.

Um, but I think one thing I wanted to get to, and that's important then when he moves to Bell Labs, is that it seemed to me, and maybe you can tell me whether this is a correct impression I have or not, is that he basically in his, um, I think one thing that kind of saved him in a sense a little bit in his PhD, or made sure that he didn't do too much.

He did some stuff, but not nothing too crazy, at least compared to what he did later. Is that it seemed to me that his PhD lab was a fairly solidly working lab in the sense that they didn't, unlike Bell Labs later, have this like, need for like, breakthrough stuff and to always be at the absolute cutting edge.

But it seemed to me it was kind of like fairly I mean, I grew up in Germany. It seemed fairly German, like do a decent job and then produce some small papers and you make small progress, you know, cumulatively that way. Um, am I correct in that assumption or, but because that seemed to me like that, that was a big step [00:31:00] when you went from there to Bowerlups where suddenly it was like, you know, we win Nobel prizes here.

Eugenie Reich: I think it's, I think it's correct that, that the, um, environment in the PhD lab, or that Sean was in, although it had some shortcomings, which we talked about, like, um, failure to sort of retain all the data centrally. Um, one, one thing it did have going for it is it, they weren't too interested in flashy publications for their own sake.

The, um, PI Ernst Bucher, um, had set up a number of companies and the lab was interested in industrial, in translating to industry its findings. So when they created a material that they thought had certain properties, um, for, for the purposes of capturing energy from, from light, Uh, from, from the, from sunlight, they did [00:32:00] actually want to make solar cells out of that rather than to get a publication in nature about that.

Like if they had to choose, they'd rather make a solar cell and put it on the market in Germany, where this was a big business. So in the end, it had to work. And, um, I think that created a certain amount of cross checking of people's work and figuring out who's done something solid and who hasn't. Um, not so much based on we're going to investigate anyone for fraud, but based on if someone's done something solid, let's have some other people also do work on that.

And then let's unpack it and then let's discuss it with, you know, the company, um, you know, our corporate sponsor or our corporate interest or, um, and see if we, if, if we can start putting this into development, uh, into product development. So I think that it's both the. fact that they weren't too interested in publicity and they were [00:33:00] quite interested in technology translation that provided a sort of check on, on Shan's impulses and pushed him in a, in a direction where he was less likely to make outlandish claims.

Bell Labs was an environment where the priority was blue skies, um, curiosity driven research.

Benjamin James Kuper-Smith: Yeah, I guess having a bunch of German engineers tell you your solar cell doesn't work. If, or knowing that it would, uh, and they will probably tell you no on certain terms pretty quickly. That probably does ground you in some sort of reality. And it seems to me like, I mean, Bell Labs as, uh, um, I guess we'd have to go into it too much, but they, you know, they were also, you know, Owned by or part of a company, um, and this is also going to be somewhat relevant, I guess, because they're particular, like the way that company kind of fell apart or like struggled at crucial times, I think also played a little bit into this, but, um, I guess it seems to me like that [00:34:00] maybe in his PhD lab, the applications were pretty immediate or fairly soon, whereas in Bell Labs, they were supposed to, you know, the big story, I think, was that the, you know, the transistor, Something that he would then later work on in a slightly different way was invented at Bell Labs.

And then, I mean, in a way Bell Labs created the computer age in a sense with, with the transistor and with information theory with Claude Shannon, uh, 50 years ago. So I guess like they were happy to wait a few decades in principle for the big stuff to come out. Whereas in, in his PhD lab, it was pretty much like five year or whatever.

I mean, you know, a pretty, uh, short term kind of impact.

Eugenie Reich: I think it's right, like Bell Labs, I mean Bell Labs at heart wasn't industrial labs, they, they too wanted to contribute to applications and people I interviewed would argue with me that there was no true distinction between Blue Skies research and research with applications and they would [00:35:00] try to merge those two ideas and say curiosity driven research leads to knowledge leads to Uh, applications, and it's all part of the same thing.

Um, but they did, I think the, what they did have as a luxury in Bell Labs, particularly in the physics division, was um, they had the luxury that they don't have to deliver applications soon, um, and they could have a long time frame, uh, more than five to ten years, uh, more than that. Um, at the same time, they're still part of a corporate organization that needs to, uh, please shareholders and, um, that they are doing something that that will lead to, to value for, uh, uh, for the corporation.

So they, um, they have to show their worth and, um, it was publicity that they turned to as a way to show their [00:36:00] worth. in the short term, given that their applications would take some time to emerge.

Benjamin James Kuper-Smith: Yeah, I think, yeah, exactly. Yeah, I think that's a good clarification. Like, there was obviously supposed to be some sort of application. Um, but I think, yeah, as you said, like, I think a lot of, like, the physics department creating shareholder value was by showing, like, We are going to create the next through the next enormous breakthroughs and paradigm shifts in this field.

And that's how like long term the company is going to, um, it's kind of funny to think about like that in some senses, he was part of like a publicity division, just one that actually had like would potentially have applications in the future. Um, I wanted to, one thing I thought was kind of quite interesting.

And I think that also, um, maybe tells you something about the difference between Bell Labs and SPHE lab is. I think early on he did, you know, he arrived, uh, I believe, because basically the main people they wanted couldn't do it at the time and then [00:37:00] someone, you know, at the end someone suggested him and then he came over because he had some relevant skills and I remember correctly, he did, uh, some work and then that was deemed not publishable.

Uh, by Bell Labs and not good enough. Do you want to say a little bit about that, about the kind of what, yeah, what, a little bit more about the kind of what, what counts as publishable at Bell Labs. And it seems to me that that was the first thing that kind of pushed him towards more outlandish claims.

Eugenie Reich: When Shun arrived at Bell Labs, he was working on organic crystals, trying to measure the transport properties. That's how electrical charge traveled through them. And the idea of the project was to have very pure crystals that they would make at Bell Labs, and to see, um, charge transport that would make it, make them suitable for gating like a transistor, for switching charge behavior using an electric field like happens in a transistor.

But they [00:38:00] wanted to do it with the organic crystals, not like inorganic materials like, um, the silicon or organium arsenide and so on that is used in the computer chips of today and and of that time. So it was a

Benjamin James Kuper-Smith: Yeah, so why, um, that's one thing I didn't quite understand. Why did they want to do it with organic stuff, not unorganic stuff? I mean, I have no knowledge at all about chemistry. Um, but it just seemed to me, I don't know, like basically what's wrong with, uh, silicon? Or what did they hope they could get out of organic materials?

Eugenie Reich: So, organic materials, um, include plastics, and there was excitement about plastic electronics at that time, and there still sometimes is, because plastics have certain qualities that, um, inorganic materials don't have. That are used in computer chips don't like, they can be very flexible, they can be very colorful.

Um, we could have, use them in fabrics, we [00:39:00] could use them in, um, very flexible designs like paper or magazines. And they could still be electronics. Um, and right now we're still actually limited to a computer as a hard box that you kind of don't want to drop. And you put it on a desk and even if it's a laptop.

Got to be careful with it on your lap. So people were interested in, in having something that can perform computing tasks, but that doesn't have all these other properties. And then organic crystals, they are still, uh, hard and rigid and so on, but they were the pure alternative. They were the way to measure the fundamentals of, of organic materials and to probe, um, where things might go in the future with plastic chips.

And then, do you want me to answer the question you, you asked before, which was why, where did he go wrong?

Benjamin James Kuper-Smith: Yeah, I mean, I was just kind of curious about why was this stuff not deemed publishable? And kind of how did that maybe, yeah, make him [00:40:00] become a bit more extreme in what he was trying to do?

Eugenie Reich: So as I remember it, Chen, Chen was trying to bring organic crystals into, um, different electrical regimes, i. e. to show that charge transport behaved in a number of different ways, particularly to get it to a situation where The electrical charges are moving without encountering disorder and impurities in the material and he couldn't get that data.

And so he was having difficulty publishing. And you can see that he has manipulated the tail end of his data sets at that time to get the crystals into the track free, the disorder free regime to show purity. That likely didn't exist, uh, for real. Because you see that the, the data's manipulated, he's added a tail to it to, to give the desired physical effect.

And at the same time, there's, [00:41:00] um, reports of his frustration that he couldn't get published, followed by, uh, his great, his, his satisfaction when he, when he did get published. So we put those together and say that's, Um, that data manipulations is contemporaneous with him sort of overcoming his frustrations, um, in being unable to be published, um, by that, by, by means, by nefarious means.

Benjamin James Kuper-Smith: Yeah, and I guess this realization, like, oh. You know, I make a little bit of a change here, a little bit of a change there, and publishing signs, everyone pats me on the back and is very happy for me because they, you know, obviously think I didn't commit any fraud. Um, but yeah, and I think one thing I'd like to focus on, because this is, I think, one of the big ways in which he manages to get away with the fraud so much.

Is that I don't actually know whether this happened, uh, before what we just described or after, um, but to me, like one of the big moments I think was when he, so I [00:42:00] don't, don't ask about the details here, but in his PhD, I believe he, he was supposed to try and do a certain thing. electrical thing, I don't know, it's chemistry.

But he's supposed to do a certain thing that they were hoping he would do that would be like a big breakthrough and he couldn't do in his PhD. And then relatively shortly after he arrived at Bell Labs, he then told his supervisor that he had done it. His old supervisor in Constance in Germany, he told him that he had done it.

And I think that was pretty much his I mean, I think how he got away with a lot of fraud was basically telling the people in Constance he'd done stuff in, uh, at Bell Labs that he hadn't done, and much more importantly, telling all the people at Bell Labs that most of the recordings he'd done, he'd actually done in Constance.

Eugenie Reich: That's right, he did benefit from, from having a foothold in two labs in two continents. And he, all the way through 2001, he was reporting breakthroughs in the wrong in, in a different lab. And I think this [00:43:00] is of interest because academics continue to work across labs and particularly junior people. We do have them spending two to three years in one lab and going to another lab.

So they often are bringing their work from one to the next and that's all he was doing. But he, he, um, he did take advantage of that to say, Oh, my samples are over there and my equipment's over there. Um, so I can't show it to you now. And somehow he never did. Um, so, you know, that's on those institutions that they didn't sort of pin him down on where's your equipment, where's your samples, can we have a look, and where's your data.

So these things all went together. I mean, there's funny portions of the book where people are like, you know, Oh, what happened to your samples? Oh, they broke in my luggage and things like that when I was flying over. So he's blaming the airline for not having his samples.

Benjamin James Kuper-Smith: classic excuse you can still use. [00:44:00] Uh, well, what I find funny is that he, he kept being able, or people thought he was able to do these things with this like, sputtering machine or whatever it's called, uh, that no one else was able to do, and they thought he was this real wizard, and at some point I think the, the, the supposed sputtering machine in Constance also achieved a kind of, mythical acclaim of all the things you could do with it.

And then, you know, once the people at Constance found out, they were like, Oh, it's, it's just like this regular, you know, it's just this random regime here. It's nothing special about it. And I thought that was quite amusing how basically, uh, he managed to create an aura, not only for himself, but for also this random piece of equipment in Constance, where he claimed to have done all these things.

Um, but I just wanted to add a little bit to what you said with, uh, You know that it's still common for people to work between different labs and that kind of stuff and I mean especially in my phase now as a postdoc it's so common to you know go somewhere for two years come back again or I mean there are fellowships that explicitly ask you to be in two countries you know learn [00:45:00] expertise a in this place then move over to the other and sometimes come back and this kind of stuff so this is I mean these are some of these things that I felt like I just so So easy to still do today.

This kind of thing of doing, you know, let's say I run a behavior experiment, it doesn't run here, and then I, you know, start another job in a year, which happens often, and then suddenly I say like, oh yeah, we did another version here, and uh, now it works. I mean, that's one to one, what, you know, people can just do so easily these days, and I don't think anything has changed about that.

Eugenie Reich: I think that's right. And it's a, it's a very common problem that people join a lab where a previous. This postdoc or previous student achieved something and then left and they can't reproduce it in that same lab and the person's gone and they either don't communicate or they don't communicate well, um, and, or, and they can't get them back to demonstrate.

So people are picking up, people are [00:46:00] joining labs. But then and now, people are joining labs based off seeing a nature publication or a science publication. Wow, this lab does great work. I'll go work there. Arrive, and oops. It turns out that is known not to replicate in the lab. And they're just sort of left holding the, the um, poison chalice.

Um, and they're going to be blamed for being incompetent whereas the guy who's long since departed without sufficient directions or data to enable replication is, is sort of the golden child who, who never comes back to show their work. So that is still happening and, you know, the sort of, I think, we think that science is replicable and so we think we can move people around like, like, like pawns on a chessboard and everything's going to be okay.

Um, but we, losing [00:47:00] accountability when we lose the, the staff, so, so it's the short term positions I think are a problem. Both because people evade accountability and because people aren't committed and invested in the setup and the, and the lab where they are. They just want to sort of get their ticket out to their next job.

Benjamin James Kuper-Smith: Yeah, there's a lot to be said about short term positions in academia, and I think there's also a fair amount of benefits to them. But I think the thing that they can do, and that Henrik Schoen, I think in general, was really, really good at exploiting, is basically this kind of strategic uncertainty between what person A knows and person B doesn't know.

And yeah, that's super easy to, I mean that just naturally increases when you go from one place to another and back and forth as he did all the time. Um, I think there's, uh, at least one other person we definitely have to talk about because he is part of, uh, I mean, so Schoen, you know, publishes the first science paper, but, you know, it takes a while for people to gain a reputation, uh, and these kind of things, and I think one [00:48:00] reason why Schoen's work was taken so seriously is because of a supervisor at Bell Labs, uh, I didn't write his, is it Bertram Bartlock?

I can't remember his first,

Eugenie Reich: Veteran Batlog, yeah.

Benjamin James Kuper-Smith: Actually, he's, uh, not exactly down the road, but I'm in Zürich now and I guess he retired now, but, uh, Uh, it's funny to see. I think also, uh, his, uh, Schön's PhD supervisor also studied in Zürich. It's kind of random to see. Um,

Eugenie Reich: That's right, he's Swiss. Ernst Buchholz,

Benjamin James Kuper-Smith: so, um, maybe who was Batlok and basically how did It seems to me he's also an interesting character in this whole thing in that He had an, yeah, maybe you tell a little like who was he, uh, his reputation and how kind of that helped create, uh, uh, assurance reputation to some extent.

Eugenie Reich: So Batlog, um, Veteran Batlog, um, built his reputation in In the late 80s, um, publishing work in a small collaboration of three physicists, [00:49:00] or two physicists and a chemist, um, on high temperature superconducting materials, and there was a kind of bubble around those materials, and he was at the forefront, among others, of that field, and it was very competitive, and a little slapdash in the way that people thought about it.

everyone in the field went about things. Um, but he, he excelled at that time and he, he was well known for that work. And so he had a great reputation and Bell Labs had a great reputation and that caused people to give Sean a pass when others couldn't replicate his work initially because they figured, um, he's at a better institution than me.

He's got a superb supervisor Um, better than my supervisor. His materials are probably better. His institutions, as I said, better. Um, and eventually as, as you've alluded to, it became and his equipment's better. Um, you, you [00:50:00] know, and his, um, methods are better. Maybe he has golden hands, magical hands that are better.

So it becomes a sort of, what's now known as the virtuoso defense. Like, why can't I sing an opera like you? Because you're not a virtuoso. So. Uh, so he, Bertrand Batlogg was part of Sharon's virtuoso defense. Well, I have Batlogg, or Batlogg signed off. And it mattered at the journals, it mattered when papers went out for review, it mattered at conference presentations.

Um, if he'd been at a small university with a supervisor no one's heard of, I don't think he would have got into these major journals, certainly not after replicability issues, uh, were known.

Benjamin James Kuper-Smith: I found Badlock interesting in the sense that it seemed to me that, I mean, I guess by this stage in his career, I guess he was, you know, As many academics as they become more senior became largely a manager it seems to me. I mean, uh, Correct me if i'm wrong, but I don't think he really did much like active science in that [00:51:00] sense I think he you know supervised his postdocs.

Um, And then they did or insurance case didn't actually do the science um, yeah, it's it's um How should I put it I I find his role really interesting because it seems to me that he just You I guess he just trusted Shun. I guess maybe that's a positive way of framing it and just didn't, it seems to me that it took a really long time for him to actually start like critically looking at what Shun was doing.

I think he basically just assumes that Shun was really good and everything was true. And basically, I guess to some extent I'm asking about like the, the, the role of a supervisor of a fraudster in this sense, because I mean, who, you know, there's always the question, why didn't he get caught earlier? And who, You know, the people closest to him, why didn't they notice earlier?

Um, yeah.

Eugenie Reich: So Batlog in some sense is an archetypal big PI, um, [00:52:00] of the type we, we have in some labs now, where he's not doing hands on experiment anymore. He did do, or he says he did, do some hands on work with, with Sian at the very beginning to see that Sian knows how to, um, put electrical leads on a, on a sample and that he seems competent in the lab.

He was very busy presenting research and data and giving feedback on it, uh, on, on data to Sian. Um, but he wasn't very involved observing Sian in the lab after a certain point. Um, and I think by the time he started raising questions of his own with Sian, It was out of his control. He had 13, 12, 10 major papers with his name on them, um, and that have gone out the door.

So, um, he didn't have evidence, he wouldn't have had [00:53:00] evidence to sort of retract them or discern them, which is a big step. especially doing that to your co authors, but he didn't have strong enough evidence to stand by them either. And I think there was a point when he would have done, he would have been well advised to hold off on promoting anymore and until, um, it's been replicated to hold off on submissions.

And there's some evidence that in 2001, he started leaning in a bit more to another collaboration he had, which I think was. connected to, to ETH where he, where he got his position that year and just sort of, um, diversifying his, uh, his activities, uh, his research away from Sean in a way that is more troubling because he would have liked to see him, people would have liked to see him go in the lab and working full time himself to replicate it and putting word out that he agrees it's a problem [00:54:00] and I think he could have helped slow the downward motion a little bit of the, of the rolling ball that was shown.

Um, but it's, you know, one understands that he didn't have evidence for fraud either. Um, I think he, there's evidence that he somewhat misstated things when he presented the work publicly. So I do, I did find a recording of him where he says, he's asked about an experiment to me says. In response, something like, well, when I increase the field, or something like that.

Well, that experiment wasn't something he, I, did. He meant when Sean did it, or when we do it. Um, and he would say, I, and that puts, I think he probably thought he's slightly misstating to be efficient in that conversation. Um, he didn't think he was covering up a major fraud, but he would probably have known he's somewhat simplifying the picture he's giving.

And you would have [00:55:00] thought justified to, to slightly, uh, simplify what I'm saying because I'm speaking to a group and they need to get the message quickly and, and so on. But there's all kinds of complicated credit issues when you do that. And it's not really the, the most, sort of, yes, it takes a while to explain who's doing what in the research and how you're involved every time you present the work.

a lot of PIs don't do it, but it probably is the proper way, uh, because then you're not inadvertently endorsing something you can't person, you can't endorse. Um, so that there are, there are behaviors like that, that I think contributed to, to what happened.

Benjamin James Kuper-Smith: Yeah. I think he definitely didn't mind also taking credit for a lot of the stuff that Hun did, which is, uh, not completely unexpected, but also, yeah, I, yeah. Anyway, I mean, yeah, I, I don't want to like judge people too much, but he did seem quite negligent and happy to take the, the positives of it and then, [00:56:00] uh.

I'm also happy to let Sean take the fall afterwards, um, but, um, yeah, so I'm starting to see that we're starting to slightly run out of time. So, uh, well, it's, I mean, people who are interested can read your book and I would recommend so, um, I guess we have to be a little quicker on some of the topics now.

Um, I, one thing I found also kind of interesting is the, the general atmosphere, not the atmosphere. So that's the wrong word. Um, the general setting of Bell Labs and how the parent company Lucent Technology was struggling financially. I thought it was really interesting how that potentially also kind of influence this whole thing.

So, I mean, to, to, to put into context, so all of the, of, of Shun's major publications come from like 1999 to 2001, something like that, 1998 to 2001, something like that. [00:57:00] Um, and A period in which, uh, we had the, uh, uh, the dot com crash, uh, or bust or whatever it's called. Um, then 9 11 happened. I believe Bell Labs wasn't far away from it and Lucent Technology really struggled.

And in the midst of that, Jan Hendrik Schoen was doing reaches that was getting a lot of attention and cost them practically no money because he was basically doing all of the stuff. And Constance in Germany. Um, yeah, I was hoping you could, uh, just a little bit expand on the whole setting and how kind of Schoen was also hyped, maybe, I feel like maybe a bit more by Bell Labs and their publicity team than maybe he would have been had, uh, Lucent Technology been a very profitable company at the time.

Eugenie Reich: It's definitely true that the, that the dot com collapse, I think it was around 2000. that the dot com, um, companies began to experience slides in their stock price, including lucent [00:58:00]technologies. When the promise of the internet boom turned out to be, um, inflated relative to, uh, what the companies were able to deliver at that time and, and, and investors that had snapped up stock in these companies began to to change their minds.

So Lucent stock was, was falling. And the background is that, um, Lucent Technologies was, was a daughter company of AT& T, which was a telephone monopoly, really a national telephone company of the United States. But it had been privatized and broken up into lots of smaller companies, which made it much less economically sustainable to run and fund and generously fund and support curiosity driven research that wouldn't see applications for 15, 20 years.

So at just as the physicists were trying to justify themselves in terms of publicity, [00:59:00] the parent company began to need content that it could use to tell investors to hold the stock. Like not we're, we're not a fly by night. com company. We are a company with stuff in the hopper for later. So don't sell your stock, hold it.

We should be one of the buy and hold companies. So I think that's the type of message that they wanted to put out. And so even though Shan's work was known to be far from applications, they still wanted to promote it as part of, here's what, here's the value that we might deliver in the future. And he did end up.

being profiled in the annual report in late 2001 on page three of the annual report to investors. Um, I don't think there's probably a single investor who, who was, who was, um, influenced by that. I don't think anybody said, you know, I [01:00:00] don't think investors were selling the stock because they knew something the physics community didn't.

And I don't think that there were investors who weren't selling the stock because they were significantly influenced by the idea that there'll be a plastic electronic chip in 15 years. But I do think that in the minds of people under pressure, and the economic pressure was such that two out of three lights in the, in the corridors were not, were not being, you know, didn't have bulbs, um, and, and people were not being able to get the equipment they wanted and, and budgets were being cut.

People under economic pressure, managers under pressure, um, in the research division would sort of grasp for that. So, You know, I don't think any investor could really end up bringing a, a, a securities fraud lawsuit because of what Shan did, but it certainly affected it. And I think that's important because right now we have a research community that is under funding pressure all the time, [01:01:00] under pressure to overstate its results just as, um, they, the Bell Labs managers were then.

Benjamin James Kuper-Smith: We've talked a little bit about like kind of how he managed to get away with the fraud and part of it being, you know, he could do a lot of the studies himself. We didn't mention, uh, but lots of it he supposedly did that night. Um, uh, he mentioned the two countries going back and forth, uh, and lots of ways in which he managed to get away with it.

But I thought the most actually genuinely ingenious way that he managed to get away with fraud was the way in which he used his colleagues. To improve his papers. I can't remember exactly what the quote was. I didn't write it down But there was this quote somewhere from one of his close colleagues who said something I couldn't believe someone could have committed this much fraud on his own Then I realized he hadn't we had all done it or we'd all contributed to it or something like that um Yeah, can you maybe say how exactly he used his [01:02:00] his, uh, unique listening skills and just basically sitting there and taking notes whilst people talked about science to, to improve his papers and, and really make, basically commit fraud that was genuinely believable and seemed very good.

I

Eugenie Reich: you could think of Eshen as a sort of mirror of the fantasies of the theoretical physicists of Bell Labs, because he would imbibe their, their ideas and, and then generate data to please them. And he would shop his data around amongst them to get feedback and criticism that he would use to refine data and then show to someone else.

And people would say, well, he, it can't, it can't be fraud because he himself doesn't understand the effect in the data. Well, he'd spoken to someone else who had told him how it should look and then take it to someone else and say, I got this, you know, what does it mean? Um, so, you know, and then [01:03:00]it, I think the theorists, uh, created a monster by, by generously giving of their time.

And. And, and, and critiques to him. And I think the same happened, unfortunately, in the experimental community, as people started replicating the work, uh, they would tell him, well, we've tried these settings and we don't see anything. And he'd say, well, well, try this. Um, and then someone else would say to him, I've tried this and it doesn't work.

And he'd say, well, don't do that. Cause so we would use information from others as feedback to someone else. And so. He was like a clearinghouse for, um, feedback on the topic of, um, his planes. Even criticism, even sharp criticism, was still constructive feedback for him. Um, and so that's, um, that's quite relevant when you're thinking about sort of strategies of engagement with people whose work you're very suspicious of.

Is [01:04:00] that, um, Giving private feedback, uh, could backfire on you because they can fix the problem before, you know, anyone else sees, sees that problem. And then you're left saying, well, I told him this couldn't be right in this private conversation last year. Sometimes I'll, I'll be saying, well, wait till it gets in the journal.

But you know, that, that's about how to deal with it for sure. He, he was using everybody.

Benjamin James Kuper-Smith: mean, it basically seemed to me that he was, he was, As a lot of people commented on, and some people didn't like about him, he was also like completely non confrontational, when people asked him about a problem, he would just like not say much, and let other people speak, and I guess he just absorbed the information, and then by the next day, had supposedly done another experiment to show exactly, you know, I mean, it's, it's, it's fascinating that then people go like, well, if we do this precise experiment this way, then we can find out why is this or that, and then, you know, next day, like, well, I did it.

Here it is. And then, you know, another nature paper. Here we go. Um, it's, [01:05:00] uh, as I said, it's, it's just a pretty, pretty good way to do it. Yeah, so to, to, to kind of bring the Shun story to the end, um, I mean, I guess the, the, the downfall or how it came about was, I mean, in a sense there, there were individual moments, but it seemed to me it was kind of just this growing mass of, People becoming aware that something wasn't quite right um Yeah, do you I don't know if yeah, do you want to maybe summarize it briefly kind of the yeah It seems to me like that's what I said earlier Basically, I felt like whilst I was reading it as I reread it to prepare for this interview and I kept going you know like okay, we really have to write this down because now it's gonna You know, come to an end, but then there was another three stories in between where it just kept growing and growing.

Uh, maybe pick your favorite moments or like what you think is most symbolic of, you know, how it fell apart.

Eugenie Reich: Obviously, we, it's, it's notorious even before my book that the way the Shan fraud came to light [01:06:00] was that independent scientists noticed duplicated data spread throughout his work. Before that, some of the things that people did that contributed to, to that outcome were things like asking for more data and asking for more data of a certain type.

Um, there was a reviewer of, of one of his nature submissions that wanted more data from more organic molecules, and Sean added it in review, and that turned out to be an instance of ad duplication in the same paper. So the bad story there is that the reviewer was satisfied and the paper was published.

The good story there is that the review process. generated some low hanging fruit for someone with a different mindset to come along and find. And I think the lesson of the way, um, the situation built up is that people pressing their questions, people asking for data, people asking [01:07:00] for details, trying to get on record.

In a manuscript about how he sputtered aluminium oxide in a way no one else could do, and having him write that down, and put detail in, and send it around to people in the community. These things may seem like they have nothing to do with fraud detection, but the more information someone has to share about their work, the harder it becomes not to make a mistake and produce something that's just a complete sort of smoking gun for, for, for lying.

So he was telling a web of lies, but I think the message for the community is Ask enough questions, either to replicate work if it's replicable, or to, uh, you know, find the problem if it isn't. And sometimes that problem is going to be fraud. So having given enough statements that if they are lies, it's a web of lies [01:08:00] in East Cork.

Um, and I think that when people give each other a pass on details and data, and leave things, and leave things vague, this is really, uh, where fraud can, can flourish and succeed and go undetected. So, yes, it was, it was found by a sort of accidental, so, discovery of a type of duplication that, that was very stupid for him to do.

On the other hand, he was being made to perform so many tasks and answer so many questions that he was probably more likely than not to produce something stupid at some point.

Benjamin James Kuper-Smith: Yeah. Um, so there was one scene that really stayed with me, uh, that you described towards the end when it all fell apart. And I'm curious what you make of this because I kind of felt really sorry for him in this moment and I'm not entirely sure why. So you wrote basically, so the, the. What was it? Uh, I think they'd already said they were going to form this committee to figure out, you know, basically they had they basically had him [01:09:00] already Um, but I think one of his supervisors or something like that, you know took him aside and said like, hey, you know, if you can replicate these things, that might be the only way you can save yourself.

And then you wrote, um, on May 14th, Shun arrived in the next door lab of David Muller. He was carrying a sample about two years after Muller had first asked him for one. In Shun's hand were several pieces of a glass slide onto which he had deposited a molybdenum electrode and sputtered a layer of aluminum oxide.

And there was just something about this image of this man, just like his entire life falling apart with these like, You know, pieces of glass in his hand, just like desperately trying to, just trying to get something out of it. Um, I don't know. And this is also in part, I think, why I felt like maybe he actually believed some of the stuff he reported, like genuinely, because like, I mean, why even try at this point?

I don't know. I was just curious what you made of that scene, because for me, I don't know, when I, when I read it, I somehow felt kind of really sorry for him at this moment.

Eugenie Reich: Well, I mean, this is an act of [01:10:00] desperation to give someone like David Muller, he's since become very famous as a electron microscopist at Cornell, but to give someone like David Muller a sample of a material and try to tell him it's something other than it is, is certainly an act of desperation. Um, I don't know if, you know, if I share your, your sort of sympathy, uh, for him.

I suppose it's not nice to see anybody drowning, um, But, you know, you do seem to have a certain sympathy for him, which is interesting, because you've said, I don't know you prior to this interview that you've asked me to do, but you've said a couple of things along the lines of, of sympathy with Shan, and that's very curious, but you're not the only reader of my book to have those ideas.

Benjamin James Kuper-Smith: I think it's because you also fairly early on, right, that he, I think he just seemed incredibly [01:11:00] naive about how science worked also. And it's also the question of how much, to what extent that is a kind of him crafting an image or just again trying to tell a story to get away with it. But I, I, What was it?

I think like very early on you described something that like someone criticized him as just thinking like if something was in a science textbook then it was correct and that's the way it was and I don't know I guess he just had this I guess also because you described a few times like you know in terms of personality he didn't he isn't the stereotype of what we might think of a fraud or someone who's like you know bombastic and always pushes himself into the foreground these kind of things and he I don't know I guess he just seemed like a very quiet guy who was just incredibly naive but Maybe I'm just being naive and believing that story of him.

Eugenie Reich: Well, I, I, I think, uh, I think Sharon Tolton's lies, because he was lacking in a certain set of scientific skills that, that we want scientists to have and that many, maybe most scientists do have, which [01:12:00] is things like, uh, curiosity about the way science really is and the dedication and, and, and, and the sort of ability to dig in and keep turning rocks over looking for, um, more information to explain what they see rather than assuming they know.

So when someone doesn't get the result they're expected to get, The ability to find out why and to interrogate what one self did and go back over one's workflow and modify and incrementally improve. And these are things that it appears that people learn best on the job with, with a slightly more or very much more experienced scientists helping them do that.

And if they don't get that training and they don't learn how to modify their own work to troubleshoot their own errors and, and, and, Check their own statements. Um, and if they don't learn how to investigate and [01:13:00] interrogate nature, then they're kind of lost in science and fraud is, is something that they might turn to, you know, in instead of failure, instead of allowing themselves to fail.

So I do feel sorry for people who are in the scientific community who. Who let, who never get that training and don't get that capability and don't become competent. It's not even about excellence, like, it's about becoming, you know, competent as a scientist and getting together with the bread and butter of what, of what scientists do.

And it is a very scary place to be in the scientific community, especially somewhere like Bell Labs, if you don't have those skills.

Benjamin James Kuper-Smith: Yeah, definitely. Um, basically what happened to, to Sharon, who I somehow seemed to feel some sympathy for. Um, initially, I mean, he, the report was very clear about what he did. Um, I believe he lost more or less everything. [01:14:00] Um, do you know, yeah, kind of what, what, what, what, what did you do afterwards? It's been.

Whatever, 20 years since this happened.

Eugenie Reich: So I, I followed Sean into a couple of other companies in Germany and then I think Norway, mostly through online sources, that he, where he was working, I think in an engineering capacity, but I don't know what he's doing now. I find it fascinating that people are so fascinated by what he's doing. Uh, some people who spend their lives in the scientific community, you know, really cannot contemplate how you can continue to exist after being kicked out of it.

Um, but amazingly you, you can and there's plenty of drugs for, available for someone who does, uh, what he did. So, you know, people who say they really want to rehabilitate, you know, don't, don't be too harsh if someone commit, you know, and, and, and set a high bar for bringing allegations. [01:15:00]Um, they sometimes talk about.

You know, the, the level of proof needed to demonstrate scientific fraud is as if it's higher even than to prove murder and in, in a, in a court of law. And, and, and I do wonder about people setting such a high bar, whether they realize that, you know, there are other jobs that people can go on to and do quite well if they're not suited to, um, you know, scientific work.

Benjamin James Kuper-Smith: Well, I guess the, uh, so this is actually one thing where, even though, as I said, I did occasionally feel some sympathy with him, uh, the, the, I think the reason I did is because I thought he just seemed so incompetent in many crucial ways. And, uh, you know, just like the fundamentals of how science work. And I'm genuinely surprised that anyone would hire him because it seems to me like, I mean, he's clearly shown that.

Yeah, I don't know, like, I mean, maybe as like a press release or if someone checks that he didn't include anything that's blatantly wrong, but, um, Yeah, I'm just surprised that [01:16:00] people would trust him if he works in some sort of engineering capacity or something like that. Um,

Eugenie Reich: Well, well, I mean, hang on, I mean, we know that he was doing He was engaged in some data manipulation, but less extreme in his PhD. He knew people in his PhD who knew his work and knew that he could do some work. It's a relatively small period from 1999 until 2002 that he was provably misbehaving in. I believe he did lose his PhD, uh, through the German courts, and so he's not going to be at the level that a German PhD would be at.

On the other hand, um, I don't think there's evidence that, you know, he cannot do anything productive. And there's no reason why he couldn't get another qualification of a technical kind. So, would I want to sort of ride in a vehicle that, that he designed and tested? Um, probably not. [01:17:00] Um, but do I think he can't ever be productive in, in a, in a functioning organisation with appropriate checks and balances?

Actually, I wouldn't say that either. Uh, he was clearly very intelligent, he wanted to be a team player, and he was very hardworking.

Benjamin James Kuper-Smith: Yeah, no, exactly. Yeah, I mean, yeah, I didn't want to say that I thought he was completely incompetent. It just seemed to me that, like, it would be hard to, to trust him and that, like, I mean, again, someone who says, like, I didn't know how to save my data. I don't know. Yeah, there's just, like, some fundamentals missing, but I don't know.

I guess some people, yeah. I guess if he has some checks and balances, then Then that's that. And then it can work. Okay, so, I mean, I think, I think I published, uh, I think I promised in the beginning of the episode we would talk about the legal stuff, uh, your, your current legal work, because you now, uh, work as a lawyer.

defending whistleblowers. Um, we ran out of time, but luckily, uh, you've kindly agreed to do another episode. So I guess we'll do that in a different, [01:18:00] uh, at a different time. So there'll be another episode where we'll talk all about, uh, about those things. Um, yeah, I thought I'd just end the episode with the recurring questions that I always ask my guests.

Uh, the first is what is a book or paper you think more people should read? Can be famous, not famous at all, uh, old or new. It doesn't really matter, just something that you would recommend people to read.

Eugenie Reich: So I really enjoy, um, there's a chapter in a book called, um, Leviathan and the Air Pump, which is by Shapin and Shapiro, who are, uh, who are two historians of science. And Leviathan and the Air Pump, um, I do talk about it in my book, but the air pump, uh, with the, with the devices that we use to demonstrate the relationship between pressure and volume in, in a gas.

And it's really the first time that the scientific community in Europe started to work on replicability, was trying to understand why air pumps, [01:19:00] which were huge apparatuses filling up a big room, operated sometimes differently in the Netherlands than they did in, in London, England. And, uh, people got super interested in the idea that there are universal laws affecting the behavior of gases that are the same in both of those countries.

So, you know, at a time when there was all this sort of nationalistic conflict, which I guess there still is, but even more so among colonial powers, um, people were sort of beginning to realize that science was a more universal thing and that Scientific phenomena were the same, um, in different places. And, and the replicability was, was a way to ensure universality of the knowledge that, that comes through scientific endeavour.

So I love that book and I don't remember which chapter it is, but there's a chapter describing all the people building the air pumps and how they communicate with each other and [01:20:00] really want to find out if the air pumps are doing the same thing in the two places.

Benjamin James Kuper-Smith: Yeah, I remember, yeah, when you, when you mentioned that, so that seems familiar, and then I remember, yeah, you, you mentioned it, and I, I think I put it in my Amazon cart for later, uh, because it did seem very interesting when you described it. Yeah, sounds interesting. Uh, second question is, uh, what's something you wish you'd learned sooner?

This can be from your work life, from your private life, I don't really care. Just something, uh, where you, that you think if I'd learned that sooner, then that probably would have helped me out quite a bit. And, uh, if, if you're willing to share, if you can share, maybe how you find that, how you found it out, maybe what you're doing about it or how you, you know, solved it dealt with it or whatever.

Eugenie Reich: So I think the thing I, I wish I'd, um, taken on board sooner is, is the extent to which, um, money, um, and, um, institutional interests around fundraising and, and maintaining funding and money [01:21:00] influences people working in institutions. Um, including myself when I was in any organization or company. Or, or in, in education and, and I think everybody else because, um, so often we tend, if, if you, if you have come from a background where you've been lucky and I did, where you've been lucky to sort of never want for any of the basic needs of life, where you, where you've had enough.

you know, money to have food and shelter and be quite comfortable and pursue an education as you, as you kind of want to. If you've had that, then I don't think it's, It's visceral to you that money and accumulations of, of, of capital are affecting people's behavior to the extent they are. And then I think at some point when I went to law school, which was as a much older person, I began to see the extent to which [01:22:00] concentrations of, of wealth that we see in, you know, some of our institutions, including bastions of academic freedom, like universities.

that that does affect the thought that goes on in them, um, and the, and the power dynamics between people. So that's a very general thing, but, and it's something that we all kind of think we know, but it, you know, you're kind of naive when you haven't had to grapple with, with poverty. I'm lucky I hadn't.

And you kind of think that, that you, you're going to rise above that as an intellectual and as a free thinker. But so often you think you, people think they've risen above, above those material things and they haven't, and they, and they come smacking back into it when they try to have a thought that is directly in conflict with, with the way that, with the interests of institutions that they're a part of.

Benjamin James Kuper-Smith: Yeah. Uh, and final question is. Oh [01:23:00] yeah, for a second I forgot what it was. Uh, yeah, usually it's because I'm, I just started, I finished my PhD So usually I ask about, uh, you know, advice for people in this kind of period who are PhD students, like latter stage PhD students or early postdocs. Uh, but usually also I, I interview people who are actively in academia right now.

Uh, so feel free to take this question maybe a bit more metaphorically in the sense, like maybe people in a. Uh, you know, kind of this kind of transitory period where you can maybe change from one thing into the next or, uh, where it's possible to change directions maybe a bit, um, or question whether you want to stay on the path at all, um, or take it literally, whatever you want.

Eugenie Reich: So I think you're asking for, if I have advice for, for postdocs and early career scientists Yeah. Yeah. It's along the lines of, of the answer I just gave, which is to remember that your personal interests are not the interests of the institution you're in, they're not the interests of your PI and your supervisor, and [01:24:00] they're not the interests of, and, and those interests are also not the same as the interest of, of the important interest of scientific integrity and knowledge creation and, um, those are all different things.

And. I think the scariest thing that you can encounter as a young person is well meaning advice, because, uh, well meaning, well meaning, well intended advice, it will come from the perspective of someone you like and trust, and they may be saying, keep your head down, don't, don't stick your neck out, uh, this is, pick your battles, this is not the thing, uh, to, to harm you, you, you know, to, to put everything on the line on, and they'll be telling you modify.

Your thoughts this way or that way, and, and of course there's, there's a huge amount of wisdom that you follow all the time. But just remember that, you know, well meaning advice is sometimes, uh, far more dangerous than, than, than caustic criticism, [01:25:00] um, um, that at least is, is being honest about the fact that it's adverse to you. So.

Benjamin James Kuper-Smith: Yeah, now I'm confused. I am assuming your advice was what I intended, but I guess sometimes it can get.

Eugenie Reich: my advice? I know, uh, did you say my advice? Yes.

Benjamin James Kuper-Smith: Well, there's a slight, uh, paradox that I always try to,

Eugenie Reich: Yeah, there is a paradox there. And I worry about it as a person who's in the legal advice business now, that, you know, it's very easy for a lawyer to be way too conservative and to be giving out stick, keep your head down advice, because that is the way to reduce risk. Um, so, you know, many, many people in the advice business are basically in a well meaning way, going around sort of chilling what, what anybody would ever do.

Yeah, it's a problem. But no, I'm thinking about maybe a mentor who, you know, a good mentor, they help you, they guide you, [01:26:00] uh, you know, they instruct you, they train you, and there's a certain point when the right thing for them to do is let you fly because you, it's the time that. It's not right to listen to them anymore.

So that's the kind of idea I'm, I'm thinking.

Benjamin James Kuper-Smith: And like recognize when the interests of the mentor and you might not be exactly identical.

Eugenie Reich: Right.

Benjamin James Kuper-Smith: Okay. Well, thank you very much. This was great. Uh, as I said, we'll, we'll, we'll try and schedule another one to actually talk about the other half of the stuff we wanted to talk about. Um, so yeah, thank you very much.