Some thoughts after reading “Bad Blood: Secrets and Lies in a Silicon Valley Startup”

I just read the above-titled John Carreyrou book, and it’s as excellent as everyone says it is. I suppose it’s the mark of any compelling story that it will bring to mind other things you’ve been thinking about, and in this case I saw many connections between the story of Theranos—a company that raised billions of dollars based on fake lab tests—and various examples of junk science that we’ve been discussing for the past ten years or so.

Before going on, let me emphasize three things:

1. Theranos was more fake than I’d realized. On the back cover of “Bad Blood” is this quotation from journalist Bethany McLean:

No matter how bad you think the Theranos story was, you’ll learn that the reality was actually far worse.

Indeed. Before reading Carreyrou’s book, I had the vague idea that Theranos had some high-tech ideas that didn’t work out, and that they’d covered up their failures in a fraudulent way. The “fraudulent” part seems about right, but it seems that they didn’t have any high-tech ideas at all!

Their claim was that they could do blood tests without a needle, just using a drop of blood from your finger.

And how did they do this? They took the blood, diluted it, then ran existing blood tests. Pretty obvious, huh? Well, there’s a reason why other companies weren’t doing this: you can’t do 100 tests on a single drop of blood. Or, to put it another way, you can’t do 1 test on 1/100th of a drop of blood: there’s just not enough there, using conventional assay technology.

I think that, with care in data collection and analysis, you could do a lot better than standard practice—I’d guess that it wouldn’t be hard at all to reduce the amount of blood needed by a factor of 2, by designing and analyzing your assays more efficiently (see here, for example, where we talk about all the information available from measurements that are purportedly “below detection limit”). But 100 tests from one drop of blood: no way.

So I’d just assumed Theranos was using a new technology entirely, maybe something with gene sequencing or microproteins or some other idea I’d never heard of. No, not at all. What they actually had was an opaque box containing several little assay setups, with a mechanical robot arm to grab and squeeze the pipette to pass around the blood. Unsurprisingly, the machine broke down all the time. But even if it worked perfectly, it was a stupid hack. Or, I should say, stupid from the standpoint of measuring blood; not so stupid from the standpoint of conning investors.

You’ve heard about that faked moon landing, right? Well, Theranos really was the fake moon landing. They got billions of dollars for, basically, nothing.

So, yeah, the reality was indeed far worse than I’d thought!

It would be as if Stan couldn’t really fit any models, as if what we called “Stan” was just an empty program that scanned in models, ran them in Bugs, and then made up values of R-hat in order to mimic convergence. Some key differences between Stan and Theranos: (a) we didn’t do that, (b) Stan is open source so anyone could check that we didn’t do that by running models themselves, and (c) nobody gave us a billion dollars. Unfortunately, (c) may be in part a consequence of (a) and (b): Theranos built a unicorn, and we just built a better horse. You can get more money for a unicorn, even though—or especially because—unicorns don’t exist.

2. Clarke’s Law. Remember Clarke’s Third Law: Any sufficiently crappy research is indistinguishable from fraud. Theranos was an out-and-out fraud, as has been the case for some high-profile examples of junk science. In other cases, scientists started out by making inadvertent mistakes and then only later moved to unethical behavior of covering up or denying their errors. And there are other situations where there is enough confusion in the literature that scientists could push around noise and get meaningless results without possibly ever realizing they were doing anything wrong. From the standpoint of reality, it hardly matters. The Theranos story is stunning, but from the perspective of understanding science, I don’t really care so much if people are actively cheating, whether they’re deluding themselves, or whether it’s something in between. For example, when that disgraced primatologist at Harvard was refusing to let other people look at his videotapes, was this the action of a cheater who didn’t want to get caught, or just a true believer who didn’t trust unbelievers to evaluate his evidence? I don’t care: either way, it’s bad science.

3. The role of individual personalities. There are a lot of shaky business plans out there. It seems that what kept Theranos afloat for so long was the combination of Elizabeth Holmes, Ramesh Balwani, and David Boies, three leaders who together managed to apply some mixture of charisma, money, unscrupulousness, and intimidation to keep the narrative alive in contradiction to all the evidence. It would be nearly impossible to tell the story of Theranos without the story of these personalities, in the same way that it would be difficult to try to understand the disaster that was Cornell University’s Food and Brand Lab without considering the motivations of its leader. People matter, and it took a huge amount of effort for Holmes, Balwani, and their various cheerleaders and hired guns, to keep their inherently unstable story from exploding.

That said, my own interest in Theranos, as in junk science, is not so much on the charismatic and perhaps self-deluded manipulators, or on the disgusting things people will do for money or prestige (consider the actions of Theranos’s legal team to intimidate various people who were concerned about all the lying going on).

Rather, I’m interested in the social processes by which obviously ridiculous statements just sit there, unchallenged and even actively supported, for years, by people who really should know better. Part of this is the way that liars violate norms—we expect scientists to tell the truth, so a lie can stand a long time before it is fully checked—part of it is wishful thinking, and part of it seems to be an attitude by people who are already overcommitted to a bad idea to protect their investment, if necessary by attacking truth-tellers who dispute their claims.

Bad Blood

OK, on to the book. There seem to have been two ingredients that allowed Theranos to work. And neither of these ingredients involved technology or medicine. No, the two things were:

1. Control of the narrative.

2. Powerful friends.

Neither of these came for free. Theranos’s leaders had to work hard, for long hours, for years and years, to maintain control of the story and to attract and maintain powerful friends. And they needed to be willing to lie.

One thing I really appreciated about Carreyrou’s telling of the tale was the respect he gives to the whistleblowers, people who told the truth and often were attacked for their troubles. Each of them is a real person with complexities, decisions, and a life of his or her own. Sometimes in such stories there’s such a focus on the perpetrators, that the dissenters and whistleblowers are presented just as obstacles in the way of someone’s stunning success. Carreyrou doesn’t do that; he treats the critics with the respect they deserve.

When reading through the book, I took a lot of little notes, which I’ll share here.

p.35: “Avid had asked more pointed questions about the pharmaceutical deals and been told they were held up in legal review. When he’d asked to see the contracts, Elizabeth had said she didn’t have the copies readily available.” This reminds me of how much of success comes from simply outlasting the other side, from having the chutzpah to lie and just carrying it out, over and over again.

p.37: various early warnings arise, suspicious and odd patterns. What strikes me is how clear this all was. It really is an Emperor’s New Clothes situation in which, from the outside, the problems are obvious. Kind of like a lot of junk science, for example that “critical positivity ratio” stuff that was clearly ridiculous from the start. Or that ESP paper that was published in JPSP, or the Bible Code paper published in Statistical Science. None of these cases were at all complicated; it was just bad judgment to take them seriously in the first place.

p.38: “By midafternoon, Ana had made up her mind. She wrote up a brief resignation letter and printed out two copies . . . Elizabeth emailed her back thirty minutes later, asking her to please call her on her cell phone. Ana ignored her request. She was done with Theranos.” Exit, voice, and loyalty. It’s no fun fighting people who don’t fight fair; easier just to withdraw. That’s what I’ve done when I’ve had colleagues who plagiarize, or fudge their data, or more generally don’t seem to really care if their answers make sense. I walk away, and sometimes these colleagues can then find new suckers to fool.

p.42: “Elizabeth was conferenced in by phone from Switzerland, where she was conducting a second demonstration for Novartis some fourteen months after the faked one that had led to Henry Mosley’s departure.” And this happened in January, 2008. What’s amazing here is how long it took for all this to happen. Theranos faked a test in 2006, causing one of its chief executives to leave—but it wasn’t until nearly ten years later that this all caught up to them. Here I’m reminded of Cornell’s Food and Brand Lab, where problems had been identified several years before the scandal finally broke.

p.60: Holmes gave an interview to NPR’s “BioTech Nation” back in 2005! I guess I shouldn’t be surprised that NPR got sucked into this one: they seem to fall for just about anything.

p.73: Dilution assays! Statistics is a small world. Funny to think that my colleagues and I have worked on a problem that came up in this story.

p.75: “Chelsea’s job was to warm up the samples, put them in the cartridges, slot the cartridges into the readers, and see if they tested positive for the virus.” This is so amusingly low-tech! Really I think the best analogy here is the original “Mechanical Turk,” that supposedly automatic chess-playing device from the 1700s that was really operated by a human hiding inside the machine.

p.86: “Hunter was beginning to grow suspicious.” And p.86: “The red flags were piling up.” This was still just in 2010! Still many years for the story to play out. It’s as if Wiley E. Coyote had run off the cliff and was standing midair for the greater part of a decade before finally crashing into the Arizona desert floor.

p.88: “Walgreens suffered from a severe case of FoMO—the fear of missing out.” Also there was a selection effect. Over the years, lots of potential investors decided not to go with Theranos—but they didn’t matter. Theranos was able to go with just the positive views and filter out the negative. Here again I see an analogy to junk science: Get a friend on the editorial board or a couple lucky reviews and you can get your paper published in a top scientific journal. Get negative reviews, and just submit somewhere else. Once your paper is accepted for publication, you can publicize. Some in the press will promote your work, others will be skeptical—but the skeptics might not bother to share their skepticism with the world. In that way, the noisiness and selection of scientific publication and publicity have the effect of converting variation into a positive expectation, hence rewarding big claims even if they only have weak empirical support.

p.97: “To be sure, there were already portable blood analyzers on the market. One of them, a device that looked like a small ATM called the Piccolo Xpress, could perform thirty-one different blood tests and produce results in as little as twelve minutes.” Hey: so it already existed! It seems that the only real advantage Theranos had over Piccolo Xpress was a willingness to lie. I guess that’s worth a lot.

p.98: “Nepotism at Theranos took on a new dimension in the spring of 2011 when Elizabeth hired her younger brother, Christian, as associate director of project management. . . . Christian had none of his sister’s ambition and drive; he was a regular guy who liked to watch sports, chase girls, and party with friends. After graduating from Duke University in 2009, he’d worked as an analyst at a Washington, D.C., firm that advised corporations about best practices.” That’s just too funny. What better qualifications to advise corporations about best practices, right?

p.120: It seems that in 2011, Theranos wanted to deploy its devises for the military in Afghanistan. But the devices didn’t exist! It makes me wonder what Theranos was aiming for. My guess is that they wanted to get some sheet of paper representing military approval, so they could then claim they were using their devices in Afghanistan, a claim which they could then use to raise more money from suckers in Silicon Valley and Wall Street. If Theranos had actually received the damn contract, they’d’ve had to come up with some excuse for why they couldn’t fulfill it.

p.139: Agressive pitbull lawyer David Boies “had accepted stock in lieu of his regular fees”! Amusing to see that he got conned too. Funny, as he probably saw himself as a hard-headed, street-smart man of the world. Or maybe he didn’t get conned at all; maybe he dumped some of that stock while it was still worth something.

p.151: Theranos insists on “absolute secrecy . . . need to protect their valuable intellectual property.” This one’s clever, kind of like an old-school detective story! By treating the maguffin as it if has value, we make it valuable. The funny thing is, academic researchers typically act in the opposite way: We think our ideas are soooo wonderful but we give them away for free.

p.154: “Elizabeth had stated on several occasions that the army was using her technology [Remember, they had no special technology—ed.] on the battlefield in Afghanistan and that it was saving soldier’s lives.” Liars are scary. I understand that people can have different perspectives, but I’m always thrown when people just make shit up, or stare disconfirming evidence in the eye and then walk away. I just can’t handle it.

p.155: “After all, there were laws against misleading advertising.” I love the idea that it was the ad agency that had the scruples here, contrary to our usual stereotypes.

p.168: “Elizabeth and Sunny decided to dust off the Edison and launch with the older device. That, in turn, led to another fateful decision—the decision to cheat.” The Armstrong principle!

p.174: An article (not by Carreyrou) in the Wall Street Journal described Theranos’s processes as requiring “only microscopic blood volumes” and as “faster, cheaper, and more accurate than the conventional methods.” Both claims were flat-out false. I’m disappointed. As noted above, I find it difficult to deal with liars. But a news reporter should be used to dealing with liars, right? It’s part of the job. So how does this sort of thing happen? This is flat-out malpractice. Sure, I know you can’t fact-check everything, but still.

p.187: “To Tyler’s dismay, data runs that didn’t achieve low enough CVs (coefficients of variations) were simply discarded and the experiments repeated until the desired number was repeated.” Amusing to see some old-school QRP’s coming up. Seems hardly necessary given all the other cheating going on. But I guess that’s part of the point: people who cheat in one place are likely to cheat elsewhere too.

p.190: “Elizabeth and Sunny had decided to make Phoenix their main launch market, drawn by Arizona’s pro-business reputation and its large number of uninsured patients.” Wow—that’s pretty upsetting to see people getting a direct financial benefit from other people being in desperate straits. Sure, I know this happens, but it still makes me uncomfortable to see it.

p.192: “Tyler conceded her point and made a mental note to check the vitamin D validation data.” Always a good idea to check the data.

p.199: “George said a top surgeon in New York had told him the company was going to revolutionize the field of surgery and this was someone his good friend Henry Kissinger considered to be the smartest man alive.” This one’s funny. American readers of a certain age will recall a joke involving Kissinger himself being described as “the smartest man in the world.”

p.207: “He talked to Schultz, Perry, Kissinger, Nunn, Mattis and to two new directors: Richard Kovacevich, the former CEO of the giant bank Wells Fargo, and former Senate majority leader Bill Frist.” This is interesting. You could imagine one or two of these guys getting conned, but all of them? How could that be? They key here, I think, is that these seven endorsements seem like independent evidence, but they’re not. It’s groupthink. We’ve seen this happen with junk science, too: respected scientists with reputations for careful skepticism endorse shaky claims on topics that they don’t really know anything about. Why? Because these claims have been endorsed by other people they trust. You get these chains of credulity (here’s a particularly embarrassing example, but I’ve seen many others) with nothing underneath. This is all interesting in that there’s a real statistical fallacy going on here, where multiple endorsements are taken as independent pieces of evidence, but they’re not.

p.209: “President Obama appointed [Holmes] a U.S. ambassador for global entrepreneurship, and Harvard Medical School invited her to join its prestigious board of fellows.” Harvard Medical School, huh? What happened to First Do No Harm?

p.253: “It was frustrating but also a sign that I [Carreyrou] was on the right track. They wouldn’t be stonewalling if they had nothing to hide.” This reminds me of so many scientists who won’t share their data. What are they so afraid of, indeed?

p.271: “In a last-ditch attempt to prevent publication [of Carreyrou’s Wall Street Journal article, Boies sent the Journal a third lengthy letter . . .” The letter included this passage: “That thesis, as Mr. Carreyrou explained in discussions with us, is that all of the recognition by the academic, scientific, and health-care communities of the breakthrough contributions of Threanos’s achievements is wrong . . .” Indeed, and that’s the problem: again, what is presented as a series of cascading pieces of evidence is actually nothing more than the results of a well-funded, well-connected echo chamber. Again, this reminds me a lot of various examples of super-hyped junk science that was promoted with faced no serious opposition. It was hard for people to believe there was nothing there: how could all these university researchers, and top scientific journals, and scientific award committees, and government funders, and NPR get it all wrong?

p.278: “Soon after the interview ended, Theranos posted a long document on its website that purported to rebut my reporting point by point. Mike and I went over it with the standards editors and the lawyers and concluded that it contained nothing that undermined what we had published. It was another smokescreen.” This makes me think of two things. First, what a waste of time, dealing with this sort of crap. Second, it reminds me of lots and lots of examples of scientists responding to serious, legitimate criticism with deflection and denial, never even considering the possibility that maybe they got things wrong the first time. All of this has to be even worse when lawyers are involved, threatening people.

p.294: “In January 2018, Theranos published a paper about the miniLab . . . The paper described the device’s components and inner workings and included some data . . . But there was one major catch: the blood Theranos had used in its study was drawn the old-fashioned way, with a needle in the arm. Holmes’s original premise—fast and accurate test results from just a drop or two pricked from a finger—was nowhere to be found in the paper.” This is an example of moving the goalposts: When the big claims get shot down, retreat to small, even empty claims, and act as if that’s what you cared about all along. We see this a lot with junk science after criticism. Not always—I think those ESP guys haven’t backed down an inch—but in a lot of other cases. A study is done, it doesn’t replicate, and the reply from the original authors is that they didn’t ever claim a general effect, just something very very specific.

Dogfooding it

Here’s a question to which I don’t know the answer. Theranos received reputational support from various bigshots such as George Schultz, Henry Kissinger, and David Boies. Would these guys have relied on Theranos to test their own blood? Maybe so: maybe they thought that Theranos was the best, most high-tech lab out there. But maybe not: maybe they thought that Theranos was just right for the low end, for the schmoes who would get their blood tested at Safeway and who couldn’t afford real health care.

I think about this with a lot of junk science, that its promoters think it applies to other people, not to them. For example, that ridiculous (and essentially unsupported by data) claim that certain women were 20 percentage points more likely to support Barack Obama during certain times of the month: Do you think the people promoting this work thought that their own political allegiances were so weak?

Summary

“Bad Blood” offers several take-home points. In no particular order:

– Theranos’s claims were obviously flawed, just ridiculous. Yet the company thrived for nearly a decade, after various people in the company realized the emptiness of the company’s efforts.

– Meanwhile, Theranos spent tons of money: I guess that even crappy prototypes are expensive to build and maintain.

– In addition to all the direct damage done by Theranos to patients, I wonder how much harm arose from crowding-out effects. If it really took so much $ to build crappy fake machines, just imagine how much money and organization it would take to build real machines using new technology. You’d either need a major corporation or some really dedicated group of people with access to some funding. Theranos sucked up resources that could’ve gone to such efforts.

– There are lots of incentives against criticizing unscrupulous people. People who will lie and cheat might also be the kind of people who will retaliate against criticism. I admire Carreyrou for sticking with this one, as it can be exhausting to deal with these situations.

– The cheaters relied for their success on a network of clueless people and team players, the sort of people who would express loyalty toward Theranos without knowing or even possibly caring what was in its black boxes, people who just loved the idea of entrepreneurship and wanted to be part of the success story. I’ve seen this many times with junk science: mediocre researchers will get prominent scientists and public figures on their side, people who want the science to be true or who have some direct or indirect personal connection or who just support the idea of science and don’t want to see it criticized. The clueless defenders and team players can do a lot of defending without ever looking carefully into that black box. They just want some reassurance that all is ok.

The stakes with Theranos were much bigger, in both dollar and public health terms, than with much of the junk science that we’ve discussed on this blog. Beauty and sex ratio, advice on eating behavior, Bible codes, ESP, ovulation and voting, air rage, etc.: These are small potatoes compared to the billions spent on blood testing. The only really high-stakes example I can think of that we’ve discussed here is the gremlins guy: To the extent he muddies the water enough to get people to think that global warming is good for the economy, I guess he could do some real damage.

Again, I applaud Carreyrou for tracking down the amazing story of Theranos. As with many cases of fraud or self-delusion, one of the most amazing aspects of the story is how simple and obvious it all was, how long the whole scheme stayed afloat given that there was nothing there at all but a black box with a robot arm squeezing pipettes of diluted blood.

93 thoughts on “Some thoughts after reading “Bad Blood: Secrets and Lies in a Silicon Valley Startup”

  1. > easier just to withdraw. That’s what I’ve done when I’ve had colleagues who plagiarize, or fudge their data

    Aren’t the obligations of a tenured professor at an Ivy League university greater than this? Those “colleagues” are (probably!?) still cheating. Why should the public trust Science in general if individual academics won’t (meaningfully) police their own ranks?

    • D:

      I don’t know what my obligations are here. I do know that sometimes when I call out misconduct, I get attacked. So it doesn’t always seem to be worth the effort and costs to do this.

      • It is a hard question. If you are a member of the American Statistical Association, you should avoid “condoning or appearing to condone statistical, scientific, or professional misconduct.” I would say that, by failing to report research misconduct you appear (to me, at least) to condone it.

        Again, if you were an adjunct, untenured or somehow at professional risk (and perhaps you are — if so, sorry!), I would cut you some slack. But if a senior, tenured, Ivy League professor won’t even report misconduct . . . then just how much faith should any of us have in the published literature?

        • D:

          The first time such behavior happened to me, I was untenured, looking for jobs, and wanted to avoid trouble. I was in the middle of being attacked for unrelated reasons (some of my colleagues at Berkeley didn’t want me around, and it seems that lying about my work was the easiest way for them to get to that goal), so I didn’t feel like picking another, unrelated, fight.

          The other times, my connections were a bit more distant, and I can’t really say with any confidence that there was scientific misconduct. It was more that I felt that these people were doing sloppy work, and I didn’t want to be involved with it.

        • In the latter case (sloppy work) I think post-publication review is the best remedy, and no one can fault your blog’s role in promoting this!

        • Thanks for the color. Again, I agree that this is a hard question, one that all of us confront one way or another.

  2. Hard not to notice many similarities with the Bernie Madoff scam. The stonewalling of those asking for information, the years of lies, the faking of data / statements. The reliance on authority. The willingness to believe “too good to be true” results / returns.

    > easier just to withdraw. That’s what I’ve done when I’ve had colleagues who plagiarize, or fudge their data

    You can only fight so many battles in life, and need to choose them wisely.

    > By midafternoon, Ana had made up her mind. She wrote up a brief resignation letter and printed out two copies.

    I remember going to a meeting with the company president and a bunch of other executives; I was trying to derail a really stupid (not fraudulent, just stupid) project. I’d printed out and signed a letter of resignation in case the decision went against me. This was scary, but I found that once I signed the letter and put it inside my folder of meeting notes it gave me a certain calm and clarity of purpose because I knew what I was going to do under what circumstances. [My arguments carried the day and I didn’t use the letter.] I can fully understand Ana not taking Elizabeth’s call.

  3. Andrew – this must have been both exasperating and fun to write – it certainly was both exasperating and fun to read! I do have a small overarching quibble, and that just to remind everyone that hindsight is 20/20. After you know how the story ends, it is much easier to see the details that matter. Still, crazy that it took so long.

  4. “He talked to Schultz, Perry, Kissinger, Nunn, Mattis and to two new directors: Richard Kovacevich, the former CEO of the giant bank Wells Fargo, and former Senate majority leader Bill Frist.” This is interesting. You could imagine one or two of these guys getting conned, but all of them? How could that be?

    Perhaps this group was chosen because they were old and doddering? How much do you think Kissinger knows about modern technology?

  5. Does the book go into the affirmative action angle, that people were especially eager to support a woman shattering the glass ceiling?

    There are lots of incentives against criticizing unscrupulous people.

    “President Obama appointed [Holmes] a U.S. ambassador for global entrepreneurship, and Harvard Medical School invited her to join its prestigious board of fellows.” Harvard Medical School, huh?

    Sounds like this might be similar to Ellen Pao and her ex-husband Buddy Fletcher.

    https://www.vanityfair.com/style/scandal/2013/03/buddy-fletcher-ellen-pao

    • I’ve read the book recently and didn’t have the sense that this was a major (or even minor) issue, although I may have missed it.

      In addition to what Andrew has pointed out, what struck me is how important Elizabeth Holmes’ connections to powerful and wealthy people were. Someone who, unlike her, did not have family friends who were powerful and well-connected, and who did not attend Stanford or another elite institution, likely would not have been taken as seriously, and almost certainly could not have raised a small fraction of the funding without providing proof of at least preliminary success. It was also interesting to note how much time and effort was spent deceiving investors and hiding failures. Elizabeth Holmes worked incredibly hard, according to the book, and also is clearly talented, but chose to put this hard work and talent towards deception and grandiose pie-in-the-sky ideas.

      • Vince:

        Yes, one thing that Theranos had in common with some recent science scandals was the role of powerful connections in propping up obviously bad work. Perversely, one thing that impresses me about Brian “Pizzagate” Wansink is that he got so far without relying on any powerful mentor figures. He really pulled himself up by his own bootstraps! He hit the sweet spot by working in a field that was somewhat of an academic backwater but where there was a lot of interest by funders and the press.

      • Gregor:

        If you read Carreyrou’s book, you’ll see that lots and lots of people were skeptical of Theranos. But Holmes/Balwani/Boies managed to control the information flow and intimidate whistleblowers for a long time. It seems that this active management was necessary, otherwise the house of cards would’ve tumbled many years earlier.

  6. I always thought it was a fraud because the descriptions of what they claimed they could do as tests made no sense. My issue – and the issue a lot of people had – is that such a fraud is impossible to maintain, so why do it? That is the kind of ‘evidence’ that tilts people toward belief. This is a statistics blog, right? You might think ‘it’s screwy to believe a company can do this’ but you have actual evidence that affects your beliefs that look, Walgreens is buying in and so are all these other people. So you adjust: maybe it’s not completely fraudulent or screwy but only partly screwy so they could possibly pull it off. Take yourself through the reasoning: it makes no sense but why would companies invest in it? You assume the companies would do some checking and you have to trust their checking because you generally trust those companies. How this gets pulled off is a neat story of how to manipulate beliefs. It’s not Madoff: he kept paying out returns, sometimes in cash and always on paper, that indicated an actual business existed. It took someone familiar with forensic accounting and stuff like Benford’s Law to not only dig into available data but to get any traction with the public … and that traction actually only happened because Madoff eventually ran out of fund-raising enough so the cracks appeared. The difference is Madoff wasn’t promising anything other than a return, nothing that required ‘replication’ or other scientific verification. Theranos is weirder: when you claim science, at some point you have to show science. That imparts a degree of ‘well, why would they lie’ which goes beyond financial returns.

    • It was obvious from the start to anyone who had any biotech background, and anyone who sees the way things are manipulated these days esp in startups could believe “why do it” was just “because there’s easy money these days”.

      The big problem was there was no way to bet against them. Theranos wasn’t public, the “valuation” of 10 Billion was pure notion. You couldn’t short the stock, and you can’t even short the VC investor companies, and even if you could they weren’t actually risking anything like 10 billion.

      When the only way to put your money where your mouth is is one-sided upwards pressure you will find that unsurprisingly the valuation only goes up if you can just generate enough interest in the company.

      • Daniel:

        The “can’t bet against it” issue is interesting. This works with junk science, too. For example, if I really thought that ESP, as measured by Daryl Bem et al., was real, I could invest money into some ESP scheme involving people predicting the future. But if I think it’s B.S., I can’t “short” the idea. Or, to put it another way, I can invest in iffy science, but the investment is not about the science, it’s all about the idea that others might be suckered into believing it. For example, I could perhaps invest in the “risk eraser” program of Marc “Evilicious” Hauser, but my investment would have little to do with whether his ideas actually work and everything to do with the possibility that Hauser could make money by convincing other investors and government bodies into funding his organization.

        This seems similar to the Theranos story: Bigshots were investing in Theranos, not necessarily because they thought it had revolutionary technology, but because they thought Theranos had what it took to get others to invest in it.

        • Universities don’t hire Wansink because they think he provides valuable information about diet and health and soforth. They hire him because they think he’s likely to bring in $1M/yr for at least 10 years, which they will then skim the overheads on, which is probably an additional $500,000 on top of the $1M that goes direct to Wansink’s lab.

          This same effect is true at other levels of science as well. If you study a hot topic and are good at writing grants (have what it takes to get others to invest in you) you will do well in science. There is no way for anyone to bet against this stuff….

        • Daniel:

          I don’t think that’s quite right. I think they hire Wansink (or Brad Bushman, or other such big-dollar stars) because they think they’re top researchers, where “top” includes publications, grants, academic respect, and fame. I agree that probably nobody in the university administration is actually evaluating these people’s ideas, but I don’t think the motivation is pure dollars. I think the administrators think of the dollars as a sign that these researchers are at the top of their game. After all, to get the dollars you need to impress peer reviewers.

          Regarding “betting against this stuff” . . . one question is whether the reputations of Cornell and Ohio State, for example, are harmed by association with high-profile faculty who’ve made questionable research decisions. Or, to take an example closer to home, is the association with Dr. Oz a plus or minus for Columbia? I just don’t know.

        • ” I think they hire Wansink (or Brad Bushman, or other such big-dollar stars) because they think they’re top researchers, where “top” includes publications, grants, academic respect, and fame.”

          Notice that true discoveries or teaching prowess or actual impact on human knowledge doesn’t really enter except if it indirectly results in increases in “publications, grants, … respect, and fame” (which certainly it sometimes does but also often isn’t needed for publications, grants, respect and fame)

          The recursive nature of “reputation begets reputation” turtles-all-the-way-down with no real final foundation in actual production of science and knowledge is more or less too convenient for driving money and power at universities to get a pass. It’s just not ok for universities to say “hey, it’s not our fault, it’s his peers who gave him respect he didn’t deserve, we were just bamboozled” The universities know, or suspect, or should have suspected, or carefully looked the other way, and it’s not just in this one case. Going along with it all is collusion, just like all those guys who kept selling the mortgage backed securities, even though they had to know that tons of them were backed by liar loans etc.

          Science has been increasingly selling a bill of goods for 40+ years, and the only possible bet is one-sided. You can’t “make money” (ie. get any real benefit) from pushing back. There’s no “department of pushing back against bullshit with careful scientific audits” to get a lifetime tenure appointment to or whatever, in the same way that there’s no “startup in figuring out which other startups and VC firms are just fishing for money in a sloshing stew of newly printed FED dollars trying to keep the potato hot until they can eventually plop it all on everyone elses 401k and fly off to Belize”

        • From my biased insider info, I am going to strongly side with Daniel on this one – its the dollars that senior admin usually appreciates or at least can’t pass up.

          Even calling out homeopathy when there are grants funds coming into the university is challenging. Of course, it gets confounded with academic freedom.

        • Andrew, I have to beg to differ with you. Funding is all research universities care about (admin, that is). One of the reasons why I left the medical field as a statistician is that my salary at a medical school would have to be solely derived from grant funding, even tenure-track. I didn’t get into this to be in sales. And even in social science, there is no way to make full professor at my university without at least $1 million in grant funding obtained after promotion to associate professor and we’re not even a top 150 school.

          Moreover, I couldn’t even get my university to waive its indirect for a tiny $25k grant last year. There was no way for me to fund myself and a student for 3 months in the summer on what would be left after admin took their Mafia cut (which, by the way, is more than the mob charges. At least they have the decency to only take 10% of the the vig.).

          This is the perversity that creates the opportunity and motivation for people like Wasinink. Hell, LaCour nearly scammed Princeton into giving him a gig because it looked like he was already a cash cow.

          Greed is at the center of all of this and universities and faculty are not immune to base impulse.

        • Inadequate Equilibria (https://equilibriabook.com/) talks about this concept. We should only expect public consensus to be true when there are real incentives to be correct; Theranos didn’t have those incentives. Although even then there will always be exceptions (Enron was public but managed to maintain a fraud for awhile).

        • It may be possible to make a little pocket change on this but from a societal standpoint billions are being bet on Theranos scams and maybe hundreds on some obscure mispriced low liquidity prediction bet. the problem isn’t that I personally want to make some kind of money on this problem its that Society is wasting huge quantities of resources on scams. I don’t think prediction markets as they stand today will solve this problem.

        • Given that a lot of the Theranos dynamics appear to be ‘information cascades’/’common knowledge’/signaling dynamics, a prediction market doesn’t necessarily have to be enormous in order to puncture bubbles and reveal that the emperor has no clothes and trigger serious investigation by outsiders – what brought Carreyrou into the Theranos picture, as he describes in _Bad Blood_, was really quite a small thing, a minor patent lawsuit spat. Already, futures markets are often low-volume and yet serve important informational and revelational purposes. (For example, it is claimed by some that the absolutely tiny Bitcoin futures market which launched in December 2017 is what shut down the latest speculative frenzy.)

        • Its not about making money. It’s the signal.

          A strong bet predicting a bust leads to a lot of people digging deeper or taking notice.

    • > such a fraud is impossible to maintain, so why do it? (Jonathan)

      There’s a lot of money sloshing around. Maybe some of it makes it into offshore bank accounts, and you think you can catch a plane in time.

      • The problem with that is that as Carreyrou describes, Holmes passed up opportunity after opportunity to sell out dozens or hundreds of millions of dollars of stock, and instead, kept doing her best to amass as much Theranos stock as possible (even going deep into debt to acquire more). Far from trying & failing to ‘bury the body’ in a pump-and-dump, she was speeding in a racecar in the opposite direction from the cemetery with her foot pressed firmly to the floor.

        • Perhaps Holmes, from ignorance/self delusion, figured her staff would figure it out. Saw this at my former employer (Fortune 50 company). Estimated the chances ~10% in successfully pulling off a project in suddenly and drastically reduced time timing – relevant VP commented – we’ll go for it, you R&D guys always are always too pessimistic. 1 in 10 did not happen in time demanded.

  7. Great post, Andrew. One of your best. As to Boies, whose law firm I’ve done a little work with, the ability to represent clients whose bona fide you may not completely understand is not an ethical problem — for lawyers.

  8. ” A study is done, it doesn’t replicate, and the reply from the original authors is that they didn’t ever claim a general effect, just something very very specific.”

    Sadly, I have observed a similar thing in online exchanges with Ph.D. students in psychology. They cite studies as illuminating some general truths about human nature, but when you point out methodological problems, they retreat to, “Sure, but this particular study is still very suggestive about how people act under certain conditions. Of course we need to do more studies.” They are already learning the two-step.

  9. I wonder whether this story is really an outlier in so far as marketing discoveries that don’t pan out eventually. Elizabeth Holmes seemed the most unlikely entrepreneur to snooker anyone. Glamorous, Stanford educated, etc, etc.

    • Sameera:

      Interesting point. There must be lots of companies that raised millions of dollars based on bullshit and then crashed. The difference with Theranos was that they raised billions (at least on paper), not just millions, and they behaved really badly (not just lying, but also attacking critics and whistleblowers), and someone wrote a really good book about it.

      In that way this is also similar to the problem of junk science, where are lots and lots of cases, with a few of the extreme examples getting the most attention.

      Also, in both business and science, the question arises of how much to blame the system and how much to say that these are just a few bad apples. With junk science: Is the whole system broken, or is the revelation of replication failures an example of how the system works and is self-correcting. With business: Is the whole system broken, or is the revelation of corporate failures an example of how the system works and is self-correcting, just an example of some rich suckers being parted from their money? I don’t know: it’s a lot easier to think about individual cases than to assess how well the system performs compared to various hypothetical alternatives.

      • >>> With business: Is the whole system broken<<<

        How can this be even an option?! I am relying on the outputs of businesses all the time in day to day life.

        How can a wholly broken system work so well? On net, I'm quite happy with the products & services I receive for the price I pay.

      • Andrew,

        Thanks for raising those questions. I speculate some or all of the above have been in operation concurrently and asynchronously in nearly every high stakes product development effort. The Australians and Brits, during the 90s’ chronicled some of these systems questions. At least by those who were also proponents of the evidence based and managed care movements. The proponents’ names’ elude me this moment.

        I’m looking forward to reading the book, particularly after the commentary posted here. The author was in my neighborhood a month ago. I missed his book talk.

    • “Elizabeth Holmes seemed the most unlikely entrepreneur to snooker anyone. Glamorous, Stanford educated, etc, etc.”

      I would think that being glamorous would put one in the “likely to snooker” category. After all, isn’t glamor a type of snookering? (Indeed, the original meaning of glamor was “enchantment, magic”)

      • Martha,

        I don’t think a ‘glamorous’ person is any more likely to ‘snooker’. It’s that marketing itself employs all sorts of strategies and tactics, including having attractive [groomed] men and women on the front lines. I meant really to stress the Stanford connection that is obviously viewed as prestigious and career-enhancing. I’m sure her being glamorous was a plus.

        I want to be surprised that she and her buiness partner behaved the way they did. But in fact such behavior is not atypical in some enterprises.

  10. On the theme of recent, excellent, infuriating books about contemporary tech companies, I highly recommend Disrupted: My Misadventure in the Start-Up Bubble by Dan Lyons, a memoir of a former Newsweek journalist who ended up at the internet marketing company HubSpot. There’s less of a connection to fraud, or statistics, than in the Theranos story, but there are some similarities — ridiculously high valuations, a culture of hype and delusion, and low-tech methods hidden behind a curtain of high-tech gloss. Also, it’s hilarious.

    • Today there’s tons of funding available today for technology ideas that will generate dazzlingly spectacular returns with a million-eyeballs traction.

      But if you have a great, relatively lower risk idea that will generate great (much better than industry average) percentage returns and maybe tens of millions of revenues getting funding is very hard.

      The incentives to lie and overstate are huge and perverse. There’s little funding left for good but not mind mindbogglingly large business plans. Everyone is running after only the high risk high reward ideas.

      • Right this is a big problem, and this kind of hype is fraudulent even if not somehow technically illegal.

        Worst of all though is it’s killing real productivity. Los angeles has shanty town encampments on every highway footbridge overpass and under every bridge… Twice as many people kill themselves with opioids as die in car accidents Nationwide…

      • The last time I checked, the European Research Council explicitly encourages high risk projects (which apparently were found to have a stunning 99% success rate), and they derisively call other research as „incremental“. As an ERC revIewer I get to read overhyped claims that either amount to nothing or are literally tiny extensions of the last 30 years of the author‘s work (which is fine of course). I complained to the ERC about this and the president wrote to me that they would take my comments into consideration. Due to the secrcy around the review process, we will never know if things will change.

  11. “Rather, I’m interested in the social processes by which obviously ridiculous statements just sit there, unchallenged and even actively supported, for years, by people who really should know better. Part of this is the way that liars violate norms—we expect scientists to tell the truth, so a lie can stand a long time before it is fully checked—part of it is wishful thinking, and part of it seems to be an attitude by people who are already overcommitted to a bad idea to protect their investment, if necessary by attacking truth-tellers who dispute their claims.”

    Too true squire!! BIG time. All of it. God I could write a book on that. Oh – wait…

  12. Great post. Content like this is why I keep coming back (and the price is right too!)

    One compound observation. You state ” But 100 tests from one drop of blood: no way.” You also note that a similar device, the Piccolo Express, already existed.

    The Abbott web page describing the Piccolo Express states that it can perform tests on 100 uL of blood (2 or 3 drops). Their basic metabolic panel measures the level of 15 different quantities. See https://www.pointofcare.abbott/us/en/offerings/piccolo-xpress-chemistry-analyzer and https://www.abaxis.com/medical/piccolo-xpress.

    I’m not a biologist or chemist, but it does not seem unreasonable that if 15 tests can be done on two drops, 100 could be done on one drop. That’s a little bit more than an order of magnitude decrease in blood volume/test. That must be getting near the limit of such testing. One source on the web states that there are about 7,000 to 25,000 white blood cells in a drop of blood. Dividing a drop into 100 portions for testing would give a mean of 70 white blood cells per portion. I’m not sure what the proper model for the variance of the white blood cells per sample is—but it seems to me that it cannot be negligible. So, some tests would probably have a significant variance due to the sampling issue alone—something analogous to shot noise in electrical circuits. Testing smaller amounts of blood would increase this problem.

    This case brings to mind another fraud that has managed to secure something like 10 times the funding that Theranos did. Of course, I am referring to LIGO and other “gravitational-wave detectors”. One does not have to be an informed scientist to understand that the technical performance characteristics they claim for their instruments are literally incredible. For example, they claim to be able to measure displacements equal to about 1/10,000 of the width of a proton in a interferometer arm 4 km long. That’s like performing 10,000 diagnostic tests on 1/10 of a drop of blood. But, in spite of its obviously fictional performance, the LIGO charade marches on—gaining kudos and dollars.

    Bob
    PS—just in case it’s not clear, my last paragraph is written tongue-in-cheek to show the challenge of correctly evaluating strong scientific claims. I don’t think the LIGO team are frauds—but if it turns out that they are, everybody will say, “In retrospect, it is obvious that they were frauds.”

      • Maybe the book answers this question but…

        By and large venture capitalist are not obviously reckless or hasty about how they invest money (except sometimes for trivial seed investments). An investment with a technological claim would be run by some supposed expert in the field. If 100 drops is such an obvious ‘no way’ to a non-expert, why didn’t the expert catch it too? Or was there no expert (unusual, but perhaps those that did normal due diligence were precisely those that didn’t invest). Or was the expert overruled?

        I’m reminded of the Madoff scandal. After the fact, it seemed to become conventional wisdom that no one could possibly have believed that the returns his funds had showed were real – they were just too good for any legitimate investment.
        In fact, if you take his public one-sentence description of the investment strategy and mimic it as naively as impossible you get basically the same return over the relevant period (one paper found better, ‘earning’ 11.5% vs Madoff’s 10.5% annually) – but Madoff had _vastly_ lower month-to-month variation. To find this obviously implausible, you need to ignore the (very plausible) returns, focus on the variation, assume he was doing no return smoothing or has any smarts beyond the blandest statements of his methods (let alone doing something rather different than advertised). But no, now it’s “no investor could possibly have believed steady 10.5% was real; so everyone must have realized this was a Ponzi scheme”.

        • Bxg:

          Theranos’s claims could have been real, if they really had new technology. But they didn’t have new technology, and this was known by many Theranos employees and former employees. So, to keep the scam going, Theranos had to control the information flow.

    • Bob:

      Fair enough. With new technology of some sort, or even very careful tolerances and improved assay design and analysis, a factor of 20 improvement in efficiency might be possible. Unfortunately, Theranos had none of this—it had no serious improvements in methods or technology at all—but, if it did, who knows, maybe something would’ve been possible. I assume that assay companies are already working on this, and I’m guessing that one limiting factor is a need to not let accuracy levels decrease. Theranos was not constrained in that way, since they were willing to cheat.

      • To be precise, the super-dilution was done with standard commercial analyzers because the Theranos ones didn’t work. They wanted to still do the finger prick trick, but use real Siemens instruments behind the scenes. As a result they were operating them well outside their intended limits.

  13. > “George said a top surgeon in New York had told him the company was going to revolutionize the field of surgery and this was someone his good friend Henry Kissinger considered to be the smartest man alive.”

    Did Kissinger recommend Theranos, or did Kissinger recommend the opinion of some NY surgeon who recommended Theranos?

  14. Theranos is impressive for the size of its fraud and for having amassed a roster of big names speaking up for it. But its investors surely must have understood what ‘caveat emptor’ means. And, not being in the venture-capital business, I am not sure how how often VCs fund fraudulent stuff based on junk science. Maybe it’s commonplace but not often visible. And maybe VCs are mindful of Keynes’s beauty contest–the stock market is not about picking winners but about picking what other people think will be winners.

    Many programs funded by Federal and state governments are not supported by evidence, billions of dollars a year spent on programs that have not shown effects (one need only look at the history of research on (non)effects of the Title I program in education, which now spends about $15 billion *a year*). This is not called fraud because a democracy can choose to spend its money any way it wants. But it’s discouraging to think how little science is at work here.

  15. My biggest takeaway from the book is just how ignorant our supposedly high-powered, wealthy elite actually are. I kept thinking “somebody thought it was smart to have this knucklehead run a billion dollar company?” Heck, I’m surrounded by relative geniuses down here in the trenches.

    • This. I recently deposed a “global-head-of-something-or-another” in a trade theft lawsuit. I was working my way through the documents his employer had produced in response to my subpoena duces tecum when the witness suddenly said “Look, it’s all B.S.”

      His employer was a company founded upon a technological advance and worth tens of billions. Its CEO was making tens of millions annually and yet they had an 8 figure line budget item that nobody had ever done more to understand than to calculate a mean and an occasional regression. “The managers just wanted the line’s slope to tilt in the right direction because that’s how they got paid”; and the slopes indeed correlated nicely with their pay packages.

      I’ve often wondered (elsewhere) whether Gosset’s greatest gift to middle management was the ability to stand before the man with (seemingly) more than just a hunch in your hand.

  16. “2. Clarke’s Law. Remember Clarke’s Third Law: Any sufficiently crappy research is indistinguishable from fraud. Theranos was an out-and-out fraud, as has been the case for some high-profile examples of junk science. In other cases, scientists started out by making inadvertent mistakes and then only later moved to unethical behavior of covering up or denying their errors. And there are other situations where there is enough confusion in the literature that scientists could push around noise and get meaningless results without possibly ever realizing they were doing anything wrong. From the standpoint of reality, it hardly matters”

    There should be a distinction. It’s like planned vs accidental homicide. The punishment depends on the intention in the case of homicide. Why should the two kinds of crappy research (unintentional va intentional) be indistinguishable.

    • I understand Clarke’s Law as saying that the *results* of sufficiently crappy research are indistinguishable from the *results* of fraud (as analogously, the *results* of planned vs accidental homicide are the same — death of the victim. But additional information may prove which one occurred, so the punishment needs to take that into account.)

      • PS The original version of Clarke’s Third Law was “Any sufficiently advanced technology is indistinguishable from magic.’ This also needs to be interpreted as talking about the results, not the process by which the results were obtained.

  17. Regarding your note on page 151, I actually think academic researchers behave the same.

    Theranos and researchers both give away their ideas for free. Theranos told everyone they could about their idea for doing tests on small amounts of blood, they just didn’t tell anyone how they did it. This is similar to researchers. They have papers with amazing results, but don’t share their methods or data so that no one knows the secret sauce needed to get those results. This uncertainty makes it difficult to reproduce/question the work.

    • Jordan:

      Your description of science researchers doesn’t match what I’ve typically seen in statistics and political science. No doubt there are lots of papers in both these fields that are useless, but people are pretty good at sharing their secret sauces. For example, consider that Electoral Integrity Project that we criticized awhile back. The study was terrible—but, on the plus side, they were open and stated exactly what they did. The news media and the political science profession have only themselves to blame for not reading the documentation and twigging to all the problems.

        • This just proves what we already knew: reproducible research is impossible **in principle** when studying cancer. It doesn’t really matter if the methods sections are complete, since each cancer is a unique disease and every cell lineage in each lab has its own unique adaptations.

          The “cancer reproducibility project” was warned this was a waste of critical funds beforehand. Funds that could have otherwise gone to legitimate cancer researchers rather than lab-coated barnacles.

        • I agree that this project and most of what the COS does is a waste of money, but I think it at least highlights the lack of detailed method sections in biology, and the lack of an open culture.

        • Jordan:

          I agree that biology does have an issue in regards to lack of details in the methods section. For example, my wife, a PhD in biology who left academics because she did not like environment, told me how a PI was bragging to her about how they wrote a paper in which they were intentionally vague about a key step in a process. This would allow the PI’s lab to use this methodology for themselves while other labs had to spend significant time trying to refine this key step.

          With that said, I’ve been exposed to research in several fields and none of the others seem to have the same level of cut-throat attitudes as biology. Couldn’t put my finger on the “why”, but I would guess it has to do with the structure of handling money.

          By far, math and statistics seem to be the fields that are the most laid back of what I’ve seen.

        • Why do you want to “highlight the lack of detailed methods sections” and “open culture” in cancer research? Do you think these are somehow negative attributes?

          Cancer is too complex to expect legacy scientific approaches like reproducibility to work. Its just a waste of valuable time and resources to write up a detailed methods section, since the study will never actually be replicated anyway. Every cancer is its own unique disease, have you really wrapped your hand around what this means?

          And lets say the raw data is “open”, what is anyone else going to do with it? Interpreting the data requires knowing exactly what was done to generate it, but its a waste of time to even try conveying this.

          Only the people who were there and did the hard work can properly interpret the results. Attempts of anyone else to do so will be a joke. That isn’t to say sharing data is always bad. Sometimes it is helpful to release some curated, summarized data in the supplement or extended data sections that supports the papers overall narrative.

        • If reproducibility is impossible then how does the research help? What you found is uinque to your circumstance. What’s the point in disseminating it?

          Why not let just every oncologist treat as per his own intuition.

        • “If reproducibility is impossible then how does the research help?”

          – Later research extends the results of earlier research using more sophisticated techniques. This enhances our understanding of cancer and leads to the development of more advanced treatments.

          This isn’t going to happen if cancer researchers are recording every excruciating detail so that my dog walker can pop in the lab, set everything up, and do the same experiment in a weekend. The conditions faced by those studying cancer are extremely difficult already, adding extra burden is not going to lead to improved performance.

        • P53:

          I don’t know anything about cancer research and maybe you’re right there. For other research I’ve seen, yes there’s an “extra burden” in ensuring reproducibility, but I’d argue that this extra burden pays off in better research. I think the requirement of reproducibility has helped my own work, and I can think of lots of projects of mine that I think would’ve gone better had I kept a reproducible workflow throughout.

        • Calling this the “cancer reproducibility project” is actually kind of a misnomer. We’re not dealing with oncologists here, we’re dealing with basic lab experiments like cell cultures, mice, etc. So it really should have just been called the “biology reproducibility project”, but I guess they wanted to make it sound more important than it is.

        • @p53

          I assumed we were talking about other specialists and not your dog walker.

          >>> Later research extends the results of earlier research using more sophisticated techniques.<<>>This enhances our understanding of cancer and leads to the development of more advanced treatments.<<<

          Doesn't a treatment, axiomatically imply reproducibility (within bounds of variation, of course)? Is there such a beast as a "non-reproducible medical treatment"?

  18. @Rahul
    There is reproducibility but not in the sense used by the “cancer reproducibility project”.

    For example, a treatment target may be discovered in cell line A grown in media M1, but if another lab tries to reproduce this they’ll usually get different results.

    However, with sufficient effort and skill, a lab should be able to build on the discovery. They achieve this by improving/optimizing the methods. For example, they may discover that a treatment designed for the target found in the first study works in cell line F grown in media M4.

    That is a novel, publishable result because it extends the first study to advance knowledge. The “unreproducible” result is just due to low-effort. Mindlessly repeating what someone else already did adds nothing.

    @Andrew: “I think the requirement of reproducibility has helped my own work”

    Cancer research is much more difficult than math-based fields like statistics and physics.

    • > However, with sufficient effort and skill, a lab should be able to build on the discovery. They achieve this by improving/optimizing the methods. For example, they may discover that a treatment designed for the target found in the first study works in cell line F grown in media M4.

      This does not sounds like “building on discovery”. Rather it seems that the second lab disproved the hypothesis form the first lab and introduced its own hypothesis.

  19. I have been slowly trying to read this book for awhile. I find it impossible to read through too much of for too long of a time. Not because it is poorly written or an uninteresting book, but rather the unending and incredible amount of harm done to so many different people with so many different ties to the company, as well as the ability to use NDAs to cover up lies, deceit, fraud and wrongdoing, and the fact that large companies can use these to bankrupt anyone who wants to tell the truth, because of deeper pockets (and that there are lawyers unethical to do so). And just remember, if my memory of recent news is correct, there are persons in even higher positions who also have a slew of NDAs and have also threatened lawsuits if anyone discloses anything.

    We truly appear to have lost our moral compass both in our personal behavior and our laws. I truly hope that Ms. Holmes and Mr. Balwani spend a long time in jail, and that all the people hurt by them sue them for all they are worth. The book to me is inordinately depressing.

    • Roy:

      Holmes seems to have been an out-of-control liar, and it sounds like Balwani watched Glengarry Glen Ross a few too many times. But what really got me angry were their lawyers, as they seemed to know exactly what they were doing, and seemed to have no problem playing the role of hired guns when harassing whistleblowers. I wonder if they came home from work every night, satisfied with what they’d been doing. Or if they feel bad about it.

      • What is worse is there is likely not a way to sue the lawyers also. Unless you can prove that they had knowledge of the deceit (and maybe even not then), they can argue that they were just aggressively protecting their clients interest. They walk off with hefty fees, a suicide, some really shattered careers and lives, but hey, they are just doing there jobs. The epilog was interesting – one of the lawyers who made most of the threats was promoted after it was all done. Perhaps we should start giving promotions to all the people who fudged data, produced results that don’t replicate etc etc.

  20. I am 2/3s of the way through the book. The biggest thing for me is how this corroborates what I know about big (and small) business. This company is not unique in any way. I have found ‘Elizabeths’ in almost every company I have worked. It still astounds me how these people who are willing to lie and cheat are masters at fooling what you would think are intelligent people. While not always the case, these type of people tend to rise to the highest levels in corporations. The rest of us suffer as a result.

    • Mark:

      My experience with people who are willing to lie and cheat is not that they’re so great at fooling people, at least not in the medium to long term. My experience is that these liars and cheaters are good at short cons, and then the people in their orbit either play along (because the liar/cheater brings in money or some other tokens) or withdraw (because who wants to deal with the liar/cheater).

      The liar/cheater is surrounded by a buffer of accomplices and dependents, who are themselves surrounded by a buffer of people who are well aware that the liar/cheater is a bullshitter but is good at lying and cheating, and then there’s the outside world of people who are just aware of the liar/cheater’s stellar reputation. The key is that the people in the buffer, who know all about the problem, don’t blow the whistle, because blowing the whistle would be costly.

      Theranos is an example. Lots of people on the inside knew there was a problem, but information flow was controlled in such a way that the knowledge didn’t make its way to the outside. I bet there were lots of rumors, but rumors are, rightly, not generally considered enough reason for action.

  21. Very small point, in relation to the comments about NPR above: it’s useful to distinguish NPR as a producer of content and a distributor of content. They produce Morning Edition, All Things Considered, Planet Money, among others. They distribute Fresh Air, Car Talk (well, distributed) and TechNation/Biotech Nation to local public radio stations, but NPR does not produce the content. I personally wouldn’t hold them responsible for an interview on a show they distributed, but reasonable people will differ.

    See https://en.wikipedia.org/wiki/Tech_Nation and https://en.wikipedia.org/wiki/NPR#Programs_distributed_by_NPR

    Relatedly, just because something is broadcast on your local public radio station does not mean it was distributed, let alone produced, by NPR, as there are other distributors out there, like PRX and APM.

Leave a Reply

Your email address will not be published. Required fields are marked *