Skip to content
 

Stasi’s back in town. (My last post on Cass Sunstein and Richard Epstein.)

OK, I promise, this will be the last Stasi post ever.

tl;dr: This post is too long. Don’t read it.

Some sincerity

Let me be sincere for a moment before lapsing into the sort of counterproductive sarcasm that you all love or hate so much.

Cass Sunstein, like Richard Epstein, is a prolific law professor with public policy interests. Sunstein and Epstein are well connected, they’re plugged in, they’re willing to write about just about anything, they have the ear of powerful government officials, and they seem to have strong feelings that if people just followed their recommendations, the world would be a better place.

OK, to be fair, just about all of us feel that the world would be a better place if more people listened to us. I know I feel that way.

So what’s my beef? My beef with the Sunstein/Epstein twins is that they really really really don’t want to like to admit their mistakes, even when said mistakes are in everyone’s face.

This bothers me. As a statistician, I’m attuned to uncertainty, and as a social scientist and student of social science, I’m strongly aware of how much we learn from anomalies. I’m with Popper and Lakatos and Jaynes on this one: Build strong models, make big assumptions, issue strong statements, and then when (inevitably) you’re proven wrong, when you’re embarrassed in front of the world, own your errors and figure out what went wrong in your reasoning. Science is self-correcting—but only if we self-correct.

Richard Epstein is a buffoon; the less said about him, the better. Sunstein bothers me more, for the same reason that David Brooks bothers me: on one hand, they make strong and often obnoxious statements and then later don’t go back and admit they were wrong; on the other, this is entirely in contrast to their stated doctrines (Sunstein’s shtick about cognitive illusions; Brooks’s shtick about humility).

I can’t imagine I’m going to change Sunstein’s beliefs or his behavior. He seems to have a combination of the advocate’s approach of never acknowledging a counterargument, along with the journalist’s attitude that yesterday is gone and we should look toward tomorrow. And I’ve already tried and failed with Brooks (including lots of polite emails, once upon a time). So you can spare me the honey-works-better-than-vinegar advice. My audience right now is you, not them.

So, in all sincerity: I’m bothered and I’m angry. Epstein and Sunstein have some social influence. They could do better. I don’t agree with all their political positions, but that’s a separate point. They could argue their cases in a way that allows for understanding and discovery, rather than in a closed way in which they never examine their own adjustments, they never face their own anomalies. They’re preening to the world, and they should spend some time holding their arguments up to a mirror.

A solid argument could be made that I should shut up about this—not because it’s boring, not because I’m being mean, but because by giving good advice to Epstein and Sunstein, I’m actually giving them the opportunity to be better thinkers . . . and thus to do more damage. Maybe we’re actually better off that the foremost proponent of nudging keeps embarrassing himself, as this discredits that policy a bit.

Ultimately, though, as a statistician and social scientist, so I have some sympathy for Epstein and Sunstein in their quests for evidence-based policy. Not in all the details that they can’t seem to get right, nor in the attitude that we should follow the guidance of rich people and celebrity law professors, but in the Bill Jamesian idea that we can do better through systematic analysis of the social-science variety.

It’s ventin’ time

OK, now that we got the sincerity out of the way . . . I made the mistake of reading Cass Sunstein’s latest column, “Why Coronavirus (and Other) Falsehoods Are Believable,” which states:

The broader phenomenon is something that psychologists call “truth bias”: People show a general tendency to think that statements are truthful, even if they have good reason to disbelieve those statements. If, for example, people are provided with information that has clearly been discredited, they might nonetheless rely on that information in forming their judgments. . . .

OK, fine. But now we’ll get some of Sunstein’s special expertise:

The underlying problem goes by an unlovely name: “meta-cognitive myopia.” The basic idea is that people are highly attuned to “primary information” . . . By contrast, we are less attuned to “meta-information,” meaning information about whether primary information is accurate. . . .

Always good to get a spoonful of jargon with our factoids. The jargon serves a similar function as the stuff in the toothpaste that gives you that tingly feeling when you brush: it has no direct function, but it conveys that it’s doing something.

Sunstein then describes the result of a recent psychology experiment.

What’s frustrating here is that Sunstein does not give any specific examples of erroneous evidential claims that people have believed, even after they’ve been refuted. So I thought I could help out.

Here’s a false claim that got spread around the world:

That’s pretty wrong, actually. As we discussed here, that passage exhibits the scientist-as-hero fallacy, neglect of variation, and a piranha violation—along with the even simpler error that it’s reporting some experiments that never existed.

But if you put the phrase “another Wansink (2006) masterpiece” in a bestselling book, people will keep remembering it—even after the work in question has been refuted.

Here’s another false claim that got some attention:

Here’s another claim, albeit one that hasn’t been disproved, at least not yet:

Then there was this:

Knowing a person’s political leanings should not affect your assessment of how good a doctor she is — or whether she is likely to be a good accountant or a talented architect. But in practice, does it? Recently we conducted an experiment to answer that question. Our study . . . found that knowing about people’s political beliefs did interfere with the ability to assess those people’s expertise in other, unrelated domains.

The study in question never said anything about doctors, accountants, architects, or any professional skills. Shoot . . . I hate when that happens. A false or misleading claim gets out there in the national media, the authors don’t correct it, and it can hang around forever.

Or, hmmmm, anybody remember this:

At this stage, no one can specify the magnitude of the threat from the coronavirus. But one thing is clear: A lot of people are more scared than they have any reason to be. . . . Many people will take precautionary steps (canceling vacations, refusing to fly, avoiding whole nations) even if there is no adequate reason to do that. Those steps can in turn increase economic dislocations, including plummeting stock prices.

You spread an idea like that in public, and people might believe it—even after it’s been refuted.

It might seem odd that Sunstein cares so much about stock prices, but then there’s this from last January:

A simple measure of presidential performance takes account of just two variables: approval rating and the Dow. The argument for APDOW, as we might call it, is that public opinion matters, because it captures the wisdom of crowds, and that the performance of the stock market matters, because it provides one measure of how the economy is doing.

A simple measure, indeed. To be fair, this was in the Opinion section of the website, not the News section.

A way forward

This all seems to be a big problem, the idea that falsehoods can circulate even after they’ve been refuted.

I think what we all need is some Harvard professors to nudge us to doing what’s good for ourselves.

So far, our nudges are:

– Follow the dietary advice of Brian Wansink; the man is brilliant.

– Don’t selfishly be scared about the coronavirus. Be public spirited. Think about the stock market.

And, of course:

– Never ever admit that you made a mistake.

If we can all get nudged in that direction, all should be fine with the stock market and coronavirus and everything else in the world. It will all be rainbows and unicorns with no methodological terrorists, no Stasi, no “ill-considered and graceless” bloggers. Just a bunch of happy celebrities giving Ted talks to each other about nudges, life hacks, and all the ways in which ordinary people need to be saved from their own poor judgment.

Sincerity again

What the hell??? Sunstein writes a whole essay on “truth bias” and how falsehoods can stay afloat even after being refuted—and he doesn’t even once refer to his own extensive history in spreading falsehoods. Involuntarily spreading falsehoods—no shame in that: if you write enough things, you’ll make some mistakes—I know I do!—the problem is not in making mistakes, it’s in not admitting the mistakes. It’s hard to learn from your errors if you refuse to acknowledge them. This is a guy who held a high government post, whose whole shtick is that he designs research-based interventions, but (a) he has a track record of believing bad research, and (b) he doesn’t seem to go back and correct his mistakes or, even more importantly, figure out what went wrong with the reasoning that let him get fooled in the first place. I don’t want the government to put this guy in charge of our soup bowls—or our coronavirus policy.

26 Comments

  1. Kien says:

    I thought Bloomberg is supposed to fact check claims made by its columnists. If they don’t, maybe they could indicate upfront that the article may contain claims that are have not been verified.

    • Andrew says:

      Kien:

      I don’t know about Bloomberg, but when I contacted David Brooks and the New York Times about errors in published op-eds, I got the impression that the newspaper only holds itself responsible for factual claims in their news pages, not their opinion columns.

  2. jim says:

    I love the calorie label thing! That’s great, man, if we could all just believe in shit like that, wouldn’t the world be a better place?

    What if we just spell “cancer” with a “k”? Wow!! Instant remission!!

    And if we just all recognized our cognitive biases, maybe our dead covid friends will come back!

  3. Yan Zhang says:

    My model of problem: The problem here is cultural. Admitting you are wrong is “weak.” Weak means you aren’t “strong.” And people who aren’t “strong” don’t end up in the positions they do (or at least get outcompeted by people who do). Why would a “winner” like Trump trust you? It seems that cultures everywhere has this as a very fundamental thing (for the PRC CCP, if you make a single mistake, not only are you wrong, it makes all your previous actions wrong =D), so that’s not going to change. The only ones who will positively reward admitting wrong are those people who took the time to learn this set of values, and most of the world just naturally plays by a different set of values. Fundamentalists (on both ends of the political spectrum) also know this, so they’ll never admit they are wrong, since that’s where power comes from.

    Constructive (?) ideas: I guess I’ve gotten cranky about this over a long time, but I really believe now ideals are not enough, we need to learn from economists and align incentives. Force published papers to have predictions. If those predictions are wrong (or right), the institutional memory of that paper should keep track of this, and the person should be rewarded/punished accordingly (knowing that in the long run, it should average out, even though good papers will obviously get unlucky sometimes). Otherwise academic papers are just going to be Sybil attacked to death by unreplicable things with no power. Maybe one can start with an ArXiV overlay with a “replicability” index, and reward negative results? The idea is to take the good parts of systems with feedback loops, such as industry and finance, where many such things are actually eradicated because bad ideas lose money (and their problem is more in long-term alignment).

    • Andrew says:

      Yan:

      Perhaps there’s some disincentive for being publicly and obnoxiously wrong. Consider Epstein and Sunstein: these two guys will keep their tenured professorships and speaking gigs, they’ll continue to have plenty of rich friends, but there’s now a large chunk of people who won’t take them seriously for a long time, if ever. They’ve lost some influence. Being ideological is one thing. Being ideological and foolish is another.

      • Yan Zhang says:

        Andrew:

        I’m always surprised at how *little* clout people seem to lose when they’re publicly and obnoxiously wrong. I hope the chunk you describe is actually big! (not that I’ve done a better study than you on this =D)

        As a point for you and against me: at least on the Twittersphere their takedown has been pretty strong in the circles of people who seem to be getting COVID-19 right. (relatedly, one of the main people involved in the takedown is Harry Crane, who wrote some things that are very in line with the predictability thing I talked about in the previous document. I just did a review of one on Twitter for https://www.researchers.one/media/documents/122-m-FPP-resone.pdf , which relates to the idea of changing academia to make predictions, at least in science).

        Keep up the good fight.

        • Andrew says:

          Yan:

          I clicked on the link. Crane recommends that we switch to upper and lower probabilities. It’s an interesting position. I’d modify his claims slightly—he requires that if you state a probability, “then you must be willing to accept a bet offered on the other side,” but I’d think there have to be some restrictions on that “other side” or else you could be slammed by people with secret knowledge.

          I can’t imagine doing all my statistical analysis using upper and lower probabilities—it’s hard enough doing things using regular probabilities—but I guess that Crane would recommend that I do some sort of post-processing to go from statistical inferences to external probability statements, kind of like what we did for polls here.

          • Yan Zhang says:

            Yes, you probably want to bound the downside to lower information asymmetry risk. Also, there’s a EV/variance trade, so “you must be willing to accept a bet offered on the other side” is only fair if you gain some EV. Very few people like to eat free variance.

            In summary, the actual implementation needs more work (I may actually bug him about it and see what we come up with =D), but I think the direction is very refreshing (synergizing with something something more appreciative of negative results and replication something something). There may even be an actionable version. In a weird full circle to our original conversation, we need people to lose clout when they’re (repeatedly) wrong, and we need people to gain clout when they’re (repeatedly) right, and right now our only weapons are like blog posts like yours once they get past peer review, which AFAIK from sitting on *both* sides of it, is a black box. (you know how it is =D)

            Thanks for the link! Will check out when have time.

    • Martha (Smith) says:

      “Force published papers to have predictions. If those predictions are wrong (or right), the institutional memory of that paper should keep track of this, and the person should be rewarded/punished accordingly (knowing that in the long run, it should average out, even though good papers will obviously get unlucky sometimes). Otherwise academic papers are just going to be Sybil attacked to death by unreplicable things with no power. Maybe one can start with an ArXiV overlay with a “replicability” index, and reward negative results? The idea is to take the good parts of systems with feedback loops, such as industry and finance, where many such things are actually eradicated because bad ideas lose money (and their problem is more in long-term alignment).”

      Interesting idea(s)

      • Yan Zhang says:

        Thank you! Again, I’m going to shill Crane a bit, because he’s actually trying to do something, where I’m just theorycrafting. I hope we can extract something good for academia out of this COVID-19 by being acclimated to doing something different.

  4. Keith says:

    Stuart Richie wrote an article that criticizes Sustein and other psychologist/behavioral economics writers who cobble-together pronouncements on COVID-19 from the hodge-podge of judgment and nudge literature.

    https://unherd.com/2020/03/dont-trust-the-psychologists-on-coronavirus/

    There is a lot in this post that I agree with, as a psychologist. I think those in our field who are qualified to do clinical practice would do best to focus on providing appropriate mental health services.

  5. Bob76 says:

    Andrew wrote “OK, I promise, this will be the last Stasi post ever.”

    Of course, if he posts again on Stasi, he can admit that the above statement turned out to be wrong.

    Bob76

  6. Its not just Cass anymore.. says:

    A whole class of scholars have learned the lessons of success from Cass Sunstein, and now perpetuate in their own scientific work.

    It is widespread among the new “successful” scholars in areas such as climate psychology/communications & behavioral science writ large, regretfully. Take a look at the “gateway belief model” of (climate) science communication by van der Linden et al. (https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0118489) and Dan Kahan’s response (https://jcom.sissa.it/archive/16/05/JCOM_1605_2017_A03).

    Basically, their premise is you can nudge people into believing and acting on climate change simply by telling them that there is a 97% consensus among scientists (demand effect much?). Kahan and many others have voiced complaints both theoretical and with really flawed stats/methods, which they have ignored and claimed was baseless because of a so-called “replication”. Lots of op-eds in all kinds of news outlets have come out (they do at least 2 for every publication I believe). They’ve also now indicated that they’ll only respond to criticism in the form of something written to an academic journal in order to ensure they get to publish a response in print (for their CV’s, of course). Blog posts and “not scientific” to them.

    In spite of many scholarly objections, they’ve turned it into a research enterprise with lots of high impact publications in Nature, PPNAS, etc (and they use your favorite SEM’s with flawed causal interpretations). They put punditry ahead of actually getting the science right. Unwillingness to admit fault is considered a strength, not a weakness.

    I’m worried that we’re just at the tip of the iceberg on this type of science punditry. Many have learned the success that a Sunstein model can offer.

  7. AER says:

    As a lawyer, I think lawyers are particularly vulnerable to these kinds of errors. These guys truly are experts in the law. And the law touches many other topics in lots of subject areas across a huge number cases. When you are a lawyer, it’s easy to fool yourself–and others–into thinking that you are an expert in anything and everything, not just in the law.

    Of course, if you start admitting you were wrong (even to yourself), the illusion starts to wear thin.

    • Very good AER. This trend for laundering uncertainty is most concerning.

      I think special interests prevent experts to show exceptional integrity and challenge what is patently wrong with the analysis. The reality now is that there are such deep crises of knowledge in several domains that go beyond the expertise of any one discipline.

  8. Peter Dorman says:

    I’ve worked in some of the same areas as Sunstein for about two decades and have probably spent too much time reading his work and observing his public persona. He strikes me as an example of a certain type, especially concentrated in academia.

    Behind his particular intellectual commitments is one underlying sense: the vast majority is cognitively flawed, and only a small, elect group, of which he is a member, has the mental firepower to see the true state of things. The practical problem is how to limit the damage caused by “them”. Cost-benefit analysis is a “discipline” that constrains it. Nudges chip away at it. Sunstein’s big struggle is figuring out ways to contain the force of mass idiocy without directly violating the noncoercive strictures he admires in classical liberalism. He’s squaring a circle, how to be a mostly-libertarian, which means granting agency to others (as individuals), while maintaining a smug superiority. He takes delight, which you can see directly in his facial expressions, in the clever ways he does this.

    There are many ways to criticize people like him. Andrew is on to a core contradiction: the “I’m so much smarter than them” principle undermines self-correction and results in Sunstein being more in error than the masses he disdains.

    Also, the toothpaste analogy was a stroke of genius.

    • Peter is right in the main: Cass Sunstein is juggling different values and interests. He wrote Why Societies Need Dissent. I commend his effort b/c it was a theme that my father elaborated extensively in his field-comparative religions & cultures. Dissent is punished in many societies as we know. The subject was especially poignant for me since I was raised to believe that dissent could be punishable by death in ME and South Asia. That is pretty scary construct to live with.

      Anyway, I have learned a great deal from his work. I think that he has been able to recast the work of Irving Janis, Tversky & Kahneman, and others in ways that we as eclectic non-conformists have found useful.

      The modern university environment is highly competitive. And to come up with riveting perspectives is a premium. And I think it is much harder to do so. It’s an observation that some academics have shared with me.

    • Martha (Smith) says:

      “Sunstein’s big struggle is figuring out ways to contain the force of mass idiocy without directly violating the noncoercive strictures he admires in classical liberalism. He’s squaring a circle, how to be a mostly-libertarian, which means granting agency to others (as individuals), while maintaining a smug superiority. He takes delight, which you can see directly in his facial expressions, in the clever ways he does this.

      There are many ways to criticize people like him. Andrew is on to a core contradiction: the “I’m so much smarter than them” principle undermines self-correction and results in Sunstein being more in error than the masses he disdains.

      Also, the toothpaste analogy was a stroke of genius.”

      Well put.

  9. Michael Nelson says:

    I’m confused: As recently as August 2019, you wrote “When Sunstein learns he made a mistake, he corrects it.” Your criticism then was that he called people Stasi and that he keeps promoting poor studies, but you gave him credit for admitting mistakes. I think I missed a post in this series? Also, the link you give above for not admitting mistakes is dead (or at least won’t open for me).

    • Andrew says:

      Michael:

      It seems that my memory is not so great! I guess Sunstein admits mistakes sometimes, but not as often as I’d like.

    • Andrew says:

      Michael:

      I guess I’d also distinguishing between correcting a mistake (as Sunstein kinda did with his recent op-ed saying that, yes, it’s ok to be scared about coronavirus) and admitting a mistake (which would be Sunstein wrestling with his earlier errors rather than just kind of ignoring that they ever happened). Correcting a mistake is better than nothing, but I think it’s hard to really learn if you don’t look your past mistakes in the eye and figure out what went wrong.

      I’m not always the biggest fan of Nate Silver but I appreciate that, more than once, he’s acknowledged his mistakes, come to terms with them, and learned from the experience. I’d like to see Epstein and Sunstein do this too.

  10. Stephen Olivier says:

    You joke about moving the blog to twitter but it ain’t so bad sometimes it produces gold like this from Kieran Healy:
    https://twitter.com/kjhealy/status/1245418314769235968?s=20

  11. jim says:

    I stumbled across this today, I’m sure you’ve seen it but what the hell, I loved it so much!

    ” Sunstein simply cannot see the world around him for what it is, a failure of inquiry that seems especially ridiculous when you consider the frequency with which he harangues his readers to anchor action and perception in the cold currency of facts rather than mere heuristics and intuitions. “

    https://newrepublic.com/article/154236/sameness-cass-sunstein

Leave a Reply