Skip to content

Crowdsourcing data analysis: Do soccer referees give more red cards to dark skin toned players?

Raphael Silberzahn Eric Luis Uhlmann Dan Martin Pasquale Anselmi Frederik Aust Eli Christopher Awtrey Štěpán Bahník Feng Bai Colin Bannard Evelina Bonnier Rickard Carlsson Felix Cheung Garret Christensen Russ Clay Maureen A. Craig Anna Dalla Rosa Lammertjan Dam Mathew H. Evans Ismael Flores Cervantes Nathan Fong Monica Gamez-Djokic Andreas Glenz Shauna Gordon-McKeon Tim Heaton Karin Hederos Eriksson Moritz Heene Alicia Hofelich Mohr Fabia Högden Kent Hui Magnus Johannesson Jonathan Kalodimos Erikson Kaszubowski Deanna Kennedy Ryan Lei Thomas Andrew Lindsay Silvia Liverani Christopher Madan Daniel Molden Eric Molleman Richard D. Morey Laetitia Mulder Bernard A. Nijstad Bryson Pope Nolan Pope Jason M. Prenoveau Floor Rink Egidio Robusto Hadiya Roderique Anna Sandberg Elmar Schlueter Felix S Martin Sherman S. Amy Sommer Kristin Lee Sotak Seth Spain Christoph Spörlein Tom Stafford Luca Stefanutti Susanne Täuber Johannes Ullrich Michelangelo Vianello Eric-Jan Wagenmakers Maciej Witkowiak SangSuk Yoon and Brian A. Nosek write:

Twenty-­nine teams involving 61 analysts used the same data set to address the same research questions: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players and whether this relation is moderated by measures of explicit and implicit bias in the referees’ country of origin. Analytic approaches varied widely across teams. For the main research question, estimated effect sizes ranged from 0.89 to 2.93 in odds ratio units, with a median of 1.31. Twenty teams (69%) found a significant positive effect and nine teams (31%) observed a non­significant relationship. The causal relationship however remains unclear. No team found a significant moderation between measures of bias of referees’ country of origin and red card sanctionings of dark skin toned players. Crowdsourcing data analysis highlights the contingency of results on choices of analytic strategy, and increases identification of bias and error in data and analysis. Crowdsourcing analytics represents a new way of doing science; a data set is made publicly available and scientists at first analyze separately and then work together to reach a conclusion while making subjectivity and ambiguity transparent.


  1. EJ Wagenmakers says:

    This project nicely showcases the diversity in statistical analyses across the different teams. In other words, model specification *sans priors* is a matter of personal preference and prior knowledge; moreover, the details of such models specification can influence the results much more than any reasonable specification of prior distributions.

    • Keith O'Rourke says:

      Very oddly like science was supposed to be
      “scientists at first analyze separately and then work together to reach a conclusion while making subjectivity and ambiguity transparent.”

      Except the absence of “a [all] data set[s] is [are fully] made publicly available”

      And the soft restriction to qualified others.

      I’ll have a look later, but I am curious if qualified other could credible identify a subset a better analyses or do something better such as quantitatively model the biases referred to.

  2. Rahul says:

    Isn’t it better to ask: How much more likely are referees to give a red card to a dark skin toned player?

    i.e. rather than Yes / No framing shouldn’t such questions focus on the size of the effect?

  3. Anders_H says:

    Thanks for posting this, this is just mindblowing. The estimated effect size ranges from 0.89 to 2.93 for a simple question about a correlation? My confidence in published papers that make use of applied statistics just took a major hit (from a low baseline).

  4. Steve Sailer says:

    Interestingly, Brazil’s top soccer player, the fairly light-skinned Neymar, used to be dark-skinned before he got rich. Before and after pictures here:

  5. Eli Rabett says:

    Referees are less likely to do so than individual referees,

  6. Nate Breznau says:

    Inspired by this study we did the Crowdsourced Replication Initiative getting together 188 researchers in several teams. The results are even more ‘all over the place’, although we can learn some interesting things (see specification curve in the middle of the APSA poster, and read the executive report for more details if you like

Leave a Reply

Where can you find the best CBD products? CBD gummies made with vegan ingredients and CBD oils that are lab tested and 100% organic? Click here.