Course: INFO 198 / BIOL 106B. University of Washington
To be offered: Spring Quarter 2017
Credit: 1 credit, C/NC
Enrollment: 160 students
Instructors: Carl T. Bergstrom and Jevin West
Synopsis: Our world is saturated with bullshit. Learn to detect and defuse it.
The course will be offered as a 1-credit seminar this spring through the Information School at the University of Washington. We aim to expand it to a 3 or 4 credit course for 2017-2018. For those who cannot attend in person, we aim to videotape the lectures this spring and make video clips freely available on the web.
Our learning objectives are straightforward. After taking the course, you should be able to:
We will be astonished if these skills do not turn out to be among the most useful and most broadly applicable of those that you acquire during the course of your college education.
Each of the lectures will explore one specific facet of bullshit. For each week, a set of required readings are assigned. For some weeks, supplementary readings are also provided for those who wish to delve deeper.
Week 1. Introduction to bullshit. What is bullshit? Concepts and categories of bullshit. The art, science, and moral imperative of calling bullshit. Brandolini’s Bullshit Asymmetry Principle.
Week 2. Spotting bullshit. Truth, like liberty, requires eternal vigilance. How do you spot bullshit in the wild? Effect sizes, dimensions, Fermi estimation, and checks on plausibility. Claims and the interests of those who make them. Forensic data analysis: GRIM test, Newcomb-Benford law.
Week 3. The natural ecology of bullshit. Where do we find bullshit? Why news media provide bullshit. TED talks and the marketplace for upscale bullshit. Why social media provide ideal conditions for the growth and spread of bullshit.
Week 4. Causality One common source of bullshit data analysis arises when people ignore, deliberately or otherwise, the fact that correlation is not causation. The consequences can be hilarious, but this confusion can also be used to mislead. Confusing causality with necessity or sufficiency. Regression to the mean pitched as treatment effect. Milton Friedman's thermostat. Selection masked as transformation.
Week 5. Statistical traps and trickery. Bayes rule and conditional probabilities. Base-rate fallacy / prosecutor's fallacy. Simpson's paradox. Data censoring. Will Rogers effect, lead-time bias, and length time bias. Means versus medians. Importance of higher moments.
Week 6. Data visualization. Data graphics can be powerful tools for understanding information, but they can also be powerful tools for misleading audiences. We explore the many ways that data graphics can steer viewers toward misleading conclusions.
Week 8. Publication bias. Even a community of competent scientists all acting in good faith can generate a misleading scholarly record when — as is the case in the current publishing environment — journals prefer to publish positive results over negative ones. In a provocative and hugely influential 2005 paper, epidemiologist John Ioannides went so far as to argue that this publication bias has created a situation in which most published scientific results are probably false. As a result, it’s not clear that one can safely rely on the results of some random study reported in the scientific literature, let alone on Buzzfeed. Once corporate funders with private agendas become involved, matters become all the more complicated.
Week 9. Predatory publishing and scientific misconduct. Predatory publishing. Beall's list and his anti-Open Access agenda. Publishing economics. Pathologies of publish-or-perish culture. Pursuit of PR instead of progress.
Week 10. The ethics of calling bullshit. Where is the line between deserved criticism and targeted harassment? Is it, as one prominent scholar argued, “methodological terrorism” to call bullshit on a colleague's analysis? What if you use social media instead of a peer-reviewed journal to do so? How about calling bullshit on a whole field that you know almost nothing about? Pubpeer. Principles for the ethical calling of bullshit. The Dunning-Kruger effect. Differences between being a hard-minded skeptic and being a domineering jerk.
Week 11. Fake news.. Fifteen years ago, nascent social media platforms offered the promise of a more democratic press through decentralized broadcasting and a decoupling of publishing from advertising revenue. Instead, we get sectarian echo chambers and, lately, a serious assault on the very notion of fact. Not only did fake news play a substantive role in the November 2016 US elections, but recently a fake news story actually provoked nuclear threats issued by twitter.
Week 12. Refuting bullshit. Refuting bullshit requires different approaches for different audiences. What works for a quantitatively-skilled professional scientist won't always convince your casually racist uncle on facebook, and vice versa.
Exercise 1: A bullshit inventory. How much bullshit are you dealing with, anyway? Keep track of your encounters with bullshit over the course of a week, and come up with a way to visualize your results.