Learnings from Carl T. Bergstrom, Professor of Biology at the University of Washington and co-author of Calling Bullshit: The Art of Skepticism in a Data-Driven World.[1]
These insights cover his work in evolutionary biology, information theory, metascience (the science of science), and his extensive public work on combating misinformation.[1]
I. The Nature of Bullshit
1. Brandolini’s Law (The Bullshit Asymmetry Principle)
"The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it."
— Source: Calling Bullshit (attributed to Alberto Brandolini, but popularized and central to Bergstrom’s course).
- Learning: It takes seconds to create a fake graph or false claim, but hours of research to debunk it. This asymmetry is why you cannot debunk everything; you must pick your battles.[1]
2. The Definition of Bullshit
"Bullshit involves language, statistical figures, data graphics, and other forms of presentation intended to persuade or impress an audience by distracting, overwhelming, or intimidating them with a blatant disregard for truth."
— Source: Calling Bullshit Website
- Learning: Bullshit isn't necessarily a lie (which focuses on the falsity).[1] Bullshit is focused on persuasionwithout caring whether the statement is true or false.[1]
3. "New-School" Bullshit
"New-school bullshit uses the language of math and science and statistics to create the impression of rigor and accuracy. It’s a lot harder to parse than old-school rhetoric."
— Source: Calling Bullshit (Book)
- Learning: We are trained to question rhetoric, but we are trained to defer to numbers. This makes quantitative bullshit (charts, algorithms) more dangerous.
4. Performative Utterance
"Calling bullshit is a performative utterance... When I call bullshit, I am not merely reporting that I am skeptical... I am explicitly and often publicly pronouncing my disbelief."
— Source: Calling Bullshit (Book)
- Learning: It is an act of public rejection of a claim's validity, which serves to reinforce norms of truth-telling in a community.
5. Bullshit vs. Lying
"The liar knows the truth and tries to lead you away from it. The bullshitter doesn't care about the truth; they just want to impress you or sell you something."
— Source: Lecture at UW (Based on Harry Frankfurt's philosophy)
- Learning: Intent matters. A bullshitter is often more dangerous than a liar because they operate completely outside the framework of truth.
6. The "Black Box" Defense
"One of the most effective ways to bullshit an audience is to claim that the complexity of your method prevents you from explaining it."
— Source: Calling Bullshit (Course)
- Learning: If someone cannot explain their data or algorithm in simple terms, they are often hiding something.
7. Intelligence Won't Save You
"You can be smart and still be bullshitted. In fact, smart people are sometimes easier to bullshit because they are good at rationalizing things to fit their existing beliefs."
— Source: Interview with The Guardian
- Learning: Critical thinking is a habit, not just raw processing power.
8. The Goal is Impressiveness
"We tell stories to create impressions of ourselves in the eyes of others."
— Source: Calling Bullshit (Book)
- Learning: Much of the misinformation shared online is not about "informing" others, but about signaling "I am the kind of person who cares about this."
9. Skepticism vs. Cynicism
"Be skeptical, not cynical. Skepticism asks for evidence; cynicism rejects everything."
— Source: Twitter/X
- Learning: Cynicism is a lazy intellectual shortcut. True skepticism requires engagement with the evidence.
10. Hanlon’s Razor
"Never assume malice or mendacity when incompetence is a sufficient explanation, and never assume incompetence when a reasonable mistake can explain things."
— Source: Calling Bullshit (Book)
- Learning: Most "fake news" is actually just sloppy reporting or honest errors, not a grand conspiracy.
II. Data Literacy & Statistics
11. Goodhart’s Law
"When a measure becomes a target, it ceases to be a good measure."
— Source: Calling Bullshit (Book)
- Learning: Once you start managing by a specific metric (e.g., number of citations, standardized test scores), people will game the system to maximize that number, destroying its value as a metric.
12. Selection Bias
"Selection bias is the most pernicious statistical error you will encounter."
— Source: Calling Bullshit (Course)
- Learning: If the data you are looking at wasn't selected randomly, the results are likely skewed. (e.g., "The people who fill out an online poll are not representative of the whole country.")
13. Correlation and Causation
"Correlation doesn't imply causation—but apparently it doesn't sell newspapers either."
— Source: Calling Bullshit (Book)
- Learning: Media headlines love to turn weak correlations into "X causes Y" stories because they are more exciting, even if they are scientifically wrong.
14. The "Too Good to Be True" Rule
"If a result seems too good (or too bad) to be true, it probably is."
— Source: Calling Bullshit (Book)
- Learning: Extraordinary claims require extraordinary evidence. A slight deviation is normal; a massive deviation usually indicates a data error or fraud.
15. Data Visualization Manipulation
"To tell an honest story, it is not enough for numbers to be correct. They need to be placed in an appropriate context."
— Source: Calling Bullshit (Book)
- Learning: You can lie with a "true" graph by manipulating the axes (e.g., not starting at zero) or cherry-picking the time range.
16. The "Flatten the Curve" Graphic
"This graph is changing minds, and by changing minds, it is saving lives... it simplifies and highlights what matters."
— Source: Twitter Thread, March 2020
- Learning: A good data visualization strips away irrelevant complexity to show the mechanism of action. The "flatten the curve" gif was effective because it showed why slowing the spread helped (keeping it under healthcare capacity).
17. Undeserved Weight of Numbers
"Quantitative evidence generally seems to carry more weight than qualitative arguments. This weight is largely undeserved."
— Source: Calling Bullshit (Book)
- Learning: Just because someone put a number on it doesn't mean it's a fact. Numbers are often estimates, proxies, or guesses dressed up as data.
18. P-Hacking
"If you torture the data long enough, it will confess to anything."
— Source: Course Lecture on P-values (Adapting Ronald Coase)
- Learning: Scientists can manipulate their analysis (p-hacking) to find a "statistically significant" result that is actually just a coincidence.
19. Practical vs. Statistical Significance
"Statistical significance is not the same as practical importance."
— Source: Calling Bullshit (Course)
- Learning: A study might prove that a drug works "better" than a placebo with 99% certainty, but if the improvement is only 0.001%, it is practically useless.
20. Fermi Estimation
"Always check the numbers on the back of a napkin."
— Source: Calling Bullshit (Course)
- Learning: Use rough mental math (Fermi estimation) to see if a claim is even plausible. (e.g., If a headline says 100 million Americans died of X last year, you know it's false because only ~3 million die total per year).
III. Science, Metascience, and the "Prestige Economy"
21. The Prestige Economy
"Academia is a prestige economy. Scientists seek status and respond to incentives just like anyone else."
— Source: UW News Interview
- Learning: Scientists are human. They are motivated by publishing in top journals (Nature, Science) to get tenure, which can warp their research priorities.
22. False Canonization of Facts
"Publication bias can lead to the 'false canonization' of facts. If only positive results are published, we might come to believe something is true just because we never see the evidence that it isn't."
— Source: Paper: The canonization of false facts (eLife, 2016)
- Learning: If 20 studies are done and only the 1 successful one is published, the scientific record is distorted.
23. The "File Drawer" Problem
"We need to share negative results more than we are doing today."
— Source: eLife Paper
- Learning: Knowing what doesn't work is as important as knowing what does.
24. Science is a Human Endeavor
"Science is not a list of facts; it is a process. And it is a human process, subject to all the biases and flaws of humans."
— Source: Calling Bullshit (Course)
- Learning: Trust the scientific consensus over time, not necessarily every single individual paper.
25. Gamification of Metrics
"The Impact Factor was intended to help librarians decide which journals to buy. Now it is used to decide which scientists to hire. This is a misuse of the metric."
— Source: Eigenfactor.org (Bergstrom's project)
- Learning: Using a journal's reputation to judge an individual paper is a logical fallacy (Ecological Fallacy).
26. Predatory Publishing
"Today, a rudimentary understanding of Web design and a willingness to defraud people is all it takes to become a predatory publisher."
— Source: Calling Bullshit (Book)
- Learning: There are thousands of "fake" scientific journals that will publish anything for a fee. Just because it looks like a paper doesn't mean it was peer-reviewed.
27. Zombie Statistics
"A number, once created, takes on a life of its own."
— Source: Calling Bullshit (Course)
- Learning: False statistics (like "humans only use 10% of their brains") circulate forever because they are catchy, even after they have been debunked.
28. Algorithms Perpetuate Bias
"Machines are not free of human biases; they perpetuate them, depending on the data they’re fed."
— Source: Calling Bullshit (Book)
- Learning: AI is trained on human history. If history was racist or sexist, the AI will be too unless explicitly corrected.
29. The Gender Gap in Science
"Gender disparities appear to be decreasing... but a large-scale analysis reveals understated and persistent ways in which gender inequities remain."
— Source: Paper: Gender differences in the authorship of invited commentaries
- Learning: Even as overt discrimination drops, subtle structural barriers (like who gets invited to write commentaries) keep the gap open.
30. Open Access is Vital
"We need a system where scientific knowledge is a public good, not locked behind paywalls."
— Source: Eigenfactor project mission
- Learning: Taxpayers fund research; they should have access to the results.
IV. Misinformation, Social Media & The Pandemic
31. The Firehose of Falsehood
"The goal of disinformation is often not to convince you of a lie, but to exhaust you so you stop looking for the truth."
— Source: Tweet on Russian Disinformation strategy
- Learning: When bad actors flood the zone with noise (conflicting theories), people check out. This is "censorship through noise."
32. Insidious Confusion
"The idea is to put out so much bad information that people feel as if they can't get to the truth."
— Source: Washington Post Interview
- Learning: Disinformation campaigns (like those during COVID) succeed by creating apathy, not necessarily belief.
33. Think More, Share Less
"When you are using social media, remember the mantra: think more, share less."
— Source: Calling Bullshit (Book)
- Learning: We are all part of the problem. Slowing down your own sharing is the single best thing you can do to clean up the information environment.
34. Social Media is for Bonding, Not Truth
"Participating on social media is only secondarily about sharing new information; it is primarily about maintaining and reinforcing common bonds."
— Source: Calling Bullshit (Book)
- Learning: We share things to say "I'm part of this tribe," which makes us share false things that signal loyalty.
35. Algorithmic Radicalization
"We are each being fed our own reality from these algorithms."
— Source: UNSW Interview
- Learning: Social media feeds are personalized to keep you engaged, which often means showing you increasingly extreme content that confirms your biases.
36. Emotional Headlines
"Successful headlines don't convey facts; they promise you an emotional experience."
— Source: Calling Bullshit (Course)
- Learning: If a headline makes you feel angry, scared, or vindicated, check your emotions before you click share.
37. The "Data Void"
"Search engines are vulnerable to 'data voids'—terms where there is very little content, allowing manipulators to fill the void with propaganda."
— Source: Course discussion on Data Voids (concept by Michael Golebiewski)
- Learning: Manipulators create new terms (e.g., "crisis actor") so that when people Google them, they only see the manipulator's content.
38. Deepfakes and Trust
"The danger of deepfakes isn't just that we will believe fakes, but that we will stop believing reality (the Liar's Dividend)."
— Source: Calling Bullshit (Course)
- Learning: If anything could be fake, guilty politicians can dismiss real incriminating video as "AI-generated."
39. Brandolini's Corollary for Twitter
"Twitter is a machine for generating context collapse."
— Source: Twitter/X
- Learning: Information on social media is stripped of the context (who said it, when, why), making it easy to misinterpret.
40. The Attention Economy
"In an information-rich world, the limiting factor is attention."
— Source: Calling Bullshit (Course) (Referencing Herbert Simon)
- Learning: Bullshit is designed to steal your attention. Refusing to give it attention is the only way to kill it.
V. Evolutionary Biology & Honest Signaling
41. Costly Signaling Theory
"Talk is cheap. Honest signals are usually expensive."
— Source: Paper: "The evolution of costly signaling"
- Learning: In nature (e.g., a peacock's tail), signals are honest because they are costly to produce. A weak peacock simply cannot grow a giant tail.
42. Honesty in Communication
"How can honest information-sharing transpire despite incentives to deceive? This is the central problem of animal communication."
— Source: Research Profile
- Learning: Deception is the norm in nature (mimicry, camouflage). Honesty is the exception that requires specific constraints (like cost or relationship) to exist.
43. Evolution and Medicine
"If evolution is so smart, why am I such a mess? The field of evolution and medicine aims to answer this question."
— Source: Evolution & Medicine Course
- Learning: Our bodies aren't perfect machines; they are a bundle of evolutionary compromises (e.g., back pain is the cost of walking upright).
44. The Arms Race
"We are locked in an evolutionary arms race with pathogens."
— Source: Evolution (Textbook co-authored with Lee Dugatkin)
- Learning: Bacteria evolve faster than we can invent antibiotics. We cannot "win" the war against disease; we can only manage the race.
45. Mimicry
"Deception is a fundamental part of the biological world."
— Source: Evolution (Textbook)
- Learning: Just as a non-poisonous butterfly mimics a poisonous one to survive, humans mimic expertise (using jargon/suits) to survive socially.
VI. Critical Thinking & Education
46. The Purpose of Education
"Nothing that you will learn in the course of your studies will be of the slightest possible use to you, save only this: that if you work hard and intelligently you should be able to detect when a man is talking rot."
— Source: Calling Bullshit (Quoting John Alexander Smith, adopted as the course motto)
- Learning: The ultimate skill of an educated person is bullshit detection.
47. Lateral Reading
"Don't just read the website 'About' page. Open a new tab and search for what others say about that website."
— Source: Calling Bullshit (Course)
- Learning: This technique (from the Stanford History Education Group) is the most effective way to verify sources. Vertical reading (staying on the page) is how you get scammed.
48. Democracy and Truth
"Adequate bullshit detection is essential for the survival of liberal democracy."
— Source: Calling Bullshit (Book)
- Learning: If citizens cannot distinguish truth from lies, they cannot vote in their own interest, and democracy crumbles.
49. Check Your Sources
"Who is telling me this? How do they know it? What are they trying to sell me?"
— Source: Calling Bullshit (Checklist)
- Learning: The three fundamental questions of media literacy.
50. The Scientific Spirit
"Science is not about being right. It's about being less wrong over time."
— Source: Twitter/X
- Learning: Embracing uncertainty and being willing to change your mind when data changes is the core of the scientific worldview.
Sources
