Skip to main content

Proof Bias

The rigid belief that only things that can be "proven" according to a narrow, often undefined, standard are real. It’s the intellectual sibling of Computational Bias, but focuses on the act of proving rather than the act of measuring. It creates a catch-22 where the proof demanded is only achievable within the skeptic's own framework. If you can't prove it to their satisfaction, in their language, it doesn't exist. It’s the ultimate tool for dismissing anything inconvenient.
Example: "Despite years of historical documentation, his Proof Bias made him claim the event never happened because we didn't have a video recording from the 1700s."
Proof Bias by Dumu The Void March 11, 2026
Proof Bias mug front
Get the Proof Bias mug.
See more merch

Proof Bias

A bias where one demands “proof” in contexts where absolute proof is impossible or inappropriate, and uses the inevitable failure to provide it as grounds for dismissal. Proof bias often appears in debates about religion, history, or consciousness, where one side demands mathematically certain evidence for claims that can only be probabilistic or experiential. It weaponizes the concept of proof to avoid engaging with reasonable evidence. Proof bias is a close relative of the burden of proof fallacy, but focused on the impossible standard rather than who carries the burden.
Example: “He demanded proof that her childhood trauma affected her adult decisions, as if psychological causality could be demonstrated with mathematical certainty—proof bias, demanding the impossible to dismiss the real.”

Evidence Bias

A bias where one selectively privileges certain kinds of evidence over others based on form rather than substance—typically preferring quantitative, experimental, or “hard” data while dismissing qualitative, experiential, or “soft” evidence as inherently inferior. Evidence bias is common in fields where scientific authority is used to police boundaries: qualitative research is dismissed as “mere anecdote,” personal testimony as “unscientific,” and cultural knowledge as “folklore.” The bias ignores that different questions require different forms of evidence.

Example: “He dismissed her ethnographic fieldwork as ‘just stories’ because it wasn’t a double‑blind study—evidence bias, mistaking one legitimate method for the only legitimate method.”

Slothful Proof Bias

The cognitive error of accepting a convenient, low-effort piece of evidence as definitive proof, while ignoring the mountain of complex, contradictory, or difficult-to-obtain evidence. It’s the mental shortcut that prefers a simple, lazy answer over a complicated truth. This bias allows people to "prove" their point by pointing to a single, easily digestible factoid, a meme, or a headline, while dismissing nuanced studies or expert consensus as "too complicated."
Example: "He 'proved' vaccines were dangerous with one Facebook post about a friend's cousin, totally succumbing to Slothful Proof Bias."

Scientific Proof Bias

A specific form of proof bias that equates “real” knowledge exclusively with what can be proven by scientific methods—particularly those modeled on the physical sciences—and dismisses any other form of evidence or reasoning as inherently invalid. Scientific proof bias treats absence of randomized controlled trials as proof of falsehood, ignores historical, experiential, or qualitative evidence, and often pathologizes those who rely on other knowledge systems. It is the epistemological engine behind scientific bigotry, using “scientific proof” as a gatekeeping tool to exclude non‑Western, indigenous, spiritual, or experiential ways of knowing from serious consideration.
Example: “She presented decades of ethnographic observation; he dismissed it as ‘not scientific proof.’ Scientific proof bias: treating qualitative research as worthless because it doesn’t mimic physics.”

Hasty Proof Bias

A bias where one demands immediate, definitive proof at the very start of an inquiry or discussion, treating the inability to produce instant evidence as proof that the claim is false. Hasty proof bias conflates “not yet proven” with “disproven” and ignores the time, resources, and iterative process required to gather evidence. It is often used to shut down exploration of novel ideas, emerging research, or complex topics that cannot be summarized in a soundbite. In debates, it appears as “if you can’t prove it right now, it’s not true.”
Example: “He asked her about a recent preprint and demanded proof on the spot. When she said the study was still being replicated, he declared ‘so it’s false.’ Hasty proof bias: treating provisional knowledge as debunked.”

Impossible Proof Bias

A form of proof bias where one demands evidence that is, in principle, impossible to provide—such as proof of a negative, absolute certainty, or evidence that would require violating the very phenomenon being studied. The goal is not to be convinced but to create an unattainable standard that ensures the opponent always fails. Impossible proof bias often appears in debates about historical events, subjective experience, or metaphysical claims: “prove you weren’t there,” “prove you’re not dreaming,” “prove God doesn’t exist.” It weaponizes the limits of human knowledge to dismiss any position the biased party wishes to reject.
Example: “He demanded she prove that her childhood trauma actually happened—as if memory worked like a video recording. Impossible proof bias: using unrealistic standards to invalidate lived experience.”

Selective Proof Bias

A bias where one applies rigorous proof standards only to claims they disagree with, while accepting weak or no evidence for claims they favor. Selective proof bias is the hallmark of motivated reasoning: the same person who demands double‑blind studies for acupuncture will accept anecdotal testimonials for their preferred supplement; who insists on “proof of harm” for environmental regulations will accept speculation about economic benefits. The bias lies not in the standards themselves but in their inconsistent application.
Example: “He rejected climate models as ‘unproven’ but accepted a single op‑ed as proof that deregulation boosts growth. Selective proof bias: rigor for opponents, credulity for allies.”