# One person’s nit is another’s central pillar

If one person believes something is absolutely incontrovertibly true, then my first (and demonstrably unhelpful) reaction is that even the slightest demonstration of error should demolish the argument.

I know this doesn’t work.

People don’t make Boolean-logical arguments, they go with gut feelings that act much like Bayesian-logical inferences. If someone says something is incontrovertible, the incontrovertibility isn’t their central pillar — when I treated it as one, I totally failed to change their minds.

Steel man your arguments. Go for your opponent’s strongest point, but make sure it’s what your opponent is treating as their strongest point, for if you make the mistake I have made, you will fail.

If your Bayesian prior is 99.9%, you might reasonably (in common use of the words) say the evidence is incontrovertible; someone who hears “incontrovertible” and points out a minor edge case isn’t going to shift your posterior odds by much, are they?

They do? Are we thinking of the same things here? I don’t mean things where absolute truth is possible (i.e. maths, although I’ve had someone argue with me about that in a remarkably foolish way too), I mean about observations about reality which are necessarily flawed. Flawed, and sometimes circular.

Concrete example, although I apologise to any religious people in advance if I accidentally nut-pick. Imagine a Bible-literalist Christian called Chris (who thinks only 144,000 will survive the apocalypse, and no I’m not saying Chris is a Jehovah’s Witness, they’re just an example of 144k beliefs) arguing with Atheist Ann, specifically about “can God make a rock so heavy that God cannot move it?”:

P(A) = 0.999 (Bayesian prior: how certain Chris’s belief in God is)
P(B) = 1.0 (Observation: the argument has been made and Ann has not been struck down)
P(B|A) = 0.99979 (Probability that God has not struck down Ann for blasphemy, given that God exists — In the Bible, God has sometimes struck down non-believers, so let’s say about 21 million deaths of the 100 billion humans that have ever lived to cover the flood, noting that most were not in the 144k)

P(A|B) = P(B|A)P(A)/P(B) = 0.99979×0.999/1.0 = 0.99879021

Almost unchanged.

It gets worse; the phrase “I can’t believe what I’m hearing!” means P(B) is less than 1.0. If P(B) is less than 1.0 but all the rest is the same:

P(B) = 0.9 → P(A|B) = P(B|A)P(A)/P(B) = 0.99979×0.999/0.9 = 1.1097669

Oh no, it went up! Also, probability error, probability can never exceed 1.0! P>1.0 would be a problem if I was discussing real probabilities — if this was a maths test, this would fail (P(B|A) should be reduced correspondingly) — but people demonstrably don’t always update all their internal model at the same time: if we did, cognitive dissonance would be impossible. Depending on the level of the thinking (I suspect direct processing in synapses won’t do this, but that deliberative conscious thought can) we can sometimes fall into traps, so this totally explains another observation: some people can take the mere existence of people who disagree with them as a reason to believe even more strongly.