Probability is misleading and, as so often is the case, Tim Harford has explained it brilliantly.
He imagines a test for a disease that is 75% accurate, meaning it correctly identifies infected people 75% of the time, but incorrectly returns a false positive on uninfected people 25 . He tests 100 people, 4 of whom are infected.
The test correctly identifies three people as having the diseases, a 75% success rate.
But there are 96 people who don’t have the disease. The test identifies 25% of those people as having the disease. That’s 24 false positives.
So in a sample that has 4 infected people, the test would give a result of 27 infected people (three correctly identified, 24 who don’t have it but now think they do, and one person who is infected but thinks they are healthy.)
Even a 90% acccurate test has flaws. It would correctly identify three or four people with the disease. But it would still identify 10 people as having the illness even though they don’t.
This kind of error creeps into so many things in the news today. Healthcare tests, yes, but also terrorism, benefit fraud, and the ongoing insistence by the state that if we have nothing to hide, we have nothing to fear.
Tim has pointed out how dangerous a concept this is. Because when a test has a 90% success rate, many more innocent people than genuine bad guys will be caught in the net.
(Tim Harford: http://timharford.com/2016/03/how-to-make-good-guesses/)