False Positives False Negatives: Compare How You Fail

“The point of modern propaganda isn't only to misinform or push an agenda. It is to exhaust your critical thinking, to annihilate truth.” ― Garry Kasparov

THINKING TOOL

plane on airport
plane on airport

False positives and false negatives are classifications of error. A false positive is when a test result incorrectly indicates the presence of a condition—like a disease when there is no disease present. A false negative is the opposite, where it indicates the absence of a condition when it is actually present. This model is used in statistics, health, computer science, and countless other.

You will run into false positives and negatives throughout your life. Criminals will be acquitted, innocent people convicted. Diseases diagnosed for the healthy, illnesses unfound for the unhealthy. People falsely identified via their fingerprint, and people unrecognized by their biometric data. Undoubtedly, you have run into them in the airport. Your keys, belt buckle, loose change, mobile phone, or replaced hip started beeping in the security check. That’s a false positive. The alarm sounds because the system thinks you’re carrying a weapon onto the aircraft. The ratio of false positives to true positives—accusing innocent travelers to be terrorists versus actually detecting a would-be terrorist—is incredibly high. The systems fail nearly every time. But it’s worth it, as the cost of a false negative—missing a terrorist that’s about to bring a bomb onto the plane—is very high while the cost of a false positive is low.

In the literature, you’ll run into true positives and negatives and false positives and negatives. It’s exactly as it sounds. True positives or negatives are correct detections. The hacker is correctly identified and their spamming behavior is shut down. The innocent emailer is let through with their legitimate message. False positives are signals of issues where they do not exist. They result in unnecessary actions and interventions, and are most tolerable where a false negative carries high risk. A test might falsely indicate cancer, causing emotional distress and unneeded treatments. A legitimate email might be marked as spam, resulting in miscommunication. A security system can be triggered by harmless activity, wasting resources by responding to a non-threat. Lastly, a false negative is when the presence of a real issue is missed. This can be life-and-death, such as in misdiagnosing a heart condition and delaying treatment, letting spam emails through and exposing people to phishing and malware, and failing to detect a terrorist or burglar.

white security camera on post
white security camera on post

Real life implications of false negatives and false positives:

  • Healthcare: a false positive can be a false detection of cancer, resulting in undue stress for the patient; a false negative is a missed early-stage tumor, potentially worsening outcomes by giving it a chance to grow and spread;

  • Finance: a false positive could be a bank’s decision to flag legitimate transactions as fraudulent, resulting in customer inconvenience and loss of reputability; a false negative might be a fraud system that misses unauthorized transactions and leads to lost money for the customers and bank;

  • Technology: a false positive might incorrectly match a person to a database record, resulting in false accusations or privacy violations; a false negative could be a self-driving car that fails to recognize and hits a pedestrian;

  • Hiring: a false positive is hiring a candidate who turns out to be unqualified, decreasing the team’s performance and wasting resources; a false negative is rejecting a qualified counterpart, the consequences of which are missing out on the opportunity to hire talent.

How you might use false positives and false negatives as a mental model: (1) evaluate which, the false positive or false negative, carries higher risk in your context—such as in criminal justice, where wrongly convicting an innocent person is more serious than letting a guilty person free; (2) adjust the system, choosing between precision or sensitivity—the sensitivity might result in more false positives, the precision might miss some genuine cases of concern, however; (3) iterate and refine to reduce both types of errors in your thinking; (4) use the model in your systems and procedures of day-to-day living—like weighing the risks of accepting or rejecting a career offer based on uncertain signals.

Thought-provoking insights. “It’s better to err on the side of caution.” reflects a preference for false positives in high-stakes scenarios—healthcare, airport security checks. “Absence of evidence is not evidence of absence.” highlights the danger of false negatives in areas like security and scientific research. Balance the scales. Different industries and contexts can tolerate different rates of false positives and negatives. It boils down to your priorities, risk-aversity, and resource availability. Use this mental model whenever you need to minimize risk and to objectively assess a system.

Grab the worksheet and figure out your false positives and negatives.