Survivorship bias


By focusing on those who overcame an obstacle, you are forgetting those who did not. Those who did not are usually not visible.

From How to Take Smart Notes:

One of the most famous figures to illustrate this skill is the mathematician Abraham Wald (Mangel and Samaniego 1984). During World War II, he was asked to help the Royal Air Force find the areas on their planes that were most often hit by bullets so they could cover them with more armour. But instead of counting the bullet holes on the returned planes, he recommended armouring the spots where none of the planes had taken any hits. The RAF forgot to take into account what was not there to see: All the planes that didn’t make it back.
The RAF fell for a common error in thinking called survivorship bias (Taleb 2005). The other planes didn’t make it back because they were hit where they should have had extra protection, like the fuel tank. The returning planes could only show what was less relevant.

But this example is also given in How Not to be Wrong

This is a type of Availability bias.


Tags: Mental Models Reference: How to Take Smart Notes Related::