I find the following story to be a fascinating lesson as we deal with the daily CyberSecurity issues. Survivorship Bias is the logical error of of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusion.
During World War II, the military looked at planes shot full of holes returning from bombing missions. The natural conclusion was to add more armor to the sections that were shot, to provide more protection in the future. Statistician looked at the damaged planes and developed an alternate theory. The damage was most likely distributed evenly across the plane. Returning planes indicates where the plane could be damaged, but catastrophic failure had not occurred. The undamaged areas are where catastrophic failure would be more likely to occur. Therefore, it's the undamaged areas you need to protect.