Most events labelled as unforeseeable were foreseeable — by someone, somewhere, before they happened.
The Convenient Label
The black swan label — applied to low-probability, high-impact events that were supposedly beyond prediction — has become a convenient institutional alibi. The financial crisis was a black swan. The pandemic was a black swan. The infrastructure failure was a black swan. In each case, the label does work that exempts the institutions involved from accountability for failures of anticipation: if the event was genuinely unforeseeable, then failing to prepare for it was not an error of judgement but an unavoidable condition of operating in an uncertain world.
The difficulty with this exemption is that most events labelled as black swans were not genuinely unforeseeable. They were foreseen — by specific analysts, researchers, practitioners, or risk managers who lacked the institutional standing to convert their foresight into institutional action. The financial crisis was predicted in detail by multiple analysts whose warnings were systematically disregarded by the institutions most exposed to it. The pandemic was anticipated by public health experts who had been issuing precisely calibrated warnings for years. The infrastructure failure had been flagged in audit reports that were filed and not acted upon.
What Failure of Anticipation Actually Looks Like
The black swan that wasn't is not a failure of prediction. It is a failure of institutional processing — a failure to convert available predictive intelligence into institutional action. This failure has a specific structure: the intelligence existed, it was held by actors with less institutional status than the actors who controlled the resources required to act on it, and the gap between intelligence and action was too large to be bridged by the institutional processes that were supposed to connect them.
This structure — available intelligence, insufficient institutional transmission, absent action — is far more common than genuine unpredictability. And it has a different corrective implication. Genuine black swans require resilience — the ability to absorb and recover from shocks that cannot be anticipated. The black swan that wasn't requires better intelligence transmission — the institutional processes that convert distributed foresight into collective action before the predicted event materialises.
Building Transmission Capacity
Building the institutional capacity to act on foresight before events force action requires solving a specific problem: creating the conditions under which junior or peripheral actors who carry valuable predictive intelligence can transmit it to the actors who have the authority to act on it, with sufficient credibility to produce action rather than deferral. This is an institutional design problem with known solutions — red team processes, pre-mortem analysis, anonymous reporting channels, structured contrarian review — that are under-implemented precisely because they are most useful when nothing has yet gone wrong.
The black swan that wasn't is the most expensive category of institutional failure — expensive because it was preventable, and labelled unforeseeable specifically because preventing it would have required acting on intelligence the institution chose not to process.
Discussion