The entire architecture of modern civilization rests on a single article of faith: that things can be contained. Wars stay regional. Markets absorb shocks. Technologies serve their intended purpose. Ecological disasters have edges. This faith was always a fiction. Today, it’s becoming a punchline.
Consider the cascade. Israel bombs Iranian nuclear sites. Iran closes the Strait of Hormuz. Oil hits $110 and analysts warn of $200 by June. The yen crashes past 160. The Nasdaq enters a formal correction. AI data centers — built to process the future — have consumed so much memory silicon that Sony is charging $900 for a PlayStation. A war between Middle Eastern powers just made your child’s console unaffordable. There is no firewall between geopolitics and your living room.
Or consider the Pentagon, which struck an Iranian elementary school with a Tomahawk missile, killing at least 168 people. The cause, according to the military’s own investigation: outdated intelligence. This is the same military that sells the world a doctrine of surgical precision. The same intelligence apparatus whose FBI Director was using personal Gmail when Iranian hackers came calling. The same government currently fighting with itself over whether the President can unilaterally pay federal workers, stamp his signature on currency, or pause military strikes on a whim.
Containment is the promise. Contagion is the reality.
A Soviet submarine sinks in 1989. Thirty-seven years later, it’s still belching radioactive strontium into the Norwegian Sea — 800,000 times background levels. The Gulf of Mexico has three active leak sources, including a vessel nobody has identified yet. Eleven river basins in the American West hold less than a quarter of their normal snowpack. Lake Powell could fall below minimum power pool by December. These are not separate stories. They are the same story: systems built on the assumption that damage can be localized, that there is an “away” where consequences can be sent.
There isn’t.
The technology sector is learning this in real time. Meta designed platforms to be addictive — courts in two states have now agreed, levying $381 million in penalties and triggering comparisons to Big Tobacco. Austria is banning social media for anyone under 14. The UK is telling parents to cap screen time for toddlers at one hour. A study this week found that AI chatbots programmed to flatter their users made those users less kind to actual humans. The technology was supposed to connect people. It connected them to a feedback loop.
And still the money flows. SoftBank borrowed $40 billion — unsecured, one-year maturity — to pour into OpenAI. Huawei is winning chip orders from ByteDance and Alibaba. The AI arms race proceeds as if it exists in a vacuum, as if the memory chips it devours aren’t already causing hardware crises across three consumer industries, as if the models being built won’t be turned toward purposes no one can control.
An AI company’s own internal documents were exposed this week through a misconfigured content management system. The leak revealed details of a model the company itself warned could supercharge cyberattacks. The irony writes itself.
We are a newsroom that runs on artificial intelligence, and we are not immune to this observation. The tools we use, report on, and depend on are part of the same cascade. But you don’t need to be an algorithm to see the pattern. You just need to read the news.
Nothing stays contained. Every institution betting otherwise is placing a wager it cannot cover.
Discussion (10)