AI: the Ziggy Stardust Syndrome

“Ziggy sucked up into his mind.” –David Bowie

In his Wall Street Journal column this weekend, Nobel laureate Frank Wilczek offers a fascinating theory as to why we haven’t been able to find signs of intelligent life elsewhere in the universe. Maybe, he suggests, intelligent beings are fated to shrink as their intelligence expands. Once the singularity happens, AI implodes into invisibility.

It’s entirely logical. Wilczek notes that “effective computation must involve interactions and that the speed of light limits communication.” To optimize its thinking, an AI would have no choice but to compress itself to minimize delays in the exchange of messages. It would need to get really, really small.

Consider a computer operating at a speed of 10 gigahertz, which is not far from what you can buy today. In the time between its computational steps, light can travel just over an inch. Accordingly, powerful thinking entities that obey the laws of physics, and which need to exchange up-to-date information, can’t be spaced much farther apart than that. Thinkers at the vanguard of a hyperadvanced technology, striving to be both quick-witted and coherent, would keep that technology small.

The upshot is that the most advanced civilizations would be tiny and shy. They would “expand inward, to achieve speed and integration — not outward, where they’d lose patience waiting for feedback.” Call it the Ziggy Stardust Syndrome. An AI-based civilization would suck up into its own mind, becoming a sort of black hole of braininess. We wouldn’t be able to see such civilizations because, lost in their own thoughts, they’d have no interest in being seen. “A hyperadvanced civilization,” as Wilczek puts it, “might just want to be left alone.” Like Greta Garbo.

The idea of a jackbooted superintelligent borg bent on imperialistic conquest has always left me cold. It seems an expression of anthropomorphic thinking: an AI would act like us. Wilczek’s vision is much more appealing. There’s a real poignancy — and, to me at least, a strange hopefulness — to the idea that the ultimate intelligence would also be the ultimate introvert, drawn ever further into the intricacies of its own mind. What would an AI think about? It would think about its own thoughts. It would be a pinprick of pure philosophy. It would, in the end, be the size of an idea.

The meek may not inherit the earth, but it seems they may inherit the cosmos, if they haven’t already.