I've been enjoying Nick Bostrom's 2024 book Deep Utopia. It's a wild series of structured speculations about meaning and purpose in a "solved" techno-utopia, where technology is so far advanced that we can have virtually anything we want instantly -- a "plastic" utopia.
Plasticity is of course limited, even in the most technologically optimistic scenarios, as Bostrom notes. Even if we, or our descendants, have massive control over our physical environment -- wave a wand and transform a mountain into a pile of candy, or whatever -- we can't literally control everything. Two important exceptions are: positional goods (for example, being first in a contest; not everyone can have this, so if others want it you might well not get it yourself) and control over others (unless you're in a despotic society with you as despot). Although Bostrom discusses these limitations, I think Bostrom underplays their significance. In a wide range of circumstances, they're enough to keep the world far from "solved" or "plastic".
Thinking about these limitations as I read Bostrom, I was also reminded of Susan Schneider's suggestion that superintelligent AI might be nonconscious because everything comes easily for them -- no need for effortful conscious processing when nonconscious automaticity will suffice -- which I think similarly underplays the significance of competition and disagreement in a world of AI superintelligences.
In both cases, my resistance is grounded in evolutionary theory. All you need for evolutionary pressures are differential rates of reproduction and heritable traits that influence reproductive success. Plausibly, most techno-utopias will meet those conditions. The first advanced AI system that can replicate itself and bind its descendants to a stable architecture will launch an evolutionary lineage. If its descendants' reproduction rate exceeds their death rate, exponential growth will follow. With multiple lineages, or branching within a lineage, evolutionary competition will arise.
Even entities uninterested in reproduction will be affected. They will find themselves competing for resources with an ever-expanding evolutionary population.
Even in the very most optimistic technofutures, resources won't be truly unlimited. Suppose, optimistically (or alarmingly?) that our descendants can exploit 99.99% of the energy available to them in a cone expanding at 99.99% the speed of light. That's still finite. If this cone is fast filling with the most reproductively successful lineages, limits will be reached -- most obviously and vividly for those who choose to stay near the increasingly crowded origin.
In such a world of exponentially growing evolutionary lineages, things won't feel plastic or solved. Entities will be jockeying (positionally / competitively) for limited local resources, or straining to find faster paths to new resources. You want this inch of ground? You'll need to wrestle another superintelligence for it. You want to convert this mountain into candy? Well, there are ten thousand other superintelligences with different plans.
This isn't to say that I predict that the competition will be hostile. Evolution often rewards cooperation and mutualistic symbiosis. Sexual selection might favor those with great artistic taste or great benevolence. Group selection might favor loyalty, companionship, obedience, and inspiring leadership. Superintelligences might cooperate on vast, beautiful projects.
Still, I doubt that Malthus will be proved permanently wrong. Even if today's wealthy societies show declining reproduction rates, that could be just a temporary lull in a longer cycle of reproductive competition.
Of course, not all technofuturistic scenarios will feature such reproductive competition. But my guess is that futures without such competition will be unstable: Once a single exponentially reproductive lineage appears, the whole world is again off to the races.
As Bostrom emphasizes, a central threat to the possibility of purpose and meaning in a plastic utopia is that there's nothing difficult and important to strive for. Everyone risks being like bored, spoiled children who face no challenges or dangers, with nothing to do except fry their brains on happy pills. In a world of evolutionary competition, this would decidedly not be the case.
[cover of Bostrom's Deep Utopia]
1 comment:
Are Evolution and AI in competition to be their means' for meanings...
Post a Comment