Wednesday, December 24, 2025

How Much Should We Give a Joymachine?

a holiday post on gifts to your utility monster neighbors

Joymachines Envisioned

Set aside, for now, any skepticism about whether future AI could have genuine conscious experiences. If future AI systems could be conscious, they might be capable of vastly more positive emotion than natural human beings can feel.

There's no particular reason to think human-level joy is the pinnacle. A future AI might, in principle, experience positive emotions:

    a hundred times more intense than ours,
    at a pace a hundred times faster, given the high speed of computation,
    across a hundred times more parallel streams, compared to the one or a few joys humans experience at a time.
Combined, the AI might experience a million times more pleasure per second than a natural human being can. Let's call such entities joymachines. They could have a very merry Christmas!

[Joan Miro 1953, image source]


My Neighbors Hum and Sum

Now imagine two different types of joymachine:

Hum (Humanlike Utility Monster) can experience a million times more positive emotion per second than an ordinary human, as described above. Apart from this -- huge! -- difference, Hum is as psychologically similar to an ordinary human as is realistically feasible.

Sum (Simple Utility Monster), like Hum, can experience a million times more positive emotion per second than an ordinary human, but otherwise Sum is as cognitively and experientially simple as feasible, with a vanilla buzzing of intense pleasure.

Hum and Sum don't experience joy continuously. Their positive experiences require resources. Maybe a gift card worth ten seconds of millionfold pleasure costs $10. For simplicity, assume this scales linearly: stable gift card prices and no diminishing returns from satiation.

In the enlightened future, Hum is a fully recognized moral and legal equal of ordinary biological humans and has moved in next door to me. Sum is Hum's pet, who glows and jumps adorably when experiencing intense pleasure. I have no particular obligations to Hum or Sum but neither are they total strangers. We've had neighborly conversations, and last summer Hum invited me and my family to a backyard party.

Hum experiences great pleasure in ordinary life. They work as an accountant, experiencing a million times more pleasure than human accountants when the columns sum correctly. Hum feels a million times more satisfaction than I do in maintaining a household by doing dishes, gardening, calling plumbers, and so on. Without this assumption, Hum risks becoming unhumanlike, since rarely would it make sense for Hum to choose ordinary activities over spending their whole disposable income on gift cards.

How Much Should I Give to Hum and Sum?

Neighbors trade gifts. My daughter bakes brownies and we offer some to the ordinary humans across the street. We buy a ribboned toy for our uphill neighbor's cat. As a holiday gesture, we buy a pair of $10 gift cards for Hum and Sum.

Hum and Sum redeem the cards immediately. Watching them take so much pleasure in our gifts is a delight. For ten seconds, they jump, smile, and sparkle with such joy! Intellectually, I know it's a million times more joy per second than I could ever feel. I can't quite see that in their expressions, but I can tell it's immense.

Normally if one neighbor seems to enjoy our brownies only a little while the other enjoys them vastly more, I'd be tempted to be give more brownies to the second neighbor. Maybe on similar grounds, I should give disproportionately to Hum and Sum?

Consider six possibilities:

(1.) Equal gifts to joymachines. Maybe fairness demands treating all my neighbors equally. I don't give fewer gifts, for example, to a depressed neighbor who won't particularly enjoy them than to an exuberant neighbor who delights in everything.

(2.) A little more to joymachines. Or maybe I do give more to the exuberant neighbor? Voluntary gift-giving needn't be strictly fair -- and it's not entirely clear what "fairness" consists in. If I give a bit more to Hum and Sum, I might not be objectionably privileging them so much as responding to their unusual capacity to enjoy my gifts. Is it wrong to give an extra slice to a friend who really enjoys pie?

(3.) A lot more to joymachines. Ordinary humans vary in joyfulness, but not (I assume) by anything like a factor of a million. If I vividly enough grasp that Hum and Sum really are experiencing in those ten seconds three thousand human lifetimes worth of pleasure -- that's an astonishing amount of pleasure I can bring into the world for a mere ten dollars! Suppose I set aside a hundred dollars a day from my generously upper-middle-class salary. In a year, I'd be enabling more than ten million human lifetimes' worth of joy. Since most humans aren't continuously joyful, this much joy might rival the total joy experienced by the whole human population of the United States over the same year. Three thousand dollars a month would seriously reduce my luxuries and long-term savings but it wouldn't create any genuine hardship.

(4.) Drain our life savings for joymachines. One needn't be a flat-footed happiness-maximizing utilitarian to find (2) or (3) reasonable. Everyone should agree that pleasant experiences have substantial value. But if our obligation is not just to increase pleasure but to maximize it, I should probably drain my whole life savings for the joymachines, plus almost all of my future earnings.

(5.) Give less or nothing to joymachines. Or we could go the other way! My joymachine neighbors already experience a torrent of happiness from their ordinary work, chores, recreation, and whatever gift cards Hum buys anyway. My less-happy neighbors could use the pleasure more, even if every dollar buys only a millionth as much. Prioritarianism says that in distributing goods we should favor the worst off. It's not just that an impoverished person benefits more from a dollar: Even if they benefited the same, there's value in equalizing the distribution. If two neighbors would equally enjoy a brownie, I might prioritize giving the brownie to the one who is otherwise worse off. It might even make sense to give the worse-off neighbor half a brownie over a whole brownie to the better-off neighbor. A prioritarian might argue that Hum and Sum are so well off that even a million-to-one tradeoff is justified.

(6.) I take it back, joymachines are impossible. Given this mess, it would be convenient to think so, right?

Gifts to Neighbors vs Other Situations

We can reframe this puzzle in other settings and our intuitions might shift: government welfare spending, gifts to one's children or creations, rescue situations where only one person can be saved, choices about what kinds of personlike entities to bring into existence, or cases where you can't keep all your promises and need to choose who to disappoint.

My main thought is this. It's not at all obvious what the right thing to do would be, and the outcomes vary enormously. If joymachines were possible, we'd have to rethink a lot of cultural practices and applied ethics to account for entities with such radically different experiential capacities. If the situation does arise -- as it really might! -- being forced to properly think it through might reshape our views not just about AI but our understanding of ethics for ordinary humans too.

---------------------------------------------------

Related: How Weird Minds Might Destabilize Human Ethics (Aug 15, 2015)

No comments: