I have a new science fiction story out in Clarkesworld (text version, audiocast). One of my philosophical aims in that story is to reflect about the conditions under which it would make sense to upload into a simulated world.
"Mind uploading" is the hypothetical process of copying your mind into a computational device. If futurists such as Ray Kurzweil are correct, we might be only a few decades away from mind uploading as a real technological option.
When mind uploading is presented in science fiction, usually the following are assumed:
(1.) The uploaded mind retains all the important psychological features of the original mind (including, of course, a genuine stream of conscious experiences).
(2.) The uploaded mind experiences a world as rich as the world of the original mind. If the mind is uploaded into a robot, that world might just be the same world as the world of the original mind. If the mind is uploaded into a Sim or a Matrix world -- that is, an artificial environment of some sort -- that artificial environment is usually assumed to be as rich and interesting as the mundane environment of everyday Earth for normally embodied humans.
Under these conditions, if we also assume that uploading has substantial advantages in improving longevity, allowing backups, improving one's cognitive powers, or giving oneself access to a new rich vein of experiences and possibilities, then it probably makes sense to upload, unless one is strongly committed to traditional human embodiment or to projects or relationships that would no longer be possible post-upload.
However, it seems natural to suppose that if uploading does someday become possible the first uploads will not have features (1) and (2). The first uploaded people (or "people"), even if genuinely conscious, might be seriously psychologically impaired and unable to access a full, rich world of experiences.
There might, however, still be advantages of uploading in terms of longevity and experienced pleasure.
In "Fish Dance", the narrator is presented with the option of uploading his mind under these conditions:
(a.) the world is tiny: a single dance floor, with no input from the larger world outside; (b.) his mind is limited: he will have some memories from his pre-uploaded self, but he won't fully understand them, and furthermore he won't be able to lay down new memories that last for more than an hour or so; (c.) his dance-floor experiences will be extremely pleasurable: ideal experiences of dancing ecstasy; (d.) he will experience this extreme pleasure for a billion years.
Also relevant, of course, are the relationships and projects that he would be leaving behind. (In our narrator's case, a recently deceased child, a living teenage child who wants him to upload, a stale marriage, and an okay but not inspiring career as a professor.)
I say the relationships and projects "he" will leave behind, but of course one interesting question is whether it makes sense to call the uploaded being "him", that is, the same "him" as the narrator.
If it seems obvious to you what one should do under such conditions, the parameters are of course adjustable: We can increase or decrease psychological function, psychological similarity, and quality of memory. We can increase or decrease the size of the world and the range of available input. We can increase or decrease the pleasure and longevity. We can modify the relationships and projects that would be left behind.
You or your descendants might actually face some version of this decision.
----------------------------------------------
"Fish Dance" (Clarkesworld #118, July 2016)
Related blogpost: Goldfish Pool Immortality (May 30, 2014)
13 comments:
I'll tell you one condition that is absolutely non-negotiable: an emergency "jack out" back to meat space or at least the absolute ability to shunt my mind to a remote private server. I don't want to be trapped in the Matrix after some nutjob turns my billion year dance floor paradise into a billion year journey through a Hieronymus Bosch painting.
I think you can upload no more than you can upload a droplet of water.
I mean really, how would you upload a droplet? Well, you know you couldn't - the best you could do is simulate it and A: You're going to go to a lot of effort to simulate the droplet than actually just having a droplet and B: It's a xerox - it's not an 'upload', it's just a copy.
When the mechanical properties of your brain ARE YOU, how can you upload them any more than you can upload a droplet of water?
But this is the last remnants of common religious thinking to be stripped away by science (ironic, given such 'uploads' are invocations of science) - the belief that there's this 'thing' that can just be punted over to a computer and be there now. The hard problem manifest in the notion that uploading transports anything (it doesn't even do so when uploading information to the web, it just takes a copy)
Best I can think of is migratory translation - synthetic synapses and original synapses allowed to integrate and the latter programs the former. Then when the former dies, perhaps some remnant of the inertia of the person is in the former.
A billion years? Is that in realtime or as subjectively experienced?
A billion years ago, the most advanced life form on Earth was the amoeba. I wouldn't trust anyone to maintain the server for a billion years.
But if it is subjective time, then if the computr is fast enough, it means that, say, twenty yers from now in realtime my contract is up and the sysadmin deletes my program. Ten years later, they discover how to do a proper simulation of reality. Bummer.
So from marketing 101, there are people who are out for a thrill for the new for an adventure.
There's even a name for such consumers.
Just like in traditional scifi there's a group out there who want to make it out of the solar system first
This recalls for me - Nozick's Experience Machine from 1973.
Are there elements here that lead down a different path than that particular thought experiment?
Thanks for the comments, folks! I'm off touring colleges with my son. An emergency escape seems like a good idea. I'm not so sure about the identity thing, though, Callan. It's a complicated issue -- are you thinking that you need neurons for consciousness (as Searle says) or could there be consciousness that in some sense seems to be "you" but isn't really, or...?
Howie: You might like Mandik's recent "Metaphysical Daring as a Posthuman Survival Strategy" which is about that sort of thing.
Pilot: Arguably, it wouldn't be deceptive, and arguably you could have real communications with others who uploaded (or others outside) depending on the set up.
Eric,
I think you're seeing consciousness as kind of the baby, where I see consciousness as the feedback informational bathwater that's a result of the baby/neurons.
So you remove the baby, then the baby is gone of course. The bathwater is no longer generated/soiled by that baby. Sure you can soil it some other way, but that's soiling it another way, obviously.
And good luck on the tour! :)
I am reminded of Olson Scott Card's short story "Fat Farm." If I remember correctly, the main character regularly visits a clinic that somehow copies his mind into the brain of a thinner, younger, healthier clone of himself. At the beginning of the story he goes into the clinic expecting to "transfer" over to the clone, but after the procedure is performed he realizes with terror that he still occupies the same body. The new version of himself blissfully leaves the clinic while the old version is forcibly dragged off to be used for slave labor.
If I could copy my mind over to a machine, I might take some interest in its future — much like I might take interest in the future of my children, if I ever have any — but it's never been clear to me why I should consider it a form of immortality.
Beautiful story, once again, congratulations. I notice that the story complicates the original premise very considerably by having the dancing be enjoyable because it's dancing with somebody he loves - I would say that loving someone requires a lot of your mind/personality, so the dancer can't be too impaired if that survives.
The smallness of the world is also very important, I think. If the dancer was going to be able to interact with real people, then it would feel much less like death. In the story, he is leaving the world, in that he will never see/speak to his wife and son again. And as much of our identity is socially constructed, that means he's losing a significant part of himself.
Callan: Yes, consciousness is the baby! Or at least one of the babies.
Dylan: Considerations like that lead some philosophers to favor "destructive" uploading so that there's no question of "you" continuing in your original form post upload. It partly depends on what the criteria of identity are and how important strict identity is -- maybe you're already familiar with all of this! I find myself broadly Parfitian, in thinking that strict identity isn't what matters most.
chinaphil: Yes, the story is more complicated than just the dimensions highlighted in the post. I do think the dancer leaves behind some very important things -- perhaps the most important things in his life. And yet (if we accept the general "upload" premise) the amount of pleasure he receives in exchange is vast, and its simplicity is appealing to him. That's what makes it a difficult/interesting choice for him.
Michel: Sorry I missed the approval on your comment earlier! (I was traveling.) Subjectively experienced is the most important thing -- though as a thought experiment, we could make it actual time if we want.
I thought the various concerns of the character in fish dance coming away in separate slices was you proposing a moderate position, Eric? As in proposing that the sliced brain is perhaps separating the various components of what generates consciousness/slicing up the baby that generates consciousness. Certainly seemed to me like a speculative nod toward the idea consciousness is just the bathwater.
Also, ahead of their time ('scuse the pun) : https://www.youtube.com/watch?v=JKDtUzRIG6I
Post a Comment