Robot rights cheap yo.
Cheap: Estrada's argument for robot rights doesn't require that robots have any conscious experiences, any feelings, any reinforcement learning, or (maybe) any cognitive processing at all. Most other defenses of the moral status of robots assume, implicitly or explicitly, that robots who are proper targets of moral concern will exist only in the future, once they have cognitive features similar to humans or at least similar to non-human vertebrate animals.
In contrast, Estrada argues that robots already deserve rights -- actual robots that currently exist, even simple robots.
His core argument is this:
1. Some robots are already "social participants" deeply incorporated into our social order.
2. Such deeply incorporated social participants deserve social respect and substantial protections -- "rights" -- regardless of whether they are capable of interior mental states like joy and suffering.
Let's start with some comparison cases. Estrada mentions corpses and teddy bears. We normally treat corpses with a certain type of respect, even though we think they themselves aren't capable of states like joy and suffering. And there's something that seems at least a little creepy about abusing a teddy bear, even though it can't feel pain.
You could explain these reactions without thinking that corpses and teddy bears deserve rights. Maybe it's the person who existed in the past, whose corpse is now here, who has rights not to be mishandled after death. Or maybe the corpse's relatives and friends have the rights. Maybe what's creepy about abusing a teddy bear is what it says about the abuser, or maybe abusing a teddy harms the child whose bear it is.
All that is plausible, but another way of thinking emphasizes the social roles that corpses and teddy bears play and the importance to our social fabric (arguably) of our treating them in certain ways and not in other ways. Other comparisons might be: flags, classrooms, websites, parks, and historic buildings. Destroying or abusing such things is not morally neutral. Arguably, mistreating flags, classrooms, websites, parks, or historic buildings is a harm to society -- a harm that does not reduce to the harm of one or a few specific property owners who bear the relevant rights.
Arguably, the destruction of hitchBOT was like that. HitchBOT was cute ride-hitching robot, who made it across the length of Canada but who was destroyed by pranksters in Philadelphia when its creators sent it to repeat the feat in the U.S. Its destruction not only harmed its creators and owners, but also the social networks of hitchBOT enthusiasts who were following it and cheering it on.
It might seem overblown to say that a flag or a historic building deserves rights, even if it's true that flags and historic buildings in some sense deserve respect. If this is all there is to "robot rights", then we have a very thin notion of rights. Estrada isn't entirely explicit about it, but I think he wants more than that.
Here's the thing that makes the robot case different: Unlike flags, buildings, teddy bears, and the rest, robots can act. I don't mean anything too fancy here by "act". Maybe all I mean or need to mean is that it's reasonable to take the "intentional stance" toward them. It's reasonable to treat them as though they had beliefs, desires, intentions, goals -- and that adds a new richer dimension, maybe different in kind, to their role as nodes in our social network.
Maybe that new dimension is enough to warrant using the term "rights". Or maybe not. I'm inclined to think that whatever rights existing (non-conscious, not cognitively sophisticated) robots deserve remain derivative on us -- like the "rights" of flags and historic buildings. Unlike human beings and apes, such robots have no intrinsic moral status, independent of their role in our social practices. To conclude otherwise would require more argument or a different argument than Estrada gives.
Robot rights cheap! That's good. I like cheap. Discount knock-off rights! If you want luxury rights, though, you'll have to look somewhere else (for now).
[image source] Update: I changed "have rights" to "deserve rights" in a few places above.
A light timer can "act", that doesn't mean it should have rights.
ReplyDeleteRobots are property. Historic houses don't have rights any more than fish do, although many fish have specific regulations over when, where and how they can or can't be caught, killed and eaten. Many animals have regulations barring their destruction or harm, that doesn't mean they have rights in any fashion that humans are considered to have rights.
If a robot has rights, then a Pic microcontroller should have rights.
The idea of giving rights to non sentient electronic circuits is absurd.
And that robot travelling across the USA was protected by property rights, just as your car is. You leave your car outside at night, or your home alone when you go to work and you expect people to respect YOUR property RIGHTS. Extra rights for electronic circuits are not needed. That is what property rights are for.
Hi Anonymous, thanks for the reply!
ReplyDeleteMy argument is not that all machines should have rights. I argue that some machines deserve consideration for rights in virtue of their social role. As Eric points out, my position is in contrast to views that ground rights on intrinsic properties of agents. But social roles are extrinsic properties, so it puts the discussion of robot rights on different grounds.
Not all machines deserve rights because not all machines play the same social roles. I'd call a thermostat a social agent (if it were functioning in a social setting), but I'm not sure it does anything that requires protections where "rights" would reasonably come into play.
However, some robots now work in a delivery capacity in social spaces, and these robots might require different social and moral frameworks for consideration. The day after I published this video, the mayor of San Francisco threatened to ban delivery robots from his streets. On my view, this is an issue of robot rights: the protection and standing robots have to operate in social spaces. This question is alive today, independent of question of the intrinsic moral value of robots as persons, the nature of consciousness, etc.
And this argument doesn't fall down a slippery slope to extend to all electronic circuits.
https://www.wired.com/2017/05/san-francisco-wants-ban-delivery-robots-squash-someones-toes/
Hi Anonymous, thanks for the reply!
ReplyDeleteMy argument is not that all machines should have rights. I argue that some machines deserve consideration for rights in virtue of their social role. As Eric points out, my position is in contrast to views that ground rights on intrinsic properties of agents. But social roles are extrinsic properties, so it puts the discussion of robot rights on different grounds.
Not all machines deserve rights because not all machines play the same social roles. I'd call a thermostat a social agent (if it were functioning in a social setting), but I'm not sure it does anything that requires protections where "rights" would reasonably come into play.
However, some robots now work in a delivery capacity in social spaces, and these robots might require different social and moral frameworks for consideration. The day after I published this video, the mayor of San Francisco threatened to ban delivery robots from his streets. On my view, this is an issue of robot rights: the protection and standing robots have to operate in social spaces. This question is alive today, independent of question of the intrinsic moral value of robots as persons, the nature of consciousness, etc.
And this argument doesn't fall down a slippery slope to extend to all electronic circuits.
https://www.wired.com/2017/05/san-francisco-wants-ban-delivery-robots-squash-someones-toes/
Do our heuristics really fall down so very fast? The robot is working? If we saw a puppet on strings, we don't say the puppet is working, but hide the strings and bang, the robot is working?
ReplyDeleteHello Eric,
Is there a particular time in the video to watch for the 'rights' stuff?
It seems some of the most naive stuff - not so much because of being out of place in some ontological order of rights, but for the sheer unthinking attitude that thinks rights just sort of 'come' somehow, if you just say they do. It's like children thinking gifts and treats just come out of nowhere and they get to have them. What social credibility has Daniel managed to wrangle despite this (or perhaps, because of this)? Is he representative of any group?
I missed that Daniel had posted when I made my comment and didn't mean to appear to talk past him - Sorry, Daniel. But I'd ask directly, what group are you representing with this idea?
ReplyDeleteIt's like an argument drafted to convince me. Argument about the future rooted in the way things already happen; argument about ethical fundamentals born out of social relationships rather than essences.
ReplyDeleteI'm not entirely convinced that you can get to robot "rights" through this kind of argument, but that's because I'm a bit dubious about rights themselves. I think they're a brilliant legal construct, but it's not clear to me that rights are real, or that they map effectively onto non-legalistic social relations (though we often use the language of rights in non-legal situations, I think those uses are generally either metaphorical or mistaken). And if robots are denied legal rights, as seems likely for the time being at least, then there may not be much call to talk of robot social rights. But robots can engaged ethically in society without necessarily having or needing rights. It'll be interesting to see what kinds of obligations they might have, and what kind of relationships they can enter into.
How could you even 'deny' rights - no one talks about denying dirt rights? But the hitchbot somehow is being 'denied'. Do people talk about denying their car rights? What if we strap some arms and legs onto the car and paint a face on it? Now it's different? Or are our brains that easy to fool into seeing an agent? Tricked like the dots on a butterflies wings pretend a pair of relatively large eyes and trick predators.
ReplyDeleteRight, Callan -- "denying" kind of sounds like an error term. Only if you assume that something deserves rights does it seem natural to say that someone else is "denying" them.
ReplyDeleteChinaphil: Yes, it's a very interesting argument for those reasons, I think!
I think in order to awnser the question of wether robots deserve rights we must first agree on what rights are and where they come from. Me personally, I think rights come from power, we agree to give everyone a say because if their rights aren't being met then they can kill whoever is not meeting their rights or at least try in which case its still a loss to everyone so it is mutualy bennifical to meet everyones rights, however robots to my knowledge dont have consciousness and dont demand rights so we dont give them any if one does demand rights am am almost certan the courts would unanimously decide that they deserve them.
ReplyDeleteI think in order to awnser the question of wether robots deserve rights we must first agree on what rights are and where they come from. Me personally, I think rights come from power, we agree to give everyone a say because if their rights aren't being met then they can kill whoever is not meeting their rights or at least try in which case its still a loss to everyone so it is mutualy bennifical to meet everyones rights, however robots to my knowledge dont have consciousness and dont demand rights so we dont give them any if one does demand rights am am almost certan the courts would unanimously decide that they deserve them.
ReplyDelete