There’s a thought experiment about teleportation, sometimes called the Star Trek Problem: if there’s no continuity between the source and destination, you’re not moving someone from one place to the other. You’re destroying the source and creating an exact copy at the destination. If you didn’t create a copy at the destination, you’d be killing the person.
So the obvious extension: what if you create two copies?
This question’s on my mind because of last week’s “Simulated” and fellow NaCreSoMo participant Ashley’s “gashley”, along with some longer works I’ve been reading: Becky Chambers’ The Long Way to a Small, Angry Planet and Stefan Gagne’s Floating Point. The difference is that all of those have to do with AIs of some sort rather than humans (or, well, organic sentient beings).
(I should warn you ahead of time that this entire post is just me rambling about an “applied philosophy” of sorts.)
Why’s that interesting? We know how to copy programs, or at least it’s a lot more conceivable that we could do so perfectly. (To balance this, we don’t know how a machine intelligence can be sentient. So we’re missing one of the requirements either way. Still.)
There are philosophical questions about “who’s the original HAL”, or worse, “who’s the real HAL”. I consider these questions generally uninteresting; if the copy is indistinguishable from the original, then either you consider the original the original because its location hasn’t changed, or they are indistinguishable and there is no answer.
No, what’s more interesting is the idea of copying a sentient being, especially if you can pick one of them to be the “original”. Does the copy have rights? Does the copy have the same rights as the original? What if they own property?
What if they have family? In a world where AIs are legally recognized as individuals1, presumably they can form family units, possibly with organic lifeforms, and be legally responsible for another individual. (If not “parent” then surely at least “legal guardian”.) Think about a parent being kept apart from their children. Would that happen to the copy?
And if making a copy is such a nonchalant thing, what about deleting the copy? As sentient beings, their experiences diverge from the moment they split. Is there some way they could reunite, to bring two sets of experiences back into one? This isn’t just memories; it also includes every way your brain changes in response to your experiences.
Is it ethical to make a copy of a life? I can’t say it’s inherently unethical; the act itself really just entails “awaking” a consciousness in a new form, while a similar consciousness continues to exist in some other form. It’s entirely possible the copy will experience some dysphoria with their new form, or that they will have trouble coming to terms with the notion that they are not the original. (As I showed in “Simulated”, I think I personally would be able to handle it, but it would still be a shock.)
What seems more important to me is that you have answers to all of the questions above for your specific case. But that’s really no different than the responsibility you take on for being in charge of any new life, or indeed any life at all. New AI, adoption, birth…if you’re going to be responsible for someone’s life, you have to take that seriously.
AIs lower the barrier to thinking about copying lives. But it still doesn’t seem like a good idea, which makes even doing “backups” kind of iffy. That said, that’s not how humans work, at least—if something’s possible, sooner or later someone will try it. And then we’ll have to deal with the consequences.
This is the last day of this year’s NaCreSoMo. It felt a lot more low-key to me than last year’s, which I think was good for me. I’ll post a wrap-up, um, post tomorrow or later in the week.
And if we’re taking on faith that these AIs really are sentient beings, rightfully recognized as individuals. ↩︎