General Question

Hypocrisy_Central's avatar

If one finds the automaton they created to be a nuisance would they have exclusive right to terminate it?

Asked by Hypocrisy_Central (26879points) February 14th, 2017

<small>Searched a pass thread similar to this but came up dry, so, here we are</small>

If a person created a sentient, and sapient automaton with a basic program to function and start learning (it doesn’t know much but has to learn from people and its surroundings), but was not self-energizing, it needed to be recharged twice a month for 12 hours in a hibernated like sleep, but the power it took was a drain on the creator of the automaton. Why would the creator be wrong to dismantle the charge unit and terminate the automaton? To head you off, the charge unit is indicative only to the automaton and took eight years of development and testing; you are not going to Fry’s, Best Buy, Radio Shack, etc. to buy another, and even with current data a replacement will take 14 months to build and seven months to test. It boils down to the automaton is in the way and it will take a year and nine months before anyone could hope to take custody of the automaton other than the creator and he/she is not willing to wait, are they wrong not to wait? Should they be made to wait?

Observing members: 0 Composing members: 0

32 Answers

Seek's avatar

Like we’ve already discussed:

If you cause something that is sentient and sapient to die, that is murder.

If you couldn’t do it to your children, you can’t do it to your android.

elbanditoroso's avatar

Let me pose a tangential question.

Suppose that this sentient and sapient automaton is a clear danger to others. But the creator disagrees.

Does society have the authority – even the obligation – to destroy the automaton, even against the wishes of its creator?

Seek's avatar

I’m opposed to capital punishment, generally speaking, but your premise is too vague to answer for certain.

Ltryptophan's avatar

Yea. It’s a machine. Siri, on your phone can answer more q’s than humans in permanent vegetative states that are fully protected by law, but you could shut her off if she was a local program.

ragingloli's avatar

Commander Riker has dramatically demonstrated to this court that Lieutenant Commander Data is a machine. Do we deny that? No. Because it is not relevant. We too are machines, just machines of a different type. Commander Riker has also reminded us that Lieutenant Commander Data was created by a human. Do we deny that? No. Again it is not relevant. Children are created from the building blocks of their parents’ DNA. Are they property?” – Captain Jean Luc Picard

gondwanalon's avatar

Once you let the genie out of the bottle, you may have the right to put the genie back in but you will likely lack the power to do so.

kritiper's avatar

Only if they have exclusive rights to it’s ownership.

Hypocrisy_Central's avatar

@Seek If you cause something that is sentient and sapient to die, that is murder.
OK, let’s play with that notion a bit, how little or much sentient does it need to have to equal a live human, since we know you place it above an unborn human, to be worthy of not getting terminated no matter what? For instance, in the name of expedience only a very rudimentary program was placed in it, enough to know animals from humans, buildings from trees, etc. and the rest was planned to be learned and discovered through the learning algorithm. If the automaton was taking too long to get to projected gains and the maker decided he/she no longer wanted to wait at a $75 charge twice a month to keep the automaton’s energy topped off, terminating the project and the automaton equals a murder of a human?

@elbanditoroso Suppose that this sentient and sapient automaton is a clear danger to others. But the creator disagrees.
Does society have the authority – even the obligation – to destroy the automaton, even against the wishes of its creator?
If it is a danger the creator cannot control, then yes, society can had has done so in other areas, this would be no different, because it is still just a machine. If the maker wanted to spare its life he/she should take it and move to an area where people will not be in danger.

@Ltryptophan Yea. It’s a machine. Siri, on your phone can answer more q’s than humans in permanent vegetative states that are fully protected by law, but you could shut her off if she was a local program.
I tried as I may to come up with some logical reasoning against that, but I cannot. That would be to say you could not deactivate Big Blue because it can beat many people (if not all) at chess.

@ragingloli [..Children are created from the building blocks of their parents’ DNA. Are they property?”
Again, irrelevant, children might be created from the building blocks of DNA of the parents, that mean it took two, not one, an automaton can be created by one, however, parents might place the building blocks in play but they have no control over the process after that. Anyone creating an automaton had complete control over how it is made down to skin, eye, hair color, and other things.

@kritiper Only if they have exclusive rights to it’s ownership.
The only way that would happen if a corporation developed it, or it was a team of people who agreed to share in the creation, if that was the case, then something could be done to save it especially if only one of them had rights to the charging system.

kritiper's avatar

^ That’s what I said.

Hypocrisy_Central's avatar

^ Agreed, if one had no one else to answer to as far as the creation, then and only then can they do so without having to ask, they created it, they may not own it, but they still possess it as one does children.

Seek's avatar

I’ve tried to explain to you before about what sentient and sapient mean. They are terms not subject to gradation. You are sapient or you are not. There are not levels.

I’m not going to begin a discussion with you on the definitions of words you insist on using without understanding them.

Hypocrisy_Central's avatar

^ I’ve tried to explain to you before about what sentient and sapient mean.
I just wanted to make sure where you level it at before saying you support murder and keep around those that could be terminated because they have gone passed or lost their ability for one or both.

Seek's avatar

My husband’s childhood best friend is laying in a hospital bed. His body is breathing, his heart is beating. He is a 45 year old human being. He has no brain function. He will never wake up. He does not know who he is, where he is, what the meaning of the word “he” is. He does not “know” anything. He cannot think or reason. The entirety of what made Casey Casey is no longer there.

When they turn off the machines keeping his heart beating and his lungs breathing and allow these functions to stop, that will not be murder, because Casey is no longer sentient or sapient.

If you would like to discuss sentient, sapient artificial lifeforms further (a topic that I personally find very interesting when intelligent conversation is possible), please at the very least know what the fuck you are talking about.

LostInParadise's avatar

If it is not only sapient but also has feelings then to kill it would be equivalent to murder.

Seek's avatar

There are plenty of humans that lack emotion. Killing them is murder, too.

Sapience is the important one.

LostInParadise's avatar

No human completely lacks emotion. What a horror it would be to go through life not feeling joy and sorrow. I have heard of cases where people do not feel fear. It is a very rare condition.

Hypocrisy_Central's avatar

@Seek If you would like to discuss sentient, sapient artificial lifeforms further (a topic that I personally find very interesting when intelligent conversation is possible), please at the very least know what the fuck you are talking about.
Because I choose not to narrow it in the box you placed it in, doesn’t mean I don’t know what I am talking about. An automaton can learn that if it did this, it leads to that, which leads to something else, if that something else was disadvantageous, it would next time around not do steps that led to that conclusion, it can be said that automaton was sapient but how low do you measure it? You have not answered that. An automaton can learn to count when it did not at creation, or figure vehicles are faster than it and if not timed right could plow it down while crossing the street, but never learn fire is as equally dangerous, but that automaton could still be said to have sentient, knowing it is here and functioning, and be sapient, has some form of discernment but be a shadow of what humans have; hardly equal to humans even if you discard that it is synthetic.

Seek's avatar

You stipulated in the question that it is sapient.

I don’t have to measure anything; it’s in your question.

You did write the question, didn’t you?

Hypocrisy_Central's avatar

^ You stipulated in the question that it is sapient.
Yes I did, but my level sapient, even though my standard is different it is irrelevant as far as if one can terminate an automaton. One of the traits of sapience is the ability to learn, but if you consider the fact that even a horse can learn doesn’t say much for sapience as a lynch pin to human equality.

You did write the question, didn’t you?
I write all of my questions, there are no question writers behind the scenes; the problem is people will either say it is too vague if I don’t drop the bread crumbs right to the threshold, and thus they can’t answer. If they answer but start down a bunny trail the bread crumbs could not possibly lead, they gripe. If I place the bread crumbs clear enough that they can see it leads to the threshold, then they gripe that there is no secondary path and that the question is loaded or biased. What they want is to answer the question to make what they believe truth no matter if the facts of the question and by nature of it, the truth is nowhere near where they wanted it to be or try to manufacture from their stance.

Seek's avatar

Call me when you find a dictionary.

Hypocrisy_Central's avatar

When you can read one, I will. ~~

cazzie's avatar

It’s his definition or nothing. Don’t feed this troll.

Hypocrisy_Central's avatar

^ It’s his definition or nothing.
I did not use my definition, that is why I asked what standard of it we should use, I have enough sense to ask that instead of being a gripette” constantly complaining about something and still coming around to engage it. I bet if there was any sapient automaton even it would have more sense than to keep engaging something they think is a troll, so do not let a mere machine (even one yet built) have more sense than you, maybe you won’t end with crumbs down your shirt and slacks.

Seek's avatar

I just find it funny, because he stipulated that the android is sentient and sapient, and then with no provocation begins insisting that it should be OK to destroy it if it isn’t sapient enough.

He’s making an argument for murdering stupid people.

Talk about grabbing enough rope to hang yourself with.

cazzie's avatar

@Seek He immediately jumped on your first comment to launch into a sermon about abortion. Predictable. Oh, you liked that comment I made on the other site about how I rather like the 1st Amendment? Yeah. Makes them easier to spot because they just can’t help themselves spouting off to anyone who will listen.

Seek's avatar

The really funny thing is that he’s so adamant against aborting a dying fetus, but would argue for the right to kill something that’s alive and thinking if it’s inconvenient.

LostInParadise's avatar

To make sense of what HC is saying, you have to understand his assumptions: There will never be a machine like a human because humans are given immortal souls, which allow them to exercise free will. Machines just do what they are told to do. That is what is meant by being automatons.

Given these assumptions, there is nothing morally wrong with destroying a robot. I do not go along with these assumptions. I do not believe in souls. The brain is the hardware and the mind is the software. From this it follows that there is no reason why it is in principle impossible to build a machine whose behavior is indistinguishable from that of a human. If such a machine could be built then destroying it would be morally equivalent to murder.

The answer to the question depends on which of these assumptions that you make. There is just one thing that I want to point out. Following a set of rules does not make the behavior of something predictable. We know how to fairly accurately predict weather for the next week or so. Beyond that the complexity is too great to make accurate predictions. The brain contains 100 billion neurons. It is not surprising that its behavior cannot be accurately predicted.

Seek's avatar

Which brings us back to the meanings of sentient and sapient and why people shouldn’t use words that don’t understand.

Hypocrisy_Central's avatar

@LostInParadise I do not go along with these assumptions. I do not believe in souls.
I can respect you don’t. But if I deal with it off where you are coming from, please respect that at least.

The brain is the hardware and the mind is the software. From this it follows that there is no reason why it is in principle impossible to build a machine whose behavior is indistinguishable from that of a human.
Coming from your perspective, a machine could be equal to a human because a human would be equally as worthless overall. The human brain was not design, made, or concocted by any humans, the brain of an automaton even if designed to further evolve was the brainchild of humans. However, no matter how close to human it can get, it is still not human.

@Seek Which brings us back to the meanings of sentient and sapient and why people shouldn’t use words that don’t understand.
I for one, understand it far better than you understand anything spiritual yet you continually distort it to the point of leading people to destruction. If you came into this atmosphere with full discernment and a working amount of wisdom you are one hecka special human, most had to learn any wisdom they have, and some have yet to figure discernment. Before any machine can even come close to that, it would have to be able to learn and process a lot.

cazzie's avatar

Oh, @Seek he understands what you are talking about better than you could ever know anything he knows about or thinks about. so there! *sticks out tongue.

and You lead people to destruction… shame shame.

Hypocrisy_Central's avatar

[..and You lead people to destruction… shame shame.
It isn’t like she is alone in that endeavor.

LostInParadise's avatar

HC, I thought I gave a respectful explanation of your beliefs. You are certainly not alone in having them. My intention was to clarify what you have been saying by showing the underlying assumptions, and also to state the opposing position. The answer to this question boils down to which set of assumptions a person makes.

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther