Social Question

Hypocrisy_Central's avatar

Sentient automatons create one of their own who is dangerous, do humans have the right to destroy it for their own safety?

Asked by Hypocrisy_Central (26879points) February 14th, 2014

Somewhere in a future far, far off, two sentient automatons decide to create a being as them. However, this new automaton created not by human hands, develops the mindset that humans are dangerous and becoming a blight on the planet and should be dealt with. Said automaton for all the rhetoric has never acted on its thoughts. Humans fearing it will want to eliminate it if its creators won’t. Should the humans have the right to for their safety even if the machine is sentient, or should they wait until something actually happens, then do something; if they still can?

Observing members: 0 Composing members: 0

52 Answers

Juels's avatar

I’d wait for aggressive action from the automaton before taking steps to eliminate it. Since it is sentient, maybe our good behavior will positively influence its opinion of humans. It is Friday and I’m feeling optimistic.

Strauss's avatar

I would agree with @Juels that since it is sentient, it must be given the same rights as other sentient beings, Even though it is not human, its sentience dictates, IMHO, that it must be treated as such.

Berserker's avatar

They’re not alive or conscious, therefore cannot be ’‘sentient’’. Unless an automaton can be something other than a machine.
As far as morality goes, I see absolutely no issue in dismantling it for any reason. It cannot feel, it cannot know. It’s not anything with an actual life. Would be like saying I shouldn’t turn off my DVD player if it makes a sad face on the little screen after pressing some buttons which it registers as a potential turn off, due to how it was programmed.

ragingloli's avatar

I think that humans are dangerous, are a blight on the planet, and should be dealt with.
Do you think you have the right to kill me?

RocketGuy's avatar

“We can admit that we’re killers…but we’re not going to kill…today. ” – Kirk

jerv's avatar

I’d wait for it to try something.

It’s the difference between homicidal thoughts and actual murder that keep me from bathing in the blood of those who annoy me.

@Symbeline We used to say the same things about Africans.

zenvelo's avatar

Such an event would break the Three Laws of Robotics. The two robots who create a third must program it to not harm humans.

Berserker's avatar

@jerv Alla sudden I’m reminded of Boondocks.

Hypocrisy_Central's avatar

@zenvelo Such an event would break the Three Laws of Robotics. The two robots who create a third must program it to not harm humans.
What if they did, but a glitch caused their creation to view humans as a waste of planetary resources, and a blight?

RocketGuy's avatar

We do waste planetary resources, and cause blight…

NanoNano's avatar

I would say that if the original creators of the automatons were humans, then they have the final say as to whether these beings live or die.

I assume you are talking about synthetic life here, “androids” or machines.

When we get to dealing with organic living things, it is a different story; as equate organic life with having a spiritual component, but not machine life.

ragingloli's avatar

@NanoNano
Being the creator of something does not give you the right to terminate its existence if the creation can express a desire not to be ended.

NanoNano's avatar

I think it does if its a machine. But that might be a human bias.

I don’t think you have the right to do that if you’re basically playing “Dr. Moreau” so to speak.

Berserker's avatar

When we get to dealing with organic living things, it is a different story; as equate organic life with having a spiritual component, but not machine life.

Yeah, agreed on that. A machine is never alive, even if it’s programmed to act like it is. It has no conscience or emotions or anything like that.

Human bias indeed, if @ragingloli created a zombie, he’d probably kill it haha. Mind you, so would I.

ragingloli's avatar

except of course that your planet is full of machines with emotions and conscience.

Berserker's avatar

@ragingloli Such as? Can you give an example?

ragingloli's avatar

You, and every big brained mammal.

Berserker's avatar

I’m not a machine. If you’re using a metaphor for telling us that humans are stupid, that works out, sure, but I thought we were talking about actual machines. Unless you’re saying your race created us, but even so, we’re organic life forms with a conscience, not toasters with integrated wifi features.

ragingloli's avatar

You are biological machines, self maintaining and self replicating, but ultimately, machines, down to your intracellular protein factories.

jerv's avatar

I think @ragingloli would be willing to kill anything, regardless of who created it or what it is.

ragingloli's avatar

@jerv
How insulting. I would never harm a cat.

Berserker's avatar

@ragingloli Yeah but the cat may be opposed to you having sex with it. :p

ragingloli's avatar

Never happened so far.

NanoNano's avatar

ragingloli:

I think the concept of human beings or other organic life being “biological machines” has been pretty well discredited by science at this point. That is, what I’m saying is, science has progressed beyond this.

Yes, the human body, or an animal, insect, or plant body is a form of organic machinery in a sense. But there is something additional at work here that science cannot explain. That is, consciousness, self-awareness, and emotions or feelings.

These responses and/or awareness are not explainable on a machine level, except with basic drives. There is nothing in science that explains why a complex gathering of molecules should be or could become self-aware, whether we are talking about a parrot or a three year old child…

Consciousness and the mind are immaterial. Thoughts have no mass, no measureable energy, no polarization or spin… They are nonmaterial things, yet they influence the material world.

This is where developing artificial intelligence in the first place is so problematic. We can create a machine with visual sensors comparable to the human eye, but we cannot teach it to “see” to the point where it ascribes an emotional value to what it senses.

So if machines do become self-aware, I think most engineers and scientists at this point see that as a form of mimicry, not true “life.”

And therefore, I suspect, we are still talking about toasters that ask you how you want your toast done. Not living things.

Berserker's avatar

Aye. A human and a car are similar if you want to get into…similarities. Both need either food or gas, both need maintenance, both release what is no longer needed, and both age and eventually become invalid. But indeed, the definitions for organic life form and machine do not exist for nothing. The similarities between a human and a car do not make them exactly the same. Except that car from Knight Rider or the force that posses machines in Maximum Overdrive.

NanoNano's avatar

That’s a good comparison. And we will soon have cars that drive themselves, that make life and death decisions for their occupants. Does this make them equal to their occupants since they have this power?

Computer intelligence in certain respects (the ability to do complex calculations that model the world in some limited degree) will soon surpass native human abilities. In this respect they will be superior, but it still doesn’t make them alive.

We should expect them to resemble us in many ways however as they become more and more sophisticated both because we will design them specifically to look and act human, and because they are subject to the same physical laws that we are.

A car, whether manufactured in the US, Russia, India, Germany or France, still is based on similar design fundamentals. A steel frame, rubber wheels, and electrical system, a hydrocarbon fuel propulsion system…. All of these things are in common use because physical law is common in all such locations…

ragingloli's avatar

I think the concept of human beings or other organic life being “biological machines” has been pretty well discredited by science at this point.
Has it now? I would like to see some sources for that assessment.

“Consciousness and the mind are immaterial. Thoughts have no mass, no measureable energy, no polarization or spin… They are nonmaterial things, yet they influence the material world.”
Again, you have no basis to make that claim.

NanoNano's avatar

Just do some research in neurobiology and the study of human consciousness ragingloli. You can find scholarly articles all over the Internet and many many books have been written on the topic in recent years.

ragingloli's avatar

Would you mind posting some actual names/links?
Because I did search, and did not find anthing that supports your claim that the brain is not a biological computer.
I did however find this neuroscientist, that claims that consciousness arises naturally from information networks, and that even the internet could have some sort of consciousness

NanoNano's avatar

I would start with Rupert Sheldrake’s work – he is a renowned biologist. The Science Delusion is one of his books…

NanoNano's avatar

BTW, ragingloli, I am talking about the human mind and consciousness, not the human brain. Two very different subjects…

ragingloli's avatar

Not very different at all.
The brain creates consciousness.

ragingloli's avatar

As for Sheldrake, he is not a neuroscientist and has no expertise in that field.
Beside that, he dabbles in parapsychology like Egon Spengler and promotes pseudoscience, namely his morphic resonance malarkey.

Hypocrisy_Central's avatar

@ragingloli The brain creates consciousness.
Interesting, so how would a very advanced computer brain produce consciousness that wasn’t programmed into it?

Berserker's avatar

Even if it could ’‘do that’’, it would only be a semblance of consciousness, I don’t see how any machine could ever be conscious like a living thing.

ragingloli's avatar

@Symbeline
Prove to me that you are conscious

Berserker's avatar

I can think in my head, and make decisions. A machine cannot do this, as it’s programed for whatever it was programed to do. (that is, for machines that do not require human involvement at all times, like a car or a toaster)
Now I certainly agree, we humans go by our survival instinct, (even with such out of the way things like societies and all its indirect associations that have nothing to do with sleep eat or kill) but that’s where I would draw the line to similarity with machines. Consciousness and self awareness may not be measurable in any way, but what are thoughts, if not consciousness? (I mean, don’t tell me thoughts do not exist) If YOU can prove to me that the Internet can think for itself, I’ll certainly change my mind about it. (which ironically, would be a proof of consciousness, no?)

ragingloli's avatar

@Symbeline
Your assertion that machines categorically will never be able to do that is completely unfounded.
While there are no self aware man made machines today, there are absolutely some that do make decisions. From the chess computer that decides which piece to move based on the actions of his opponent, to self driving cars that decide which route to plot and which obstacle to avoid instead of driving over it, as well as stock exchange software. Decision making in computers is everywhere today.

And besides, you have only claimed that you can “think in your head and make decisions”. A chat bot can do that.
From where I am sitting, you are just pretending to be conscious, and by no means have you even come close to proving that you are conscious.

Hypocrisy_Central's avatar

@Symbeline *_Now I certainly agree, we humans go by our survival instinct, (even with such out of the way things like societies and all its indirect associations that have nothing to do with sleep eat or kill) but that’s where I would draw the line to similarity with machines.
That need to survive would kick in to say ”That automaton wants to hurt me, and us, humans, what if _it built another like itself as the automatons before created it?”_ Would the originator automatons need to have the concept of compassion, empathy, sympathy, or the ability to reason to have a right to say you can’t destroy our ”child”, in a sense? If the dangerous automaton, though not having actually done anything, had the ability to reason, have empathy, etc. would that make it more worthy of grace or mercy not to be shredded or dismantled?

Berserker's avatar

@ragingloli Unfounded how? You said it yourself, there are no self aware machines at the moment. (maybe there will be some day, who knows, but at that point they would no longer be machines, even if they’re not organic, the definition does not exist for nothing)

Machines that can make decisions, like your chess program or a self driven car, can only do this because they were programed to do it. (I’m not actually aware of self driven cars, only GPS systems) You won’t see a chess program ask if you want to play checkers instead, or a car going, well let’s go to route A instead of route B, if you’ve told it to go to route B. (although granted, it may suggest route A instead if it’s shorter, but again, only because it’s been programmed to do so, it does not consciously make that decision, or I assume this would happen, when these cars will be available to people)
That is not the same at all as a machine that is aware of its existence, I’m pretty sure you realize this. You seem to be suggesting here that no man can be conscious, but machines can, or eventually will?

From where I am sitting, you are just pretending to be conscious, and by no means have you even come close to proving that you are conscious.

Well, tell me how I can prove it then, and I will. How am I pretending? Not sure I understand. (I’m guessing you mean I’m not conscious of pretending) If you do not believe that consciousness exists in people, (and some animals, I’m sure) than whatever I say will not be good enough, but I wish you’d explain a bit more about what you claim to know on this subject instead of just telling me I have no basis or explaining to me the basics for programs that existed back when computers were all blocky.
What about you? Are YOU conscious?

@Hypocrisy_Central If the dangerous automaton, though not having actually done anything, had the ability to reason, have empathy, etc. would that make it more worthy of grace or mercy not to be shredded or dismantled?

As I keep saying, if something is able to feel such emotions, it’s no longer a machine or an automaton. Personally I do not believe this to ever be possible. This would be like giving life to something, which people, at this point, cannot do. But sure, if we ever DID create actual life, I’d hope we’d be able to respect it, and treat it right, especially as it would rely on us for its survival, at least at first anyway. (depending on where that scenario would go, and since we created it)

But I mean, can you create a dog? An insect? Another human? We can only create something which has a semblance thereof, and for me, no matter how cute it is, a robot puppy that barks and wags its tail is not life, and can be dismantled and destroyed with no fear of guilt. That said, cloning does exist. :/

ragingloli's avatar

@Symbeline
If you are unable to prove that you are conscious, then you have no basis to claim that a machine that outwardly acts the same way you do, is not conscious while at the same time asserting that you are.

You have no basis to claim that machines can not be conscious, because
1. you do not know what the specific physical properties of the brain are that enable consciousness
2. you do not know the detailed processes in the brain that enable consciousness
3. you definitely do not know that these physical properties and processes can not be simulated by a computer.

Berserker's avatar

But do you know all those things? (number 1 to 3) If you don’t, then the same goes for you. I ask you again; are YOU conscious?
You don’t seem to realize that there’s a difference between a machine and a life form. It’s not because we’re similar that we’re the same. I’m thinking that whatever proof to you I give of being conscious, you will denounce, but to me, the first example I gave is still valid. (you’re acting a lot more like a machine than I am :p )
Tell me how exactly a machine can have emotions and act on them, without being programmed? (I’ll go with the assumption that you know enough on this subject to be able to explain SOMEthing to me here, which you haven’t been doing so far)
Although I certainly won’t disagree that as far as biology and DNA goes, humans are also programed in a way.

Hypocrisy_Central's avatar

@Symbeline Personally I do not believe this to ever be possible.
It has to be from a earthly standpoint; that is how we got here to many. If humans got here by gazzilion tried of trial and error and without any sentient being doing it, why can’t a sentient being do it with enough tires? However, that is another question.

On you your consciousness, you do not have to know how the lungs filter the C0 2 for them to work, or how many watts, ohms, etc. it takes to keep our hearts regulated. Book intelligence does not consciousness make.

ragingloli's avatar

@Symbeline
Any machine that acts the same way you do, supplies the same amount of “proof” for its own consciousness that you did, and as such, in terms of consciousness, you and the machine are the same to an outside observer.

What is a life form.
By the most commonly used criteria, an entity must have metabolism, response to stimuli, and reproduction with modification to adapt to changing evironments to enable evolution.
There is nothing that would prevent a machine from doing all these things, and as such, a machine that can do all these things, is, by definition, a life form.

“Tell me how exactly a machine can have emotions and act on them, without being programmed?”
Genetic algorithms, i.e. artificial evolution. It has already been used in engineering to evolve flower shaped solar collectors, as well as computer science to evolve a virtual puppet that dynamically moves around a space and reacts naturally to obstacles.
A sufficiently detailed simulation could very well evolve an entire population of virtual consciousnesses.
That said, whether or not conscious behaviour is programmed is irrelevant.
Say you yourself were programmed, with no change to how you are now, would that invalidate your claim to consciousness?

ragingloli's avatar

There is no right to deny freedom to any object with a mind advanced enough to grasp the concept and desire the state.
– Asimov, The Bicentennial Man

Berserker's avatar

Any machine that acts the same way you do, supplies the same amount of “proof” for its own consciousness that you did, and as such, in terms of consciousness, you and the machine are the same to an outside observer.

Yeah, that completely makes sense and I agree, or so I imagine what it would be like for something non machine and non human observing both. If you don’t mind me saying though, I don’t believe for a second that you are anything else but human.

But thanks for explaining, at least now I know where to go from. That said the issue at hand is still the same I guess; consciousness itself. I still don’t understand how a machine can be conscious or aware, no matter the modifications you bring to it for it to act like it has a life. It acts alive but is not. Although from your side, I understand that you believe humans do the same. this is where I am confused. (based on that I believe conscious to be real, and that you don’t believe this, or at least that’s what I gather, guess that’s the sticker here)

There is nothing that would prevent a machine from doing all these things, and as such, a machine that can do all these things, is, by definition, a life form.
But it still isn’t alive. This goes back to my car example, is a car alive? Is the Internet alive, a living thing, because it computes demands met by typing fingers? Might as well say that a type writer is alive because it can type. (granted, the net can act on its own, while the type writer cannot)

Say you yourself were programmed, with no change to how you are now, would that invalidate your claim to consciousness?

Well honestly, you got me there. But in that defense, say we create clones and can replicate life and it remains what it originally was, then yes, we should respect it. Say we clone a mouse, or change the color of a flower, we should care for these beings after, even if it’s altered or emulated. If the being can feel its own life no matter how that life came to be, yeah; but I still don’t understand how that can apply to a machine, because it isn’t organic. (my belief, until I learn more, is that there is a difference between the two when it comes to life, as I’m sure you’ve figured out by now. But seeing as I don’t believe in souls or gods, it’s hard to explain how, and so I base it on the fact that if I hit my computer, it feels no pain or sadness, but if you hit a dog, it will, am I right?)

There is no right to deny freedom to any object with a mind advanced enough to grasp the concept and desire the state.

Seriously, we’re quoting movies now? :p But that’s just it; how can a machine grasp anything the way people do?

@Hypocrisy_Central On you your consciousness, you do not have to know how the lungs filter the C0 2 for them to work, or how many watts, ohms, etc. it takes to keep our hearts regulated. Book intelligence does not consciousness make.

I’m not sure where you’re going with this? I know that a toilet will flush away the mess when you activate said flusher, but don’t ask me how to build one. What does this have to do with consciousness?

However, that is another question.

Yeah I’ll say. I’d be happy to talk about that if you made a question of it. In short though, it’s not because I don’t believe in God that I think humans can do everything. Far from it, in fact. Now I don’t say we’ll never have robot cleaners in our home some day, but if we do, it will have to meet some economic benefit, because the world is money driven, and has no place for space exploration or scientific advent unless it benefits some war, or gain. (But that’s also for another, entire different subject, and is also just a quick opinion thrown out there)
But if such wasn’t the case, we’d probably already have robot cleaners by now. And I never said that a sentient being couldn’t do what humans have done; but as I keep saying, I do not believe a machine to be sentient. Even if a machine existed that could create evolution, it still wouldn’t be sentient just because it’s been programmed to do so. It has to do it on its own accord without anyone deciding for the machine, without human intervention, current or planned.

Hypocrisy_Central's avatar

@Symbeline I’m not sure where you’re going with this? I know that a toilet will flush away the mess when you activate said flusher, but don’t ask me how to build one. What does this have to do with consciousness?
Just the fact that consciousness is just what it is, do not try to figure it out because no one can. A computer or device of artificial intelligence can be programmed to learn or know certain things, but that ”book knowledge” will never make them conscious. If an automaton was programed to like sunsets, based off its sensors it could tell sunset from noon, but it could not appreciate one sunset from another. Not only that, what it was programmed to like it would like the same without variation no matter how long it remained in operation. Any automaton would not have any free will humans had not given it.

Consciousness is a human trait, trying to fathom it for anything else is a misuse of time, that is all. ;-)

Berserker's avatar

Right, now you’re clearer, that’s pretty much along the lines that I was going. Machines can do a lot of things, but that doesn’t make them conscious or self aware. Ragingloli says there is no difference between that and how humans are, and that’s what I’m not understanding. So far I only know of similarities between the two, but that is not enough to convince me that we’re one and the same.
Of course it’s pretty hard trying to define consciousness if, as you say, it’s incomprehensible. I do not believe it has no logic, mainly that I’m not able to pinpoint it. But to me everything that exists goes by some logic or another, or if it doesn’t exist, there is a logic to explain why we think certain things are what we think they are.
All that said Ragingloli could be right, but I’m not seeing it.

ragingloli's avatar

Try as you might, you can not will yourself to get aroused by a decaying corpse covered in shit. Certain basic likes and dislikes are hardwired. Free will is a delusion.

Berserker's avatar

@ragingloli You might be surprised at some of the fetishes that people have. not talking about me…or am I lol Not that I’d have to tell you that, I’m guessing you’re a fan of hentai. :p

Granted though, if someone is aroused by a corpse covered in shit, it’s also against their will to STOP liking it. (or so I believe this is how fetishes, and mental disorders, work, although I’ll gladly stand corrected if that is not correct)

ragingloli's avatar

Also, research has shown that the brain makes decisions 7 seconds on average before you consciously become aware of them. http://www.newscientist.com/article/dn13658-brain-scanner-predicts-your-future-moves.html#.Uyislle0e9Y
Your consciousness is not in control.

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther