General Question

Observing members: 0 Composing members: 0

22 Answers

cwilbur's avatar

In this case, it’s not murder; it’s suicide. The method is just a more complicated machine than usual.

This is like asking “if a gun commits murder, who is guilty?” The person who pulled the trigger.

If we could establish that the robot was intelligent, the answer might be different.

bgdbeatnik's avatar

A robot without artificial intelligence can not commit murder. It could be used as a tool in a murder. However it would the person wielding the tool that would be guilty. It is like saying, “If a gun commits murder, who is guilty.”

squirbel's avatar

So let’s take it out of the context of this particular article, and assume the robot has been developed with artificial intelligence? This article was just a starting point for a thought I had :)

Vincentt's avatar

@squirbel – then you could compare it to “if a dog commits murder, who is guilty?” Though if the robot in question has emotions, then I suppose you can blame and punish him (it?).

chaosrob's avatar

Robots can’t have motive, so you can’t make a murder conviction stick. Conceivably, the programmer or owner might haves used the robot as a murder weapon, in which case they’d be guilty of something.

FlutherMother's avatar

I would say if the robot thought (as in artificial intelligence) of the murder and conducted it against the wishes of the victim – it would be a murderer. But considering this was just a mindless machine programmed to pull a trigger, it is the fault of whomever programmed it – in this case the suicide. Have you watched “I, Robot” with Will Smith? Watch it, if you haven’t. It deals with the same issues in a future setting. And the movie wasn’t too bad either.

simone54's avatar

WTF? I just woke up. How the fuck long have I been sleeping? Since when did we have robots?

Sparkie510's avatar

I guess that sometime in the future when robots will inevitably form a much more significant part of society – and will be more sophisticated than they are now – the laws will be changed to allow for such a situation. Then what? Would they be destroyed or just re-programmed? Can’t imagine robots in prison!

As Fluthermother said, watch ‘I Robot’ if you get the chance – it’s a pretty good movie.

Besafe's avatar

Sorry to tell you folks but our military services already have robotic weapons that are autonomus—I know becuase I do the system safety engineering on some of the ones in devlopment.

Like any other weapon they can be used for good or for evil—like money – it is a neutal object or technology. Which way they are used is dependent on the morality and ethics of those who control their use. The real question should be can they reporduce themselves and take control at some point or otherwise take over society.. BTW one of my favorite movies TRON might be good to watch—LOL

squirbel's avatar

@Sparkle510: That’s exactly what I was aiming for, thanks for saying it!

It’s almost inevitable that they will form a part of our society – and I’m wondering how [or if] our laws will change.

This is also linked to another question:

Yes, “I, Robot” was a great movie! :)

afghanmoose's avatar

Boston dynamics big dog,you tube it,we got the structure,now it needs a brain

FlutherMother's avatar

And, unfortunately, we are probably going to have the same clashes we always had as to whether or not they are “deserving” of all the basic rights before we change those laws. Deserving is in quotes because it was meant to be sarcastic. We have always subjugated one type of person or another – be it women, minorities, homosexuals – do you think we will be any different with robots just because they can “feel” or make decisions on their on without someone pushing a button? – or is that another question for our Flutherites? I would be very interested in the outcome when that time comes.

afghanmoose's avatar

the matrix is another perfect movie

lozza's avatar

Great question. It’s going to be an ethical dilemma we face at some point in the future for sure.

I would imagine we will end up with laws that state the inventor/seller/programmer or company director must take overall responsibility. The problem will be when such machines are used in wars. I’ve always thought the Terminator films could turn out to be more of a prediction than pure fictional entertainment.

or company director

Perchik's avatar

Well. Everyone knows that the first rule of robots is that they may not harm a human or allow a human to get harmed. Therefore the robot was against the law and must be shutdown.

Besafe's avatar

We already have liabilty laws that can and have been enforced agianst companies or individuals who did not exercise due dilligence to ensure their products do not create harm to another individual—I expect the laws will just be modified – expanded as required.

HoodatWhatzit's avatar

::sigh:: I love Will Smith movies as much as anyone but give credit to Isaac Asimov for posing this question in his story long before Hollywood gave us its version. My own feeling… As long as computers (robots) need us to program them, a computer-caused death would be either accidental or intentional on the part of a programmer.

squirbel's avatar

But isn’t the purpose of an AI to grow and change, and to make decisions on its own – based on a ruleset?

And yes, Asimov (the author of I, Robot) deserves recognition, among many other futurists.

skeh0138's avatar


steelmarket's avatar

The question should be, “Does the robot have a value system and what is it’s position on fellow sentients?”. Does it know (what we term) right from wrong? Courts call this competency.

squirbel's avatar

But that’s not the question I asked. smirk

I understand your aim, however – and your point is valid.

Kraken's avatar

The robot. Send it to robot jail.

Answer this question




to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
Knowledge Networking @ Fluther