Social Question

RedDeerGuy1's avatar

Are armed drones a violation of the three rules of robotics?

Asked by RedDeerGuy1 (24462points) March 8th, 2016

To not harm humans.

Observing members: 0 Composing members: 0

13 Answers

canidmajor's avatar

The 3 Rules of Robotics, as laid out by Asimov, are fiction. They have no bearing on reality.

Hawaii_Jake's avatar

Good question. I don’t think it violates that rule, because the drones do not do any variation of thinking. No artificial intelligence is involved. They are controlled by a human who directs their path.

Strauss's avatar

If the drone is ever controlled by any type of intelligent software (AI) then the three rules, IMHO, should apply. They probably wouldn’t, due to the reasons cited above by @canidmajor.

XOIIO's avatar

Well, the 3 laws apply to robots which are more or less completely autonomous, or in many cases in asimov’s work, sentient, however drones are basically remote control planes.

jerv's avatar

Drones don’t kill people; humans piloting drones kill people.

flutherother's avatar

@canidmajor Asimov’s rules may have been fictional but reality is catching up fast.

canidmajor's avatar

@flutherother: my post states that the 3 Rules are fiction. Not the advances in robotics. There has been a lot of talk about the concern re: the development of AI and how that would relate to humans, to my knowledge (which is not absolute, I get that) no one has been programming 3 Laws restrictions into AI attempts. And wouldn’t that essentially negate AI parameters?

flutherother's avatar

As far as I know, which isn’t very far, no one has been programming AI to ensure humans can’t be hurt but it’s something that will have to be borne in mind as AI develops. Asimov’s fiction is becoming our reality.

canidmajor's avatar

I think there is a fundamental disconnect between true AI and programming of specific values.

If you’re interested, a realistic take on this concept was presented in the film Ex Machina. It’s worth the watch no matter how you feel about this topic.

flutherother's avatar

It was one of my favourite films of the year.
PS AI has just beaten the world’s best player at Go There are four games still to play.

zenvelo's avatar

Asimov’s “Laws” are a fine idea, but it’s not like they were ever agreed upon by any society. Given there is a lot of research in AI funded by various militaries, there will be a lot of AI that harms.

You can call the Laws of Robotics as rules, but who says anyone is playing by the rules?

ARE_you_kidding_me's avatar

I loved that film too, creepy ending.
AI has come a lot farther than many people realize. But waay short of what people consider “AI”

elbanditoroso's avatar

No. For one thing, armed drones are not robots in any sense of the word.

Although unmanned, a drone is still piloted, electronically, by army personnel from a military base. There is still human command and control, and the decision to fire (and kill) is still a human one.

A robot, in contrast, is self-controlled and working according to its own thought processes.

The key fallacy with your question is to think that drones are robots. They are not.

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther