General Question

TheKNYHT's avatar

Will Technology Ever Achieve 'Terminator' like A.I.?

Asked by TheKNYHT (686points) May 24th, 2009

Computers are advancing so rapidly; I was wondering after watching the Terminator film, if we will ever have androids that can out-reason humans, and in fact, in ways, be superior to them?

Observing members: 0 Composing members: 0

24 Answers

The_Compassionate_Heretic's avatar

Consciousness cannot be engineered or manufactured.

All A.I. does is attempt to simulate a human response, not replicate it completely.
Will A.I. become far more advanced? Yes it will. Will A.I.‘s ever be able to achieve self awareness? No.

The best we will be able to do is is engineer technology that that makes it more difficult to tell the difference between artificial intelligence and human consciousness, in application. That’s where the Turing Test comes in.

whatthefluther's avatar

I agree with @The_Compassionate_Heretic and pray it could never happen. California could not survive with another Governor like the current “Terminator.”

The_Compassionate_Heretic's avatar

@whatthefluther It’s been worse but no, it’s not great.

whatthefluther's avatar

@The_Compassionate_Heretic…I often gulp when I look at my UCLA diploma and see Jerry Brown’s signature. These are two very well intentioned men whose policies and apparent priorities often frightened me. But bless them both for doing much good in a very difficult position within an even more difficult arena (politics). See ya…wtf

oratio's avatar

@The_Compassionate_Heretic No self awareness? What makes you so sure? Just curious.

The_Compassionate_Heretic's avatar

@oratio That’s basically creating synthetic life. There is no amount of computer coding that will create consciousness because we don’t understand the true nature of consciousness from a scientific standpoint. It does make for good science fiction.

evelyns_pet_zebra's avatar

I think given enough time and scientific advancement, technology could prodcue just about anything. The guy that invented the computer said it would never amount to much more than a glorified adding machine. Boy, was he way off base. Look what we can do with his adding machine today.

I’m waiting for the day when cell phones are the size of rice grains, are implanted in our cheekbones and are powered by nanobatteries that recharge from the energy created by the movement of the chewing muscles in our jaw.

I’d assume it isn’t that far off.

Bluefreedom's avatar

I wouldn’t mind having a few live-in androids like those found in the movie Bicentennial Man. As far as Terminator like A.I.? Not in my lifetime and probably not ever.

@The_Compassionate_Heretic‘s answers are very compelling and interesting and I tend to agree with that line of thinking.

ragingloli's avatar

@The_Compassionate_Heretic
“because we don’t understand the true nature of consciousness from a scientific standpoint.”
YET. There is no reason to assume that we will never find that out.
There are other animals that are arguably self aware, including dolphins, elephants, and some primates, all with brains less developed than ours.
Anyway, our brains are computers, made from biological components, so replicating and even exceeding human brain function is only a matter of calculating power and time to develop the necesseray hardware. And with the emergence of quantum computing, they already made one that can do basic arithmetics, the day of a true AI might not be far away at all.

The_Compassionate_Heretic's avatar

@ragingloli Our brains are more than mere computing units.

ragingloli's avatar

@The_Compassionate_Heretic
are they? i don’t think so.

whatthefluther's avatar

Will someone please go check on @Jayne…been crafting a response for what seems like hours…wtf

Jayne's avatar

@whatthefluther; so sorry, left it open and went to watch a movie :)

@The_Compassionate_Heretic; why do you think that? I agree that we will never be able to engineer consciousness, but only because I don’t think that anything worthy of the name exists; I don’t think that there is anything special about our own brains, either. In fact, that lack of distinction is what I find to be so wonderful and fascinating about them, that they can take such basic, unremarkable parts and from them create such a marvelous whole; I consider this much more interesting than just saying that we have a “consciousness” and a “soul”, and leaving it at that.

One thing that our brains can never accomplish is to understand themselves. Their behavior is emergent, which is to say that, while the components are simple, they combine in such a way that the only way to discover the outcome of the combination is to track them step by step. In the end, an extremely complex behavior is reached, and there is no way to look at the broad plan, the overarching pattern that created it, because there is no such pattern, only a string of successive events. Accordingly, to understand how a brain accomplished a task would require holding each successive, complete state of that brain in one’s own brain, an impossible task. This is why it is so tempting to believe that the brain cannot be the sum of its parts, as one can never have an understanding of how those parts of the brain combine to form the mind. It also makes it impossible to design a brain from scratch, because one cannot create the building blocks of an emergent system with a particular end behavior in mind. In our case, they can only have been created by a process of trial and error, as with evolution or a creator with a lot of spare time. In theory, however there is no fundamental barrier preventing the creation on something that approximates our brains; and there is nothing to make an approximation necessarily less “conscious”.

oratio's avatar

I agree. I don’t think you program a consciousness. You teach it.

wundayatta's avatar

I think that eventually we’ll be able to model consciousness, and design software that can learn very sophisticated things. We already have software that learns. We’d also need feedback systems, so that the AI could sense it’s surroundings. Environment sensing plus learning and consciousness = sentience, I think.

whatthefluther's avatar

@Jayne…So glad to see you are OK and thanks for the very interesting comments…they were certainly more than worth the wait. Hope you enjoyed the movie. See ya…wtf

oratio's avatar

@syz Yes. I assume you’ve seen Big Dog. There are some interesting projects in development. Darpa seems to be busy.

syz's avatar

@oratio yep, that thing kinda freaks me out.

oratio's avatar

@syz Yes, imagine being hunted by a group of armed robot dogs, who can see heat, heat tracks, has electronic noses, and superhuman hearing. Not unrealistic and quite scary shit.

ragingloli's avatar

@oratio reminds me of the hunters from hl2 ep2. but
nothing a burst from a pulse rifle won’t handle

TheKNYHT's avatar

Correct me if I get this wrong, but it seems the preponderance of views held here is that consciousness is more than merely brain; that it is something perhaps (metaphysically) beyond the merely physical? Connected to the physical in some way, but extended beyond the parameters of genes, brain cells, etc.

DOMINO's avatar

HELL YES, THE WORLD WILL BE TECH. CONTROLED! NOT TO MENTION THE FACT THAT HALF THE WORLDS ANNIMAL SPECIES WILL BE WIPED OUT CAUSED BY ’’’‘GLOBAL’’’’ WARMING!!!!!!

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther