Social Question

mattbrowne's avatar

If a chatterbot were answering some of the Fluther questions, would the moderators detect it?

Asked by mattbrowne (31732points) August 5th, 2009

From Wikipedia: A chatterbot (or chatbot) is a type of conversational agent, a computer program designed to simulate an intelligent conversation with one or more human users via auditory or textual methods. Such computer programmes can also referred to as Artificial Conversational Entities within some specific contexts. Although many chatterbots appear to intelligently interpret human input prior to providing a response, many simply scan for keywords within the input and pull a reply with the most matching keywords or the most similar wording pattern from a local database. Chatterbots may also be referred to as talk bots, chat bots, or chatterboxes.

Usually a good contextual understanding of a conversation is required to carry out a meaningful dialog, however, many chatterbots do not attempt this level of achievement. Instead they “converse” by recognizing cue words or phrases from the human user, which allows them to use pre-prepared or pre-calculated responses which can move the conversation on in an apparently meaningful way without necessarily requiring the chatterbot to actually understand or comprehend what is being said. Users, especially those unfamiliar with chatterbots, can often find the resulting conversations engaging. Critics of chatterbots call this engagement the ELIZA effect.

http://en.wikipedia.org/wiki/Chatterbot

The Loebner Prize is an annual competition in artificial intelligence that awards prizes to the chatterbot considered by the judges to be the most human-like. The format of the competition is that of a standard Turing test. A human judge poses text questions to a computer program and a human being via computer. Based upon the answers, the judge must decide which is which. In 2008 a variety of judges, including experts and non-experts, adults and children, native and non-native English speakers participated in the University of Reading hosted contest.

Winners:

2008: Elbot by Fred Roberts
2007: Ultra Hal by Robert Medeksza
2006: Joan by Rollo Carpenter

In case you have installed emacs on your computer you can give it a try. Just type M-x doctor and you’re ready to roll:

If a chatterbot were answering some of the Fluther questions would the moderators detect it?

Maybe your plans have something to do with this.

Well, my plan was to find out what Flutherites think about chatterbots.

I don’t understand.

Would the moderators find out that Eliza is not a real person?

Eliza? Hah! I would appreciate it if you would continue.

I don’t really like talking to chatterbots.

Is it because of your life that you are going through all this?

(...)

Observing members: 0 Composing members: 0

18 Answers

richardhenry's avatar

Not directly. The user agent string would (hopefully, if they knew what they were doing) make the bot appear to be a normal user, and it would post at a rate that would make it appear to be human.

However, it would probably be very easy to identify by behaviour. The mod team has suspended and banned users that only ever posted nonsensical questions or responses, and the bot would fit into this category.

The mod team would not be looking to say “this is definitely a bot, so we should ban them”, it’s more about looking to say “this user posts garbage, so we should ban them”.

richardhenry's avatar

You could write or modify an existing chatbot yourself so that is was more question specific, but this is a conversation with one of the better algorithms, called ALICE. I’ve pasted in a couple of the most recent questions.

Fluther: Romantic love and companionate love – How many times in your life did you fall in love?
ALICE: Can you speak any foreign languages?

Fluther: Which animation courses should I take?
ALICE: The large one, I think.

Fluther: How do I tweak Google search?
ALICE: You must persuade me to tweak Google search.

Fluther: Is there a way to trace “Unknown” phone calls?
ALICE: There are many ways but not all are the same.

The main issue is that chatbots are specific to chat. An algorithm more specific to answering a question rather than creating and continuing a dialog, would probably be able to better pass itself off as a human in the context of Fluther (although no doubt a silly one).

richardhenry's avatar

Fluther: What would be a good name for a dating site?
ALICE: I don’t know who. :-).

Fluther: Are you able to change the font and theme color on Nokia N97?
ALICE: Only by remote control.

Fluther: Americans do you support your police having tasers?
ALICE: And Europeans do not do me support my police having tasers?

dpworkin's avatar

A successful bot would have passed the Turning test with all of us, including the mods, and would then have to be declared “intelligent”. This would immediately throw the human race into Asimov land, and all that such a thing would entail.

richardhenry's avatar

Ha. ALICE is a bitch.

Fluther: Is there life after foot surgery?
ALICE: No.

mattbrowne's avatar

@richardhenry – Alice won the Loebner prize three times. I guess Eliza, who’s her grandma, would be very proud.

And I hope I never have to undergo foot surgery!

RICHARD: Alice is a bitch!
ALICE: You do not speak with respect. Why do you think I am a FEMALE DOG ?

Source: http://www.pandorabots.com/pandora/talk?botid=f5d922d97e345aa1

mattbrowne's avatar

@pdworkin – I’m glad Asimov created the Three Laws of Robotics:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

dpworkin's avatar

@mattbrowne I remember the Three Laws, but I don’t remember how they were to be enforced on sentient beings many times more intelligent than ourselves.

LexWordsmith's avatar

Is code for running ELIZA in a Windows environment available?

LexWordsmith's avatar

from the sample given, i would conclude that ALICE is far less capable of deceiving me than ELIZA was.

LexWordsmith's avatar

@pdworkin : Sentient beings many times more intelligent than ourselves would follow the three Laws as a matter of rational action; no enforcement would be needed.

dpworkin's avatar

Hitler was sentient, also Genghis Khan. Intelligence does not exclude evil, especially in the absence of a superego. Why couldn’t there be sociopathic bots?

LexWordsmith's avatar

@pdworkin : Hitler and GK were arguably not “many times more intelligent” than i am. In fact, by circular reasoning, i can demonstrate that they could not have been: They did not follow the Three Laws, so clearly they were not “many times more intelligent than i am”, because, as has already been stated, sentient beings many times more intelligent than ourselves would follow the three Laws as a matter of rational action, because if i can see why that is true, a being far more intelligent than i would also see it. The problem comes when bots who pass the Turing test are not considered sentient; then the reasoning does not apply to them, and they could be sociopathic (or maybe they just get very angry about not being considered sentient, despite being able to pass the Turing test, and take that anger out on skeptics and doubters, so beware!).

dpworkin's avatar

That was a nice post, but i can’t help but point out that by circular reasoning, “begging the question’ or petitio pincipii I can “prove” anything at all.

mattbrowne's avatar

@LexWordsmith – Yes, Eliza is part of GNU Emacs and it’s available for Windows

http://www.gnu.org/software/emacs/windows/ntemacs.html

LexWordsmith's avatar

@pdworkin : yes, when i said “prove by circular reasoning”, i was just trying to make a joke. my wife says that i fail at that all the time, so this is apparently just more evidence of my inability.

@mattbrowne : Thanks! so i would have to have a GNU compiler running on my Windows box to use that?

mattbrowne's avatar

@LexWordsmith – No, the download comes with a lisp directory and a bin directory (with the exe files). There’s only one manual step explained in the README file:

Let’s assume that the directory in which you have unpacked Emacs is
c:\emacs (note that it can be anything you want it to be; I just want
to be able to refer to something concrete at this point). First, you
will want to run the program bin\addpm.exe with the Emacs directory as
an argument. In this example you would invoke it as:

c:\emacs\bin\addpm.exe c:\emacs

That’s it. Emacs is the best editor ever in my opinion. To start Eliza type M-x doctor

Answer this question

Login

or

Join

to answer.
Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther