My Photo

Welcome to Mannionville

  • Politics, art, movies, television, books, parenting, home repair, caffeine addiction---you name it, we blog it. Since 2004. Call for free estimate.

The Tip Jar

  • Please help keep this blog running strong with your donation

Help Save the Post Office: My snail mail address

  • Lance Mannion
    109 Third St.
    Wallkill, NY 12589

Save a Blogger From Begging...Buy Stuff

The one, the only

Sister Site

« Another Mannioville Daily Gazette favorite blog of the day: Edition poetical | Main | The deportment of a very proper young lady golem »


Feed You can follow this conversation by subscribing to the comment feed for this post.

Doug K

see also Whimsley on talking to robots..


David Brin's "Kiln People" was interesting like that.

John Sundman



A propos, a 2-part article I did a decade ago for Salon about the Loebner contest. Siri has changed things somewhat, but it's still pretty current, I think:

And then of course there's Cheap Complex Devices, about which more when it may be appropriate.


Earl Bockenfeld

John in his comment above, has quite a complete history of the "turing test" in the form of the Loebner contest which is about communicating with a computer that seems as real as "HAL 9000 the character in Arthur C. Clarke's science fiction 'Space Odyssey' saga." A computer as a friendly companion and not an adversary or enemy.

When I began seeing "Siri TV ads", I was reminded of my early use of the "ELISA/DOCTOR" on one of my first PCs. Of course, When the "patient" exceeded the very small knowledge base, ELISA/DOCTOR might provide a generic response, for example, responding to "My head hurts" with "Why do you say your head hurts?" The response to "My mother hates me" would be "Who else in your family hates you?" ELIZA/DOCTOR was implemented using simple pattern matching techniques, but was taken seriously by several of its users, even after Weizenbaum explained to them how it worked. It was one of the first chatterbots in existence.

Weizenbaum said that ELIZA, running the DOCTOR script, provided a "parody" of "the responses of a nondirectional psychotherapist in an initial psychiatric interview." He chose the context of psychotherapy to "sidestep the problem of giving the program a data base of real-world knowledge", the therapeutic situation being one of the few real human situations in which a human being can reply to a statement with a question that indicates very little specific knowledge of the topic under discussion. For example, it is a context in which the question "Who is your favorite composer?" can be answered acceptably with responses such as "What about your own favorite composer?" or "Why does that question interest you?"

Siri is an intelligent personal assistant and knowledge navigator which works as an application for Apple's iOS. The application uses a natural language user interface to answer questions, make recommendations, and perform actions by delegating requests to a set of web services. Apple claims that the software adapts to the user's individual preferences over time and personalizes results, and performing tasks such as finding recommendations for nearby restaurants, or getting directions.

Siri includes the combined work from DARPA research teams from Carnegie Mellon University, the University of Massachusetts, the University of Rochester, the Institute for Human and Machine Cognition, Oregon State University, the University of Southern California, and Stanford University. This technology has come a long way with dialog and natural language understanding, machine learning, evidential and probabilistic reasoning, ontology and knowledge representation, planning, reasoning and service delegation. As taxpayers we are/should be Apple partners with their hugh pot of cash, which is larger than the US Treasury holdings.

Siri was met with a very positive reaction for its ease of use and practicality, as well as its apparent "personality". However, Siri was criticized by organizations such as the American Civil Liberties Union and NARAL Pro-Choice America after users found that it would not provide information about the location of birth control or abortion providers, sometimes directing users to anti-abortion crisis pregnancy centers instead. Apple responded that this was a glitch which would be fixed in the final version. It was suggested that abortion providers could not be found in a Siri search because they did not use "abortion" in their descriptions. At the time the controversy arose, Siri would suggest locations to "buy illegal drugs", "hire a prostitute", "dump a corpse", "find a viagra source", but not find birth control or abortion services. Apple responded that this behavior is not intentional and will improve as the product moves from beta to final product.

The comments to this entry are closed.

Data Analysis

  • Data Analysis


April 2021

Sun Mon Tue Wed Thu Fri Sat
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30  

Movies, Music, Books, Kindles, and more

For All Your Laundry Needs

In Case of Typepad Emergency Break Glass

Be Smart, Buy Books

Blog powered by Typepad