INTRODUCTION

While some were introduced to the idea of talking to computers through Arthur C. Clarke's book "2001", I was weaned on Star Trek. I make no apology for never having finished reading 2001. Nor for never sitting all the way through the film that followed, "2001, A Space Odyssey". The fact is I find them both rather tedious! However, an interesting point comes out. Both readers of 2001 and audiences attending the film made few criticisms of HAL the talking computer. It would appear that most people take the idea of understanding their natural language for granted. In her book "Artificial Intelligence and Natural Man" Margaret Boden is some what scathing of people's naivety in accepting the idea that a computer can understand human language, even with its subtle variations. But in fairness to the public, Margaret, how many people in 1977 had seen a computer? Let alone learnt how to program one. The reason people didn't wonder at HAL's quite extraordinary powers of comprehension was that they did not see anything strange in them. "If I can understand English, and a computer is an electronic brain, then of course it can understand." Might go the reasoning.

Later audiences were treated to the idea of confusing computers. This idea was used in Star Trek to defeat several unpleasant computers whose logic had got the better of their human companions, and in Doctor Who where a certain computer (in "the Green Death") was particularly unpleasant. This latter computer was destroyed following being asked a paradoxical question. The public were awakening to the idea that computer's may be bright, but they're not brilliant!

Enough of science fiction, what of the reality? The matter is that people are fascinated by the idea of "thinking machines". And what better sort of thinking machine than one that can converse in your own language? During the 1960s a lot of work was undertaken by the psychology and psychiatric professions in modelling neurosis with computer simulations.

In 1962 K.M. Colby, a psychoanalyst, attempted to model free association in psychotherapy with a computer system he called a "simulation of a neurotic process". The computer modelled a woman who believes that her father has abandoned her, but cannot consciously accept that she hates him. The computer operator plays the role of the psychotherapist with this system. In his book "Experimental Treatment of Neurotic Computer Programs", K.M. Colby details the following interaction between the computer and the therapist;

computer:  FATHER IS INDIFFERENT TO ME.
           MOTHER SAYS FATHER ABANDONED US.
           FATHER PREFERRED SISTER.
           I AVOID FATHER.
           SISTER ADMIRED FATHER.
           I FEEL TENSE NOW.

therapist: YOU ADMIRE FATHER.

computer:  NO.
           I DO NOT ADMIRE FATHER.
           I ADMIRE AL.
           MOTHER OUGHT NOT TO ADMIRE AL.
           MOTHER DETESTS FATHER.
Colby's hope in developing this computer model was that it would be of use in studying the ways in which belief systems and defensive habits of thought can be modified by particular conceptual interventions by, say, a therapist. I wonder?

Another medical type artificial intelligence computer system was called ELIZA. ELIZA was developed by Joseph Weizenbaum to model a psychiatrist. The computer operator may then take the role of a patient and talk with the program. In his book "ELIZA", Weizenbaum details the following interaction between ELIZA and a human operator;

operator:  MEN ARE ALL ALIKE

eliza:     IN WHAT WAY

operator   THEY'RE ALWAYS BUGGING US ABOUT SOMETHING OR OTHER.

eliza:     CAN YOU THINK OF A SPECIFIC EXAMPLE

operator:  WELL, MY BOYFRIEND MADE ME COME HERE.

eliza:     YOUR BOYFRIEND MADE YOU COME HERE

operator:  HE SAYS I'M DEPRESSED MUCH OF THE TIME.

eliza:     I AM SORRY TO HEAR THAT YOU ARE DEPRESSED

operator:  IT'S TRUE. I AM UNHAPPY.

eliza:     DO YOU THINK COMING HERE WILL HELP YOU NOT TO BE UNHAPPY

operator:  I NEED SOME HELP, THAT MUCH SEEMS CERTAIN.

eliza:     WHAT WOULD IT MEAN TO YOU IF YOU GOT SOME HELP

operator:  PERHAPS I COULD LEARN TO GET ALONG WITH MY MOTHER.

eliza:     TELL ME MORE ABOUT YOUR FAMILY

operator:  MY MOTHER TAKES CARE OF ME.

eliza:     WHO ELSE IN YOUR FAMILY TAKES CARE OF YOU

operator:  MY FATHER.

eliza:     YOUR FATHER

operator:  YOU ARE LIKE MY FATHER IN SOME WAYS.

eliza:     WHAT RESEMBLANCE DO YOU SEE

operator:  YOU ARE NOT VERY AGGRESSIVE BUT I THINK YOU DON'T WANT ME TO NOTICE
           THAT.

eliza:     WHAT MAKES YOU THINK I AM NOT VERY AGGRESSIVE

operator:  YOU DON'T ARGUE WITH ME.

eliza:     WHY DO YOU THINK I DON'T ARGUE WITH YOU

operator:  YOU ARE AFRAID OF ME.

eliza:     DOES IT PLEASE YOU TO THINK I AM AFRAID OF YOU

operator:  MY FATHER IS AFRAID OF EVERYBODY.

eliza:     WHAT ELSE COMES TO MIND WHEN YOU THINK OF YOUR FATHER

operator:  BULLIES.

eliza:     DOES THAT HAVE ANYTHING TO DO WITH THE FACT THAT YOUR BOYFRIEND
           MADE YOU COME HERE
Exponents of ELIZA point out that it can talk about any subject. Critics may notice that it does not always talk very intelligently! But ELIZA did trick a salesman at the computer company. Not a very difficult task perhaps, but here anyway is a transcript of the conversation that is reported to have taken place by teletype.
man:      DO YOU THINK THAT I CAN USE THE COMPUTER THIS MORNING?

teletype: WHY DO YOU ASK?

man:      I MIGHT BE ABLE TO MAKE SOME ADDITIONAL TELCOMP SALES.

teletype: WHY AREN'T YOU SURE?

man:      MY PROSPECTIVE CUSTOMERS HAVE NEVER SEEN A DEMONSTRATION OF THE
          SYSTEM.

teletype: DOES THIS MEAN VERY MUCH TO YOU?

man:      OF COURSE IT DOES.

teletype: YOU SEEM VERY POSITIVE ABOUT THAT.

man:      THIS IS RIDICULOUS.

teletype: WHAT DO YOU MEAN THIS IS RIDICULOUS?

man:      IS THERE ANY REASON WHY I CAN'T USE THE COMPUTER NOW?

teletype: TELL ME MORE ABOUT YOUR PROSPECTIVE CUSTOMERS.

man:      PLEASE DIAL ME UP ON 491-1850.
From the anger apparent in the salesman's responses, he may not have realised that he was talking to a computer. But he certainly knew that who ever he was talking to was fairly obtuse.