Expect

I recall reading a paper, probably a tech report, from the Rand corporation in the mid 1970s about a little AI program they had written which would watch a user interact with a time sharing system and then attempt to extract a script to automate that interaction.  Later in that decade I used Interlisp who Read-Eval-Print command loop, or repl, included a feature known as DWIM, or do what I mean.  DWIM was yet another primitive AI, it would look over your shoulder as you worked and try to help.  It was amusing, though in the end there was a consensus reached that it was more party trick than useful.

A while later, on unix, a serious problem emerged.  A delightful game, Rogue, appeared which we all played far too much.  When Rogue would fire up it would randomly setup a game for you to play, and some of these were better than others.  This gave rise to a serious need; i.e. automation to find good games.  So people wrote programs to do that.

When Mac came out countless hours were wasted complaining about how it lacked a command line.  Interestingly one of the things it included, right from the start, was a record/playback mechanism.  Developers used this automate testing.  (I should note here that the Mac had 128K bytes of ram.)

All these systems are a work around for a general problem.  Given a interface designed to target human users what can we do to bring computers into that interface.  We have this problem writ large these days, since most of the content on the web is targeted at humans it is continually frustrating how hard it is to get the websites to interact with the computers.

It’s a “who do you love?” question.  Unsurprisingly most website designers love the human users.  They labor to serve them well.  That crowds out efforts to serve the computers well; which has a perverse side effect of making it hard for the computers to help the humans.  This in turn, of course, plays into issues like RSS, XML, RDF, Rest, etc.

Interfaces designed for humans are, unsurprisingly, different from those designed for computers.  A good list of what separates these two would be a extremely useful!  For example human interface is likely to be more visual, more asynchronist, more multi-threaded, more decorative, more commercials.  That list would be useful because we build lots of tools to bridge the difference.  Web spiders are one example.  Screen scrapers are another.  Automated testing tools are a third.

All this was triggered by my delight at discovering that my Mac, which has the unix tool set installed, has bundled in a program called ‘expect’.  Expect is a tool for just this kind of bridging.  It is the direct decedent of the tool written to get you a good game of Rogue; in fact it’s examples include a script to do just that.  Expect is designed for writing scripts to manipulate command line interfaces which were designed for humans.  The examples include all kinds of slightly perverse things.  Editing a config files on N machines simultaneously, for example.  It’s a hoot.

It seems to me that there are powerful reasons why the dynamic that leads to tools like these spans so many decades.  For lots of reasons implementers love humans more than computers.  In some cases implementors hate the computers, while wanting to reach the humans.  Because of this the human facing APIs will always be richer than the computer facing ones; and we will forever be writing tools to bridge the gap.

Leave a Reply

Your email address will not be published. Required fields are marked *