[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: PUSHing a human factors perspective
- To: Shwartz at YALE, Riesbeck at YALE, Fisher at YALE, Ruttenberg at YALE, Miller at YALE
- Subject: Re: PUSHing a human factors perspective
- From: John R. Ellis <Ellis at YALE>
- Date: Wed ,20 May 82 15:17:00 EDT
- Cc: Rees at YALE, T-Discussion at YALE, Nicolau at YALE, Vanleunen at YALE
- In-reply-to: Shwartz's message of 20 May 1982 0852-EDT
Rees can decide to do whatever he wants about PUSH. But there is a more
serious issue here that will continue to crop up again and again, not
just in T, but in programming in general. This is a long message, so
the weak-minded will want to type ^O and flush it. It is requiried reading
for Fisher and Nicolau.
From: Steven Shwartz <Shwartz>
Subject: PUSHing a human factors perspective
... By deciding for "consistency" and against "naturalness" you will
please the purists but alienate the majority of potential users. I
think it is a mistake to put the user-interface anywhere but first
on the list of design decision criteria.
-------
From: Chris Riesbeck <Riesbeck>
...There's no benefit in reusing an old name if the old name is going
to confuse people -- AND IT WILL!!!
-------
From: Josh Fisher <Fisher>
The whole point of using the name PUSH is to suggest the English
meaning of the word. How can it be ignored when people have such
a strong intuition about it?
...Almost every programmer is going to get it wrong most of the time,
especially for the first 1000 lines, and it's going to be a constant
irritant.
-------
Has it ever occurred to anyone that "methodology", "consistency" and
"convention" have as much to do with "human factors" or "user interfaces"
as "naturalness"? My endearment to one particular methodology has nothing
to do with "purity" or "idealogy". The very purpose of programming
methodology is to find ways of programming that reduce the human cost.
Current programming languages are NOT natural languages, they are formal
engineering notation systems with many complicated rules. We all want
to make those rules as simple, as easy to learn and remember as possible.
Other more mature disciplines have evolved highly formal notation systems
that are divorced from natural, casual language: music, drafting,
mathematics, business, etc. Why shouldn't programming? (If I could talk
to the computer in English, great, but that's not what we're discussing here).
But those notations still use English words, in often "unnatural" ways.
"Naturlness" is only one small criterion to be used in desiging a
programming notation (or any notation, for that matter). Consistency
and convention are generally more important, because they allow (and
encourage) programmers to treat the programming language as a formal model,
not as a bag of special cases that must all be memorized. The assumption
is that, though possibly more painful to learn at first, formal models
and conventions are cost effective in the long run, because they can be
confidently applied to new situations, whereas a body of special cases
cannot so easily be applied or communicated.
In a large program, there are many hundreds of different "objects" or
constructs, each with its own set of operations. If no conventions were
obeyed and the operations used any naming and argument conventions that
seemed "natural" at the instant the programmer implemented them, you would
end up with a big mess that only the original programmer could understand.
But if you follow consistent conventions, the OTHER programmers reading
the code don't have to understand and remember one person's quirky
understanding of "naturalness"; they only have to learn the conventions
and formal models.
Systems programmers routinely look at and modify code written by dozens
of other people. Communication of intent to other programmers is at least
as important as communication to the machine. The hardest programs to
read and modify are those that don't follow any quickly discernable
conventions in program surface structure (argument order, naming, pretty
printing, etc.). A formal convention provides a uniform surface structure,
allowing the programmer to concentrate on more important issues.
Picking names for operations is always a tricky business. We want the
names to do a lot of the mental drudge work for us, helping us to remember
the semantics of the thing the name represents. Unfortunately, the
ambiguity and extreme richness of English confuses the issue, as we've
all seen with the PUSH example. Anyone can apply the same arguments about
the naturalness of PUSH to many other functions with English names. For
example, try try to explain (APPEND X Y). Using the naturalness approach,
I sure want to read that as "APPEND X to Y", which actually implies a
destructive operation!. But we all know that it should be read as "APPEND
Y to X, yielding a new list, without destroying X or Y.
By using a consistent argument and naming convention, we can eliminate much
of the ambiguity implied by the names of operations. The programmer
KNOWS for certain that the object being manipulated or changed comes
first in the argument list. The operation name describes ONLY the
operation being done, NOT the order of arguments.
When I first started using Lisp (I came from a PL/I background), one of
its most enjoyable aspects was the rich library of powerful operations.
The most irritating thing was the total chaos in naming and argument
conventions. I still can't remember them; a reference manual by my side
is a must. Rees and Adams and Pitman and the other "purists" have done
an admirable job in making T less of a reference manual language, and
more of a notation that can be written without resort to page flipping.
I can't think of any more important "human factors" issues than these.
-------