[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Re: early error detection]



    Date: Tue, 28 Feb 89 09:01 CST
    From: ai.gooch@mcc.com (William D. Gooch)

    Apparently you missed some of my mail.  Steele states that type-checking
    in such cases is a decision left up to the implementors.  Symbolics has
    documentation too, by the way.  <Select> D Show Doc defstruct.

No one is claiming that Symbolics is violating the CL spec.  What is
being said is that Symbolics isn't providing the kind of environment we
expect out of them.  In general, we expect the Symbolics environment to
do better error checking than other Lisp implementations.  But there are
a number of areas where it falls short in this regard.

	    My complaint was a about a patently false claim by a 
	submitted by a symbolics employee -- namely, that symbolics "architecture"
	detects type errors at the earliest possible moment.

    You are beating an imaginary dead horse.  As I said in other mail to
    you, I don't believe anyone has made this patently silly claim.  

I'm pretty sure I read it, too.  Luckily, I save all the mail I receive
(never say anything in email that you don't want quoted):

   Date: Wed, 15 Feb 89 18:40 EST
   From: Reti@stony-brook.scrc.symbolics.com (Kalman Reti)
   Subject: Re:  [not about] gc-by-area and host uptimes

   Type checking also acts as a way to trap certain classes of errors early,
   e.g. trying to do arithmetic on character strings.   This operation
   might never be detected and could simply provide wrong results to a computation
   on stock hardware.  This feature strikes me as being entirely analogous
   to parity on memories;  it lets you know at the earliest moment that something
   has gone wrong.

Personally, I don't think the analogy with parity is really that good.
He later compares turning off type checking in production mode to only
doing parity checking during software development, but memory hardware
failure is generally not related to software maturity; in a perfect
software world programs with bugs would not be shipped, so checking for
software errors in production code would be wasted effort (parity
checking would still be necessary because memory can be changed by
outside forces such as stray gamma rays).

Of course, the software world isn't perfect, bugs are in shipped code,
and the Symbolics architecture catches many of the errors that might go
unnoticed or cause strange failures on most other systems when code is
compiled with low safety.

	    It does not detect
	them at the earliest possible moment, and it does not necessarily detect them
	at all.  It detects a subset of all type errors -- namely, those involving
	a selected set of predefined types(e.g., number) used as operands to a
	selected set of predefined operations (e.g., +).

    I would be interested to see an example of the system "not detecting"
    such an error for reasons other than hardware breakage.  

This example has nothing to do with declarations:

(defstruct foo a b c)
(defstruct bar d e f g)
(bar-d (make-foo))

In Lucid, both interpreted and compiled with default optimization
settings, this signals an error about a wrong type argument to BAR-D.
Symbolics doesn't even detect this error when the code is interpreted!

(bar-g (make-foo))

Both Lucid and Symbolics detect this error, but Symbolics claims that it
is an array subscript error, even though there are no obvious array
operations in the code.

							     It's not at all
    clear to me what you think happens in lieu of an arg type error - I've
    certainly never observed this myself in several years of using Symbolics
    systems.  In Unix-based lisps, on the other hand, it's typically quite
    simple to obtain trash-your-lisp-environment-and-die behavior by passing
    the wrong type argument to a function.

As I pointed out in a previous message, most of the error checks
Symbolics provides automatically protect against environment trashing.
But that's a pretty arbitrary level of error checking.  Also, there are
many classes of error checks that have nothing to do with architecture
trashing; if an implementation passes two floats to an integer
arithmetic instruction and it simply processes them as if they were
integers it will get a valid (but meaningless) integer result.

Except for the error checks that protect the environment, most of
Symbolics's other error checks come as a side-effect of something else.
For instance, checking arithmetic operation arguments is a side effect
of generic arithmetic, which must already do type dispatching.  Checking
flavor message and generic function applicability comes as a side effect
of method lookup.

                                                barmar