[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: small integers
- To: firstname.lastname@example.org
- Subject: Re: small integers
- From: email@example.com (James E. O'Dell)
- Date: Tue, 13 Oct 1992 22:33:27 -4
- Cc: Scott_Fahlman@SEF-PMAX.SLISP.CS.CMU.EDU, firstname.lastname@example.org
- Organization: Fort Pond Research
- Reply-to: email@example.com
On Tue, 13 Oct 92, firstname.lastname@example.org (David A. Moon) wrote:
>> Date: Sat, 10 Oct 92 02:33:59 -0400
>> From: Scott_Fahlman@SEF-PMAX.SLISP.CS.CMU.EDU
>> .... At times you
>> seem to favor the "correct" Common Lisp approach, and at other times
>> seem to accept the idea of an overflow error under certain conditions.
>I think I have consistently said that I favor an overflow error when it is
>type error, that is, when it can be shown that if a bignum were created
>the overflow it would cause a type error before the next observable
I think that you'll find almost every scientific computer designer is in
of that approach. I think that what you'll find is that in a machine with a
complex architecture (vector or multiple instructiuon issue) that you
not be able to catch the error when you desire.
>I think (but I'm not sure about this part) that an overflow in the middle
>a complex arithmetic expression can often be shown either to cause an
>at the end (and thus we can reason backwards that it is a type error) or
>handled by using increased precision for the intermediate results in the
>arithmetic expression, without going beyond the precision the machine
>supports. Think about (* (+ x y) z) where we know they are all small
>and z is not 0 or -1, and the result is being fed to something that only
>accepts small integers.
Isn't the problem here that "often" isn't good enough? You either have to
the arithmetic error for sure. If you can't get it for sure then you'd just
as well ignore
it like C and let the programmer deal with it as best he can.
Do you intend that Dylan have arithmetic that is "as good as" FORTRAN or
it be "almost as good"? I would argue that for general acceptance of the
for both political and technical reasons only "as good as" will do.