[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Issue: DECLARE-ARRAY-TYPE-ELEMENT-REFERENCES
- To: pierson%mist@MULTIMAX.ARPA
- Subject: Issue: DECLARE-ARRAY-TYPE-ELEMENT-REFERENCES
- From: Kent M Pitman <KMP@STONY-BROOK.SCRC.Symbolics.COM>
- Date: Fri, 7 Oct 88 18:54 EDT
- Cc: CL-Cleanup@SAIL.Stanford.EDU
- In-reply-to: <8810072119.AA03303@mist.UUCP>
This is inconsistent with current practice in that implementations are
currently permitted to ignore type declarations altogether -- not just
for array but for anything.
The only consistent proposal would be to say that in all cases where
declarations are done and violated, an implementation should signal an
error -- not just for arrays.
Either way, you're talking major work. Although not technically an
incompatible change, some implementations (like the Lisp machine family)
currently don't track these declarations at all since it would mostly
only slow things down. Some other (stock hardware) implementations
only track the types they know how how to optimize. Eg, they might
ignore SYMBOL declarations because they don't have an optimization that
pertains to symbols but they might track FIXNUM declarations because
they know lots of cool things to do with that.
As such, your cost to implementors seems a little understated. I
do not personally maintain that aspect of any implementation, so I'm
just guessing but it looks to my naive eye like it's not just
a matter of "extended some checking" -- it might also mean some
pretty sweeping changes to the kinds of information that have to
be tracked by interpreters and compilers, and perhaps in some cases
even to the modularity of the interpreter and/or compiler.