[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
- To: email@example.com
- Subject: proposal LOAD-TIME-EVAL:REVISED-NEW-SPECIAL-FORM
- From: Kent M Pitman <KMP@STONY-BROOK.SCRC.Symbolics.COM>
- Date: Tue, 20 Sep 88 22:41 EDT
- Cc: Gregor.pa@Xerox.COM, eb@SAIL.Stanford.EDU, Moon@STONY-BROOK.SCRC.Symbolics.COM, firstname.lastname@example.org, email@example.com, kmp@STONY-BROOK.SCRC.Symbolics.COM, firstname.lastname@example.org
- In-reply-to: <8809210219.AA20490@bhopal>
It doesn't make any sense to talk about the number of uses that a primitive has.
PROGV isn't used that much either -- many users never use it. Yet it's generally
acknowledged to be a good thing just for those few places where it's needed.
The situations Gregor and I are talking about are rare by one metric: lines of
code. Maybe #, will occur only once in a portable implementation of CLOS. (Though
my guess is much higher -- perhaps even 2 or 3 times. :-) But without it, it may
be hard to make such a portable implementation.
I also needed the feature for my Fortran->Lisp translator, 10 years ago in Maclisp.
It allows one to emulate ``linking.'' If I wrote the program again today in
Common Lisp, I couldn't do as well because at least the Maclisp facility provided
for an interface to macros.
Are you going to count the number of lines of code (fewer than 10, probably),
the number of programs that write DEFMETHOD or Fortran SUBROUTINE (probably in
the thousands), or the number of times those programs are called (conservatively
in the hundreds of thousands) when you count the frequency of use?
If #, is not used a lot, it's not because it's some low-level grungy that only a
few people have the poor taste to use. It's because the language designers haven't
gotten their act together enough to give it a portable definition that people can
Currently, #, is defined in a non-portable way that makes it a low-level grungy
thing, but in fact it is an abstract concept that I think should be fundamental
to everyone's knowledge of programming.
Programming is about decision-making. Some decisions are delayed longer than others.
One key to making decisions at the right time is having access to being able to
name the times at which decisions might be made. It's ludicrous to think that only
at toplevel would there be decisions that need to be postponed until runtime and
that EVAL-WHEN should suffice. It's ludicrous to say that every postponed decision
should need a first-class name. Some decisions in the middle of a program body
need to be postponed without sacrificing anonymity. #, provides a natural
mechanism for, at any point in a programming, saying "I won't know what goes here
at compilation time, but the instant I get ahold of the runtime environment I can
tell you the object I'm talking about." There's nothing low-level or shameful about