[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [not about] gc-by-area and host uptimes

About the "genera philosophy": Right.  We kind of know all that.
(This IS slug, after all).  The question is, how relevant is it?

Two things to notice: (a) Symbolics hasn't been routinely giving out
sources for a long time now.  (b) Reusing code over multiple
hardware systems has nothing to do with the underlying hardware.
Anybody remember the Interlisp virtual machine?  Once you had it
running you could port over everything else pretty easily, since all
the tools were written in lisp.  [Whether or not you would want those
tools is yet another question :-)]

All the hardware arguments boil down to "unless you have special
hardware it isn't fast enough".  Compared to what?  Alot of the
baselines assume the generic stuff is just as slow as the specialized
stuff.  But what if it is faster (indeed, a whole lot faster)?  IF you
built your specialized machine to have basic cycle rates the same as
the fastest generic box THEN these arguments would be correct.  But if
the specialized box is slower, then it becomes dubious which way the
performance tradeoff actually lies.

When I benchmark my code on generic hardware, I DO NOT turn off type
checking, I DO NOT install extra declarations, etc.  I have enough on
my hands w/o trying to make up for deficiencies in the lisp
environment.  And the generic stuff is really cleaning up in terms of
performance on my code.

I've said this before: I wish Symbolics had concentrated on rewriting
their software for performance instead of DW.  I'll bet their sales
would be alot better now.