[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


re:  (A)  Jonl may correct me, but I think the experience at Lucid has
	  been that the overhead per constant for detecting circularities
	  is minor, perhaps down in the noise.

What you say is generally true, but of course there is at least one  painful
counterexample that can legitimately come up -- FASDMP.

Normally, file compilation spends its time in things like file-io,
macroexpansion, source-code analysis, code-generation, etc; so that the
hash-tabling time to EQify data (and to stop circularities) is down in 
the noise.  [Well, that's assuming that the implementation in question has
a reasonably efficient version of "hashing" hash-tables, rather than merely 
random "tables" that happen to be called hash-tables -- at the recent X3J13 
meeting, a of couple people who really ought to have known better claimed 
they didn't understand at all what that means.]

But for FASDMP, one can construct an arbitrarily bad case:  no time 
spent in macroexpansion, code analysis or generation, and practically
no time spent in file-io;  and no sharing or circularity whatsoever in
the datastructures.  Thus *all* the time spent in the table-look-up 
routines (read: "hash" routines) is wasted.   Still, this is such a
special case that I can't see devoting time trying to optimize it
unless more than one weirded-out user falls upon it as follows:
  (1) It is a natural case of some money-earning software product, and
      not just another exercise to show how bad someone's compiler is;
  (2) The time saved by a special-case switch to turn off the tabling
      algorithms can be translated into a real dollars-and-cents profit
      for that company -- i.e., not just "frequently is 3% slower", or
      "runs 300% slower once in the lifetime of the company".

re: I think any serious implementation of COMPILE-FILE will quietly handle
    circular data, just as any serious garbage collector is expected to.

Well-said, Jim!

-- JonL --