[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

wilson%uicbert@uxc.cso.uiuc.edu



Date: 21 Nov 88 21:27:00 GMT
From: uxg.cso.uiuc.edu!uicbert.eecs.uic.edu!wilson@uxc.cso.uiuc.edu
Subject: large or long-running T pgms?
Message-Id: <82000003@uicbert.eecs.uic.edu>


I am trying to figure out how large and varied a body of T programs there
is, for the purpose of gathering dynamic statistics.  (These statistics
would be useful in designing garbage collectors, among other things.)

I am particularly looking for programs that are large and/or long-running,
especially those that allocate more than about 5 megabytes of data over
the course of a run.  Real, heavily-used programs are somewhat preferable
to throwaway prototypes, but all kinds would be helpful.

The point of this is that I may want to implement a special garbage 
collector, instrumented to gather statistics on survival, locality of
reference, and locality of state changes.  This empirical data would
be useful for designing garbage collectors, among other things.

If there is not a large body of *varied* programs, that weighs against
T and for a Common Lisp such as Kyoto.

Anyway, if you have large programs that you would be willing to send
me (hopefully with easy startup instructions), please drop me a note
with a short description of what the program does, how long it runs,
and (if possible) a guess at how much memory it uses.


Thanks prematurely,

Paul


Paul R. Wilson                         
Human-Computer Interaction Laboratory
U. of Illin. at C. EECS Dept. (M/C 154)   wilson%uicbert@uxc.cso.uiuc.edu
Box 4348   Chicago,IL 60680