[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
gc-by-area and host uptimes
Date: Thu, 9 Feb 89 10:46:31 +0200
From: luria%sel.technion.ac.il@RELAY.CS.NET (Marc Luria)
Could someone explain techniques for using gc-by-area
do you do it to all areas, including "static ones"
if not how do you choose?
Incant the command ":Start GC :Immediately By-Area". It gives
you a menu of areas and their sizes. You get to choose which
ones to GC and which ones not to. It will tell you when the
sum of the areas which you selected is less/greater than the
amount of remaining memory and thus whether the GC has the potential
of running out of copy space. You have to use your intuition to
select the proper subset of areas that 1) is small enough and
2) will likely free up the most storage. You must bear in mind
that if there are pointers from non-GCed areas into ones that
you GC then that storage will not be reclaimed. So you should
pick a set of areas which are most likely to have a lot of
pointers back and forth between them. What I do is do several
iterations of this, each iteration frees up more storage
to allow me to GC a larger and larger subset of the defined
areas until I regain enough storage to do a normal
":Start GC :Immediately". I seem to remember doing three to five
iterations which took about 12 hours. At the end I reduced my
memory usage from over 95% down to 60% (at least by informal
observation of my GC thermometer). One hint you should know is
that the presentation/dynamic window system creates a lot
of garbage so if you answer the questions that GC asks you
about clearing the history and then you GC those areas, I find
that I recover a significant portion of memory.
what is this slow gc, and how slow is it?
According to Michael Greenwald (infomally) it is a mark-and-sweep-like
GC which requires only a very small amount of memory to do a
full GC. Although I understand the benefits of real time copying GC
for interactive use of a large virtual memory Lisp, I have for a
long time wished that Symbolics would offer an offline GC that required
a minimal amount of free storage for two reasons: 1) it would allow
unattended operation of long compute-bound jobs that used a lot of memory.
2) it would allow me to recover from situations like the one I
am currently in without having to baby-sit the machine through
GC-by-area.
In terms of host uptimes, when I was finishing my thesis, and worried
that if I rebooted nothing would ever work again, I kept
my machine up for a couple months. Unfortunately, some areas
kept on growing, and I wasn't aware of gc-by-area.
Solution: Add Paging File
As I got closer to breakdown I just wrote over another world load
That won't work for me. I currently have no world loads on my machine
as I netboot. My two disks (140Mbyte and 190Mbyte) are nothing but
paging space. So the good old "spare-tank.page" solution won't work.
Jeff