[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
over a billion consed
Only a billion? :-) I just checked my machine's current state, and it's
consing up cells even faster than the world is consing up humans.
The scavenger is busy finding old objects that are still in use and transporting
them. Currently it is finding ephemeral objects referenced by pages in main memory.
Scavenging during cons: On, Scavenging when machine idle: On
The GC generation count is 19018 (1 full GC, 2 dynamic GC's, and 19015 ephemeral GC's).
Since cold boot 5,163,969,312 words have been consed, 5,150,597,121 words of garbage have
been reclaimed (at most 326,820 words of garbage might be reclaimed during
this GC), and 20,048,567 words of non-garbage have been transported.
The total "scavenger work" required to accomplish this was 18,430,443,396 units.
Use Set GC Options to examine or modify the GC parameters.
This is from running an implementation of a standard graph-coloring algorithm
(the Brelaz algorithm). The consing comes from a function that returns a list
of possible nodes to color next.
Since we're gathering statistics on the behavior of the algorithm, instead of
having it return the first solution, I have it try every equally-good choice
whenever confronted by a decision. This made it take over three days
exploring ways to color a single 30-node graph, and that's where most of those
5-billion cons cells came from.
I'm not sure what this proves. I could probably optimize it to cons a
thousand times less by optimizing one function not to cons up a list if it
will only have one element in it.