[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Spock@SAMSON.cadr.dialnet.symbolics.com: IFU]
Date: Fri, 2 Dec 88 08:46 PST
From: DE@PHOENIX.SCH.Symbolics.COM (Doug Evans)
[ stuff deleted ]
Also, it's pretty difficult to squeeze performance out of a 40bit
computer with a 32bit address space when it has to use a bus structure
with 36bit datapaths and 24bit addresspaths.
I thought the address path was 28 bits, since the word organization is
(I'm told) 28-bit address + 6-bit tag + 2-bit cdr code or 32-bit
immediate data + 2-bit tag + 2-bit cdr code. If the address path is
24-bit, how do you deal with the extra 4 bits?
While I'm asking questions:
How does the MacIvory do 40-bit (or 48 -- see next question) words with
the 32 bit NuBus?
I may be confused by this, but the literature I've read implies about
the XL400 says it's memory organization includes 8 bits of error
correction/detection, and that its Ivory board handles the ECD itself.
Is the MacIvory set up that way too? Since this is a substantial amount
of memory overhead (and maybe silicon, too), I was wondering what the
design argument was to add it in. Is the memory potentially bad enough
to require a lot of error-fixing overhead?
I think that the 3600 architecture allowed for up to 34 tags. The
Ivory, with a full 6-bit tag field (assuming you're still using 2 bits
out of the 40 for a cdr code), lets you have up to 64. Aren't there
some big incompatibility potentials here?