[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

sun and symbolics development environments



    Date: Mon 27 Jul 87 10:21:54-CDT
    From: "Chunka Mui" <C.AITCMUI@chip.uchicago>


    There's been a lot of discussion recently comparing and contrasting
    the Sun and Lispm development environments but this discussion has
    focused primarily on the software side of the issue.  How do they
    compare from a strictly hardware (and maybe very low level system
    software) standpoint?  Is there anything that one can do that the
    other can't because of implicit restrictions?  Is this a naive
    question?

    Chunka

    -------

There are some important differences that relate to the underlying
hardware architecture.  All of them relate to your ability to implement
large integrated applications.  These applications depend on a large
shared address space for their integration.  If you have a truly large
application data based the timesharing style of writing the data out to
a disk file and reading it back in when switching between functions in
the application becomes unmanageable.  So you need a large long-lived
shared memory database to provide a good level of integration between
all the aspects of the application.  

Our VLSI CAD system is a good example of this type of application.  It
provides an environment that allows the designer to move easily up and
down the design hierarchy from block level architecture design through
logic design to device and mask design.  Thus when the designer is
trying to optimize a portion of the mask layout he can easily see
exactly where in the logic and architecture he is to guide his efforts.
When the system does an circuit level simulation of a portion of a chip
it can drive that simulation from an efficient architecture level
simulation of the rest of the chip.  After a circuit level simulation
the wave forms are easily available by pointing at nodes in a logic
drawing.  This package was used to design the Ivory chip with about ten
times the designer productivity of traditional design techniques.  Since
there is effectively only one data base the architecture, logic, gate,
and mask level descriptions are always kept in sync.  Given a 400,000
gate chip the long lived, shared memory approach is the only one that
could have dealt effectively with this type of problem.

I can imagine this technique being very effective in providing large
increases in productivity to users in many different application
areas.  I can imagine an intelligence analyst integrating large amounts
of data from many sources, using multiple techniques, producing separate
reports for different users.  Similarly I can imagine a plant general
manager modeling his entire plant operations to determine the impact of
unexpected demand for a new product or a part shortage.  


There are two technical issues that must be dealt with regarding memory
management in this kind of system.  The first is "large", and the second
is "long-lived".  To build a large integrated database for such an
application requires tools such as object oriented oriented programming
to manage the complexity that the programmer must deal with in
developing a part of the application.  This is really the Genera story.
The second part of the problem is the need for the environment to be
long lived.  The size of the database means that it is not possible to
be writing it out to disk every time the user switches from one
application module to the next.  It is clear to me that strong hardware
data integrity checking is necessary to building large long-lived
applications that continue to survive despite the inevitable bugs.  The
Symbolics 3600 architecture provides hardware that supports objects and
structures as the basic data model.  This provides data type checking in
hardware.  It also means array bounds checking.  It means hardware
support for garbage collection integrity as well as performance.

It is also important that the operating system as well as application
modules written in C or FORTRAN get the benefit of this checking as they
are at least as likely to eventually trash memory as the Lisp portion of
the code.  This is the Symbolics 3600 hardware story.