[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Hardware/Software/People Costs (was Risc vs Symbolics)



    Date: Tue, 14 Aug 90 05:48 EDT
    From: RWK@fuji.ila.com (Robert W. Kerns)

	Date: Mon, 13 Aug 90 19:08 CDT
	From: lgm@iexist.att.com

[...]

    Actually, I see the iron that runs the software to
    be an ever-shrinking portion of the total cost.  

The common fixation on hardware costs dates back to the 50s and 60s when
machines really were the biggest cost.  Timesharing helped cut costs.
But the equation changes, and old habits die hard.

In 1982 I attended a panel on "The Future of Seismic Computing" at the
Society of Exploration Geophysicists convention.  Seven of the eight
panelists had basically the same message: "Data explosion!  More MIPS!!"
I'm not denying that it is a critical concern, but -

One man got up and explained the economics of computer systems, talking
about how under-utilized equipment costs money, etc. etc.  He then said,
"We should already be looking at the end-user as an under-utilized
peripheral."  This caused gasps as attendees objected to being compared
to machines; the point was missed.

It still seemed to be unrecognized in 1987 when we visited the Houston
Area Reserch Council to try to interest them in Symbolics user-interface
front-ends to their 6 month-old NEC SX-2 supercomputer.  There were
supposedly users networked in from universities and industry sites in
the region; we also saw a few there, working at ASCII terminals.  The
saddest part of our tour was the % Utilization Meter on the main
console: it spent most of it's time at zero, and every few seconds, it
would make a feeble attempt to reach 5%.  Wasted people, wasted hot-iron.

						     I've
    spent many times the $$$ on software for my mac than I
    have for hardware, and I have a quite hefty configuration.