[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Symbolics Germany prices

    Date: Mon, 26 Aug 1991 15:11:33 +0000
    From: baechler%liasun4.epfl.ch@Warbucks.AI.SRI.COM (Emmanuel Baechler)

		     How can you blame an adminstrator 

See below.
						       who refuse you the
    credit for a Symbolics when you can get a Sparc Station for $6000 and
    when you get a Lisp Environnment almost for free because we have a
    site license? I know that the environnment of my Symbolics is still
    fundamentally better than what I can find on a standard Unix box, but the
    price difference is so big that this argument becomes irrelevant for
    the people who have the money.

Not irrelevant, just a little harder to understand.

    Emmanuel Baechler

It all depends on how long you intend to be programming, or how long
management intends to stay in business.  If management only makes
decisions based on absolute cost, maybe they don't know how to make
Return-On-Investment calculations.  For instance (plug in your own

Programmer Salary:     $50,000

Machine A
Cost:                   $5,000
"Productivity Factor":       1

Machine B
Cost:                  $50,000
"Productivity Factor":       5

The productivity factor is the hard part.  It depends on the user, the
complexity of the application, and the quality of the implementation
(maintainability).  It also seems to be impossible to measure, in
practice, because of the lack of controls.  (e.g., you can't have the
same programmer "benchmark" her personal implementation times on 2
environments, since she'll learn from the first experience, and do
better the second time.)  I suppose that according to classical
management theory, since productivity cannot be measured, it's not real.

Still, 5 or so years ago, there were a couple of papers about an
application that had been attempted unsuccessfully in conventional
environments, and then successfully so in Genera.  You would think that
would mean there's an infinite productivity factor, but the paper was a
little more conservative than that, and came up with a figure that
seemed to agree with commonly stated subjective estimates of the value:
5.  It's probably less, now, but I'm ignorant of machine prices and
configurations at the moment, so I'll let it stand.  Plug in your own

In any case, you can probably get away with using something meaningless,
like "lines of debugged code" as a productivity metric for purposes of
selling your argument to management.  So you've got (t in years, say)

Cost(A,t) =  5000 + (50000 * t)
Cost(B,t) = 50000 + (50000 * t)
Lines(B,t) = 5 * Lines(A,t)

For each machine, figure the cost of lines of code as a function of
time, and solve for the crossover for the two machines.  In this case,
it turns out that if you intend to keep your machines for longer that
1.25 years, B is more economical, and after that point, it's as though
you've got 5 programmers for the price of one.  If you consider
nonlinear effects of intra-team communication (see "The Mythical
Man-Month" by Brooks) you're also dropping the cost of this overhead.

A year AND a quarter is kind of unfortunate, since most publicly-traded
companies these days seem to be run as though they're only going to be
in business for a year OR a quarter.

I think the problem is a common blindness to the recurring costs of
salaries that dates back to the days when computers were MUCH more
expensive than programmers.  That's no longer the case, and managers
that continue look only at hardware costs ...