Re: Y combinator derivation

In article <27390@shemp.CS.UCLA.EDU> pierce@lanai.UUCP (Brad Pierce) writes:
\$
\$I think the previous poster was asking for a derivation of the
\$*applicative-order* Y combinator, not the normal-order as given
\$here. ``The Little Lisper'' (as someone else mentioned) has a
\$good presentation, but its complexity points out the price pays
\$for using an applicative-order language.
\$
\$In normal-order languages, like standard lambda calculus, the
\$derivation is straightforward. The motivation for the Y-combinator
\$in lambda calculus is to allow the use of recursive definitions.
\$Recall that there is no such thing as assignment in lambda calculus,
\$but that this elegant and mathematically tractable model of
\$computation is rich enough to support recursion anyway.
\$
\$The standard example goes as follows:
\$
\$ [A quite proper description of normal-order Y deleted.]
\$
\$is the so-called Y-combinator given above. It is easy to verify
\$the definition, as long as you remember that it is to be
\$evaluated in normal-order. Finding a fixed-point combinator
\$is not so easy when using applicative-order evaluation. Please see
\$the ``Little Lisper''.
\$
\$Maybe it's time to start thinking seriously about whether the
\$current performance benefits of applicative-order evaluation
\$is worth the theoretical and human factors price tag.

astounding. The applicative-order Y is not all that prohibitively more
complex than the normal-order one. One only has to replace in

\$> Y = lambda h.(lambda x. h(x x))
\$>              (lambda x. h(x x)) .

both occurrences of (x x) by (lambda z. (x x) z). Note also that the
new Y shares the elegance and lack of assignment that you claim for
the old Y.

Re the current acceptance of applicative-order evaluation in the
world's programming languages, I hardly think it can be attributed
solely to performance benefits. Applicative-order is _understandable_
in the presence of side-effects in a way that normal-order fails to
be.  You may protest that assignment-type side-effects should be
eliminated from the language, but then things like input and output
cannot be totally wished away.

Assuming you have a 'let' defined in the usual way in terms of lambda,
we have, in a potential normal-order real-life programming language
with input/output, among other things,

(let ([y (printf "y~n")])
((lambda (x) t) y))

never printing y at all;

(let ([y (printf "y~n")])
(or y y))

printing y twice (assuming printf returns nil);

whereas in applicative order, y is just printed once (as expected,
IMHO), regardless of the body of the let.

--dorai
-------------------------------------------------------------------------------
Alles ist richtig, auch das Gegenteil. Nur: zwar ... aber -- das ist
nie richtig. 						       --Kurt Tucholsky
-------------------------------------------------------------------------------