[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: #/char



1) People who have any code using #/... generally have code that
   works arithmetic on these things.  You guys who want to make NIL
   incompatible with the rest of the world, by not having #/...
   return a fixnum, will either make it impossible for NIL to
   absorb code written by J-random user of LISP/MACLISP, or else
   you aspire to convincng *everyone* who's ever used #/... to go
   back and check their code for fixnum-dependencies.
2) Is this whole issue the hopless morass of not enough graphic
   characters again?  Martin, Szolovits, and Sussman have designed
   a "logical" character set scheme which allows 7- or 8- bit graphics
   for information interchange (in files, over nets, etc), but permits
   a user-tailored mapping  to-and-from a larger set (say 8- bit or
   even 12 bit, a la APL).  Similar "mapping" schemes have been in
   use for a long time in the IBM world, where EBCDIC is only one of
   the several mappings into the 8-bit alphabet.  This sort of solution
   is necessary in the long run, and is clearly superior to forcing
   every possible new character usage into some # format.  By the bye,
   I've already programmed up the Martin-Solovits-Sussman scheme in
   the NIL reader.
3) More  against the conservative-minded approach of making every new
   application take a # approach:  The NIL vector syntax works
   because there is simple extension of any paired bracket -- 
   namely for { and }, do #{ and };  or in the vector case, we have
   "#(" paired with ")".  Yes, this is a very limited approach, but
   it works for 1 or 2 new data types.  The idea of making #. be
   the general approach appears attractive, e.g.
      #.(MAKE-FOO-DATUM ...)
   but it takes something more than a sophomoric mind to see the
   worthlessnes of this "hook" as a general information interchange
   standard.