[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Multiple-font strings and char encodings in CLX



   > What is the relation between the 16-bit strings and multiple-font-strings?

You need to manage and implement any such relationship yourself.  The idea is
that you need to provide a :translate function which is smart enough to
input your (implementation-dependent) encoding of multiple-font-string
(perhaps a sequence of characters with font bits, where the font bits are an
implementation-dependent font identifier) and then output a sequence of
8- or 16-bit glyph indexes or a xlib:font change. The draw-glyphs function can
then automatically construct the appropriate PolyText8/16 request.

Your :translate function is thus responsible for converting
implementation-dependent font identifiers (e.g.  font bits) into xlib:font
objects.  The :translate function interface allows it to return a font change,
but *not* a change in glyph index size.  (See the source for
xlib:translate-default).  That is, draw-glyphs assumes that, regardless of any
font change, the objects in the sequence of "characters" will always be
translated into glyph indexes of the same size (given by the :size argument).


This reminds me of some related questions that have been bothering me lately.
I'd like to design an international text displayer which could handle text in
multiple character set encodings.  Many of the pieces to do this are in place in
X11 R3.  For example, given a character set identifier and a font name string, I
can open an xlib:font which accepts glyph indexes in the given encoding. Given a
sequence of such glyph indexes, I can then display them.

	Question 1: What char encodings are used by various Lisps? char-code is
	not guaranteed to return an ASCII code (although it does in my Lisp).
	How can my text displayer ask a Lisp string what encoding it's using?

	Question 2: Given a char set identifier, how do I know if I need to draw
	with 8- or 16-bit size indexes?