Not sure this is the proper place for the questions, but still ...
I'm a total newbiew to programming for MULE though Lisping in XEmacs over 5
years. So, have several questions.
I introduce the new charset with the following code:
(make-charset 'experimental-charset
"Experimental charset"
'(dimension
1
registry "ISO8859-1"
chars 96
columns 1
direction l2r
final ?3
graphic 1
short-name "EXPERIMENTAL CHARSET"
long-name "EXPERIMENTAL CHARSET: CHARSET NEWBIE"
))
Investigating into how the chars from the charset appear in ccl programs I
found that chars are essentially three octets
For example, a character created as (make-char 'experimental-charset 32)
appears as a sequence (in decimal) 158 172 160.
I can understand 172 -- it is equal to (charset-id 'experimental-charset). 160
из 32 with highest bit set. BUT! where does the magic number 158 comes from?
The second question relates to font selection. Under windows, when I try to
display a char from the charset, i get a warning:
(font/notice) Unable to instantiate font for charset experimental-charset, face default
So, what is the method to associate a particular font (potentially with font
family that differs from the default font family) with a charset?
Nick.