kifer(a)cs.sunysb.edu (Michael Kifer) writes:
Could somebody point me to the part of code where XEmacs decides,
depending on the OS, which bits can be stolen from address space to
tagging?
There's the old code and there's the new code. In the old code, it
was done in the s&m files -- look at "DATA_SEG_BITS" defines. It's
explained in the "Internals" manual:
[...]
Also, note that there is an implicit assumption here that all
pointers are low enough that the top bits are all zero and can
just be chopped off. On standard machines that allocate memory
from the bottom up (and give each process its own address space),
this works fine. Some machines, however, put the data space
somewhere else in memory (e.g. beginning at 0x80000000). Those
machines cope by defining `DATA_SEG_BITS' in the corresponding
`m/' or `s/' file to the proper mask. Then, pointers retrieved
from Lisp objects are automatically OR'ed with this value prior to
being used.
In the new code, which you get using `--with-minimal-tagbits', we
silently assume that two least significant bits are always available
for tagging.