[XeTeX] Use of Apple Symbols font in XeLaTeX

Bruno Voisin bvoisin at mac.com
Fri Sep 17 11:34:39 CEST 2004

Le 15 sept. 04, à 18:35, Bruno Voisin a écrit :

> I've been trying to use characters from Apple Symbols in XeLaTeX (to 
> complement and replace those already in AMS fonts), by mimicking the 
> way AMS symbols are used in amsfonts.sty and amssymb.sty (and trying 
> to digest all the information in fntguide.dvi!):
> [...]
> Alas this does not work, and I only get characters from the main text 
> font instead of Apple symbols. For

Le 15 sept. 04, à 23:16, Jonathan Kew a écrit :

> I'm not going to try and understand LaTeX's \DeclareMathSymbol just now

I've been making some progress in analyzing this problem, I think. 
Surely \DeclareMathSymbol requires some modification for XeLaTeX, as it 
appears to assume that the corresponding character numbers can be 
represented by two hexadecimal digits and are hence lower than 256. 
Specifically, the provided character number (4th argument of 
\DeclareMathSymbol) is decomposed into two hexadecimal digits stored in 
\count0 and \count2:

       \advance\count\tw at -\count@
           \expandafter\set at mathsymbol
              \csname sym#3\endcsname#1#2%

I've attempted to cure this by using four counters \count0, \count2, 
\count3 and \count4 instead:

       \divide\count0 by 4096
       \multiply\count@ by 4096
       \advance\count2 by -\count@
       \divide\count2 by 256
       \multiply\count@ by 256
       \advance\count3 by -\count@
       \divide\count3 by 16
       \multiply\count@ by 16
       \advance\count4 by -\count@
           \expandafter\set at mathsymbol
              \csname sym#3\endcsname#1#2%

Alas this does not work. I suspect the problem is actually more 
serious: in XeTeX, a two-digit text character declaration such as "1C 
becomes a four-digit one "001C (I think); but what does become a 
four-digit math character declaration such as "321C, where the first 
digit refers to the class (relation, binary operation, opening 
delimiter, ordinary character, etc.) and the second digit to the font 
family (1 for math italic, 2 for math symbol, 3 for math extension, 
etc.)? Is it simply "32001C?

I am asking this because, apparently, \DeclareMathSymbol works 
essentially as follows, for a definition of the form 

-> \set at mathsymbol{sym#3}{#1}{#2}{#4}
-> \mathchardef{#1}="{#2}{sym#3}{#4}

where the arguments have been replaced by their hexadecimal numeric 
equivalents wherever necessary. Hence, what the \mathchardef syntax 
becomes in XeTeX seems crucial to the issue.


Bruno Voisin

More information about the XeTeX mailing list