[XeTeX] Use of Apple Symbols font in XeLaTeX

Bruno Voisin bvoisin at mac.com
Fri Sep 17 12:29:48 CEST 2004

Le 17 sept. 04, à 11:34, Bruno Voisin a écrit :

> Alas this does not work. I suspect the problem is actually more 
> serious: in XeTeX, a two-digit text character declaration such as "1C 
> becomes a four-digit one "001C (I think); but what does become a 
> four-digit math character declaration such as "321C, where the first 
> digit refers to the class (relation, binary operation, opening 
> delimiter, ordinary character, etc.) and the second digit to the font 
> family (1 for math italic, 2 for math symbol, 3 for math extension, 
> etc.)? Is it simply "32001C?
> I am asking this because, apparently, \DeclareMathSymbol works 
> essentially as follows, for a definition of the form 
> \DeclareMathSymbol{\applewhitesquare}{\mathord}{applesymbols}{"25A1}
> \DeclareMathSymbol{#1}{#2}{#3}{"#4}
> -> \set at mathsymbol{sym#3}{#1}{#2}{#4}
> -> \mathchardef{#1}="{#2}{sym#3}{#4}
> where the arguments have been replaced by their hexadecimal numeric 
> equivalents wherever necessary. Hence, what the \mathchardef syntax 
> becomes in XeTeX seems crucial to the issue.

Well, I should just have tried. Doing this:

	\DeclareFontShape{U}{appsym}{m}{n}{<-> "Apple\space Symbols"}{}

	% defines a new math font family \symapplesymbols

	% \hexnumber@ is LaTeX's command for extracting an hexadecimal value 
between 0 and F

I get the error message:

	! Bad mathchar (730529).
	l.208 ...esquare="0\hexnumber@\symapplesymbols25A1

which seems to indicate that \mathchar's larger than four digits 
(actually larger than standard TeX's default "7FFF) are simply not 
allowed. There's certainly something I've figured out wrongly 
somewhere, something's I've implicitly assumed that I shouldn't, but 

Bruno Voisin

More information about the XeTeX mailing list