[XeTeX] Font scaling problem

Jonathan Kew jonathan_kew at sil.org
Mon Oct 18 09:32:10 CEST 2004


On 18 Oct 2004, at 12:48 am, Will Robertson wrote:

> Hi all
>
> I'm trying to add scaling to my fontspec package, but I'm getting an 
> error. Can anyone explain to me why my appended example fails? If the 
> scaling string s*[0.5] is removed I have no problem.
>
> Basically it works if the commands are entered manually, but not if 
> they're put into a command. I'm stumped - help would be greatly 
> appreciated!

In a word: catcodes.

To explain a little more: NFSS uses special catcode settings when 
parsing arguments to \DeclareFontShape (this is why spaces in font 
names disappear, for example, if they're entered directly there). I'd 
guess '*' is one of the characters that gets a special catcode; maybe 
'[' and ']' as well.

In normal use, the command \DeclareFontShape doesn't actually take 
parameters, as it looks like; instead, it changes the catcode settings 
to the NFSS ones and then chains to an internal macro that actually 
reads and interprets the parameters. It has to be done this way because 
TeX "freezes" the catcode of a character token when it is initially 
scanned.

Your test fails because the catcodes of the characters in the font 
shape description get frozen when it is initially read during the 
**definition** of \mkfont. So during the **expansion**, when 
\DeclareFontShape gets to set up the catcodes for NFSS parameters, it's 
too late.

To demonstrate that this is the problem, you can wrap your \mkfont 
**definition** so that the proper NFSS catcodes are in effect:

{\makeatletter \nfss at catcodes \globaldefs=1
  \newcommand\mkfont{
  \DeclareFontFamily{U}{fontB}{}%
  \DeclareFontShape{U}%
     {fontB}{m}{n}{<-> s*[0.5] "Didot"}{}%
  }
}

Then it will work as expected.

(How to actually build this into your larger package is left as an 
exercise....)

HTH,

JK



More information about the XeTeX mailing list