[OS X TeX] Computer Modern in a Mac program
Doug McKenna
doug at mathemaesthetics.com
Sun Oct 28 23:18:58 CET 2012
All -
I made a fair amount of progress before reading your latest round of
helpful responses, so here's my summary of reality.
I found the Latin Modern and Latin Modern Math fonts on the GUST website,
downloaded them, and got them installed in my ~/Library/Fonts folder, and
they show up on FontBook.
Using Get Info on them in FontBook, I determined their official
PostScript names. Using these names, one can then use Core Graphics
calls in one's source code to make one of the Latin Modern fonts the
current font, and then draw a single glyph at a given position on the
screen. E.g.,
CGContextSelectFont(context, "LatinModern-Regular", 10.0,
kCGEncodingMacRoman );
CGGlyph chCode = 'x';
CGContextShowGlyphsAtPoint(g.context, x, y, &chCode, 1);
places an 'x' at the given coordinates (x,y). So far so good!
I then found a discussion on the web that explained that Latin Modern
were Unicode fonts, and eventually determined at
http://www.unicode.org/charts/PDF/U2200.pdf
that the official Unicode code point for the summation symbol is, as
Peter Dyballa also explained, U+2211. Which led to this code snippet for
testing the math font:
CGContextSelectFont(context, "LModernMath-Regular", 10.0,
kCGEncodingFontSpecific );
CGGlyph chCode = 0x2211;
CGContextShowGlyphsAtPoint(g.context, x, y, &chCode, 1);
(note that the math font name doesn't include the prefix "Latin" in its
official PostScript name, just a single "L", which---unlike the text
fonts---definitely makes it next-to-impossible to guess what its official
name might be).
But setting |chCode| in the above to 0x2211 didn't work. Nothing drew on
the screen.
I then found the utility |ttfdump| on my Mac, and tried using it in
Terminal to look at information inside the Latin Modern Math font file:
ttfdump lmodern-math.otf
ttfdump is not very good, crashing with a segmentation fault just after
it tried to dump an area called the 'loca' table. But fortunately prior
to that crash there's a fair amount of information to peruse. In
particular, searching the dump text for "2211" finds a mapping to an
index 2822 (decimal, not hex). So I tried that in my code above and
voila, the sigma summation sign appeared on the screen!
So that solved my immediate problem, but now I've got one more: the
glyph's baseline is obviously not the same as it is in the "lmex10.tfm"
(or "cmex10.tfm") metrics files, where I determined that the glyph is
(peculiarly) nearly all below the baseline (15 times as much below as
above). In Latin Modern, the baseline of the summation symbol appears to
be about a quarter of the way up the glyph, with 3/4s of the glyph above
the baseline.
In my various searches and researches, there are statements saying that
Latin Modern is, with a few exceptions, the same as Computer Modern. It
seems I unluckily chose one of the few symbols that are an exception.
Does anyone know of an official (stable?) compendium of what (precisely)
these metric differences are?
Doug McKenna
More information about the macostex-archives
mailing list