<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=UTF-8" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
Hello again,<br>
<br>
I am rapidly getting addicted to XeTeX :-) ...<br>
<br>
I want to continue using UTF-8 input encoding even when certain glyphs
(basically characters with diacritics from the
LATIN-EXTENDED-ADDITIONAL block, which I need to transliterate Oriental
languages) are not present in the font used to typeset the document. To
do so I make
those UTF-8 characters \active and define a macro that checks whether
the
glyph is available in the font, in which case it retrieves it,
else it calls another macro to compose the diacritics as in traditional
LaTeX. So yes, this is indeed the reverse idea of the xunicode package.
I wrote a
little package of no great sophistication for my own use, which I
called xdiacomp<font face="sans-serif"> (for DIAcritic COMPosition).
Here are a
few extracts to provide the general idea (cf attachment for the full
version):</font><br>
<font face="monospace"><br>
[...]<br>
</font><font face="monospace">% various definitions of accents and
diacritics <br>
\gdef\textsubbreve#1{\hmode@bgroup\o@lign{\relax#1\crcr\hidewidth <br>
\vbox to.2ex{\hbox{\ifnum\fontdimen1\font=0 %<br>
\kern-0.0em\else\kern-0.40em\fi\ifnum\XeTeXcharglyph"02D8 >
0\char"02D8\relax%<br>
\else\fontencoding{T1}\selectfont\char8\fi}\vss}\hidewidth}}<br>
%this used to be set with $\lhook$, but the tipa glyph is better i
think:<br>
\newcommand{\ain}{\raisebox{.8ex}{\fontencoding{T3}\fontfamily{ptm}\selectfont\char21}}<br>
[...]<br>
</font><font face="monospace">% now we redefine the catcodes </font><font
face="monospace"><br>
\catcode `ḍ = \active<br>
</font><font face="monospace">\catcode `ḫ = \active</font><br>
<font face="monospace">[...]<br>
%and we associate those with the following macros:<br>
</font><font face="monospace">\def ḍ{\ifnum\XeTeXcharglyph"1E0D >
0\char"1E0D\relax\else{\d d}\fi}%<br>
</font><font face="sans-serif"><font face="monospace">\def
ḫ{\ifnum\XeTeXcharglyph"1E2B >
0\char"1E2B\relax\else{\textsubbreve h}\fi}%<br>
\def ʿ{\ifnum\XeTeXcharglyph"02BF > 0\char"02BF\relax\else{\ain}\fi}%<br>
[...]<br>
</font><br>
Now come my questions: <br>
<br>
(1) Is this a sound way to solve the problem I face? If yes, would
it make sense to extend my package to cover a range of characters with
diacritics that many OT fonts are likely to lack, but which are easily
composable by TeX macros? Note that if a glyph is actually present in
the current font, it will be used instead of the TeX composition. What
I propose may in some way be a little bit sinful (from a pure
Unicode/OpenType perspective), but at least it tries to minimize those
little sins... ;-)<br>
<br>
(2) My package (xdiacomp.sty) is incompatible with xunidode.sty, which
is quite logical since it represents its absolute antipode. Yet there
might be situations where a user would need to use both packages. For
instance since I type ṭ (t with dot below) very often and have defined
a keyboard shortcut for it (RIGHTALT-t), this character is always input
directly as UTF-8 in my LaTeX files. But for seldom-used characters I
might prefer to type, say, \textrangle or \textrightarrow rather than
spend time to input them as UTF-8 by whatever means... I tried with
\UndeclareUTFcharacter ... but it does not seem to work. Can someone
help on this? Does it actually make any sense to attempt making the two
packages compatible?<br>
</font><font face="sans-serif"><br>
Thanks<br>
François<br>
<br>
</font><br>
PS: To follow-up on a recent message of mine, I have now pretty much
succeeded in implementing a fully-featured ArabTeX-like user interface
for XeTeX by means of TECkit font mappings. More on this very soon!<br>
</body>
</html>