FMi on text symbol encodings

Hilmar Schlegel hshlgaii@mailszrz.zrz.tu-berlin.de
Tue, 2 Mar 1999 09:02:09 -0500


Frank Mittelbach wrote:
> 
> Ulrik writes:
> 
> just picking on one remark
> 
>  > \unfakable{hyphendbl}
>  > \unfakable{hyphendblchar}
>  >
>  > Could hyphendbl be replacable by a normal hyphen?
> 
> i'm not really sure what is better here, should you try not to fake with
> essentially different glyphs or should you try to make the glyph set larger so

The question is - of course - : what's the purpose?
There are different tasks which require different approaches. For making
a given dvi-file readable/printable as much as possible fakes are
useful(necessary). For working with locally available fonts only
accaptable fakes are useful. 
For example the double hyphen slot should indeed print at least a normal
hyphen if the font has nothing else. This is reasonable as well for
markup - the source prints only different depending on the locally
available font sets (just like MS-Win prints a Word text depending on
the printer driver used :-)


> that fonts that have the real glyph and those that have only a faked one could
> live together can claim to have the same encoding (or support the same
> glyphset)

The idea to have only one encoding for zillions of fonts which provide a
wide spectrum of different character sets (aside from the Adobe Standard
Roman Character Set) causes usually some headaches. At least depending
on the available ligatures there are usually differences between
individual fonts. The key problem is that Latex has no concept with
respect to encodings: there is some single "standard" enconding and
that's it. Conceptual problems to change uc/lc codes imply some inherent
difficulties to change that for the long run.

> i tend towards the former for things like TS but that also needs more thought
> 
>  > > This isn't a question of particularly current interest however,
>  > > so we may well leave things as they are, for now.
>  >
>  > If the LaTeX team has some interest in cleaning up the textcomp
>  > package and TS1 encoding, this might well be an issue to consider for
>  > the next fontinst release.  We could just pretend that fontinst's 8c
>  > and 9c encodings have always been what might be called TSA and TSX.
> 
> it was discussed but never got resolved due to manpower problems.
> 
> yes i think we do have an interest or rather in think the LaTeX community
> should have an interest as the current situation isn't very good, is it?

The present situation is that one must tell people not to use EC/TC
fonts to be able to produce reasonable PDFs. 
What's even more problematic is the way how the TS1-encoding is
"designed", i.e. occupied by aux characters/symbols: the switching of
fonts breaks ligatures/kerns and symbols of questionable usefulness for
the average user are placed between necessary stuff.
My very personal opinion is that some very exotic MF-only signs are not
very helpful for Tex's future to restrict the setup to the CM-style in a
world where Type1 font technology is practically accessible for
everybody.

Hilmar Schlegel