[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: What's the relationship between vfs and tfms?




Hi:

   >That is, the psfonts.map has two entries to force reencoding of the `raw'
   >font to anything you want, no need to add additional reshuffling using VF.

   But that does mean you need different encoding vectors for each occurrance
   of a different encoding *and* each different dvi driver.  The nice thing
   about the vf approach is that you need one 8r encoding vector for each dvi

Not sure I see the difference.  The DVI processor has to reencode the
font in any case.  Whether it reencodes to 8r or the encoding the user
actually wants (LY1 say) doesn't make much difference (other than not
needing VF in the latter case).  

	Also, the encoding vector is a *constant* --- it does not depend
	on the platform.  Keep in mind it completely *replaces* whatever
	encoding the font is set up for.  You are perhaps thinking of
	`remapping' done by VF, which *does* have to take into account 
	the target encoding.

Maybe it helps to see what an encoding vector looks like conceptually:

32	space
33	exclam
34	quotedbl
35	numbersign
36	dollar
37	percent
38	ampersand
....

It is *NOT* a mapping from one set of numbers in the range (0-255)to
another ser of numbers in the range of (0-255).  (Which is all VF can do).

   driver (rather than OT1, T1, and any others that are in use); and the
   OT1->8r and T1->8r mapping is handled by a single encoding vector (for each
   encoding) that works for all TeX installations.  If everyone used dvips and
   the same PostScript founts, then this non-vf approach wouldn't have any
   drawbacks.  But what of those of us who use TrueType founts on Macintoshes,
   without using dvips?

Then you are in trouble in any case, since there is no way to reencode
a TrueType font on the Mac (other than changing the actual font file).
You are stuck with what is in Mac standard roman encoding.  This means
you won't have access to 21 of the `standard' 228 glyphs (like eth, thorn).

(Unlike Windows NT where some software can reencode the TrueType vectors
just like Type 1 and make even f-ligatures accessible in TT fonts).

   >   Anyway, the actual encoding the real live Type 1 printer fount file uses
   >   can be pretty much anything, and this varies according to computer.  It

   >Well, for text fonts, the actual font file *always* says Adobe Standard
   >Encoding.

   Surely not?  I was under the impression that the printer fount files on my
   computer say they use Macintosh text encoding.  They certainly behave as if
   this were the case.

The actual Type 1 font file *always* says /Encoding StandardEncoding def 
in the case of a text font.  In fact some system software and
printer drivers depend on this.  The operating system reencodes the
font to a platform specific encoding (remember we are now talking
about systems with system level support for scalable fonts :-).

   Until 8r encoding was invented, PS founts were installed `raw', so this
   re-encoding step wasn't needed.  As Alan Jeffrey puts it in fontinst.tex:

   >Finally, you should tell your {\tt dvi}-to-PostScript driver about the
   >fonts.  This will depend on your driver, for example with {\tt dvips}
   >you should add the following lines to your {\tt psfonts.map} file:
   >\begin{verbatim}
   >   ptmr0      Times-Roman
   >   ptmri0     Times-Italic
   >   ptmb0      Times-Bold
   >   ptmbi0     Times-BoldItalic
   >\end{verbatim}

   In other words, Adobe standard encoding has been used, and re-encoding in
   the dvi driver isn't essential.

Sigh.  Which means you do not have 58 accented characters and about 30
other special characters!  No hyphenation in `foreign' languages in
TeX. Not sure what you are getting at here.  Surely nobody used it
that way (In prehistoric times, DVIPS had its own hard-wired
unadvertized internal encoding which it applied to  text fonts -
so at that point in history you wouldn't mention an encoding vector
in psfonts.map).

   > Which was my point.  If you have to reencode
   >it anyway, why bother with an additional shuffling of character codes?

   Can you think of a way in which TeX documents could be made properly
   portable, and dvi drivers could be made easy to set up, given that we'd
   need dozens of different output encodings for each different dvi driver?

See above.  The encoding vector is not *different* for different drivers,
since it does *not* depend on the underlying encoding of the text font
(or whatever the operating system reencodes the text font to).
The encoding vector *replaces* whatever was there.  The original encoding
might as well be a null vector for all it matters.

For example, many people happily use LY1 (TeX 'n ANSI encoding) - even
on Unix.  All you need do is \usepackage[LY1]{fontenc} and no need for 
`text companion' fonts. These people all use the *same* encoding vector.
(Modulo different drivers requiring it in slightly different formats: 
DVIPS wants an actual PostScript array, OzTeX wants a `charnumber glyphname' 
format).

   >   The sensible thing to do
   >   is to have the dvi driver handle this part of the interface, because the
   >   dvi driver set-up is *supposed* to be system-dependent.
   >
   >Agreed.  But that still doesn't address the question of why yet another layer
   >of remapping is needed (VF).

   Can anyone who helped make this particular design decision explain it?  As
   I see it (in my ignorance), the vf re-mapping is used because it's portable
   and works on all TeX systems.  

Yes, all systems that use DVIPS :-)  And even there it is not needed for 
this purpose.

   The extra re-mapping to 8r strikes me as
   more of a necessary evil, and is only put up with because you only need one
   single re-encoding vector file for each dvi driver, so it doesn't add too
   much to the mess of system-dependent files we've got anyway.

No.  As explained above, the encoding vector is always the same.
It simply says what glyph you are going to get for each numeric
character code.  (The physical expression of the encoding vector
may differ because different drivers may like to see it in different
forms - but the mapping from number to glyph is constant).

   > is the thing that actually contains
   >the character programs.  Hence a TFM file is not a font (despite nomenclature
   >used in the TeX book, neither is a VF file.

   What about pk files?  By your definition, these aren't founts, which puts
   TeX users in the interesting postion of being able to produce printed
   output with no founts at all (by using tfm, vf, and pk files only).  I

OK, PK files contain the character shapes, so they are fonts.  (Since I
never use PK fonts, I simply forgot to specifically mention them). 
Similarly BDF files, FON files, F3 files Speedo files etc. etc. are fonts.
But AFM files, TFM files, PFM files, etc. are not.

Regards, Berthold.