[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Unicode and math symbols



Berthold wrote --

>    > (1) Which is why we have the `alphabetic presentation forms' 
>    > ff, ffi, ffl, fi, fl, slongt, st etc. in UNICODE.
> 
>    They are in the compatibility section. 
> 
> Well, they were put in *somewhere* because they are needed,

No, they were put in for compatibility and there use is not advised.

Also "etc" are not there: these are the only Latin "aesthetic
ligatures" that are there, eg there is no ck ligature.

> since (i) we do not have a usable and widely accepted glyph
> standard, and (ii) because most software wants to be able to have
> *some* way of telling what glyph in a font is to be used.

These do seem to be the two problems driving this issue.

But not just "some way": it seems that the only object some software can
use is a fixed-length number; and *only one* correspondence from
these numbers to ... what? ... to glyphs (in all fonts, in some fonts, or
what??) or to characters (ie units of information) or to both (ie
a one-one relationship between glyphs and characters?).

>  But do I really need - in English - to make a distinction
> between the characters A-Z and the glyphs A-Z?  Or, beyond that,
> most of the glyphs in the ISO Latin X tables (if we ignore the
> mistakes and bad choices made).

I have little qualification to answer this but it may well be that you
do not need to make a big thing of the difference in these cases.

But the point of Unicode is to remove such cultural dominance of modern
European languages on IT.

> But anyway, meantime we need to make life easier!  And despite all the
> explanations and arguments I don't see a whole lot wrong with using
> UNICODE as essentially a glyph standard for Latin X, Cyrillic, Greek,
> and yes, most math symbols, relations, operators, delimiters etc.

Let us suppose that some such encoding would be a practical encoding
containing considerably more than 256 slots but using enormously less than
2^16 slots available.

If, as I mentioned above, there must be only one encoding, then surely
it will have to be whatever Adobe or Microsoft decides, and will
therefore cover only some subset of the glyphs they are interested in:
and, knowing them, it will be almost but not quite "the same as"
Unicode.

So even if a 16-bit encoding is set up with all the math symbols "we"
want, it will not be used as the *only one*.

> Except that unfortunately they don't cover enough of math to be 
> really useful...

And never will, according to some definitions of useful; but it will
also not cover my ck ligature, or all the lovely twirly things in
Poetica and similar fonts.

So set up a 16-bit glyph encoding if you wish, but do not try to
change Unicode because you want it to parallel your glyph encoding.
Leave Unicode itself to do what it is intended for.


chris