[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Unicode and math symbols
On Sat, 22 Feb 1997, Chris Rowley wrote:
> One big problem with discussing this is that Unicode contains such a
> lot of rubbish, eg box-drawing glyphs, so it is always possible to
> argue for putting more in.
The "rubish" parts are usually due to backwards compatibility issues.
I.e. there is some national standard or some industry or company
encoding that contains these things.
In general, I agree with Chris that for systematic form changes
in the alphabet, additional information (such as font information
on a lower level, or structural information on a higher level)
should be used. On the other hand, if there is a well-used
Math symbol that isn't in UNicode, I would suggest to make
a formal proposal for putting it in, with all the necessary
One thing not really clear in the Math area is the distinction
between semantics and abstract form. For example, should there
be one codepoint for "set difference", and this could look
like "-" or like "\" or whatever, depending on the font (and
maybe other setting), while there is another "-" for subtraction,
one for hyphen, and so on, or should there be one and the same
"-" for various purposes, and one and the same "\" for various
purposes, and the slight differences in shape, size, and placement
be dealt with depending on circumstances (e.g. Math or not).
I guess we certainly need some amount of both (the later e.g.
to distinguish between a hyphen and a dash), but in general,
on the level of character encoding, it's easier for most
people to deal with "one shape, one code", and so that will
probably prevail in the long run.