[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

*To*: bkph@ai.mit.edu*Subject*: Re: Unicode and math symbols*From*: Chris Rowley <C.A.Rowley@open.ac.uk>*Date*: Sun, 2 Mar 1997 14:00:54 GMT*Cc*: mduerst@ifi.unizh.ch, BNB@math.ams.org, tex-font@math.utah.edu*Flags*: 000000000000*In-Reply-To*: <199702282012.PAA08397@kauai.ai.mit.edu>*References*: <199702281916.TAA21792@fell.open.ac.uk> <199702282012.PAA08397@kauai.ai.mit.edu>

Berthold It seesm unlikely that Microsoft and whoever: a: will take any notice of Unicode when they do not wish to; b: intend (or wish) their OS support for fonts to be used for typographic purposes. So the automation of high-quality typography (on paper, screnn, ballons or etc>) in an OS-independent way should not use such OS support for 16-bit fonts. If it does it will always have to change as whoever decides. Note that this does not imply that Unicode is irrelevant; its use as the internal coding, as in Omega, seems sensible and fits in with it becoming a standard for information exchange. Neither does this mean that there are not other reasons for trying to change Unicode because it has ben usurped in this way; but that need not, and perhaps should not, be related to omega/TeX uses of 16-bit character encodings. And there are certainly reasons for trying to improve it as a character encoding. chris

**References**:**Re: Unicode and math symbols***From:*Chris Rowley <C.A.Rowley@open.ac.uk>

**Re: Unicode and math symbols***From:*"Berthold K.P. Horn" <bkph@ai.mit.edu>

- Prev by Date:
**Re: Unicode and math symbols** - Next by Date:
**Re: Unicode and math symbols** - Prev by thread:
**Re: Unicode and math symbols** - Next by thread:
**Re: Unicode and math symbols** - Index(es):