viewing long pages, evince seems to choke on simple test file
doug at mathemaesthetics.com
Mon Sep 9 19:46:05 CEST 2019
Mike Marchywka wrote:
>| Was there some reason known for this
>| or as the other response suggested this
>| was back in the day of 32 or 16 bit integers?
TeX uses 32-bits as the word size of integers, with the upper 16 bits the signed integer part of a fixed point "Dimension" and the lower 16 bits for the fractional part. But the connection to the real world is that the low-order 16 bits measure 1/65536th of a point, each point being 72.27 per inch. Had these bits measured 1/10000th of a point, Knuth would have avoided some input/output round-off problems, and would have increased the dynamic range of a Dimension by a factor of 6 or so, with attendant loss of precision. C'est la vie.
But TeX also artificially prevents a Dimension from using the high-order (non-sign) bit of those 16 bits, so that any two TeX-legal Dimensions can always be added without checking for twos-complement overflow. So the dynamic range of a Dimension is halved; it's really using a form of [15:16] fixed point arithmetic, which is plus/minus 18.8921 feet. Worse, there are hacks in TeX's source code (see, e.g., running widths in Rules) that take advantage of using that otherwise "free" bit as a flag (to avoid a second argument to a subroutine by using an otherwise illegal dimension value), which in turn makes changing things messier than otherwise.
On today's 64-bit machines, one (ahem!) might configure a TeX-language interpreter at compile time to use [48:16] fixed-point arithmetic, which would allow a single page to be about 30 million miles long. Sadly, that unrolled scroll would make it only about 20% of the way to Mars.
More information about the texhax