Barbara Beeton

(completed 2005-11-24);

Barbara Beeton has been editor of TUGboat for 22 years and a member of the TUG Board since it was called the steering committee. She also serves as a liaison for TeXnical issues between Donald Knuth and the TeX community.

 

 

Dave Walden, interviewer:     Please tell me a bit about your personal history independent of TeX.

Barbara Beeton, interviewee:     After graduating from college (applied math and German literature), I worked for a year as gofer at the Brown University Computing Lab, then found, serendipitously, a position at the American Math Society, as assistant to the assistant to the Executive Director. This position had much to recommend it — independence in carrying out varied assignments, several of them associated with experimental projects including a pioneer effort to implement computer typesetting of math. Although that project ended without producing a complete production system, many of the ideas and techniques for encoding data became key components of later projects.

In spite of interesting projects, I came to feel that my brain was beginning to rot, and returned to school (still working full time) to pursue a master's degree in structural linguistics (completed and filed under `hobbies' in my resumé}).

When it became obvious that future management of the Society's records would best be served by keeping them on a computer, I was assigned to that project, being the only person at the Society with hands-on computer experience, however lowly. The initial task was to design the system — an amazing learning experience! In these early days, the AMS approach was basically to take people who knew how things worked at the Society, and provide the means to learn the necessary computer techniques. The first applications were business oriented, but because of the particular audience — professional mathematicians — a non-business project was a pioneer SDI (selective dissemination of information) effort, the Mathematical Offprint Service (MOS). This involved the presentation, including typesetting, of bibliographic data, and the choice of computer platform (an RCA Spectra) was ultimately made based on the presence of a typesetting program, PAGE-2.

This adventure involved more than just the mainframe. Along the way, we became adept at manipulating a minicomputer (first a BIT, then a Data General Nova) to which was attached a paper tape punch; paper tape, in turn, was used to drive a Photon 713 phototypesetter (the only way to communicate with it), loaded on a reader for transmission to a remote location (mostly for business data), or sent by courier to a typesetting service bureau. This motley array of mechanical devices was great fun to work with, if on occasion frustrating when things went south under deadline pressure.

This success in handling the bibliographic data from MOS, resulting in the publication of several indexes, was noticed by the Mathematical Reviews staff; MR had been publishing reviews of the mathematical literature since 1940, the volume of material was growing steadily, as was the cost of Monotype composition. PAGE-2 could be programmed to handle indexes, but the formatting of bibliographic information interspersed with review text was too complex, and a different approach was needed.

At about this time, the developer of a new math typesetting system, Science Typographers, Inc. (STI), approached AMS. Their software was found to meet the Society's needs for setting math text, although it lacked pagination capabilities, so pages had to be cut manually from galleys, mounted on boards, and headers and footers added by hand. However, this system could be run in-house, and mag tapes sent to a service bureau with a Harris Fototronic to produce photographic galleys. My contribution to this project was development of the local user interface, training the first keyboarders, and communicating with the developers. Input media included 80-column punched cards, OCR mini-barcodes, and more paper tape; we still hadn't reached the era of direct magnetic input.

DW:     How did AMS get involved with TeX, and were you part of that effort from the beginning?

bb:     The STI system I mentioned served AMS well for quite a few years, but in 1978, Donald Knuth was invited to deliver the Gibbs lecture at the AMS annual meeting. The topic he chose was “Mathematical Typography” — TeX and Metafont. The then-chair of the AMS Board of Trustees, Dick Palais, listened to Don's lecture, and interpreted what he heard as meaning that TeX was ready to run, out of the box. And, unlike the STI system, which was nearly inscrutable to anyone but a highly trained keyboarder, TeX was comprehensible to an ordinary mathematician. It sounded just the thing for production of AMS books and journals.

In addition to needing new composition software, the AMS was also looking for new hardware. We had been using an IBM 360 clone, an RCA Spectra 70 (later a Univac Series 70), which still depended on punched card input, had no native upper- and lowercase distinction, and was certainly no longer “state of the art”. TeX was written for a DECSystem 10 at the Stanford Artificial Intelligence Lab (SAIL) and also ran on a DECSystem 20, an interactive time-sharing system with a scrutable operating system. This looked very promising. After making sure that all the existing business applications could be converted to run on this system, the smallest machine of the line was acquired and installed in the computer room alongside the Series 70, and the programming staff was sent off for training.

Although I had become a specialist in the composition activities, the first inkling I had of how deeply I would be involved in TeX was made clear in late spring 1979, when my boss, Sam Whidden, handed me a small book with a yellow and green cover (the first TeX manual), a plane ticket, and a list of addresses. He told me to “go to Stanford, learn TeX, bring it back, and make it work.”

After I finished reading the manual, I gathered samples of both typical material and some especially nasty problems encountered with earlier composition software — running heads on multi-column pages, bad breaks at ends of lines and at the bottoms of columns, uneven columns, and the like. These were mostly in administrative publications, not math journals, since that is where the need for a new composition system was greatest. I got on the plane with all of this, and during the flight, read the manual again.

A small group of TeX initiates gathered at Stanford, and settled into a rented house on the campus. The group included Dick Palais, Mike Spivak, and several others, who were to address particular tasks such as creating an AMS-specific user interface, which became AmSTeX.

I was put in the care of David Fuchs (DRF), one of Don's graduate students in the TeX project. He sat me in front of a terminal on the SAIL computer, brought up the Emacs tutorial, and told me to learn it. (I've never looked back.)

While at Stanford, I had relatively free access to Don, and with his help, addressed the problems I had brought with me. He patiently worked through them with me, occasionally stopping to make modifications to the TeX source code when something exceeded its capabilities. My little green and yellow manual has, in Don's handwriting, the first description of the \firstmark command, which is needed to get the proper material into the running head for the left-hand column of an index. And many of the other problems are addressed in Appendix D of The TeXbook (dirty tricks). (I really am quite proud of my ability to choose “good bad examples” which stretch a program and not only help in the creation of good specs, but also are useful for ongoing testing during development.)

I also became familiar with the various output devices that were available, since the whole purpose of typesetting is to generate documents that can be printed.

After the month at Stanford, I brought back a tape containing the current version of TeX, and installed it on the new DEC 20. We didn't have any output device yet, but later that summer DRF came to Providence to install a newly arrived Benson Varian 9211, an electrostatic printer which used roll paper that felt rather slimy, and liquid toner that adhered to spots energized by a comb apparatus through which the electric charge was fed selectively. Since this device had a resolution of 200 dots per inch, all output was generated at a 130% magnification, to be photographically reduced to final size for the press.

The first publication generated by TeX was the 1980–1981 edition of the Society's Combined Membership List. This was a multi-column document with data generated from a database — a perfect task for a batch system. For this first iteration, the Computer Modern fonts were used, but they proved to be too space-hungry. For the next edition, a “tphon” font was developed; this was an especially compact and narrow font based on the Computer Modern sans serif, but adding serifs to the capital I so it could be distinguished from lowercase ell and digit one. (Later, the commercial Bell Centennial fonts, designed specifically for use in directories, were obtained for this use, but that's another story.)

By the end of 1980, a slick new typesetter was installed — an Alphatype CRS, with a resolution of 5333 dpi. DRF was responsible for the driver software; this was a duplicate of the machine on which the camera copy for the five volumes of Computers & Typesetting series was generated. For quite a few years, we replicated the setup at Stanford to ensure that someone else would be familiar with the hardware in case something went wrong.

We didn't start to produce journals with TeX until TeX82 was stable. For much of this period, I was the principal, though no longer the sole, developer of style files.

DW:     Wow! What a fascinating story.

My impression is that you have been involved with a lot of other TeX and TUG activities: TUGboat editor since its very early days (I'm not clear what the division of responsibility was between you and Robert Welland in the earliest issues), on the TUG board from its inception and on the steering committee before that, conduit to Donald Knuth for TeX bugs, keeping of the hyphenation exception list, at least close to the TUG office when it was at the AMS, etc. (I'm sorry if I missed a major area of TeX or TUG activity.) Can you tell us something about each of these, and any others, and how it came about?

bb:     Sam Whidden, who was head of the Information Systems Development department at AMS (my boss), was concerned that, since there wasn't a commercial organization with a stake in TeX, there would come a time when Don returned to other research, and there would be no one to go to for support. Users would be essentially on their own. For this reason, it was in the users' interest to band together for the general good. An exploratory meeting was held at Stanford in February 1980, attended by about 50 people, and the framework of the TUG organization was established.

Because the AMS had such a strong interest in the success of TeX, and because TeX had been unveiled at an AMS annual meeting, the facilities of the Society were offered for the TUG headquarters and AMS staff time was allocated for production of the newsletter. Sam accepted the position of Treasurer. Since I had been involved from the start, and was volunteered to be the newsletter production crew, I was appointed to the Steering Committee.

Bob Welland, as the initial editor of TUGboat, collected material and shipped it to Providence for production of the newsletter. (TUGboat didn't graduate to being a “journal” until 1988. Even now, it's subtitled the “Communications of the TeX Users Group”.) Bob didn't intend this to be a permanent position, and he “retired” after three years. In the absence of other obvious candidates, I assumed the editor's position as of the second issue of volume 4. Although the length of my tenure can be ascribed partly to inertia, and lack of planning for succession, it's been a challenging and enjoyable ride. I've learned a very great deal from this experience, and met many really wonderful people.

Probably because of my position on the TUG board and as editor of TUGboat, which enabled me to meet nearly all the key people in the TeX community, and because I had become a fixture at AMS, in little danger of disappearing, Don designated me to be what I like to refer to as his “TeX entomologist”, i.e., bug collector. He has my address listed on his TeX web page, along with a schedule of when he will next look at accumulated reports. I collect reports as they come to me, and, after a cursory check to see if the topic has been submitted previously, farm them out to volunteers (approved by Don) who vet them. This process involves verifying that the problem is really a bug, and not a feature or user error, and the resulting report is returned to the submitter; often, when a bug really is found, the vetter provides code that might be used in repairing the relevant program. When Don is ready to look at the collection, he has his secretary request them. I organize the reports by topic, distill them to eliminate redundancy, and ship the file off by e-mail to his secretary. She, in turn, prints them out, and Don reviews and annotates the paper copy, making changes in his source code as necessary, or explaining briefly why something isn't going to be changed, and compiles a list of rewards due. He writes out whatever checks are required, and his secretary ships the annotated paper and the checks back to me for distribution. Before returning the annotated reports (and possibly checks) to the submitters, I transcribe the notes (always handwritten in pencil) into a copy of the file I sent out, so there is a history that can be passed on to the other implementors and researched when future bug reports are submitted. When I'm finished the transcription and mailing, I send the report to a list of TeX implementors for their information.

The hyphenation exception list just sort of happened. Problems encountered with books and journals set at AMS were reported to the TeXnical staff, and we started keeping a list. Since I also like to play with words, it became a bit of a game to see if I could identify similar words that had problems, and figure out why some did and some didn't. The mathematical words yielded a list that was included in the AmS-LaTeX document classes, and it seemed useful to publish the entire list in TUGboat. Once people saw the list, they added to it, and we've ended up with what's there today.

As for interaction with the TUG office, it started out in a corner of the AMS headquarters, and used the AMS computer hardware; avoiding contact was next to impossible. When AMS needed the space, the TUG office moved up the hill to a rehabbed fire station, where it occupied the second floor and the old hose tower. After several years in that location, and changes in personnel, there was a need for someone new in charge of the office; the office moved from Providence to be where the new manager was located. With more changes in elected TUG officers and office personnel, the physical location of the office is now in Portland, Oregon. Although I remain in Providence, I've kept in close contact with the TUG office, and still hold some of the ancient archives, both electronic and on paper. It's useful when various ideas are discussed to be able to say, well, this was tried before, and here's what happened. We can learn from history, to decide whether to repeat something in different circumstances, or go in another direction. I hope and believe that this “institutional memory” is useful.

DW:     As someone who has been so broadly and deeply involved in the TeX community for so much of its history, I'm interested in what you see as the major eras of development of TeX and the TeX community.

bb:     Good question. I've never actually given that much thought; I've just accepted developments as they've come along.

As for TeX itself, there are several distinct tracks. First, there are the changes made by Don himself. These seem to me to be the following:

Now that TeX itself is frozen, extensions to the core program may no longer be called TeX, but there are some significant ones:

There were great hopes for NTS (the “new typesetting system”), but it looks at the moment like a dead end.

In the complex superstructure built up around TeX, I think LaTeX is the most obvious success story. Although I have some reservations about its details, the concept of logical structuring is a winner, and the fact that it's been so widely adopted means that there's a critical mass of users that makes it worthwhile for publishers to support its use and undertake further development.

Similarly, ConTeXt, although with a shorter history, has made a stunning entrance in large part based on pdfTeX.

It's perhaps obvious, but many of these developments couldn't have taken place without the arrival of some important technologies:

The personal computer also made it possible for anyone, not just the well-connected or well-heeled, to have access to these tools.

The original TeX community consisted of relatively few individuals, widely scattered, mostly in academic settings. The banding together of users into many formal groups, often based on a common language, has been a major force toward ensuring the continued viability of TeX. In turn, these groups have promoted TeX's adoption in environments other than the academic one. We all know that there are still many users who don't belong to any group. Encouraging their participation and sharing of their skills and experience can only widen everyone's horizons.

DW:     As we have been communicating about this interview, you mentioned your involvement in the STIX fonts project. I gather that you are doing this as part of your job at AMS. While it is not exactly part of TeX, it is still related to typesetting, and I am interesting in what dealing with Unicode has been like.

bb:     I had represented AMS in an ISO working group, trying to create an international font standard, for almost ten years; this was the result of a recommendation by one of Don Knuth's colleagues at Stanford. So when the STIX project was started, I was familiar both with standards bodies and with how those bodies viewed the difference between characters (an encoding) and glyphs (in fonts).

In fact, the Unicode Technical Committee (UTC) operates in a manner very similar to an ISO working group, except that their work is able to be done without formal mail ballots by national standards groups, as there is only a single Unicode committee involved. However, since the character content of Unicode ultimately becomes the ISO encoding standard 10646, the rules for accepting and encoding new characters are just as stringent. The members of the UTC have varying backgrounds — linguistic, scientific, computing — but they are all very smart and very knowledgeable in their fields.

Although by Unicode version 3 there was a fairly large complement of characters for math and other technical areas, it didn't include all the symbols that come “standard” with TeX, much less the many additional ones included in the fonts used by the STIX organizations.

The first group of symbols submitted to the UTC was just “symbols”. However, the scientists in the group realized that was only part of the problem. Letters, in different styles and alphabets, are used to represent variables, and substituting a roman letter for the “same” script letter will change the meaning of a formula quite radically. These “mathematical alphanumerics” were accepted into Unicode, but placed in plane 1, where they aren't as likely to be misused for, e.g., wedding invitations. This Unicode work has been reported at http://www.ams.org/STIX.

The STIX project has now progressed to the point where most of the new symbols have been created and placed in preliminary fonts for review. (I've been the person at AMS responsible for reviewing the material and providing comments when adjustments are needed.) Only a couple hundred more symbols are still awaiting delivery from the font contractor, after which a full design review will be undertaken, followed by final corrections, packaging, and creation of AllTeX support. This phase of the project is reported at http://www.stixfonts.org.

DW:     Do you have thoughts you'd care to share on your view of TeX's future, both at the AMS and in general?

bb:     TeX isn't about to go away. There isn't anything anywhere close to the horizon that can handle publication-quality math as well. Publishers such as the AMS will continue using TeX (or a related successor) for a long time.

TeX is also very suitable for batch processing and other applications (such as preparing custom documents from a database) where it can be used “under the covers”. The latter approach is used by the German railroads to generate custom timetables, as reported in my column in TUGboat 24:3. It is also the basis of a new free web office suite, gOffice. And there have been presentations at several TUG meetings about similar uses in the insurance industry. These kinds of applications will continue to use TeX, at least in part because it's free and reliable.

Free, reliable and eminently math-aware are also the reasons that graduate students in math, the hard sciences, and linguistics will continue to use TeX to prepare their dissertations.

One ever present conundrum is, how to make these audiences more aware of the existence of TeX, and encourage them to be active in TeX organizations? I don't know the answer. But I do know that if this can be accomplished, it will help ensure that TeX remains alive for quite a long time to come.

DW:     Unless you think there is a question I should have asked or that you'd like to answer that I haven't asked, I will stop asking questions and close now.

bb:     I suspect that everyone who's read this far is bored to tears, so I'll mention just one that I get asked from time to time — why I usually answer e-mail in all lowercase. I've been using computers since the days of punched cards. When terminals with upper\lowercase distinction and e-mail became available, so many people were used to the single case mode that they automatically switched on the caps lock. This seemed to me too much like shouting. Although I'm a person of strong opinions, I try not to shout, so I adopted the all-lowercase mode as (over)compensation. (That doesn't mean I don't pay attention to proper spelling and grammar.) These days, it still serves a useful purpose: I will answer with my own opinions in lowercase, but for really “official” communications, when I am speaking on behalf of AMS or TUG, I will use both upper- and lowercase. Anyone who's familiar with my idiosyncrasies will therefore know when I'm making an official “pronouncement”.

DW:     Thank you very much for taking the time to participate in this interview, and thank you for your detailed and fascinating descriptions. I greatly admire and appreciate all that you have done for the world of TeX over so many years.


Web site regeneration of January 5, 2010; TUG home page; search; contact webmaster.