Idris Hamid

[completed 2009-11-01]

Idris is best known in the TeX community for his work in developing better typesetting for Arabic-script languages such as Arabic, Persian, and Urdu.

 

 

Dave Walden, interviewer:     Please tell me a bit about yourself outside of the world of TeX.

Idris Hamid, interviewee:     I was born on the East Coast of the United States to multiethnic/multicultural parents. My background includes a heritage that goes back to the founders of the US, as well as Native American, Arab, African, and East Indian. I have lived in Saudi Arabia, Pakistan, and the East Coast as a child. My teenage years were spent in the Caribbean Islands, such as Trinidad, Barbados, and Dominica. My father was an internationalist who did a lot of non-governmental work for oppressed peoples of the Western Hemisphere. I did a lot of hiking in the mountains, was home-schooled for most of my junior- and high-school years. Went to Georgia State University for a BSc in Physics, followed by an MA in physics at State University of New York, Buffalo, followed by a PhD in Philosophy at that same institution.

DW:     When and how did you first encounter TeX.

IH:     When I was a physics grad student, a Chinese fellow grad student who shared my office printed a paper of his that caught my eye. I was floored by the quality of the mathematics, and asked him, “How in the world did you do that?!?” And he told me about TeX. I then bought Borde's “TeX by Example”, after which an Indian colleague convinced me to forget about Plain and learn LaTeX. He gave me a few pointers and I became a LaTeX user for the rest of my grad school years, even after going into the Philosophy Department.

While in the Physics Department I used the mainframe for the most part. They already had TeX installed. After getting my own PC I tried VTeX, YandY, and teTeX on various Linux distros, before finally settling on MiKTeX and WinEdt. This all part of my pre-ConTeXt history :-)

DW:     Did you use TeX/LaTeX before you got involved in creating a critical edition for your PhD thesis?

IH:     Sure, for my MA in physics, as well as all papers I wrote for my philosophy course work.

DW:     You said at TUG'09 that you wrote your PhD thesis using EDMAC and Klaus Lagally's ArabTeX. What are these, and were there other TeX tools you used, e.g., is LaTeX also somehow involved? And please tell me a bit more about what your thesis was.

IH:     EDMAC is a critical edition package by John Lavagnino and Dominik Wujastyk. It's a pretty good package, but it was originally only a Plain TeX package. Having used LaTeX for a while, and not being a programmer myself, I was more comfortable with LaTeX than Plain TeX. Now EDMAC did provide a sty file to at least load the LaTeX2e NFSS (New Font Selection Scheme), making font control a bit more friendly. On the other hand, with NFSS one loses some PlainTeX functionality as well as some LaTeX functionality.

ArabTeX was an exceptionally useful package by Klaus Lagally. The font is not particularly beautiful but it is quite “scholarly” in the sense that it imitates and captures the spirit of efforts of orientalist works — such as Lane's Arabic-English Lexicon — prior to and just after the turn of the 20th century.

My PhD was done in LaTeX and ArabTeX, with the critical edition portion done in EDMAC and ArabTeX. One thing that neither EDMAC nor ArabTeX could provide is global right-to-left typesetting. So the footnotes were still numbered on the left side of the footer, etc.

The title of my thesis is The Metaphysics and Cosmology of Process According to Shaykh Aḥmad al-Aḥsāʾī: Critical Edition and Translation of Observations in Wisdom..

Shaykh Aḥmad was the last major original philosopher of traditional Muslim civilization. He is not very well known, even in Islamic philosophy circles, but his importance for metaphysics, cosmology, theology, and eschatology can hardly be emphasized enough.

DW:     You moved from physics to philosophy, and your CSU department website says your research, and presumably teaching, now includes Islamic philosophy, cosmology, metaphysics, and mysticism. What led you from studying physics to studying these other subjects and why this combination of subjects?

IH:     I get this question a lot! All of these interests are captured by the concept “cosmology”. My studies in physics were focused on cosmology, and my MA thesis involved a solution/interpretation by the philosopher Kurt Gödel of Einstein's field equations. As I got deeper into the philosophy of physics issues, I realized that my immediate interests in cosmology needed a better philosophical grounding. The Muslim contribution to cosmology was especially interesting to me, so I chose that as a specialty. In the end, however, I would like to bring all this historical research in Islamic cosmology back to a 21st century relevance and context.

I had been trained in Arabic language, syntax and morphology, from a young age, so the transition to Islamic cosmology was a very natural one. In addition, I already had a degree in Arabic and Islamic Studies as well.

DW:     Your website also says you are an editor-in-chief of the International Journal of Shīʿī Studies. What are Shīʿī Studies?

IH:     Islam has two main schools of thought: Shīʿī and Sunni. Sunnis are the majority, although in the arena of the intellectual heritage of Islam the Shīʿī persuasion is just as if not more rich. Most of the great scientists and philosophers of Islam were either Shīʿī or Shīʿī-leaning. The main difference between the two involves the nature of the presence and continuation of the Prophet's spiritual, charismatic, and political authority over the Muslim community after his passing.

DW:     And you've written on “Islam Dynamic: the Cosmology and Spirituality of Walayah”. Is this a paper or a book, and what is Walayah?

IH:     Islam Dynamic is a book-length project that aims for a unified, 21st-century presentation of the various elements of the Islamic framework. This work is based on the thesis, attested to in the Qurʾān and a number of traditions, that the most fundamental activity of Islam is walayah, that is, dynamic loving, which involves the twin poles of cherishing-lordship — from God to Creation — and of adoration-service — from Creation to God. This field of walayah echoes throughout the cosmos as infinite-yet-bounded dyads of walayah, or walayah-relationships: sun-earth, earth-moon, mother-daughter, leader-follower, and so forth. The health of the cosmos depends on the health of these walayah-relationships, which in the end are meant to bring us into cosmic cognizance of the Source and Destiny of our being.

Due to its length, we will probably divide this work into two books, and hopefully release them later this year:

By the way, my latest book, Islam and Development: The Institutional Framework, was released in October of this year. It is coauthored with Dr. Abbas Mirakhor. He is a former Executive Director and Dean of the the Executive Board at the International Monetary Fund, and is the foremost expert in the West on the rapidly burgeoning field of Islamic Economics. More information is available at the on-line store of Global Scholarly Publications. The entire book was produced using LuaTeX and ConTeXt MkIV.

DW:     What is Neo, i.e., described in your chapter “The Cosmological Journey of Neo: an Islamic Matrix”.

IH:     That was a fun article to write. Basically, I situated the character Neo of the Matrix movie trilogy within a cosmological, spiritual journey of self-knowledge to enlightenment.

DW:     I was fascinated by your presentations at the TUG 2009 conference at the University of Notre Dame: Arabic typography: Past, present, and TeX and Dynamic Arabic: Towards the philosopher's stone of Arabic-script typography. I had no idea that good typesetting of Arabic involved such variety and complications stretching the individual letters, placement of the vowels and ornaments, cultural considerations, and so forth. What motivated you to take on trying to create the best Arabic typesetting system, and why did you choose TeX, ConTeXt, and LuaTeX, as the basis for it?

IH:     To answer the first question: As good as Lagally's system is, it left a lot to be desired both aesthetically (cultural authenticity) and technically (bidi control etc). The more I studied the issue, the more I realized that much more needed to be done. I figured that, if I'm going to do this, I may as well try to have the best system possible.

Also: By this time I was appreciative of the power of TeX compared to other typesetting systems. Even if some commercial systems had the potential to do most of what I needed, there would invariably be some feature that would be unavailable, so with TeX I could potentially have something that I can control and extend as needed, without waiting for or fruitlessly begging forever.

Around the year 2000 I closely followed the Omega project and all the research that Yannis Haralambous was doing with the Arabic script. I was exceedingly hopeful that Omega would provide the path forward. Yannis and I worked together briefly, and his help gave me a boost in my research. However, he soon got busy with other things and I had to struggle on my own to figure out how to edit OPTs (Omega Translation Processes) and build OVPs (Omega Virtual Property lists). The documentation was scattered and often vague, and the utilities frequently broken. I finally cobbled together a working set of utilities that could function with Omega 1.15 — the most stable version — and even put together a guideline for how to make an OVP: http://www.dtek.chalmers.se/~d97ost/omega/font-instructions.txt

I also made contact with Alan Hoenig — author of TeX Unbound — who was also interested in developing an Arabic typesetting system based on TeX (and Metafont as well in Alan's case). He was concerned about Omega's stability as a platform, concerns which proved all the more serious when Omega 1.24 was released. It was such a disaster: much, much slower than Omega 1.15 and broken in a number of areas. I complained LOUDLY to anyone who would listen, and lobbied to not have 1.15 replaced in TeX Live.

That event was the impetus for the eomega project, eventually Aleph. Giuseppe Bilotta took the Omega 1.15 code, fixed some bugs, added the eTeX extensions needed by ConTeXt and other LaTeX packages. The aim was to provide something immediate, practical, stable and solid for users. Thus the Omega team could continue to experiment with new ideas until the team was ready to release a truly usable and reliable system. Put another way, Aleph was meant to provide a practical bridge for those needing a stable version of what Omega already provided.

Aleph fulfilled its bridge role quite well. However, the bridge turned out to lead to, not a future version of Omega, but rather a new version of TeX, one that would unite aspects of Omega with pdfTeX and a scripting engine. Thus LuaTeX was born.

I applied for and received a significant grant from my home institution, Colorado State University. The grant was processed under the auspices of a new project called Oriental TeX. This grant allowed us to accelerate development of LuaTeX, in particular, a big block of development time for Taco Hoekwater.

A critical element that made the entire development possible was the dynamicity of the ConTeXt typesetting system. Core LaTeX development has been pretty static, and I lost hope that the needs I had for Arabic-script typesetting could be solved within the LaTeX framework in a timely manner. The memoir package gave me some hope in the early stages, but a communication from Peter Wilson around 2001–2 caused me to lose hope that, e.g., EDMAC could be fully ported to memoir or LaTeX etc. Giuseppe Bilotta was active on the comp.text.tex newsgroup promoting the Gospel of ConTeXt: Whenever I would ask if X could be done in TeX, I generally got the best answers from Giuseppe. After joining the ConTeXt mailing list I found a dynamicity and movement there that, to be blunt, simply did not exist in the LaTeX world. Hardly any problem came up that was not solved or patched within days, or even hours and minutes. Also there was this openness to dramatic new ideas that I found quite refreshing. The monolithic and consistent interface of ConTeXt — no clashing packages — was also enjoyable.

To be fair, switching to the ConTeXt way of thinking and doing things was not an overnight process — in particular, it took a while to master the typescript system of font control. But once I got used to it, I could not imagine going back to LaTeX. I'll go even further and say that, in my view, ConTeXt is the future of TeX.

Ironically, a couple of years later Peter Wilson actually did port EDMAC to LaTeX, and added some good functionality for parallel bilingual typesetting as well (where, e.g., Arabic-script is on one page and Latin on the other). But for me there is no turning back. Accomplishments of the Oriental TeX and LuaTeX projects — in conjunction with the new experimental version of ConTeXt, called ConTeXt MkIV (see http://www.pragma-ade.com/general/manuals/mk.pdf) — include advanced OpenType typography which rivals and, in some cases, supersedes that which is available in commercial systems like InDesign. For example, we can access variants within alternate substitution lookups. See, for example http://scripts.sil.org/cms/scripts/page.php?site_id=nrsi&id=arabicfonts#b78d09ff, under “ArabAyah and subtending marks” where it says,

Additionally, Scheherazade includes two simplified alternates for ArabAyah under the Stylistic Alternates (salt) feature, but at this time we know of no OpenType-based applications that can access these.
Well, ConTeXt can!

I have written an ambitious new framework for critical editions typesetting, one much more general than that provided by EDMAC, and which will have far more general application — it's called CriTeXt. Implementation will require further LuaTeX development, but it is my hope that some version of it will be implemented in the near future.

DW:     My understanding is that the Arabic alphabet/script is used for several different languages. Although Arabic is written in a cursive style, are there still words made up of letters?

IH:     When you say, “are there still words made up of letters”, I'm not sure what you mean. Arabic script is perfectly alphabetic, it's just that the shape of each character is a function of its contextual position within a cursive string — assuming the character is one that is supposed to connect cursively at all. For example, the word for spirit (pronounced rūḥ), contains no cursive connections at all:

روح

And fātiḥ or opener is a single word composed of two cursive strings:

فاتح

And the name Muḥammad is a single word composed of a single cursive string:

محمد

So when we say “cursive” we mean something slightly different from what is meant in Latin. Not every letter connects cursively in a given context. And an individual Arabic-script word may contain zero, one, or more cursive strings.

DW:     I think I understand what you are saying. I also now understand that the Arabic alphabet is used with many languages — the second most used alphabet after the Latin alphabet according to what I read in the wikipedia. Which of these languages do you know?

IH:     I adapted an illustration of the domain of the Arabic from a graphic I found on the wikipedia. The Arabic script was in even more widespread use before the colonial powers discouraged its use by the populations within their empires. For example, Swahili and Malay used the Arabic script but the Europeans encouraged the use of the Latin alphabet. In Malaysia, the Arabic script writing system for Malay, called Jawi, is making a comeback after being banned by colonialists. There is even a dialect of Chinese, Dungan, that used to use the Arabic script; they call it Xiao'erjing. The Soviets banned it for the Chinese Muslims living in their territory but it barely survives among some small communities of Chinese Muslims. Throughout the now former Soviet Central Asia, the Arabic script was banned, so that in Tajikistan, whose national language is Persian (Tajik dialect), people were forced to abandon the beauty of Persian calligraphy for the cyrillic script. Today, there is a movement to restore the Arabic-script for the Tajik dialect of Persian.

I am primarily familiar with the Arabic, Persian, and Urdu languages. Arabic is the name of a language separately from being the name of the script.

DW:     Can you point me to a good article summarizing the issues and challenges of typesetting Arabic?

IH:     I am writing a monograph, Towards an Ontology of Arabic-script Typography, that will summarize much of this. I intend to include much of what I presented at TUG09 in the Ontology as well as in a planned monograph on OpenType in TeX.

DW:     I've heard about an Orientalism period in American culture and art and about Orientalism as an area of cultural study. You call this project the Oriental TeX Project. What does Oriental mean in this context?

IH:     “Orientalism” is not a welcome term among many people these days (there is too much of a connection with 17th to 20th century imperialism). But “oriental” doesn't have the same problem.

In academia, the adjective “oriental” may be used to mention “pertaining to Middle-Eastern or South Asian”. Of course, it is also used more commonly to refer to matters pertaining to East Asia. But “oriental” is also a translation of the Arabic and Persian word “ishrāqiyy”, a philosophical term which translates as “pertaining to the orient of rising light”. The ishrāqiyy philosophical tradition was a symbiosis of Hellenistic rationalism and “oriental” mysticism. “Oriental” was a more generic term to use in the project name than words such as “Arabic” and “Islam”.

DW:     To get Oriental TeX built, I understand you must do fund raising.

IH:     I have sought the usual sort of academic grants that come to university researchers. The first major grant was from Colorado State University. The university saw the value of working on a solution to improve global intercommunication and scholarship. I have also sought grants from private institutions interested in Arabic typesetting. And of course I have sought funding within the world of TeX.

DW:     How do you coordinate with Taco, and with Hans Hagen, regarding the Oriental TeX Project?

IH:     It's often done by email and pretty informal. Skype sessions are perhaps the most important method of communication these days. Hans catches me here in Colorado when I've barely awakened, and I catch him in Holland often at dinner time or right before sleep. When one of us sees a problem or need, he communicates with the other two. We converse until we have settled on a solution. Sometimes we involve someone else in our discussion. Sometimes we have to postpone working on something until better options emerge.

DW:     Are there areas of particular difficulty?

IH:     Kerning for vowels and diacritial marks is at least an order of magnitude harder with the Arabic script than with a Latin alphabet. The relationships between base glyphs, vowels, and identity marks need to expand and contract horizontally and vertically — thus there are more degrees of freedom including rotational degrees of freedom. I suspect a really super Arabic font of this sort can never be truly complete; it will always require iterations and updates as clashes between marks and bases are discovered in practice etc.

An important point: Taking TeX to the next level has been extremely time-intensive. On my side the development of a prototype super Arabic-script font — code-named Husayni — has turned me into a caveman. This is not do-it-in-your-spare-time type of work. To fulfill the flexible requirements of Husayni takes constant, focused, 12-hour days over months. We already have about 70 GSUB variant lookups, probably more than just about any other OpenType font out there! And Hans has to continually fight a fuzzy OpenType specification to make sure that MkIV does the right thing, and that we are both interpreting the spec correctly. We spend long Skype sessions struggling to interpret the spec at times.

Then there is Microsoft VOLT, which I use for GSUB and GPOS tables. Husayni taxes VOLT to the max, and it is not infrequent that the font will not compile and I have to spend hours optimizing to make VOLT happy. Although we use FontForge code within LuaTeX to actually interpret OpenType tables, doing the tables themselves in FontForge was abandoned early on, due to incomplete implementation of features, somewhat buggy GUI, not a fully bidi proofing window, etc.

I could submit more bug reports, and George Williams (developer of FontForge) would almost certainly fix things that need fixing. However, we have more than enough time constraints already. Fighting compilers to build every patch of FF on Windows is just out of the question. VOLT hides more of what's really going on inside of OpenType tables, so it's less instructive, but it is much more efficient to use. LuaTeX can dump the actual OpenType tables as interpreted by FF into a text file, so we can examine that directly when we need more information.

These days, we have roughly settled on Microsoft VOLT as a standard for GSUB and GPOS implementation in MkIV (not the same as Uniscribe, which adds a lot of extras including MS's partly controversial interpretation of what the actual implementation of a given language in type should look like). For one thing, most advanced typography multilingual fonts are made with VOLT, so it has de facto status. On the other hand, we have good relationships with VOLT guru John Hudson and the main VOLT developer Sergey Malkin, so if and when we find a bug in VOLT, it can be fixed, or we can choose a different interpretation of the spec.

So fighting imperfect tools and fuzzy specs takes up a lot of our time. This is the kind of project one only wants to do once in a lifetime, sort of like Knuth and his decade developing TeX to begin with. But we expect the results to be worth it!

DW:     Do you encourage your students to learn TeX?

IH:     If I could, I would. But, practically speaking, I prefer that my students spend their time on the course content rather than struggling to surmount TeX's learning curve. I do ask for PDFs rather than .doc files.

If a reliable interface to TeX were available — one with a completely graphical user interface where the user never had to see a backslash — I would be tempted to require the use of TeX for some courses. On the other hand, once LuaTeX, ConTeXt MkIV, and the Oriental TeX projects are mature, I may require the use of TeX by my graduate students in any case.

DW:     Thank you, Idris, for taking the time to participate in this interview. I enjoyed seeing you in person at TUG'09 in South Bend, Indiana, and enjoyed more learning about your background and work. I probably will never use the results of your work with Arabic typesetting; however, I definitely will use what I learned from reading your TUGboat paper about the Minion Pro fonts.


Interview pages regenerated January 26, 2017;
TUG home page; contact webmaster; (via DuckDuckGo)