\newcommand{\Lcs}[1]{\mbox{\normalfont\ttfamily\bs#1}}
\title{The 15th Annual \TeX\ Users Group Meeting}
\author[Michel Goossens]{Michel Goossens\\CERN, 
Geneva\\\texttt{m.goossens@cern.ch}}

\begin{Article}

\section{Introduction}
July 31st, Santa Barbara, California, USA.\footnote{For another view
of TUG94, see \emph{Malcolm's Gleanings} later in this \BV.}
Just the right combination 
of sunshine, temperature, and sea breeze. The mountains in the
background, the beach nearby, the food nearly perfect. The ideal
setting for a conference. And here we were, some 120 \TeX{} 
enthusiasts, coming from many countries and cultures, to meet
each other, and talk about and listen to presentations of the
latest developments in the area of high quality typesetting.

We were not disappointed. The quality of the presented papers was
uniformly good, or even outstanding, so many Birds of a Feather (BoFs)
were going on in parallel that it was impossible to keep track of the
many hot topics being debated by specialists and users in these
informal meetings that took place when there were no formal
presentations.

The formal theme of the conference was ``Innovation''.  Malcolm Clark
and Sebastian Rahtz brought together a tremendous programme that
clearly showed how \TeX{} is now making inroads in many areas of book
production, like colour support, more flexible page layouts, scholarly
and non-Latin alphabet editions.  Several groups are working on
extending \TeX{} or \LaTeX{} so that these tools become ever better
adapted to the demands of present-day document handling and are
integrated more readily into electronic distribution networks or
databases.  Several new approaches introduce object-oriented
programming techniques, and hence show that \TeX{} forms an integral
part of a modern computing development environment.

I hope that the following detailed overview will give you a flavour of
all these developments, and that it will convince you that you want to
know more about one or more points. You can obtain the proceedings of
the Conference by becoming a TUG member for \$60, which entitles you
to four issues of TUGboat and of \TeX{} and TUG News, or else for \$30
you can obtain a copy of the Proceedings only. For more details
contact the TUG office.
% at the following address
%
%\begin{tabular}{@{}l@{\quad}ll}
%\TeX{} Users Group            &     Phone:   +1 (805) 963-1338\\
%P.O. Box 869                  &     Fax:     +1 (805) 963-8358\\
%Santa Barbara, CA 93102, USA  &     E-mail:  tug@tug.org
%\end{tabular}

It all started on Saturday July 30th in the evening with the
traditional Welcome Party. This where one meets old friends and
colleagues or discovers new faces; the latter are at first looking
around with somewhat anxious eyes, but are quickly surrounded by
reassuring oldies, shaking hands, and being welcomed to the
``Family''. The Californian wine, beer, or lemonade flowed freely, and
by the end of the evening all ice was broken and the atmosphere was
one of harmonious warmth and unity.

The Conference was formally opened the next day by TUG'S
Executive Director, and local organizer, Patricia Monohon, and
Christina Thiele, TUG's President also spoke a few words of welcome.

\section{Publishing, languages, literature and fonts.}
It was Charles (Chuck) Bigelow who had the honour to present
the first paper.  He started by looking back at letter forms over the
past 2500 years or so, and then discussed work---together with Kris
Holmes---on the Lucida Sans Unicode font, that contains at present
some 1700 alphabetic and mathematical symbols and is or will be
available with the multi-byte operating systems Windows/NT, Apple GX
and AT\&T Plan 9.

Frank Mittelbach then discussed some of the dos and don'ts that he
learned while preparing the \emph{\LaTeX{} Companion}. From the
discussions following the talk it seemed that his impressions were
shared by many other authors/editors who are in the publishing
business.

Just before tea it was Yannis Haralambous who showed off his artistic
talents using \MF{} when he presented his work on
typesetting the Holy Bible in biblical Hebrew using his \emph{Tiqwah} 
system, that will make it possible, for the first time, to use the
typographic powers of \TeX{} to typeset high-quality Bible editions.
Together with his work on typesetting the Holy Koran using several
thousand ligatures, and his font developments for many other scripts, 
(as described at earlier conferences, and later in the present one)
this will allow scholars in many disciplines to typeset their works
at affordable prices using \TeX{} and any computer.

Michael Cohen, an American teaching at the University of Aizu in
Japan, explained how his \emph{Zebrackets} system of meta-\MF{}s 
can generate striated parenthetical delimiters on demand. This offers
the reader a more complete graphical picture of the relationship 
between various document elements by augmenting the information 
content of their representation.

Yannis Haralambous then came back on stage to present ``Humanist'',
his new system to ``humanize'' \LaTeX. 
Document input, markup and editing is performed using any word
processor that supports RTF output (like Word, WordPerfect),
that will then be turned into \LaTeX{} code by the Humanist  system.
A user can thus work on a text in the most friendly and natural way
(\ie without a single \LaTeX{} command), but will get syntactically
correct \LaTeX{} output so that the powerful \TeX{}
engine can be used to obtain high-quality typeset output.

The final paper of the Sunday was by Basil Malyshev, on converting \MF{}
fonts automatically into PostScript Type~1 outlines. 
It was read by Alan Hoenig in the author's absence.
Various techniques to perform the conversion in question were presented
and the one chosen for the creation of the \emph{Paradissa Fonts
Collection} was described. This collection offers a freely available
set of PostScript Type~1 renderings of all Computer Modern, Euler, CM
Cyrillic and \LaTeX{} fonts.

\section{Colour, and \LaTeX}
Leslie Lamport started the presentations of the second day.  He gave
us his ideas on ``\LaTeX4'', a \textsc{wysiwyg}-like, though
structured text editor, well integrated into the user environment.

James Hafner gave a short historical overview of how colour was 
first implemented in Tom Rokicki's dvips {\tt.dvi} driver to provide an 
efficient and simple method for specifying colour with \TeX.
Tom Rokicki then discussed a new implementation of colour
support and proposed a standard way for specifying colour and
colour-like specials, implemented by modular C-code, that can be
easily integrated into the {\tt.dvi} drivers.
Angus Duggan described his program DVISep,
a simple colour separator for {\tt.dvi} files, as well of some other tools
for working with {\tt.dvi} files.
Sebastian Rahtz provided an introduction to the colour commands available
in \LaTeXe{} and showed some interesting examples. 
Michel Goossens discussed some of the more basic issues concerning the use
of colour in documents. He emphasized that the colour dimension has to
be used with great care, so as not to distract the reader from the
main message. Colour, like typography, has a set of rules, that have
to be learnt and applied for greater effectiveness.
Friedhelm Sowa presented his original and device-independent  approach to
colour support and showed some results obtained using BM2FONT on a Hewlett
Packard inkjet printer.
Michael Sofka gave an overview of the various stages in the
production of a colour book. He addressed the issues involved in
professional colour separation, and demonstrated how \TeX{}, with a
suitable driver, can be used to produce high-quality custom and process
colour books.
Then Sebastian Rahtz returned to the spotlight,
with a presentation of PSTricks, a paper by Denis
Girou and Timothy van Zandt, who could not be present.
Sebastian, in his usual clear style, showed how PSTricks provides a 
convenient interface to PostScript from within \TeX. It allows one to
draw any kind of graphics object, like circles, polygons, curves,
springs. It offers several drawing tools,  grids and has various
commands to place text along a path. Objects and text can be rotated,
scaled and tilted, and 3-D effects are available. Framing and clipping
are supported, as is a general tree-drawing package.
A package for generating slides, \texttt{seminar}, exists, and an early
version of a plotting package is also ready.

After the presentations on colour our attention turned to the subject 
of general \LaTeX{}-related developments.
First, Jon Stenerson showed us his system for creating
customized \LaTeX{} style files via a graphical user interface,
composed of menus, windows, and dialog boxes. It is at present closely
linked to the Scientific Word text processor, although, in principle,
it could be used with any \LaTeX{} environment.
Johannes Braams provided a clear introduction to 
classes and packages and \LaTeXe. 
He started by relating
the \LaTeXe{} packages and classes to \LaTeX~2.09 major and minor
styles. Then he discussed how old styles can be most easily upgraded.
In the last part of his talk he gave a
concise overview of the document
classes and packages that come with \LaTeXe. 
The last talk of the day was by Alan Jeffrey, who covered the
subject of using PostScript fonts with \LaTeXe. He described the
\LaTeXe{} font packages \texttt{psnfss} and \texttt{mathptm} and some
of the design decisions made in their development.

Before the dinner ``on the beach'' several BoF sessions took place.
One was on ``colour'', coordinated by David Carlisle, another on
``practical indexing'', coordinated by Nelson Beebe, and one on ``font
encoding'', coordinated by Alan Jeffrey.
Many of the discussions in the BoFs carried over into the beach
dinner time, but, as families were also present, other more
mundane subjects were also addressed. It was one more golden 
occasion to get to know each other in a more personal context,
without reference to glue, (coloured) boxes or other \TeX{} speak.

\section{\TeX\ Tools}
Tuesday morning was devoted to ``Tools'', and started with a
presentation by Oren Patashnik, the author of \BibTeX{}.
He first took a look back and explained why some of the
design decisions of \BibTeX{} were made. Then he discussed some of the
features that he plans to include in the new version, such as an
easier interface to create non-standard bibliographies, support for
national languages and the possibility of multiple bibliographies 
in a single document.
The next talk was by Pierre MacKay, who presented his typesetter's 
toolkit, which includes tools for remapping fonts and generating
composite glyphs, and a program for generating AFM PostScript metric
files for the Computer Modern fonts.
Michael Barnett described a remarkable application where a combined
use was made of electronic typesetting and symbolic computations. 
His work seems to indicate that a considerable amount of time and
effort can be saved when complex formulae are obtained symbolically by a
computer program, like \textsc{Mathematica}.
Minato Kawaguti, of Japan, proposed a new and efficient method to edit
\AllTeX{} source files by combining an emacs-type editor and a special
version of \texttt{xdvi}, where the two windows (emacs and \texttt{xdvi}) are
displayed simultaneously, and pointing to a portion of the document in
the \texttt{xdvi} window positions the text in the editing window in
the same region.

After coffee Yannis Haralambous showed his work on the Indica system,
and a completely new \TeX{} system for Sinhalese. The Indica system is
a generalized preprocessor for Indic scripts (scripts of languages
used on the Indian subcontinent, plus Sanskrit and Tibetan).  Urdu,
where the Arabic script is used, is not supported.  Various input
encodings are accepted and with the help of \texttt{flex}, a {\sc
  unix}-based lexical analyser generator, are translated into \TeX{}
commands. Identical input encodings can be used for different
languages, thus minimizing user retraining when inputting in different
languages. The Sinhalese \TeX{} system is a complete typesetting
workbench for that language, containing specially designed fonts.
Jean-luc Doumont explained how pretty-printing of Pascal programs can
be done entirely within \TeX{}, without the need of a preprocessor.
He showed how this approach of ``preprocessing within \TeX{}'', using
two-token tail-recursion, can also be applied to other situations,
\eg, for an elementary chemistry mode.

After lunch we had the afternoon off and most of us spent it in
the nice town of Santa Barbara.
In fact, during the Tuesday afternoon we were supposed to
go and have a look near the Santa Barbara Channel Islands, that provide
a shelter for the area between the islands and the mountains, thus giving
Santa Barbara its unique sub-tropical climate. The plan was to go and
spot a few whales, but the sea was somewhat rough, and the captain
preferred to take us on a 3-hour tour along the coast. Even so quite
a few of our passenger-colleagues felt sick, and it was with some
relief that many of us set foot ashore again around 7 pm, and set off
to go and pick a restaurant to enjoy the local food.

\section{Futures}
The next day's theme was ``Futures'', and Joachim Schrod thought that
interactivity was the way forward. He emphasized that Knuth already
very early on thought that an interactive \TeX{} would be useful. Many
\TeX{} systems have been built that contain some interactivity.  To
better understand the actions of \TeX{} he proposes that a formal
approach should be used since, according to his views, informal
descriptions have failed. As part of a solution he presented, after
developing an abstract decomposition, a formal description for
\TeX{}'s macro language. The latter can be interpreted by a Common
Lisp system and the resulting Executable \TeX{} Language Specification
(ETLS) can be used as the basis for a debugger of \TeX{} macros.
Chris Rowley then reviewed some of the investigations of the \LaTeX3{}
team in the area of modeling and specifying page layouts.  One of the
questions that they asked themselves was how well \LaTeX{} can cope
with that job compared to other text processing software systems, and
whether a complete redesign of the system is needed.  He also
mentioned the wider question of how these aspects should be addressed
in future typesetting systems.  Don Hosek gave an overview of various
page layouts he had tried for his new magazine \textit{Serif}, and
showed how he could massage \TeX{} into doing (almost) everything he
wanted, mainly using code from the infamous Appendix~D of \textit{The
  \TeX book}.  John Plaice then reported on the present status of the
Omega project, which is a series of extensions to \TeX{} to improve
its multi-lingual abilities. It supports multiple input and output
character sets and allows any input encoding. Transformations from one
coding to the other are supported.  Even scripts requiring a very
complex contextual analysis, such as Arabic or Khmer, can be handled
elegantly using 16-bit or 32-bit virtual fonts.

After a short break Arthur Ogawa showed ways of combining
within \TeX{} the descriptive markup and object-oriented
programming (OOP) paradigms. He discussed an extension to \LaTeX's markup
scheme that more effectively addresses the needs for a production
environment, and for implementing such a system he heavily relied on
the use of OOP techniques, where \LaTeX{}
environments can be thought of as objects, and several environments
can share functionality of a common, more general object.
In his companion talk to Ogawa's, William Baxter went on to describe
the actual implementation of an OOP system in \TeX{}, where formatting
procedures and markup are strictly decoupled, so that, indeed,
designers can fully benefit from the OOP techniques available.

The afternoon started with the TUG Business meeting, where decisions
taken by the TUG Board of Directors for the coming year were
presented, explained, and discussed. These decisions will be
presented elsewhere.
The Knuth Scholar was also announced:
Shelly Lee Ames of the University Manitoba,
where she works for the Canadian Mathematical
Society (Soci\'et\'e math\'ematique du Canada) preparing
formats and proofing all papers published by the society in their
Journal and Bulletin.
This involves handling submissions in many different flavours of \TeX,
and initiating the development of macros to implement their formatting
requirements.

After the meeting Yannis Haralambous, in a companion paper to Plaice's
on the Omega project, showed a few applications for fully
diacriticized scholarly Greek, vowelized Arabic, properly kerned
Khmer, and for Adobe's calligraphic Poetica font.  Then Phil Taylor
reported progress on the NTS project. This project was started in 1992
by the German-speaking \TeX{} user's group, DANTE, and has as its main
task the development of a successor to Donald Knuth's now frozen
\TeX{} system. In fact two paths, one evolutionary, with e-\TeX{}, and
one more revolutionary, with NTS (New Typesetting System) are at
present being investigated. As the \TeX{} typesetting system consists
of a rather complex set of tools, the group proposes to define a
``canonical \TeX{} kit'', which is assumed to be present at every
installation.  The status of the e-\TeX{} project was reviewed by
Peter Breitenlohner.  At present this involves improved control over tracing,
additional math delimiters, improved access to the current interaction
mode, checking for the existence of a control sequence, alternative
ligature/kerning, extensions to the set of valid prefixes for macro
definitions (\eg, \Lcs{protect} and \Lcs{bind}), support for colour.
Finally it was Ji\v{r}\'{\i} Zlatu\v{s}ka who told us about the team's
present thinking on the more ambitious NTS project.  He sees
essentially a two-phase approach, namely first a re-implementation in
a rapid-prototype language such as CLOS or Prolog, so that one can
experiment easily with various modular representations of the present
\TeX{} engine. Using this model one will try and identify functionally
independent units, for which various alternate ways of extensions can
then be proposed and tested.  Based on the knowledge gained in phase
one, the second phase will then see the step-by-step re-implementation
of the functional units in a more efficient and widely available
programming language, such as C++.  Initially only e-\TeX{} will be
implemented in NTS, but later on alternate algorithms can be included
to perform some of the typesetting tasks better.  The long-term aim of
NTS is thus to make maximum use of the phase-1 test bed to investigate
and evaluate possible approaches to overcome various of \TeX{}'s
perceived shortcomings.  A lively discussion followed these
presentations, and then the participants went off into one of the
three BoF sessions. The first was on WWW servers, coordinated by Peter
Flynn and Norman Walsh, where the latter discussed at some length his
paper describing his WWW interface to the CTAN archive, which provides
an attractive means to combine different views of the archive into a
single view.  Marko Grobelnik coordinated a BoF on database
publishing, while Oren Patashnik discussed extensions to \BibTeX{} in
his BoF.  At the Banquet, that started at 19:30, all participants had
one last chance together with their families to socialize, and enjoy
the good food, wine (some had original 16 year old cask Caol Ila malt
whisky\ldots), and the music.

\section{Publishing and design}
It was a little difficult for some of the participants 
to get up on time for the last morning.
Yannis Haralambous and Maurice Laugier discussed some of the tools
used at the Louis-Jean Printing house in Gap (France) to typeset
books. The Trad\TeX{}-SGML program was introduced. It is used
to convert \TeX{} and \LaTeX{} files into SGML. The tool is presently
implemented on a Macintosh and is in real-life production. eDVItor is
a program that allows interactive editing of a {\tt.dvi} file, using a
mouse-driven cursor to move blocks of text, insert illustrations,
change colours, etc. It runs on both DOS and Macs.
Michel Downes stated that the American Mathematical Society produces
almost all its publications (a couple of dozen journals and book
series) with \TeX{} using AMS-developed macro packages. About two
years ago a major overhaul of the macros package was decided, one of
the goals being to ease revisions to the visual design.
In this new approach the design specifications are kept outside of the
\TeX{} code in an element specification template that is relatively easy to
understand and modify by traditional book designers.
Alan Hoenig then showed us some examples of 
visually pleasing page layouts, which most \TeX{} users only
thought possible with PageMaker or Quark Express. 
His secret is to turn off some of the \TeX{} functions, like vertical
glue or tall characters, and all lines are assumed to have the same
height and depth. It is to be said that this
arguably restrictive set of conditions still allows one to typeset
probably at least 99\% of all printed material in the world.
And, indeed, the model is not so limited as it seems, since with some
work one can include section heads, display material, and so on.
Just before the coffee break, Malcolm Clark presented Jonathan Fine's
paper in his absence. He described first some historic aspects of the
\TeX{} typesetting program, leading to a discussion of strategies for
possible future extensions. He strongly believes that with improved
macro packages and {\tt.dvi} processors many of the present problems will
be solved. Also imposing a more rigorous syntax for input
compuscripts should help. This will not only allow the source to be
used with a  possible future successor of \TeX{}, but also ensure
re-use with other, not-necessarily typesetting, applications.

Marko Grobelnik presented a \TeX-based system developed in Slovenia
for publishing dictionaries, lexicons and encyclopedia. The \TeX{}
macros are augmented with many special purpose written editing tools to
assist the editor, who looks after the contents and form of the
publications. The final talk was by Henry Baragar, who showed how
special purpose (``small'') languages can be used for documenting
Knowledge bases so that \LaTeX{} can be augmented by adding
expressiveness for specific tasks. He introduced the language TESLA,
that allows Expert System analysts to mark up groups of rules into
tables so that the logical structure of the database becomes clear.
The system generates \LaTeX{} tables, that can be typeset in tabular
form to be  used by expert system programmers or typeset as text, to
be used by Domain experts, thus yielding presentation forms adapted to
the targeted audience.

The conference was brought to a close by Christina Thiele, but not
before Mimi Burbank, coordinator of next year's TUG meeting, gave us a
short outline of plans for the 1995 meeting, to be held during
the week of July 24--28th 1995 in the Trade Winds Hotel in Florida.
It was also the occasion to honour the winners of the trophies for
the best papers, namely Alan Hoenig, Yannis Haralambous and Tom
Rokicki, who were presented with EPODD CD-ROMs by Nelson Beebe.

\section{Conclusion}
I think that I can safely suppose that at the end of our five day
conference all participants left the University of California, Santa
Barbara Campus satisfied to have taken part in this unique event. Even
though most of us, Internet addicts, were a little surprised to find
only very limited access to the Internet, this fact might indeed have
been more of a blessing than a shortcoming, since in this way we were
not distracted by having to answer e-mail or otherwise respond to
``urgent requests'' from home. In any case it certainly benefitted
contacts between the participants and hence contributed to the
friendly atmosphere.  Another positive factor was the hard work of
John Berlin and Janet Sullivan of the TUG office, who did their best
almost 24 hours per day to help solve problems, or better, trying to
prevent them before they occurred. Their kindness and helpfulness were
truly appreciated by all those present. Thanks once again to John,
editor of the \emph{The~TUGly~Telegraph} (and his partner in crime,
Malcolm Clark), which kept us informed of the latest conference news,
and to Katherine Butterfield, Suki Bhurji, and Wendy McKay for helping
with staffing the on-campus TUG office.
\end{Article}
\endinput
%%%
%%% ommitted from Baskerville to save space
%%%
 (the town itself has a population of
86,000, while the county counts about 360,000 inhabitants). The town lies
about 150 km north of Los Angeles, and 530 km south of San Francisco.
The climate is sunny and temperately warm (average temperature is
about 12 centigrades in December, and 20 in July). The architecture of
the town offers a unique blend of Chumash, Spanish, Mexican and
American heritage. State Street, Santa Barbara's main
and most famous street, whose lower lying part was rebuilt after the
1925 earthquake as a beautiful Spanish-style avenue, lined with trees,
plants, benches, and lamps is the town's favorite shopping area.
At the end of State Street Stearn's Wharf, built in 1872, and the oldest
operating wharf on the west coast, offers  a lot of restaurants, gift
and souvenir shops, wine tasting, a seafood market and other small
shops. From the wharf one has a marvelous view of the mountains, the
ocean and the yacht harbor. 

Santa Barbara also has an interesting history, and many of its
buildings and museums will give the visitor an overview of what happened
since 1542, when the first European, Juan Cabrillo, set foot in the
area.
But it was not until 1782 that the Spaniards came to stay, and also
around that time they established a military presidio and the now
famous Mission, established in 1786. Its beautiful setting, unique
twin bell towers and lovely facade have earned it the title ``Queen of
the Missions''.
Santa Barbara, heart of the American Riviera, also offers white-sand
beaches, whale watching, mountain biking, sailing, and many good
restaurants. 