Chapter 2 

The Technological Basis

of Visualization

M. Firebaugh 

© Wm. C. Brown Communications, Inc. 

The purpose of [scientific] computing is insight, not numbers.
Richard Hamming

A number of technological developments and economic trends emerged in the early 1990s that resulted in the integration of visualization tools into capable multimedia workstations. In this chapter, we discuss the technology widely available in computer graphics laboratories and consider the economic and institutional forces that have shaped the computer graphics industry. An appreciation of these forces will help computer graphics students prepare themselves for this maturing industry.

The primary drivers of technological growth in computer graphics can be classified as:

All three areas are in a rapid state of flux, and to follow developments in the industry requires essentially a full-time effort reading journals and scanning trade magazines. Some of the most useful sources are listed in the Endnotes.

In addition to the purely technological developments listed above, there are a number of interesting cultural/institutional forces at work in shaping present and future graphics workstations. These include:

It should be stressed that no single one of these technological or institutional forces is responsible for the rapid development of visualization capability. Rather, we are observing a synergistic effects of all these factors.

The following example illustrates this synergy. The National Center for Supercomputer Applications (NCSA) of the University of Illinois at Urbana-Champaign has developed a network of supercomputers linked to Sun and Macintosh workstations for the use of researchers in scientific visualization.

Consider these four elements. The workstations provide excellent computer graphics and image processing capability for analyzing the computations performed on the supercomputer. In an hour, the supercomputer can number crunch complex simulations that would take days or weeks on the workstations. To port the supercomputer results to the workstations requires a high bandwidth network to handle the large data flow and, eventually, to interactively control the simulation experiment. The image processing task requires sophisticated image processing software running on the workstations for the interactive analysis of graphical data. As isolated components, none of these four elements are particularly noteworthy--working together they constitute a system of exceptional power and capacity.

Increasing Performance/Price Ratios

The computer industry, and in particular the personal computer industry, seems to be governed by two immutable laws:

These observations stem from the author's personal experience. In 1980, he paid $2,700 for an Apple II Plus (1 MHz clock rate, 64 KB memory). This was followed four years later by a Zenith 158 (IBM XT clone) for the same price. It ran at 8 MHz and had 640 KB of memory. About the same time, the original Mac 128 (8 MHz, 128 Kb) was purchased for almost exactly the same price. After a series of upgrades, this initially weak machine evolved into a 4 MB Mac Plus running at 16 MHz.

Two years after this system enhancement, a new machine, the Mac IIcx, was added. This unit, with 1 MB memory and a 16 MHz 32-bit 68030 processor, cost almost precisely the same as the original Apple II had 9 years previously. Taking into account nine years of inflation, the cost of adding a 16 inch color monitor, 7 MB of RAM, and a 200 MB hard drive to this system brought its cost in constant dollars equivalent to that of the original Apple II. The present Mac IIcx system has 48 times the speed, 125 times the RAM memory, 20 times the ROM operating system memory, and 1,200 times the disk storage capacity of the original Apple II Plus.

The next addition was a 33 MHz 486 system with 8 MB of memory, 200 MB hard drive, and 1024 x 786 resolution VGA screen, continues this trend. Such systems are available from a number of vendors at well under $3,000 in 1993 dollars.

Exponential Growth

What we are describing is a classic example of exponential growth in the ratio of the performance to the price. Whenever the doubling time of a quantity remains constant, the result is exponential growth in that quantity over time. To illustrate this growth, the performance trend of the CPU clock speed of typical personal computers is indicated schematically in Figure 2.1.

Figure 2.1
Growth of clock speed for PCs over the years. Note logarithmic scale on ordinate.  

Additional anecdotal evidence on the rapid growth of the performance/price ratios emerged during the process of building the personal workstations used in the preparation of this book. Uncompressed colored computer graphics images require vast amounts of storage. To accommodate colored images, a 45 MB removable hard drive was added to the system. In the three years since the arrival of the $799 drive, prices have fallen to less than $400. The media for this system costs about $65 per 45 MB cartridge as this is being written and most likely costs less than $1 per MB as this is being read. Such systems provide hard drive performance (25 ms average access time) at floppy drive prices.

Even the cost of floppy disk storage reflects exponential behavior. The original Apple II disks cost $5 for 0.140 Mb, while high density disks now cost $.50 for 1.4 Mb, a performance/price increase of one hundred in ten years. CD-ROM (compact disc read-only memory) provide Gb memories at a price of $.50/Mb.

The area of graphics displays has shown similar improvements. The original Apple II Plus displayed 280 x 192 pixels in 8 colors. Standard Macs and PCs feature 640 x 480 pixels in 256 colors, and true color (24-bit) monitors are now available in 1024 x 768 pixels in 16 million colors. Color scanners capturing an 8 x 10 inch picture at 300 dots per inch (dpi) in true color generate an image with over 20 MB of information. Thus, the information content of graphical images has grown by a factor of more than a thousand in about ten years. This corresponds to a doubling time of almost exactly one year.

The total throughput of a computer depends just as importantly on availability of high-speed RAM as it does on clock speed. In Figure 2.2, we show the trends in addressable memory capacity of standard personal computers over the last few years.

The list continues. If we plotted the number of MIPS (million instructions per second), MFLOPS (million floating point operations per second), size of the resident ROM operating system, the storage capacity of hard drives per dollar, the resolution of output hardcopy devices, or most any other measure of performance of graphics workstations, the curves would all exhibit the exponential growth in performanceprice ratios. The only variation would occur in the doubling time (slope) of the curves.

Figure 2.2
The growth of addressable random access memory (RAM) for standard PCs over the years. Note exponential scale on ordinate.  

Emerging Applications

A remarkable aspect of exponential growth in any area of technology is that, if the performance you want is not yet available, just wait, and it will be very shortly. Two applications from engineering design illustrate this point. First is in the area of computer aided design (CAD), and the second is in finite element analysis (FEA). Both of these applications are heavily computation-bound and require high-resolution graphics interfaces. As a result, conventional wisdom has maintained that these applications prove that mainframes (or at least powerful minicomputers) will always be required for "serious work."

Two of the real number crunching tasks of 3D CAD programs are the hidden line removal problem and the generation of realistic shading. Recently, a number of CAD products with these features have emerged for personal workstations. In Figure 2.3, the output of one such system is shown for the design of an automobile camshaft.

While the finite screen resolution is still apparent in Figure 2.3, the image resulting from the application of hidden line removal and shading algorithms is a great improvement over more conventional wire-frame models. Until recently, constructive solid geometry and shading features were available only on mainframe-based CAD systems costing in the $100,000 range.

Figure 2.3
Design for an automobile camshaft illustrating the use of solid modeling and shading.  

The second application that has recently migrated from mainframe to personal workstations is finite element analysis. FEA may be considered an extension of CAD in which material properties are added to the model and the resulting physical system subjected to user-specified forcing functions and boundary conditions. The system's response to the applied forces are then computed and displayed graphically. The heart of FEA programs consists of algorithms for discretizing the model into simple polygons (for 2D models) or volume elements (for 3D models) and applying physical laws to generate a set of equations relating the elements to each other and the forcing function. The FEA program then solves the resulting set of coupled differential equations and presents the user with a solution, typically in the form of displacements or electromagnetic potentials at each node of the finite element model.

Postprocessing modules are available for viewing the resulting field solution using several of the visualization modes described in the previous chapter, including contour maps, wire-frame topographic maps, pseudo-color images, and animation sequences of such representations. Postprocessors also provide the user with global problem solution parameters, often through integration over components of the model, including information such as the stored field energies and frequencies of the normal modes of vibration. Figure 2.4 illustrates a typical FEA problem, the displacement of a beam under an applied force .

Figure 2.4
Prototype finite element analysis problem-- displacement of a beam subjected to the force F 

The advantages of porting applications such as FEA from mainframes graphics workstations are the flexibility and interactivity obtained by local control. One might expect to pay a serious penalty in performance as the price of these advantages. However, benchmark tests indicate that personal workstations perform very well against larger computers. In Figure 2.5, the results of running the same FEA model with the same program on a variety of minicomputers and personal workstations are compared.

Figure 2.5
Benchmark test of COSMOS/M FEA program running on several computers. The model consisted of 1020 3D solid elements representing the junction of two mechanical pipes. 

The benchmark problem consisted of two intersecting pipes that were modeled by 1020 3D solid finite elements. The program COSMOS/M was run on a variety of computers, and some of the timing results are shown in Figure 2.5. Of the computers shown in Figure 2.5 all, except the VAX and Mac IIx, were running at 25 MHz and used math coprocessors whenever possible.

The most remarkable performance was turned in by a garden-variety 25 MHz 386 PC equipped with a 4-node, 25 MHz transputer. This machine ran the benchmark in 254 seconds compared to the 3452 seconds required for the VAX 750. This same problem required 22 seconds running on a CRAY XMP/416 with the ANSYS finite element system. Thus, a standard 386 PC equipped with a transputer parallel processor (system cost, $10K-$15K) performs finite element analysis at about one-tenth the speed of a multimillion dollar supercomputer. Since over 50 percent of the CPU time used on supercomputers is devoted to some kind of finite element analysis, this result has significant implications.

With this background in some of the cultural/institutional forces shaping the computer graphics marketplace, let us turn specifically to the hardware and software readily available in most computer graphics laboratories.

Hardware Systems

A modern integrated graphics workstation consists of one or more hardware components from each of the following categories:

The general architecture of a graphics workstation is diagrammed in Figure 2.6.

The general pattern of data flow in a graphics workstation deserves some comment. At the lowest level, graphical information may be generated by the stand-alone workstation, using the keyboard to type in a program which calls the built-in graphics capabilities of the workstation to display graphical output on the workstation monitor. The next higher level of integration involves using interactive devices such as the mouse or data tablet to control a graphics program (possibly a paint or CAD program) resident in the workstation, displaying the graphical results on the monitor and printing a copy on a dot matrix, inkjet, or laser printer. The third level of graphics system integration involves obtaining image information directly from an image transducer or storage device (scanner, video, or CD-ROM), transforming the image to solve the problem, and routing it to any of the output devices listed.

The highest level of graphical system integration involves embedding the workstation in a network which provides access to additional graphics processing resources such as file servers and supercomputers. File servers allow access by multiple work stations to common graphical data bases. The role of the supercomputer is to provide additional processing power for solving problems infeasible on the workstation. The supercomputer serves as an analysis engine. Let's look at the properties and capabilities of workstations in a little more detail.

Graphics Workstations

We define a graphics workstation as a computer with associated high resolution imaging screen, keyboard, and pointer device, whose purpose is to process graphical information. By this definition, the SAGE air-defense system of the 1950s probably qualifies as the first large scale implementation of graphics workstations. The purpose of the SAGE system, a precursor of modern air traffic control systems, was to map radar signals onto a cathode ray tube (CRT) in order to represent the positions and velocities of the planes under observation.

The primary graphics workstations during the 1960s were vector display devices. Given the (x,y) location of the two end points, these expensive terminals would draw precision, continuous lines on the face of CRTs. However, this period preceded the arrival of minicomputers and microprocessors, and so the cost of a system capable of computing anything interesting and refreshing the screen fast enough to generate a stable image was in the $100K range.

A breakthrough in technology occurred in 1968 when Tektronix invented the direct view storage tube CRT. This device used two independent electron guns, one to write the initial image on a finely-meshed storage grid and the second to read the stored image and display it on the output screen. These devices had severe limitations, which included among others: (i) selective object erase could not be done, ruling out the possibility of animation, (ii) full screen erase required the greater part of a second, (iii) the terminal screen displayed only one color--green, (iv) the early versions were dumb terminals--that is, they contained no independent computer and had to be used in a network containing a minicomputer or main frame.

However, in spite of these weaknesses, Tektronix terminals had a number of attractive features and contributed greatly to progress in computer graphics.

The road to the present high-resolution, color-graphics workstation was paved by the introduction of personal microcomputers. These computers initially used ordinary B/W or color television monitors to display alphanumeric and graphic information. The breakthrough occurred when low-cost memory became available and the enormous advantages of bit-mapped displays were recognized. Bit-mapped graphics involves assigning the bits in a particular region of memory to corresponding pixels on the screen. Image display then simply involves mapping this portion of memory directly to the screen. A "1" in memory means "turn the pixel ON"; a "0" means "turn it OFF." By hardwiring this mapping process, great efficiency and speed may be obtained. This advance has led to steady progress towards greater resolution both spatially and chromatically. Because bit-mapped graphics has become the standard of the computer graphics industry, we will restrict our discussion of display devices to bit-mapped displays.

Figure 2.6
Graphics Workstation Architecture. The top line indicates the range of input devices. The bottom line shows possible output devices. The supercomputer and network provide additional resources for computationally intensive applications. 


Computer graphics is a unique area in computer science in which at least one aspect of the performance goal is well defined and already in our grasp. The goals of traditional computer design have involved a never-ending search for faster computation, larger word size, and larger memories. In computer graphics, on the other hand, when the screen or some other output device produces a "picture perfect" image, we have reached our goal. Present displays of 1,000 x 1,000 pixels in sixteen-million colors come very close to achieving that goal.

Display Devices

Although the cathode ray tube (CRT) is the oldest of all display devices, it is still the best and most cost-effective. The only serious competitors are plasma-display devices and liquid crystal display devices (LCDs). Both of these have a number of features that make them very attractive as graphical output devices, and we shall return to them shortly. Meanwhile, consider the operation of a typical CRT. The mechanism for displaying information on a CRT screen is shown in Figure 2.7.

The basic elements of a CRT are an electron gun, a deflection system, and a fluorescent screen which emits light when it is struck by electrons. Two kinds of deflection systems are commonly used--magnetic and electrostatic. Magnetic deflection systems use two current-carrying coils to create magnetic fields that force the electrons either up and down or right and left. More easily understood are electrostatic systems in which deflection plates are charged by external voltage drivers.

By applying the proper voltage waveforms to the horizontal and vertical deflection plates (in principle, a sawtooth to the horizontal plates and a stairstep wave to the vertical plates), the electron beam traces out a pattern on the screen as shown in Figure 2.8. On the path AB in Figure 2.8, the voltage on the vertical deflection plates remains constant while the horizontal voltage is a linear ramp.

On the flyback path, BC, a blanking signal cuts off the beam, while the vertical deflection voltage is incremented by a step. The cycle repeats from CD down to E, where another blanking signal turns off the beam on the flyback path EA. This whole process must occur in under 1/30 second to avoid irritating flicker. Typical frequencies for the complete screen refresh cycle are 60-70 Hz.

Figure 2.7
Cathode Ray Tube. The basic elements are an electron gun for accelerating a beam of electrons, a deflection system for bending the beam, and a fluorescent screen that emits light when struck by the electron beam. 


On the B/W Macintosh (SE30 and Classic), for example, the full screen display consists of 342 horizontal scan lines, each composed of 512 pixels. The detailed specifications for these displays are given as:

Figure 2.8
Beam Pattern of a CRT Display. The electron beam follows the path ABCDÖE once every 1/30 second or less. Dotted/dashed lines indicate blanked beam; solid line means beam is active, plotting a pixel, either white or black. 

On color monitors, the complexity increases by the requirement of three independent electron guns and three independent phosphors at each pixel, one each for red, green, and blue. The structure of the electron guns, phosphors, and the color mask that controls the geometry is shown schematically in Figure 2.9.

As we show in the chapter discussing color, the proper mixture of red, green, and blue light can reproduce any color, according to the additive RGB model, and in practice the model works very well. A black pixel is generated by turning off all three beams at the electron gun; the three primary colors are generated by turning on a single gun; and intermediate hues are generated by mixing red, green, and blue in various proportions. To get yellow, we mix roughly equal intensities of red and green light while violet is generated by mixing red and blue. By mixing approximately equal intensities of red, green, and blue light, white light is produced. You can verify this statement experimentally when sitting in front of a color terminal displaying a white screen by wetting your finger with your tongue and lightly touching the screen. The droplets of liquid act as tiny magnifying glasses and sparkle brightly of red, green, and blue.

A second device that continues to capture an increasing share of the graphics display market is the LCD (liquid crystal display). The reason for this rapid growth is the improvement in the devices price/performance ratio, both through falling prices and greatly improved performance. The introduction of LCDs with color capability has accelerated this trend.

LCD displays are the standard output device for essentially all calculators and are found on many children';s games. The heart of the device consists of a smectic liquid whose long chain molecules exhibit polarization properties which can be controlled by an electric field. By segmenting the fluid into tiny cells (pixels), each of which can be switched electrically from transparent to opaque, we have the basis for a high-resolution graphics display. Black-and-white LCDs are now available with resolutions comparable to CRT monitors. Essentially all laptop and portable computers are equipped with LCDs as their display monitor. 

Figure 2.9
Shadow mask color CRT. Color registration is achieved by use of a physical mask with holes or vertical slots through which each color beam is focused on its respective color phosphor. Note: the electron gun angles are exaggerated for purposes of illustration.  


The advantages of LCD displays include:

With attractive features such as these, what has kept LCDs from sweeping away the ancient CRT? There have been two serious drawbacks. The first is well on the way to being solved, and the second problem is slowly receding.

The final display device we discuss is the plasma display. This device had its origin in the Nixie tubes of Burroughs Corporation and was developed as a graphical output device by Donald Bitzer and Hiram Slottow of the PLATO Project of the University of Illinois. It has been marketed at various times by Owens-Illinois, Magnavox, Control Data, Fujitsu, Toshiba, and IBM. It shares many of the advantages and the disadvantages of LCDs and is becoming a standard display device on high-end portable computers.

A plasma display screen is essentially a square matrix of tiny neon glow lamps, each individually addressable. Neon glow lamps have an "L" shaped current-voltage characteristic curve similar to that shown in Figure 2.10.

Figure 2.10
Voltage-current characteristic curve for a neon glow lamp, the heart of the plasma display terminal. An applied voltage of > 120 v. will flip an OFF state ON, while a voltage of ~90 v. will maintain a state in its status quo. 

In the region below 120 volts, the neon lamp is OFF, that is, nonconducting and not glowing. As the voltage is raised above the threshold of 120 volts, an autocatalytic discharge occurs and the lamp begins conducting and giving off light. A voltage of approximately 90 volts is then required to maintain conduction, and the discharge continues until the voltage is driven below this value.

The architecture of a plasma display panel is shown in Figure 2.11. By maintaining a voltage of about 3/4 of the threshold voltage, all OFF cells will remain OFF and all ON cells ON. When a pixel needs to be turned on, the voltage on the two lines addressing its cell, e.g. B2, are both increased to just over one-half the threshold voltage and with opposite polarity. Thus, the voltage across cell B2 exceeds the threshold voltage and it turns ON. However, no other cells along line B or along line 2 received more than the threshold voltage. Therefore, no other cells are turned on.

Some very attractive features characterize plasma display panels.


Figure 2.11
Plasma Display Panel. A clear plastic panel with neon-filled cells is addressed by the solid horizontal conductors and the vertical shaded conductors. 

There are two serious limitations of plasma displays.


The two main classes of microprocessors which have made high quality graphics readily available are the "sixes" and the "eights" manufactured by Motorola and Intel, respectively. The genealogy of the sixes originated with the MOS-6502 (Apple II, Atari, Commodore 64), and includes the Motorola M6800, the M68000, the M68020, M68030, M68040, and PowerPC  chips (Macintosh, Next, HP-Apollo, and Sun). The eights genealogy includes the I8088 (IBM PC and XT), the I80286 (IBM AT), I80386, I80486, and Pentium chips (IBM PS/2 and clones).

For purposes of developing some understanding of the evolution of microcomputers and their various strengths and weaknesses as graphics systems, the following classification system may be helpful.

In this text, we will confine ourselves primarily to the discussion of level 1, 2, and 3 systems because of their performance/price advantage and value as instructional tools. Level 4 and 5 systems provide exceptionally powerful graphics workstations, but their improved performance comes at a price ranging from a factor of two to more than ten times that of Level 1-3 systems. Level 1 systems are available in the $0.5K - $1.5K price range and are capable of carrying out most of the algorithmic computer graphics of the first two sections of the book. Level 2 and 3 systems cost $1.5K - $15K and are capable of all algorithmic calculations plus the image processing and higher level graphics applications in the rest of the book. The final advantage of orienting the text for level 1-3 systems is their plentiful supply--most laboratories are well stocked with PCs or Macs or both.

Another consideration on Level 2-3 machines: new technologies are becoming available to enhance the performance of these machines as graphics workstations into direct competition of Level 4 and 5 machines. These technologies include special graphics accelerator cards and networking systems for harnessing workstations as parallel graphics processors. Sluggish graphics display behavior is a particularly serious problem for machines like the Macintosh in which a single microprocessor CPU must handle all interrupts and graphical display tasks in addition to its main computing assignments. Application Specific Integrated Circuits (ASICs), such as the Texas Instruments TI34010, are designed specifically to speed the processing of graphical information display. The Radius Quickcad board for the Mac II is an example of the improvement achieved through the use of special graphics cards. This board reduced the time required to display an AutoCAD file of 110,000 vectors from 83 seconds to three seconds.

Table 2.1
Levels of Graphics Workstations
Level Microprocessor 
(No. bits)
Clock rate 
I8088, 80286(8-16) 

M68000 (16)


Mac Plus, Classic

4.8 - 16 MHz 

7.8 MHz

640 Kb 

1 Mb

I80386 (16-32) 

M68030 (16-32)

IBM PS/2 Line 

Mac II Series 

Sun, Apollo

16 - 33 MHz 4 - 16 Mb
I486 (32) 

M68040 (32)

IBM PS/2, Clones 

Mac Quadras 


HP-Apollo 9000

25 - 66 MHz 

[25 MIPS]

8 - 64 Mb
Sparc RISC 

M88000 RISC 



Sun Sparcstation 


SGI Iris Indigo 

IBM 6000

25-66 MHz 8 - 256 Mb

Proprietary ASIC 


Stardent 3000 

IBM 6090 

SGI Iris Crimson

[128 MIPS] 

[85 MIPS]

32 Mb 

32 Mb 

16-256 Mb


Clever networking technology is available to convert unused machines into parallel graphics processors. One particularly impressive example is NetRenderMan, a UNIX-based workstation enhancement of MacRenderMan, the high-quality rendering system we study in the chapter on realistic rendering of 3D graphics. In one test, a rendering task that took 35 minutes on a Macintosh Quadra using MacRenderMan was completed in four minutes and thirty seconds using NetRenderMan in a network containing the Quadra, a Sun Sparcstation, and an Iris Indigo workstation.

A final note on Level 2-3 machines--the richness and variety of software available for level 1-3 machines far surpasses that available on Level 4-5 machines. For the same functionality, the price of software running on level 1-3 machines is typically one half to one tenth that of the level 4-5 equivalent. In particular, in a multi-station instructional environment, the software price advantage may outweigh the hardware price advantage as the critical consideration.

Figure 2.12
A Zenith 158 Computer. The system contains 640 KB of RAM and runs at 4.77 or 8 MHz. It has one 5.25" floppy, a 40 MB hard drive (segmented as two 20-MB partitions), a NEC MultiSync EGA monitor, and a PC mouse. (Video image captured with Computer Eyes.)


Level 1 Example - I8088 System

These machines, when equipped with standard EGA or VGA color adapters and monitors, provide medium resolution color graphics capabilities at a low price. A typical setup built around a Zenith 158 XT with NEC MultiSync monitor and a PC Mouse is shown in Figure 2.12.

The chief drawbacks of 8088-based systems are:

In spite of these handicaps, some very elegant application programs have been developed for PCs, XTs, and their clones. Many of these problems have been overcome in the M68000-based Macintosh level-1 systems.

M68000-Based Machines

By moving to a 16-bit memory bus and 32-bit register structure, these machines achieve a factor of two in processing power over the 8-16 bit 8088-based machines. In addition, the Macintosh Classic implementation runs at 7.8 MHz, another factor of nearly two improvement over the 4.77 MHz PC. However, this theoretical factor of 4 in speed advantage is not realized in practice because of the heavy work load assigned the 68000 Mac processor.

The chief advantage of the Macintosh implementation for graphics applications is in the firmware (ROM). The Macintosh User Interface Toolbox firmware consists of the following tools:

The Toolbox provides high-level functions and procedures for performing complex tasks. For instance, the task of opening a window which lists all files available in a folder and the control buttons necessary for selecting and opening a given file from the list is performed by a single function call. In addition to saving programmers much time in building such tools from scratch, the toolbox routines ensure familiarity and consistency of application programs for the user.

The Toolbox tools of most interest for users of this text are the QuickDraw routines. QuickDraw tools are provided to specify graphical objects such as rectangles, circles, and polygons; graphical attributes such as cursor, pen, and text characteristics and transfer modes; and drawing environments with its GrafPort routines for creating and manipulating independent graphical windows. There are, for instance, five routines for handling the cursor, fifteen for manipulating GrafPorts, thirteen for pen and line drawing, twelve for text drawing, five each for operating on rectangles, ovals, arcs, and rounded-corner rectangles, and so on.

The key concept to the effectiveness of the toolbox paradigm is that all of these tools are built into the system at the hardware level through firmware. As we will see shortly, they permit quite sophisticated graphics to be generated by short and elegant programs. In the chapter on designing GUIs, the features of graphical toolkits are discussed in detail and several application programs are developed using them.

What are the problems with the Macintosh Plus/Classic?

Even with these handicaps, the Mac Plus/SE/Classic line of computers has played an important role in computer graphics as the first low-cost, widely available system supporting a windows/icon/menus/pointer operating system. The popularity and acceptance it won for the graphical user interface (GUI) concept may be its single most important contribution.

Level 2 Examples

Next consider the two leading classes of Level 2 systems. By solving most of the problems of Level 1 systems, these machines have demonstrated that personal computers can serve as serious graphics workstations.

I80386-Based Machines

IBM and Compaq introduced the fast and capable Intel 80386 line of computers and many companies have followed suit. These machines have broken the 640K memory limits of earlier PCs and support a wealth of B/W and color monitors which match the requirements of many user application programs. The addition of Microsoft Windows or the OS/2-Presentation Manager operating system makes the 80386 computer a credible graphics workstation.

Although I80386 machines offer considerable computing power per dollar invested, they are not without problems. These include:

Although the problems presented here make the development of application programs more complex for developers and the selection of system components more problematic for users, there are real advantages to open architectures. The flood of application programs running under Windows demonstrates the potential of the 386 machines as graphics workstations.

M68030-Based Machines

The Motorola 68030 has become a popular microprocessor and is used in the HP, Apollo, Sun, Next and several models of the Macintosh. It is a full 32-bit processor which runs at 16, 25, 40, and even 50 MHz. Next we consider the Mac IIcx, IIsi, and IIci line as examples of 68030-based machines.

The whole Macintosh II line has returned to the open architecture pioneered by the Apple II and adopted by the IBM PC. As with the PC, the CPU comes as a chassis in which a hard drive, additional memory, and a variety of peripheral cards may be installed. Cards are available for such functions as color monitor drivers, video frame grabbers, and experimental control modules.

The Mac IIcx-IIci line supports all of the toolbox routines of the Mac Plus and additional color QuickDraw functions to support color graphics manipulation. The author's Mac IIcx workstation is shown in Figure 2.13.

Figure 2.13
Macintosh IIcx workstation with peripherals. (Video image captured with Computer Eyes .)


The components of this workstation consist of the following elements.

This system has all of the elements of the general graphics workstation shown in Figure 2.6 except the data tablet and video recorder. The digital scanner, dot matrix printer, and laser printer are not shown but available on a network. A modem provides access to a VAX minicomputer and Cray supercomputers.

The principal limitation of the Mac IIcx-IIci line of workstations is speed--or lack thereof. Because the 16-25 MHz processor must handle all interrupts and I/O as well as computation, speed of processing becomes a problem for applications such as CAD and finite element analysis. One solution to this problem is to add 40-50 MHz accelerator boards based on both M68030 and M68040 processors. With or without accelerators, the Mac IIcx, IIsi, and IIci line provides very capable and flexible workstations. All of the graphics in this text were generated or reproduced by this computer graphics system.

Level 3 Machines

Machines built around the i486 and M68040 resemble their lower model number counterparts but have accelerated processing capability. This is primarily through the use of ingenious caching schemes for fetching and storing locally those items from memory on which the processor will operate in the immediate future. The caching scheme helps overcome one of the big bottlenecks in sequential von Neumann machines--the fetching and storing of information to and from memory.

For purposes of this text we consider the i486 and M68040 as high speed, downwardly compatible versions of the I80386 and M68030, respectively. They represent the state-of-the-art processors for the current generation of standard personal workstations. The author';s 486/33 machines is shown in Figure 2.14.

There are a number of innovations which will greatly enhance the power available on graphics workstations. These include:


Figure 2.14
Gateway 2000/486/33C Computer. The system contains 8 MB of RAM and runs at 33 MHz. It has one 5.25" floppy, one 3.5" floppy, a 200 MB hard drive, a 1024 ï 768 Super VGA monitor, and a Microsoft mouse. (Figure scanned from a photograph using Microtek 300G.)

Input Devices

Every stand-alone graphics workstation has at least the following two input devices: a pointing device and a keyboard. In addition, it is essential to have a sizable hard drive for intermediate storage of images and image processing programs. In order to build an integrated graphics workstation, however, additional graphical input and output devices are desirable. That is, at least some components from the top and bottom row of Figure 2.6 must be available to the workstation for capturing and exporting images. Below we consider standard input devices in detail and discuss briefly additional transducers for capturing images.

Figure 2.15
Three mice: PC Mouse on left, Apple Mouse, old style, in center, and Apple Mouse, new style, on right. (Video image captured with Computer Eyes.)



Pointing devices have evolved from light pens and joysticks to data tablets, track balls, and touch-sensitive screens. Except for certain specialized applications such as cartography and some areas of CAD requiring data tablets, the pointing device of choice for graphics workstations is the mouse. It was invented at Xerox PARC and has the sensitivity, accuracy, and intuitiveness to make it an ideal pointing device. In Figure 2.15 we show three different mice. The mouse on the right is, in fact, alive and controlling the shutter on the video camera which took its picture.

Mice can be classified by two modes of operation: optical and mechanical. The PC Mouse, on the left in Figure 2.15, works in the optical mode with a reflective pad beneath it. The pad has a square grid of lines ruled at 1 mm intervals imposed on its reflective surface. The horizontally ruled lines are of a different color (and hence reflectivity) from the vertically ruled lines. Mounted internally is an LED light source and a lens which focuses light reflected from the ruled mouse pad and an internal mirror onto a photodetector.

Figure 2.16 shows the physical arrangement of the active elements on the PC Mouse. As the mouse is moved, a series of pulses is generated as the alternate silvered and darker colored bands reflect the beam into the detector. By appropriate interpretation of the train of pulses generated, the computer can interpret the mouse motion as up, down, right, or left.

The advantage of the optical PC Mouse is that it has no moving parts to wear out or gum up. The only disadvantages are that it must be used with the reflective mouse pad, and the system has a fairly coarse resolution.

Figure 2.16
Internal construction of optical mouse. As the mouse moves right or left, the intensity of the reflected light is modulated by the ruled mirror mouse pad, and the photo-diode produces a train of square waves.


The two mice on the right of Figure 2.15 work in a coupled mechanical/optical mode. A weighted, rubber ball protrudes slightly from the bottom and rolls as the mouse is moved on any surface. Inside the mouse, the ball bears against two mutually perpendicular rollers that drive shafts on which are mounted slotted shutter disks as shown in Figure 2.17. On one side of the disk are mounted two LED light sources and opposite them on the other side are two photodetectors. As the mouse ball rolls, the shafts turn rotating the slotted shutter which produces a train of square-wave voltages from the photodetectors. By properly interpreting the number and relative phases of the four wave trains generated by the mouse, the computer can compute the direction and distance it has moved.

The advantage of the mechanical mouse is that it can be used on almost any smooth surface and has a very smooth, precise action. The disadvantage is that the internal rollers tend to get dirty and behave erratically. As a mechanical device it is more prone to problems than purely optical mice.

Figure 2.17
Mechanical mouse assembly. As the mouse is moved, the rubber ball protruding from the bottom turns, causing one or both contact rollers to turn. As they turn the shaft on the optical shutter detector, a train of square electrical pulses is generated which can be decoded to read motion.


A consensus is evolving among users of graphics workstations on the value of extended keyboards. Keyboards with 101 or 105 keys provide function keys in which complex but repetitive operations may be encoded. The first six function keys on the keyboard in Figure 2.13, for instance, are encoded with the operations: undo, cut, copy, past, information, and select all. Many second source suppliers are now producing keyboards with more keys and functions than the original equipment.

Hard drives

One or more large hard drives is essential for any workstation on which serious computer graphics work is contemplated. The COSMOS/M finite element program specifies that at least 8 MB of hard disk space must be available for loading the program. B/W video images, such as those in Figures 2.12, 2.13, and 2.15, require 300 KB - 1.5 MB for storage. A workstation performing image processing operations will require 30 MB - 50 MB for system and application programs. Since it is useful to have at least that much disk space available for temporary storage of images, a hard drive of at least 100 MB - 300 MB is recommended.

Figure 2.13 illustrates an optimal configuration of graphics workstation hard drives. The 200 MB internal hard drive contains all of the basic systems, languages, graphics, and word processing tools. The removable drives provides individualized cassettes for specialized programs and their associated data files. A second important function that the removable drives provides is rapid backup of the internal drive. Backup is performed at hard disk I/O speeds.

Digital Scanners

One of the most useful graphics input devices is the digital scanner. Two main applications of digital scanners include scanning text for conversion into ASCII character files (i.e., reading text), and capture of images (paintings, drawings, photographs, and so on) for subsequent use in advertisements, documents, papers, and books.

Scanners are available in both B/W and color with spatial resolution of 200 dpi - 600 dpi and color resolution from 1 bit - 24 bits. Several models are now available that offer 24-bit, 300 dpi resolution at a price comparable to B/W scanners.

Video Scanners and Digitizers

The marriage of video media with computer image processing has spawned a whole new generation of multimedia technology. Sources of graphical information in a video format include:

Video input devices typically operate in one of two distinct modes: frame grabbers and scanning digitizers. Frame grabbers capture whole video frames and save them for later image processing. Some frame grabbers can even capture and store sequential frames, turning the workstation, in effect, into a video recorder for short sequences. The Moonraker system, for instance, can capture and play back video frames at 30 frames per second with a resolution of 645 x 484 pixels in 8-bit color. At 15 fps, it can display up to four separate windows from four distinct video sources simultaneously. Since frame grabbers must transfer large amounts of information rapidly, they are typically packaged as boards plugged into the internal bus structure which provides for parallel data flow.

Scanning digitizers such as Computer Eyes shown in Figure 2.14 operate on a somewhat different principle. The standard NTSC (National Television System Committee) composite video signal consists of 480 horizontal scan lines. A scanning digitizer samples each of the scan lines at a number of evenly spaced intervals along it (640 for Computer Eyes), sending the digitized values to the computer which gradually builds up a complete video image. The process takes about 20 seconds to scan and generate a complete image. At each of the 307,200 sample points the analog video signal is digitized to 8-bits of precision. Control of the scanning process is exercised by a program resident in the workstation.


Because of their large storage capacity, CD ROMs have become a popular medium for archiving graphics images. A number of public domain CD ROMs are available which contain a vast amount of graphical data. Commercial clip art disks are marketed specifically for artists and desktop publishers. Finally, some very significant scientific visualization images are available on CD ROM.

For the purposes of this book, we consider CD ROMs as a sizable resource of prepackaged images that are available for downloading into the workstation, transferring to other media (floppies and hard drives), integrating into new images, and providing raw data for the many image processing programs. For instance, the detailed workstation subimage in Figure 2.6 was downloaded from a public domain disk and edited by the addition of input devices, output devices, and a super- computer by the program Canvas to produce the final figure. Archival graphics libraries such as this are a valuable resource to graphics artists and designers.

The specifications and features of the Toshiba TXM-3201-A1-Mac include:

The last two features add another pleasant medium to the multimedia workstation--high quality stereo sound. When the user is not busy examining graphical data from the CD ROM, she/he can eject the disk, select a musical CD, and play it uninterrupted in background mode while computing on the workstation with no degradation in either musical quality or computational performance.

The only two problems with CD ROMs are their relatively slow speed compared to magnetic media and the "read only" nature of the media. Although the read only nature of CD ROMs restricts the range of applications, it does provide iron-clad security for information resources such as encyclopedias, the collected works of William Shakespeare, and Roget's Thesaurus.

Go to Lecture 4 (Chapter 2, Part B)