The evolution of computer displays

We've come a long way since the days of blinking lights and teletypes. Ars …

Current computer graphics are fairly well known and understood. But how did we get here? The evolution of computer graphics is intertwined with textual display, and it is difficult to consider the two separately.

An old saying has it that a picture is worth a thousand words. The exact quantification of the value of imagery versus text appears to vary somewhat with subject matter, and is probably better left to psychologists and social scientists. But there is little question of the kernel of truth in the saying, and it has been a driver of computer architecture for many years.

Computer graphics are taken for granted today. But it has been a long and painful struggle, with hardware rarely keeping up with the demand for better images. In English, there are a relatively small number of characters which comprise text. The same is not true of images: graphics are computationally intensive. They always seem to take as much speed and memory as there are available. But the demand was high enough that early computer graphics could be fairly crude and still be in demand.

From blinking lights to plotters

Getting computers to type text was, in comparison, a simple process. Even in the early days of computing, there were existing devices which could translate a simple binary pattern into text. The military, for example, had used teletype machines for many years. Programming a computer to output the pattern that outputs the code for a textual character on a teletype machine is relatively simple.

Early computers used mostly flashing lights, with punched cards or paper tape for input and output. When there is only room for a few hundred instructions, you take input and output in its simplest form. But sometimes the available technology drives applications, and sometimes, the need to do something becomes a driver of seeking new technology.

The potential for getting a computer to produce a picture of the data wasn’t missed. It would be more valuable if the picture were produced rapidly enough for the user to interact, but even producing an image of some sort that represented the computer contents or calculations in the recent past had its merits.

IBM was offering an output printer on its 701 model in 1952. It also offered a primitive graphics solution (the model 740 “Cathode Ray Tube Output Recorder”) in 1954.

The 740 demonstrates just how big the demand for graphics was, and how minimal a capability was considered meaningful. The 740 was a cathode ray tube to which a camera could be attached. Digital-to-analog converters drove the cathode ray tube, slowly drawing lines, based on the digital outputs of the computer. This method gradually came to be known as “vector graphics,” to distinguish it from other technologies.

Lines were plotted one point at a time. IBM justifiably bragged (at the time) that points were plotted at a rate of 8,000 per second, with a display accuracy of a given point of only 3%, but with good repeatability. You couldn’t reliably scale the resulting image, but the image would have at least conceptual value.

Typically, the camera shutter was opened when the drawing started, and closed when it finished. At that time, the film could be developed, and the image could be viewed later the same day. Needless to say, this tech wasn't suitable for playing video games. Of course, one could maintain a simpler image on the display simply by repeating the drawing instructions at a fairly high rate. But this used most or all of the CPU time, and limited the detail which could be drawn.

The 740 had a sister display, the 780, which had a long-persistence phosphor (20 seconds). While not as precise, when paralleled with the 740, it allowed the operator to verify that the image being produced was indeed the one desired. When you have to wait several hours for the film image to be developed, that’s a good idea.

But there is another way to get an image with a slow computer, and one that will yield an image to the users much sooner: a plotter. Gene Seid and Robert Morton, two of the founders of CalComp, developed the idea in 1953, but lack of funding kept the device off the market until 1959.

The idea is simple: drive a pen on two axes. That takes a pair of stepping motors, and something to put the pen down at the start of a line, and lift it again at the end. Software can calculate when the pen should be stepped in either axis to draw straight lines between two points, curved lines, or whatever.

In time, fairly sophisticated software packages were developed which isolated the users from the plotter, and allowed them to describe the image in more human-friendly terms such as “m=1, plot y=mx+b for x=1 to 6,” and so on. If the plotter drew a shape, the shape could be filled with solid, dashed, or dotted lines to simulate black or grayscale. Calculating the bounds of the lines within the shape became a part of early plotting packages. Unless one is doing ray tracing, or something similar, the bounds for shading or color within a surface must still be calculated today, and the techniques still draw on these early methods.

In time, plotters had additional pens added to do drawings in color. But the number of colors was limited, and the drawings were still vectors.

91 Reader Comments

Excellent article, this sure took me down memory lane as well. I do agree with the end though, that it seems progress in displays has leveled off to a bit over the last 5-10 years. As far as evolutions of existing technology goes, I'm hopeful that the boost in PPI found in recent mobile devices translates into a long awaited boost to traditional computers (both laptops and desktops) as well, along with the resolution independence we've been waiting a long time for. An increase in dynamic range and color rendition would be welcome too, although that may be pushing things, but purely a push up to high PPI on large screens would be nice.

As far as the future, I personally believe the next real revolutionary leap forward will be found with virtual retinal displays or some equivalent technology. The ability to have full desktop real estate or beyond in a portable form factor that can not only be connected to anything but is conducive to augmented reality as well will enable some true leaps forward in my opinion. Beyond that of course are totally integrated neural implants, but given how primitive those are right now I think it'll be at least a decade or two before there's major progress there, so VRDs should get a solid chance for a good run in the mean time.

Incidentally, the Atari 800's graphics system was designed by the same guy that designed the Amiga's graphics system- Jay Miner. Who was a true genius with that stuff.

It's interesting to think that with modern CPUs, 2D graphics are basically a solved problem. You can do almost anything you want in 2D, using just the CPU. Eventually that will be the case for 3D graphics, too, which will be interesting. 100% software-driven graphics is a GOOD thing as far as graphics programming goes.

Incidentally, the Atari 800's graphics system was designed by the same guy that designed the Amiga's graphics system- Jay Miner. Who was a true genius with that stuff.

It's interesting to think that with modern CPUs, 2D graphics are basically a solved problem. You can do almost anything you want in 2D, using just the CPU. Eventually that will be the case for 3D graphics, too, which will be interesting. 100% software-driven graphics is a GOOD thing as far as graphics programming goes.

Really? I remember seeing a few demoscene works some years back which used software pipelines. This let the coders do some interesting stuff which 3D accelerators at the time couldn't do, but isn't it really just more of a convergence thing at the moment? With technologies such as CUDA, you just offload the stuff that makes sense for a CPU or GPU to calculate based on its strengths.

Still, on that sort of thought pattern of awesome stuff that used to be on the CPU, I can't wait until high density, high resolution voxels are finally able to be used without their often massive performance hits

It's interesting to think that with modern CPUs, 2D graphics are basically a solved problem. You can do almost anything you want in 2D, using just the CPU. Eventually that will be the case for 3D graphics, too, which will be interesting. 100% software-driven graphics is a GOOD thing as far as graphics programming goes.

Or do 2D using a GPU, not to mention dedicated sections dealing with video.

Unfortunately, other than the pseudo 3-D that's being developed right now, we probably won't see any major breakthroughs from now until holograms (though of course screens will keep getting thinner and more efficient, not necessarily with higher resolution because of text size limitations)

The description of VGA in the table is a bit wrong. VGA is very specific in its capabilities, it is the video system of the IBM PS/2.

It has 256 kB framebuffer, 6 bit per channel colour and offers 256 colours on screen at any one time from a total of 262,144 in 320x200 or 16 in 640x480. A 640x350 mode is also available.

VGA was a major, major breakthrough, as it was out of the box and into the programmer fully addressible. Before VGA the programmer had to screw around with character maps or low res EGA rubbish, in VGA he could control the position and colour of every pixel: VGA was first to provide acceptable bitmap video on the IBM platform.

@ Sunflowefly, the demand isn't there yet, and the software isn't there yet. A higher DPI monitor will cost more, and make text smaller. I know Windows 7 has the DPI independence thing but not everyone knows about it. I for one would love a doubled resolution screen, like the iPhone 4 did. But the software has to be there first.

/rantHowever I would just like to take this opportunity to shake my fist at the trend of glossy displays that have proliferated faster then cockroachs in a dumpster. They need to die a sudden and horrific death, preferrably involving a blowtorch and much screaming.

My 5 year old laptop is 1920x1200 (about 140 dpi?), new phones are 300 dpi, yet I can not find a nice big LCD monitor for my desktop much over 1080p. Where are the higher dpi computer monitors?

HDTV ate them, as far as I can tell (I understand that for TV buyers, HDTV resolutions can be impressive; it'd be nice if computer monitor manufacturers stopped condescending by pretending that they represent anything but lowered expectations, though, from the computer end of things). See also the move from 4:3 to 16:10 (okay, arguable, has some advantages) to 16:9 (near worthless for computer monitors relative to 16:10). It's just economy of scale.

Unfortunately, other than the pseudo 3-D that's being developed right now, we probably won't see any major breakthroughs from now until holograms (though of course screens will keep getting thinner and more efficient, not necessarily with higher resolution because of text size limitations)

Loved the article, although the technical detail I loved in the earlier parts was sadly missing from the later bits - although that would likely have made for a much longer article. It is a huge subject. There are some fascinating details that home computers brought to graphics as we know them today. Games really have pushed the state of the art in so many ways, from sprites and messing with interrupts and timing of the 8 bit era, to the bitplane shenanigans of the Amiga, and then the 3D geometry and shaders of the more recent accelerator cards. It's a fascinating and incredibly deep subject.

What I am really fascinated by are the different approaches used and the advantages and pitfalls they present. The evolution of computer graphics is littered with dead-ends or ideas that ended up outliving their usefulness (vector-based displays, planar graphics, 4-sided polygons, CRTs?) and it's these that actually tell us a lot about the subject and the lessons learned.

Now where did I put that copy of Michael Abrash's Zen Of Graphics Programming.

BTW, the one omission from that PC display mode table is the PCjr modes. Which would probably have disappeared along with that machine were it not that they were used in Sierra's King's Quest and derived games.

I remember drooling over 20" 1600x1200 and 24" 1920x1200 YEARS ago. I fantasised about those 30" ones too. Where have we gone since then? In terms of resolution that is, I know that there have been other improvements, and the price has dropped.

LCD is very disappointing and a huge step backwards IMO. I like the space/power/flicker advantages but they just don't compare to a professional/high-end CRT. Hurry up OLED! I am not going to miss LCD one bit. I honestly blame consumers though, they are far less critical of HDTVs and LCD panels than they were of CRT displays.

I remember drooling over 20" 1600x1200 and 24" 1920x1200 YEARS ago. I fantasised about those 30" ones too. Where have we gone since then? In terms of resolution that is, I know that there have been other improvements, and the price has dropped.

Until HDMI 1.3 in 2006 or dual-link DVI, no common computer video connector could support a resolution much larger than 1920x1200 at 60 Hz.

I remember drooling over 20" 1600x1200 and 24" 1920x1200 YEARS ago. I fantasised about those 30" ones too. Where have we gone since then? In terms of resolution that is, I know that there have been other improvements, and the price has dropped.

Until HDMI 1.3 in 2006 or dual-link DVI, no common computer video connector could support a resolution much larger than 1920x1200 at 60 Hz.

You could buy a certain Sony Trinitron that pushed 2048x1536 through a BNC connector. And this was 10 years ago. It also had a VGA port but all our AVID equipment used BNC. I forgot the model name but it was a huge, heavy beast and we had a few of them in the production truck.

I started work with Control Data, back in 1981, as a batch terminal operator. My first programme? I taught myself fortran (I was bored), and proceeded to port "Midway" from the TRS-80 (written in Basic) to the mainframe at work (written in Fortran), using a Tektronix 4014 graphic terminal. Different sprites defined for the cursor (crosshairs, natch), sprites defined for the different aircraft carriers, etc, and using the trackball to pick the thing you wanted to give orders to. Major pain in the arse, trying to output 8-bit ascii to control the 4014, when working with a machine that had 10 6-bit characters in a 60 bit word - I still remember my brain exploding trying to do all the translations, and pack them into words appropriately.

I remember finishing the game - probably about 5 or 6 hundred lines of code - while staring at the Fortran-4 manual, and not being able to think of a good reason why those SUBROUTINE statements existed .....

It even became a money maker - I had been yelled at a few times by the office manager for burning so many CPU cycles, and using an expensive terminal, with my screwing around. But then one day, a client engineer came into the office, saw what I was doing, and started to play the game (on HIS company's account).

My 5 year old laptop is 1920x1200 (about 140 dpi?), new phones are 300 dpi, yet I can not find a nice big LCD monitor for my desktop much over 1080p. Where are the higher dpi computer monitors?

Yeah, very odd. My old 20" imac blew up, and I built a replacement hackintosh. I am currently using some 17" $100 Samsung that I had laying around and it totally stinks. I started checking out the specs on my imac's old screen and found out it was pretty dense for something about 4 years old. Same panel as the 20" aluminum "cinema display" and the Dell 2007WFP, IPS and 1680x1050 resolution. I wanted something that size with at least the same resolution, and was sort of shocked to see that there wasn't some huge resolution jump in that panel size in the past 4-5 years. I noticed that unless you unload an ass-ton of money, you don't get any more pixels when going to 23". That's pretty sad, but I guess this is for the crowd that run every application maximized...

Still looking for a cracked 20" cinema or Dell 2007WFP on Ebay that I can swap my old panel into.

Several mix ups in the table on page 4, I'm not sure if the colour depth started out in bits then changed to colours - but MDA etc. have 1-bit colour depth for 2 colours, nothing displays 1 colour. VGA was 256 colours from the start, but I can't remember if the palette was selected from 6 or 7-bits per channel.

And don't forget Hercules Mono - a full 64kb on the card for the frame buffer. Lovely sharp text (for the time)

LCD is very disappointing and a huge step backwards IMO. I like the space/power/flicker advantages but they just don't compare to a professional/high-end CRT. Hurry up OLED! I am not going to miss LCD one bit. I honestly blame consumers though, they are far less critical of HDTVs and LCD panels than they were of CRT displays.

The only LED screens I've seen that are worth buying are Apples Cinema Displays, nice wide viewing angle (178°h/v) and a good contrast ratio. They do have some issues of course, gloss can be a pain in the backside if you happen to have an office with windows. But the picture quality is still better than anything else I've ever seen.

Although I've heard rumors that Dell makes good displays, I'm yet to see any evidence of that.