Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Alan Shutko asks: "I was playing with different resolutions recently, and got confused.
640x480, 800x600, 1024x768, 1152x864, 1400x1050, 1600x1200, they all have a 4x3 aspect ratio. But 1280x1024 has a 5x4 aspect ratio.
What's up with this? Somewhere in the annals of computing history, someone must have come up with 1280x1024. Why did they choose such an odd aspect ratio?"

Ideally, you'd have 1280x960, but since 960 isn't a multiple of 128 (or 256), it messes up various hardware blitting methods.

Is this still true, or did it only apply to older hardware? I've ran X in 920x690 mode with no problems, 690 isn't a multiple of 128.

One of the popular VGA modes was 320x256 (which is in the 5:4 ratio). It meant that you could have an array of scanlines, and index them with a single byte (with no wastage).

What programs used that screen mode? 320x200 was the standard mode for games (mode 13h). 320x240 was available with a hack ("Mode X"). Neither of these are multiples of 128, and I've never heard of 320x256 being used.

Back when desktop publishing was really heating up on the Macintosh, Apple came out with the Full Page Monochrome (later RGB) Display that supported one resolution:

640x870

!!!

Yes, a "backwards" display ratio - 3:4 - 1/3 taller than it was wide. This was designed to show a single sheet of paper at 100%; and it worked, too, you could just about line up a sheet of paper to the display.

And others will remember the Radius Pivot display, which could "swing both ways", so to speak; you could tilt it over horizontal for spreadsheets and tilt it back vertical for page layout. The neat thing on the Mac was the fact that the computer would automagically detect this and resize your desktop to accomodate.

No, it does. I noticed the discrepancy when using Xvnc for the first time. I remembered that the root window size on the target machine was 1152x?, so I set it to 1152x864 by doing the 4:3 math.

But then I couldn't figure out why I couldn't fit the same number of windows on the screen as I do when using X on the target's console. I had to force vncviewer to open at 1152x900 in order to get the same results. [*]

Others are using the same X setup as me, with similar boxen (older sparcs mostly) and they have the same res and effects.

[*] I actually had to walk upstairs to the target and run X on it, check the root window size, close X, and walk back downstairs to reopen the XVNC client. It was silly. Luckily it was a temporary necessity.

Ya know, I never could figure out why reverse-aspect monitors never caught on. when I hack I always end up fullscreening emacs and running follow mode (kicks but!) and horizontal split which turns it into a two column display.

The scrolling isn't perfect, which is why I want a reverse aspect ratio monitor.

Does anyone have a software solution? Like an X driver (server?) that just rotates the display 90 deg? Then you just stand the monitor on its side...

I'm going to say that 1280x1024 came about because of the need for a more "square" work area, namely in CAD environments. That's the first thing that popped into my head when I saw this question.

Could it also have been brought about due to memory constraints? 1280x1024 will fit into 1Meg of video memory at 4-bit color depth, maybe 1364x1024 (close to the 4:3 aspect ratio) doesn't fit into one meg of video RAM as well?

Again, I'm guessing here, but isn't 1280x1024 the first "XGA/PGA" resolution? Could this resolution be held to a different standard than the "(S)VGA" resolutions?

All I know is, I like my resolutions high, and 1280x1024 suits me just fine on medium monitors (17", high end 15").

You want a messed up resolution? Dell's 1400x1050 LCD screens (XGA+)... I've got one of those puppies, and MAN, are they nice!:-) X on that laptop is almost heaven at 16-bit color depth! Oh, and it's a 4:3 aspect ratio. Probably for good DVD playback, but since I'm boycotting DVDs, I'll never know.

Hmm: lspci thinks it's a 'ATI Technologies Inc 3D Rage P/M Mobility AGP 2x (rev 64)'. I've got to agree, 1400x1050 is a great resolution. I'd have bought a non-Dell laptop if I could have gotten better than 1024x768 in December, but at the time the only one I could find that did SXGA was the 7500, and it did SXGA+ (Dell's name for 1400x1050) for just a little bit more money. Well worth it.

BTW, if you're looking at the 5000 (slimmer, not as expandable version of the Inspiron 7500) think very carefully about a Celeron or something else that runs cooler. My 7500 is incredibly stable and stays reasonably cool, but the PIII/650 Inspiron 5000's my company bought have real heat problems. Your lap getting uncomfortably warm is one thing, but having drive and cpu flakiness that trashes a filesystem when you run it too hot is a little outside what I'd consider acceptable. If you can hack the extra weight grab the 7500.

If you do want the 5000 look at Sceptre [sceptre.com]. They source the chassis from the same manufacturer Dell does.

I believe the Hercules monochrome cards ran at 720 x 348 (only mentioning that so you don't calcute an incorrect aspect ratio, of course;) ) Just for a laugh, one of these days I'm going to break mine out of storage and try it in my Windows 98 box to see if it'll be autodetected. I guess I'd better do it soon - it seems unlikely that the next motherboard I own will be physically able to support an 8-bit ISA card.

It had better resolution than CGA, but I have to say it sucked being the only one amongst my friends without color. I was so happy the day I got a copy of "SIMCGA" - a TSR program that allowed us Hercules users to fool games into thinking they were running on a computer with a CGA card installed; a requirement for most PC games back then. I owe whoever wrote "SIMCGA" (and who released it as public-domain) a big thank-you.

Didn't the Hercules cards allow you to have a VGA card installed simultaneously?

Hmmm... after a quick search at the PC/Blue Disk Library [oakland.edu] hosted by the awesome folks at the OAK Software Repository [oakland.edu], if anyone cares (yeah, I know) here are the first few sections from the SIMCGA manual:

SIMCGA - Simulate CGA with Hercules Monochrome Card

Written in September 1986 by
Chuck Guzis
153 North Murphy Ave.
Sunnyvale, CA 94086

This memory-resident utility allows you to "fool" most software requiring a Color Graphics Adapter into using your Hercules (or compatible) monochrome adapter in the graphics mode. Graphics images are reproduced in normal aspect ratio, using as much of the available screen area as is possible.

The trick used here is to program the HGC to display more lines of 3 lines per character time instead of 4 (The CGA displays 2). A service routine hooked into the hardware timer interrupt (int 8) copies one line to the third displayed line to give a filled-out image.

I've actually always wanted a resolution between 1280x1024 and 1600x1200 for my 19inch monitor. 1400x1050 seems to fit the job well, but it would appear my current video board (v3 3000) doesn't support it. Do any current video boards support this mode?

The BBC micro had a virtual screen resolution of 1280x1024 - it had a number of different screen modes (colour depths, resolutions), but the graphical modes all had a virtual resolution of 1280x1024. This meant drawing a line (in BBC BASIC) from the bottom left to the top right was always:
MOVE 0,0: DRAW 1279, 1023
(Yes, (0,0) was bottom left, as opposed to top left)

Okay, so it isn't a nice multiple of good numbers, but here's the resolution I've come up with: 1320x992. The number of pixels on the screen is within approximately 0.1% of 1280x1024, so it should theoretically work with any monitor that supports 1280x1024.

I have pasted a few modelines below. The most important number for a lot of people is the dotclock (120 in the example). You can bring that down or up, depending upon how high of refresh rates you can use with your system. IIRC, this runs at about 60 Hz, but it may be a bit higher (65 or so). Please also realize that xvidtune may be of use.

Also note that I'm not a genius when it comes to this stuff, and it could cause bad things to happen (though most modern displays can shut off when fed a bad signal..)

``...so it should theoretically work with any monitor that supports 1280x1024.''

Whoops.. If you have a fixed-freq monitor, this probably won't work, but it should work for any multisync monitor. I came up with this because my monitor is only spec'd to do 1280x1024, and I didn't want to try my luck at a higher resolution (which would have probably forced me to use lower refresh rates).--Ski-U-Mah!

Matrox G200/G400 will do that mode nicely. Your monitor may not like it, however. I had to tweak my Sony 21/XF86Config for an hour to get it to sync up correctly. I had some luck with it on an ATI Rage Pro, but the card has so many non-redeeming qualities I'd shy away from it, even if it is cheap.

OT, but I ran some benches of my new V3 3000 (replacing a burned out ATI) against my old SLI V2s.. The V2's kicked its ass, by up to 40%. I sincerely hope the 'Bigger, Badder' V3 models really are.. (Scariest part? I have 64M of video subsystem memory in there now)

Most games don't list all the other resolutions. If you have a GeForce 2, you can't play Homeworld or Half-Life in 1280x960 if it runs basically the same with a better view at 1280x1024. GeForce 2s still support 1280x960, but with that kind of video card, why not go all the way up to what your monitor supports? If you have a crazy high-res monitor, you're doubling that res, though.

The comments about memory adderssing make sense for the horizontal size, but not necessarily for the vertical. That said, it is definitely easier to get a 4040-like counter to reset every 1024 ticks than every 960 ticks - but this is trivial.

I think the reason may be that older systems actually used 24-bit color, often with no underlay/overlay/alpha channel. This gives 3 bytes per pixel, for a total of 3.75 Meg for the display. This fits comfortably within a 4M framebuffer. The next higher multiple of 128 for the horizontal is 1408, with a 1056 vertical (for 4:3), and that is too much for a 4M framebuffer.

I know that back in 1987 (and probably before), the SGI Iris (with a whopping 25MHz R3000 and 32M RAM!) had a 1280x1024x24bpp display. (of course, that was 24 bits color, 24 bits z-buffer, 2 bits overlay, 2 bits underlay, and another 24bit+24bit rendering buffer, for a total of 100 bits/pixel!)

This may be another one of those "They're doing it, so we might as well, too" kind of things.

The simple answer is that while TV's have a 4:3 aspect ratio, PC monitors have a 5:4 aspect ratio. That's why DVD's always seem just a little stretched. Some decoder cards actually letterbox the display to retain the approximate ratio of actual TV screens, which inevitably results in scaling artifacts unless you're cranking it out at 1080i resolution.

Anyways.. 1280x1024, in addition to offering memory-aligned scanlines as stated in every other comment, provides perfectly square pixels, which comes in pretty handy for graphics work.

Various Macintosh computers have supported various of the popular resolutions, but the one I always run into is Apple's 1152x870. When I render a 3D graphic full size desktop picture, I must go out of my way to support 1152x870. I have to trim off 6 scan lines when moving it to a PC.

Another is 800x600 (4:3) and Apple's 832x624. Apple supports 800x600 on some Macs, 832x624 on others, and both on other models.

For mostly obscure hardware reasons, early DRAM/VRAM graphics controller implementations favored horizontal resolutions that were a multiple of the row size. 1280 is divisible by 256, the row size of a 64K DRAM.

One of those obscure reasons was address translation. If you form the linear framebuffer address as (2048*y)+x, it made doing blt hardware much easier: just map x and y onto the appropriate row and column bits.

Another of those reasons was being able to load the video shift registers at the same times each line. This made the timing control easier to do in the logic of the day (think MSI counters and gates.)

Modern gfx conrollers refresh the display using periodic burst DRAM access instead of actual shift registers; and they have hardware to help deal with the x-y to linear address translation. So the whole issue of row size pretty much goes away.