The first two were In-game cutscenes, not CG. I don't even have to explain it if you actually have played the game, but whatever.

And the RE5 screen you posted is a bullshot capcom released before even the game came out.

I wasn't suggesting they were CGI, but pre-rendered. Even a lot of "in-game" cut-scenes are pre-rendered using the same game engine and then ran as a video file, rather than rendering in real-time using the console. This is a common technique with Blu-Ray games in particular (e.g. like TLOU or Ryse). The best way you can tell whether it's real-time is whether alternate costumes show up in cut-scenes, like in MGS4 or RE5.

As for RE5, it didn't seem like a marketing bullshot to me, because I remember it looking like that in the game. Nevertheless, I'll look for another screenshot just in case...

But there was no black/white screen to load the pre rendered file, it just zooms in to show the cutscene and zooms out to give the player control of the character again. Plus you can clearly see that I was using a custom character skin, which further proofs it's 100% in-game

Fair enough. I haven't got that far enough in the game to know how that scene plays out, but I'll take your word for it. Still, it's worth keeping in mind that nowadays video files can be loaded almost seamlessly from Blu-Ray discs. If the camera switched to a different angle, then it's possible a video file is playing, but if it zoomed in, then that's unlikely. By the way, by the custom character skin, I assume you mean the latter two screenshots? Because the first pic just looks like default Nathan Drake.

The first two were In-game cutscenes, not CG. I don't even have to explain it if you actually have played the game, but whatever.

And the RE5 screen you posted is a bullshot capcom released before even the game came out.

I wasn't suggesting they were CGI, but pre-rendered. Even a lot of "in-game" cut-scenes are pre-rendered using the same game engine and then ran as a video file, rather than rendering in real-time using the console. This is a common technique with Blu-Ray games in particular (e.g. like TLOU or Ryse). The best way you can tell whether it's real-time is whether alternate costumes show up in cut-scenes, like in MGS4 or RE5.

As for RE5, it didn't seem like a marketing bullshot to me, because I remember it looking like that in the game. Nevertheless, I'll look for another screenshot just in case...

But there was no black/white screen to load the pre rendered file, it just zooms in to show the cutscene and zooms out to give the player control of the character again. Plus you can clearly see that I was using a custom character skin, which further proofs it's 100% in-game

Fair enough. I haven't got that far enough in the game to know how that scene plays out, but I'll take your word for it. Still, it's worth keeping in mind that nowadays video files can be loaded almost seamlessly from Blu-Ray discs. If the camera switched to a different angle, then it's possible a video file is playing, but if it zoomed in, then that's unlikely. By the way, by the custom character skin, I assume you mean the latter two screenshots? Because the first pic just looks like default Nathan Drake.

Seriously though have you even played the game, cause that part of the game plays out in chapter 2 which is like 15-20 minutes into the game. And no I mean all the screenshots I posted are custom character skins apart from the Uncharted 3 ones. The first pic is also custom skin, because Drake would be wearing his heist gear in that section of the game, but I changed it to default Drake.

The first two were In-game cutscenes, not CG. I don't even have to explain it if you actually have played the game, but whatever.

And the RE5 screen you posted is a bullshot capcom released before even the game came out.

I wasn't suggesting they were CGI, but pre-rendered. Even a lot of "in-game" cut-scenes are pre-rendered using the same game engine and then ran as a video file, rather than rendering in real-time using the console. This is a common technique with Blu-Ray games in particular (e.g. like TLOU or Ryse). The best way you can tell whether it's real-time is whether alternate costumes show up in cut-scenes, like in MGS4 or RE5.

As for RE5, it didn't seem like a marketing bullshot to me, because I remember it looking like that in the game. Nevertheless, I'll look for another screenshot just in case...

But there was no black/white screen to load the pre rendered file, it just zooms in to show the cutscene and zooms out to give the player control of the character again. Plus you can clearly see that I was using a custom character skin, which further proofs it's 100% in-game

Fair enough. I haven't got that far enough in the game to know how that scene plays out, but I'll take your word for it. Still, it's worth keeping in mind that nowadays video files can be loaded almost seamlessly from Blu-Ray discs. If the camera switched to a different angle, then it's possible a video file is playing, but if it zoomed in, then that's unlikely. By the way, by the custom character skin, I assume you mean the latter two screenshots? Because the first pic just looks like default Nathan Drake.

Seriously though have you even played the game, cause that part of the game plays out in chapter 2 which is like 15-20 minutes into the game. And no I mean all the screenshots I posted are custom character skins apart from the Uncharted 3 ones. The first pic is also custom skin, because Drake would be wearing his heist gear in that section of the game, but I changed it to default Drake.

I only remember playing it for like 15-20 minutes. The first pic does look kind of familiar to me, but I don't remember the second one with the old man. Back when it came out, I didn't really have the time to play games much. Now I do have a bit more free time though, so I might pick it up again some time.

It was up until some time around the turn of the millennium, when PC's began overtaking the arcades as the golden standard of the video game industry. This was partly because the 3D graphics accelerators which were previously exclusive to arcade manufacturers started becoming available available to PC consumers, and partly because arcade games were no longer as profitable as they used to be. Nowadays, high-end arcade machines are usually based on PC's, while low-end and mid-range arcade machines are usually based on consoles.

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush and Mace the Dark Age did. It easily beat Virtua Fighter 3 as it didn't run on 3DFX's Voodoo Graphics.

I let you decided which one looked better:

San Francisco Rush using 3DFX Voodoo graphics:

http://www.youtube.com/watch?v=2WLlXaTdLFI

Mace The Dark Age running on 3DFX Voodoo Graphics:

http://www.youtube.com/watch?v=YR_9C0A3CEs

I let you decide which one look better, Mace the Dark Age may have the slight advantage in the link I posted as it's in HD (even though the original game wasn't in HD) so it might look cleaner. If I remember correctly, Mace The Dark Age looked probably slightly better than Rush even though I had fond memories of San Francisco Rush as it was the first game I saw running on 3DFX hardware (or 3D hardware for the matter) and it blew my mind.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as it also used 3DFX's Voodoo Graphics chip too which was better than he Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

FEAR was 2005 and it looked better than CoD 2. It can be argued that Doom 3 or Half-Life 2 were the best looking games in 2004.

I also love how once dedicated GPUs for the PC were released, the PC quickly started becoming the go-to platform for best graphics.

Crysis 3 is also a 2013 game and played on a proper high end PC looks better than Ryse. They do run the same engine, only you can push a lot more graphics on a PC than you can on an Xbox One. Ryse has a better artstyle though with more vibrant colors.

I wasn't sure whether to go with FEAR or COD2. To me, both looked just as good as each other. But trying to limit my list to just one game per year makes it really difficult. Same goes for Crysis 3 and Ryse.

By the way, Ryse uses a 4th generation CryEngine, while Crysis 3 uses CryEngine 3. While Crysis 3 has higher resolutions and more texture details, Ryse has higher polygon counts for the assets as well as improved cinematic lighting & shading effects. Ryse could have looked a lot better if it was a PC game though.

Higher resolution assets don't mean much when you're limited to 30fps at 900p. If we want to go by pure asset fidelity then modded Skryim blows everything else away no competition. The limitation of the resolution combined withe lower frames reduces the overall picture quality compared to that of Crysis 3 running on a PC at recommended settings. Furthermore Ryse has smaller environments, though that could very well be a product of the game's design, not the limitations of the hardware.

With graphics king your looking at the best overall picture quality you can get, that has to combine both art and technical aspects of graphics. It is biased towards upgrade able platforms like the PC because of the limitations of hardware of a closed and fixed platform. If Ryse was on the PC it would probably look better than Crysis 3 at its best, but as it stands Ryse's best is still not as good as Crysis 3 at it's best.

Yes I have seen it, yes it is a fantastic looking game, no it's not the best looking game of last year. It's highly doubtful any console game will ever be awarded that again unless somebody breaks the mold and builds a console that is on par with the best PCs of the time.

The last console game to ever be crowned graphics king will be Gears of War. At the time the Xbox 360's unified shader GPU was ahead of anything on PCs. The next year the PC far suppressed the Xbox 360, but for those few months Gears of War was the best looking game. It's just not possible anymore for a $400-500 machine to do the graphics of $1000 GPUs. Even with the native advantage of having a low level API that gets more out of the hardware, the gap in power is too great for any sort of optimization to fully win.

Crysis 3 and even Battlefield 4 on ultra settings on the PC look better than Ryse at it's best. That's also why I'm not talking about Killzone Shadow Fall despite being 1080p and looking very good, when you play it there are a lot of unfortunately obvious cuts they had to make that hurt the final picture quality (mostly distant LoD models).

No it's not fair, but that's how it is.

Actually, I wasn't referring to higher-resolution assets (I'm not sure if it does), but higher polygon counts for those assets. According to this chart, for example, Ryse has a higher character polygon count than Crysis 3. However, my main point was about the engine: Ryse uses a 4th-generation CryEngine, while Crysis 3 uses a 3rd-generation CryEngine. Ryse is the first game to demonstrate Crytek's next-gen CryEngine, but is obviously bottlenecked by the limited X1 hardware. In terms of resolution, picture fidelity, and texture details, your right that Ryse is no match for Crysis 3, because of the limited hardware. But the advantage Ryse has is that it's using a next-gen CryEngine, with a more advanced feature set, such as improved cinematic lighting & shading, as well as higher polygon counts for the characters. I'm not sure if you'd consider that to be an "art" advantage, but I'd consider that to be a technical advantage.

EDIT: By the way, I've now updated the OP to replace COD2 with FEAR for 2005.

You made the right call FEAR was definitely better looking than COD 2. You have to remember FEAR pushed graphics tech and used "soft shadows" even high end GPU's like the 7800 GTX struggled to run FEAR using soft shadows..

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush did. It easily beat Virtua Fighter 3 as gaming running on 3DFX's Voodoo Graphics vastly looked better than any other games that didn't run on it like Virtua Fighter 3.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as they used 3DFX's Voodoo Graphics chip too which was better than Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

Good suggestion with Unreal. I was actually considering it, but ended up going for the more popular option with Half-Life, without spending much time comparing them side-by-side (since there's so many games to consider). I'll compare them again and maybe update the OP if necessary.

However, I'd have to disagree with your comparison between 3dfx and Real3D. The Real3D/Pro-1000 chipset used by the Sega Model 3 was by far the most powerful GPU around 1996-1997. And the Model 3 didn't just use one, but used a dual GPU setup. While the 3dfx Voodoo was capable of around 300,000 triangular polygons/sec in real-world performance, the Model 3's dual Real3D/Pro-1000 chipset was capable of over 1 million quadratic polygons/sec, and 2 million triangular polygons/sec, in real-world performance, with more advanced anti-aliasing and specular lighting & shading effects to boot. Even the Real3D/100 chip later used with the Intel740 in 1998 was significantly toned down and doesn't come close.

There were other Sega Model 3 games I could have chosen. The graphics of SCUD Race, for example, looks far superior to San Francisco Rush, as far as 1996 arcade racing games go. However, I went with Virtua Fighter 3 because it was a major breakthrough in terms of 3D character model graphics in 1996. And I'm sure you'd agree if you compare it side-by-side with the 3dfx-based arcade fighting game you mentioned, Mace: The Dark Age. And for 1997, I chose Sega Bass Fishing because of the advanced water effects, which again is unrivalled by any other game that year.

Anyway, I'm glad you agree with my other choices though.

@jun_aka_pekto said:

@kemar7856 said:

wait so arcade>pc :P

Back in the 80's, the arcade version was the golden standard.

If we talk generic computers as PCs, it's pretty close because of computers such as the Amiga, Atari, ST, and Apple IIGS. Games on those three looked great. I recall some arcades were in fact Amigas without the case.

Now if we take the PC as strictly IBM-compatibles, no. Back then, IBM-compatibles were often limited to 4-color CGA or 16-color EGA. Many of them didn't even have sound except for the bleeps and bloops. When I had my Tandy 1000SX, I was able to game with 16 colors and 3-channels ound because some games supported the Tandy hardware. But, the majority of them defaulted to CGA and really crappy, obnoxious sound.

While most IBM-compatibles still made do with CGA/EGA and lousy sound, the Amiga was already doing this. All from two floppies:

Newtek Demoreel 1

The Amiga and Atari ST were powerful in their time because their 68K architecture was more or less modelled after mid-range arcade machines. This made them ideal for ports of 16-bit arcade games. However, I'm not sure about there being any arcade machines actually based on the Amiga. The only 80's computer I know of being used as the basis for an arcade machine was the Sharp X6800, which was the basis for Capcom's CPS arcade system.

But yeah, the x86-based, IBM-compatible PC had a long way to go before it could catch up to consoles or other home computers. 68K-based computers like the Amiga, Atari ST and Sharp X68000 were far superior to the IBM-compatible PC in terms of graphics and sound in the 80's.

Wow for about 8 years there was just no progress until the coloured games arrived. Seems more recent games had the biggest jumps which I did not exactly expect. Could have used a better selection of games or even just better images of the games but was fun to glide through them.

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush did. It easily beat Virtua Fighter 3 as gaming running on 3DFX's Voodoo Graphics vastly looked better than any other games that didn't run on it like Virtua Fighter 3.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as they used 3DFX's Voodoo Graphics chip too which was better than Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

Good suggestion with Unreal. I was actually considering it, but ended up going for the more popular option with Half-Life, without spending much time comparing them side-by-side (since there's so many games to consider). I'll compare them again and maybe update the OP if necessary.

However, I'd have to disagree with your comparison between 3dfx and Real3D. The Real3D/Pro-1000 chipset used by the Sega Model 3 was by far the most powerful GPU around 1996-1997. And the Model 3 didn't just use one, but used a dual GPU setup. While the 3dfx Voodoo was capable of around 300,000 triangular polygons/sec in real-world performance, the Model 3's dual Real3D/Pro-1000 chipset was capable of over 1 million quadratic polygons/sec, and 2 million triangular polygons/sec, in real-world performance, with more advanced anti-aliasing and specular lighting & shading effects to boot. Even the Real3D/100 chip later used with the Intel740 in 1998 was significantly toned down and doesn't come close.

There were other Sega Model 3 games I could have chosen. The graphics of SCUD Race, for example, looks far superior to San Francisco Rush, as far as 1996 arcade racing games go. However, I went with Virtua Fighter 3 because it was a major breakthrough in terms of 3D character model graphics in 1996. And I'm sure you'd agree if you compare it side-by-side with the 3dfx-based arcade fighting game you mentioned, Mace: The Dark Age. And for 1997, I chose Sega Bass Fishing because of the advanced water effects, which again is unrivalled by any other game that year.

Anyway, I'm glad you agree with my other choices though.

Hmm...didn't know that the Real3D/Pro-1000 was the most powerful GPU in 1996-1997. I was under the impression that the 3DFX Voodoo was the best GPU that was running games Arcade/PC wise but now it makes sense. And you are right about intel 740 being used on the Real3D/100 chip and it had pretty good performance/image quality compared to the early generation of PC graphics cards.

The question then becomes why didn't Real3D come out with a PC Graphics card with the full blown specs with 2 million triangular polygons back in 1996-1997. They would have smacked 3DFX/nVidia/ATI? I was under the impression that 3DFX had the lead in terms of technology. I guess poor business decisions lead to it.

By chance, did you follow GPU's back then? I am interested to know the opinions of others as to what they think caused the demise of 3DFX because as I stated they were pioneers and leaders on PC graphics from 1996 to the late 90's.

Anyways, I am glad you included games from the 70's and 80's as I never thought that graphics stagnated for nearly a decade as someone has mentioned.

It seems like graphics started to pick up in the mid to late 80's and proceeded on from their with Wolfenstein 3D and Doom.

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush did. It easily beat Virtua Fighter 3 as gaming running on 3DFX's Voodoo Graphics vastly looked better than any other games that didn't run on it like Virtua Fighter 3.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as they used 3DFX's Voodoo Graphics chip too which was better than Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

Good suggestion with Unreal. I was actually considering it, but ended up going for the more popular option with Half-Life, without spending much time comparing them side-by-side (since there's so many games to consider). I'll compare them again and maybe update the OP if necessary.

However, I'd have to disagree with your comparison between 3dfx and Real3D. The Real3D/Pro-1000 chipset used by the Sega Model 3 was by far the most powerful GPU around 1996-1997. And the Model 3 didn't just use one, but used a dual GPU setup. While the 3dfx Voodoo was capable of around 300,000 triangular polygons/sec in real-world performance, the Model 3's dual Real3D/Pro-1000 chipset was capable of over 1 million quadratic polygons/sec, and 2 million triangular polygons/sec, in real-world performance, with more advanced anti-aliasing and specular lighting & shading effects to boot. Even the Real3D/100 chip later used with the Intel740 in 1998 was significantly toned down and doesn't come close.

There were other Sega Model 3 games I could have chosen. The graphics of SCUD Race, for example, looks far superior to San Francisco Rush, as far as 1996 arcade racing games go. However, I went with Virtua Fighter 3 because it was a major breakthrough in terms of 3D character model graphics in 1996. And I'm sure you'd agree if you compare it side-by-side with the 3dfx-based arcade fighting game you mentioned, Mace: The Dark Age. And for 1997, I chose Sega Bass Fishing because of the advanced water effects, which again is unrivalled by any other game that year.

Anyway, I'm glad you agree with my other choices though.

Hmm...didn't know that the Real3D/Pro-1000 was the most powerful GPU in 1996-1997. I was under the impression that the 3DFX Voodoo was the best GPU that was running games Arcade/PC wise but now it makes sense. And you are right about intel 740 being used on the Real3D/100 chip and it had pretty good performance/image quality compared to the early generation of PC graphics cards.

The question then becomes why didn't Real3D come out with a PC Graphics card with the full blown specs with 2 million triangular polygons back in 1996-1997. They would have smacked 3DFX/nVidia/ATI? I was under the impression that 3DFX had the lead in terms of technology. I guess poor business decisions lead to it.

By chance, did you follow GPU's back then? I am interested to know the opinions of others as to what they think caused the demise of 3DFX because as I stated they were pioneers and leaders on PC graphics from 1996 to the late 90's.

Anyways, I am glad you included games from the 70's and 80's as I never thought that graphics stagnated for nearly a decade as someone has mentioned.

It seems like graphics started to pick up in the mid to late 80's and proceeded on from their with Wolfenstein 3D and Doom.

When the Real3D/100 was announced in 1996, it was initially hyped up to be the most powerful GPU on the PC market, but then it just kept getting delayed until Intel released it in 1998, by which time other GPU's had already caught up and some overtaken it. Yes, it was down to poor business decisions from Real3D, which, despite having the most powerful GPU at the time, were newcomers to the PC market and failed to make the right business decisions at the right time.

The reason why the Real3D/100 was significantly toned down was because the Real3D-Pro/1000 would have been way too expensive. Considering how expensive a Sega Model 3 arcade machine was back then, I would imagine that a graphics card based on the Real3D-Pro/1000 may have cost in the thousands rather than the hundreds, so the Real3D/100 was a more affordable cost-reduced solution, while still maintaining a lead over competitors in 1996.

I did follow GPU's back then, and remember always wanting a 3dfx Voodoo or a NEC-VideoLogic PowerVR, but couldn't afford them (since I was barely a teenager back then). However, I'm not entirely sure why 3dfx declined. I think a major factor that contributed to their decline was the arrival of the nVidia GeForce 256, which was the first commercially released PC GPU with hardware T&L (transform & lighting) capabilities (previously limited to arcade machines) and no longer relied on CPU power like 3dfx did.

To be fair to the 70's, the graphics were actually progressing quite rapidly, but the progress wouldn't be that noticeable to modern audiences. For example, the arrival of sprite graphics and vertical scrolling in 1974 were major breakthroughs, and then those sprites started getting more detailed around 1975-1976. And then you also had the arrival sprite-scaling in 1976 and of course colour graphics in 1978, which were also major breakthroughs. But yes, it was in the 80's, the arcade golden age, that the pace picked up greatly.

Amazing job, TC. I dont agree with a few of your picks, but overall this is great thread to see.

Edit:

For example the 1992 computer graphics king was Commanche: Maximum Overkill imo.

Just noticed your edit. The pseudo-3D voxel graphics look impressive, but I still stuck with Star Cruiser 2 because it uses true 3D polygon graphics. But as far as pseudo-3D graphics goes, its voxel graphics kind of looks more impressive than Doom's ray casting, so I replaced that with Commanche: Over the Edge for 1993.

@kemar7856 said:

@happyduds77: RE5 had pretty good graphics at the time I was impressed with the character models but it wasn't as good as uncharted 2

Like earlier on with what I was saying about Ryse and Killzone SF, each have their own advantages. Uncharted 2 looks more vibrant and may have better texture details too, but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game, and that's why I gave the edge to RE5, kind of like why I gave Ryse the edge over Killzone SF.

But you did end up with a very crappy list, mostly with crappy shots. It makes it seem like some games looked a lot worse than they did. On the other hand, you presented in gifs a few games, which makes us inclined to believe are you preferred ones for the period.

All in all, a list that's full of bad decisions.

If you just named it evolution in graphics it would be ok. But you had to make the mistake to "name" all those "graphics kings"...

Anyway, congrats again on the thread, it's still very, very interesting.

But you did end up with a very crappy list, mostly with crappy shots. It makes it seem like some games looked a lot worse than they did. On the other hand, you presented in gifs a few games, which makes us inclined to believe are you preferred ones for the period.

All in all, a list that's full of bad decisions.

If you just named it evolution in graphics it would be ok. But you had to make the mistake to "name" all those "graphics kings"...

Anyway, congrats again on the thread, it's still very, very interesting.

Some of the screenshots I posted were rushed, which is inevitable given the 100+ images in the OP. Nevertheless, I've gone through some of them again and replaced a couple with better-looking screenshots.

And by the way, the "graphics kings" in the title was intentional. If I was going to showcase the evolution of video game graphics, it makes more sense to pick out the best I could find (or at least what looks the most technically impressive to me) for each platform type in each year.

@560ti said:

Pretty solid list.

Theres no way in hell I would list goldeneye over final fantasy 7 (final fantasy 7 is debatably the biggest graphic/size jump in the ENITRE history of console gaming).

We went from a 32mb cartridge to 3x disc (not only that but the graphics where better than goldeneye ALL while being a much much bigger game).

I actually did want to include FFVII, simply because it had the best pre-rendered CGI work I had seen in a video game up until that time, but the problem is that its in-game 3D graphics was limited by the PS1's technical limitations. On the other hand, GoldenEye had superior in-game 3D graphics, since the N64 had superior graphical capabilities. While FFVII had the superior production values overall (in terms of art design, CGI cut-scenes, pre-rendered backdrops, game world size, and soundtrack), GoldenEye had the superior in-game 3D graphics.

wait... did you just put RE5 above Uncharted 2??? lol another BIG fail on your list

I've already addressed this point above: "Like earlier on with what I was saying about Ryse and Killzone SF, each have their own advantages. Uncharted 2 looks more vibrant and may have better texture details too, but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game, and that's why I gave the edge to RE5, kind of like why I gave Ryse the edge over Killzone SF."

@flashn00b said:

Wait a minute, shouldn't Sinistar get an honorable mention for 1982?

I've decided not to do any honourable mentions, or otherwise the list would become way more bloated than it already is, with 100+ screenshots already there. 1982 was a competitive year though, because it was a big leap forward in terms of graphics, with games like Zaxxon, Sinistar, SubRoc-3D, Buck Rogers and Pole Position all pushing boundaries.

@jun_aka_pekto said:

Back in the 80's, the arcade version was the golden standard.

If we talk generic computers as PCs, it's pretty close because of computers such as the Amiga, Atari, ST, and Apple IIGS. Games on those three looked great. I recall some arcades were in fact Amigas without the case.

Now if we take the PC as strictly IBM-compatibles, no. Back then, IBM-compatibles were often limited to 4-color CGA or 16-color EGA. Many of them didn't even have sound except for the bleeps and bloops. When I had my Tandy 1000SX, I was able to game with 16 colors and 3-channels ound because some games supported the Tandy hardware. But, the majority of them defaulted to CGA and really crappy, obnoxious sound.

While most IBM-compatibles still made do with CGA/EGA and lousy sound, the Amiga was already doing this. All from two floppies:

Newtek Demoreel 1

Oh yeah, there's something I forgot to ask you before. About the HAM mode showcased in the video, do you know any Amiga games that actually utilized the HAM mode's higher colour count?

2003 should definitely be F-zero Gx. It's even more impressive than Panzer dragoon orta, and also more consistent.

In fact I think this game was the most impressive during the whole generation.

I listed Panzer Dragoon Orta for 2002 though, since that's when it first released in Japan. For 2003, F-Zero GX's more direct competition would be fellow GameCube game Rebel Strike, which is often considered one of the best-looking games of that generation. F-Zero GX also looks very impressive, but I'm not quite sure if I'd call it a Rebel Strike beater.

2003 should definitely be F-zero Gx. It's even more impressive than Panzer dragoon orta, and also more consistent.

In fact I think this game was the most impressive during the whole generation.

I listed Panzer Dragoon Orta for 2002 though, since that's when it first released in Japan. For 2003, F-Zero GX's more direct competition would be fellow GameCube game Rebel Strike, which is often considered one of the best-looking games of that generation. F-Zero GX also looks very impressive, but I'm not quite sure if I'd call it a Rebel Strike beater.

Yeah, I know orta came out in 2002. I mentioned it though because that game was also very impressive but F-zero is more so.

Even if rebel strike was doing more with the core graphics (I don't think it is, it's slightly enhanced over rouge squadron), F-zero runs at an un-flinching 60fps.

but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game

Dont know about shading but RE5 definitely does not have superior lighting than U2. Visualy the all game is blured as fvck. Yes, it was a nice looking game to look at at the time but the fact is, visualy it is very mediocre by todays standards while U2 still holds up very well says a lot wich one is the better looking. Try as spin it as you may but your opinion is just factually wrong. U2 was the superior looking game and thats not even debatable. And I know, I platinum'ed both games (snake01972 if you want to check it) so I spent a good ammount of time with both. U2 graphics were the most praised in 2009. RE5 graphics were only relevant until U2 came out. After that you heard almost no one praising RE5 graphics again and the vast amount of "best graphics" awards U2 won not only over RE5 but several other great looking games should be enough proof that your RE5 does not belong in your list

but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game

Dont know about shading but RE5 definitely does not have superior lighting than U2. Visualy the all game is blured as fvck. Yes, it was a nice looking game to look at at the time but the fact is, visualy it is very mediocre by todays standards while U2 still holds up very well says a lot wich one is the better looking. Try as spin it as you may but your opinion is just factually wrong. U2 was the superior looking game and thats not even debatable. And I know, I platinum'ed both games (snake01972 if you want to check it) so I spent a good ammount of time with both. U2 graphics were the most praised in 2009. RE5 graphics were only relevant until U2 came out. After that you heard almost no one praising RE5 graphics again and the vast amount of "best graphics" awards U2 won not only over RE5 but several other great looking games should be enough proof that your RE5 does not belong in your list

What do you mean by "blurred"? The lighting or the textures? Either way, "technical breakthrough" is the term I used. Uncharted 2 didn't really do anything new in terms of lighting, whereas RE5's cinematic sunlight effect was something almost never seen before in console games.

but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game

Dont know about shading but RE5 definitely does not have superior lighting than U2. Visualy the all game is blured as fvck. Yes, it was a nice looking game to look at at the time but the fact is, visualy it is very mediocre by todays standards while U2 still holds up very well says a lot wich one is the better looking. Try as spin it as you may but your opinion is just factually wrong. U2 was the superior looking game and thats not even debatable. And I know, I platinum'ed both games (snake01972 if you want to check it) so I spent a good ammount of time with both. U2 graphics were the most praised in 2009. RE5 graphics were only relevant until U2 came out. After that you heard almost no one praising RE5 graphics again and the vast amount of "best graphics" awards U2 won not only over RE5 but several other great looking games should be enough proof that your RE5 does not belong in your list

What do you mean by "blurred"? The lighting or the textures? Either way, "technical breakthrough" is the term I used. Uncharted 2 didn't really do anything new in terms of lighting, whereas RE5's cinematic sunlight effect was something almost never seen before in console games.

The game overall felt blurred, going from the textures to character movement and aliasing. Cinematic sunlight? I have no idea what you're talkin about but I dont remember being impressed by RE5 lighting. Still, using one or two "inovative" features is hardly a good reason when choosing wich game had better graphics that year when U2 was factually superior both technically as visually than RE5. And this is not a situation where RE5 has its strenghts and U2 has theirs. U2 factually does so much better than RE5 ever did. The fact its still an impressive game today while RE5 isnt says a lot

but on the other hand RE5 was a technical breakthrough in terms of lighting and shading for a console game

Dont know about shading but RE5 definitely does not have superior lighting than U2. Visualy the all game is blured as fvck. Yes, it was a nice looking game to look at at the time but the fact is, visualy it is very mediocre by todays standards while U2 still holds up very well says a lot wich one is the better looking. Try as spin it as you may but your opinion is just factually wrong. U2 was the superior looking game and thats not even debatable. And I know, I platinum'ed both games (snake01972 if you want to check it) so I spent a good ammount of time with both. U2 graphics were the most praised in 2009. RE5 graphics were only relevant until U2 came out. After that you heard almost no one praising RE5 graphics again and the vast amount of "best graphics" awards U2 won not only over RE5 but several other great looking games should be enough proof that your RE5 does not belong in your list

What do you mean by "blurred"? The lighting or the textures? Either way, "technical breakthrough" is the term I used. Uncharted 2 didn't really do anything new in terms of lighting, whereas RE5's cinematic sunlight effect was something almost never seen before in console games.

The game overall felt blurred, going from the textures to character movement and aliasing. Cinematic sunlight? I have no idea what you're talkin about but I dont remember being impressed by RE5 lighting. Still, using one or two "inovative" features is hardly a good reason when choosing wich game had better graphics that year when U2 was factually superior both technically as visually than RE5. And this is not a situation where RE5 has its strenghts and U2 has theirs. U2 factually does so much better than RE5 ever did. The fact its still an impressive game today while RE5 isnt says a lot

So you mean the motion blur? Even that in itself was a new graphical technique for console games, and plenty of games use that today. As for the cinematic sunlight, what I mean is the intense sunlight effect that gave it the cinematic look of a movie set in Africa. In addition, the dynamic lighting was also something new for console games. When a game features innovative graphical techniques, that to me makes it more deserving of mention in a graphical evolution list. Also, the fact that all the cut-scenes in RE5 are rendered in real-time is another advantage. So yes, I think this is a situation where RE5 has its strengths and U2 has its strengths, in terms of real-time graphics. If we're talking about anything else other than real-time graphics though, then I'd agree that U2 trumps RE5 in almost every other way (in terms of gameplay, story, art design, production values, pre-rendered cut-scenes, voice acting, etc.).

I own both and there is no way anyone with eyes can say Killzone looks better than Ryse. There are certainly some breath taking parts in Killzone but there are tons of spots that would look just ok on a PS3.

"Either way, it looks to me that Ryse had the superior lighting, shading, and polygon counts, while Killzone SF (and of course Crysis 3) had higher resolutions, frame rates, and texture quality. In the end I gave it to Ryse, because it's experimenting with new kinds of lighting and shading techniques, in addition to the high polygon counts, making it feel like a technical breakthrough, achieving the look and motion of a pre-rendered CGI movie in real-time."

"2001: MGS2 looked incredible for its time, but could have looked a lot better if it wasn't bottlenecked by the PS2's lack of anti-aliasing, pixel shading, and bump mapping. And among the Xbox and GameCube launch titles that did have these graphical features, it was Rogue Leader that impressed me the most."

In the end I gave it to Ryse, because it's experimenting with new kinds of lighting and shading techniques, in addition to the high polygon counts, making it feel like a technical breakthrough, achieving the look and motion of a pre-rendered CGI movie in real-time."

Yeah gaming has come so far. I remember those '70s, and early '80s games. My brothers and I thought they were the business. That's why I cant stand talk of PS4 destroying X1 or WiiU. Its such an exaggeration. PS4 destroys late 1990's games, not the games in the same era.

Good list, while there are a game or two, where I might disagree, but this lists just shows how far we have come and in my opinion complaining about graphics is pretty pointless nowadays. Over 20 years ago, when I started gaming I had to use my own imagination to fill some gaps in the graphics.

You mentioned all the genuine greats i was thinking about that are usually criminally ignored like Virtua Fighter 3 and Daytona USA, excellent thread.

Yup, these games had incredible graphics for their time. I still remember the first time when I saw screenshots of VF3 in 1996, I was thinking that can't be real-time graphics, but must be CGI cutscenes, before realizing that it really is real-time graphics when I saw it at the arcades.

@zaraxius said:

Good effort! It would've also been nice to see the evolutions of handhelds, but I imagine this took enough time as it is.

I was actually thinking that myself before. But I didn't go through with the idea because the list is already 100+ images big as it is.

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush did. It easily beat Virtua Fighter 3 as gaming running on 3DFX's Voodoo Graphics vastly looked better than any other games that didn't run on it like Virtua Fighter 3.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as they used 3DFX's Voodoo Graphics chip too which was better than Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

Good suggestion with Unreal. I was actually considering it, but ended up going for the more popular option with Half-Life, without spending much time comparing them side-by-side (since there's so many games to consider). I'll compare them again and maybe update the OP if necessary.

However, I'd have to disagree with your comparison between 3dfx and Real3D. The Real3D/Pro-1000 chipset used by the Sega Model 3 was by far the most powerful GPU around 1996-1997. And the Model 3 didn't just use one, but used a dual GPU setup. While the 3dfx Voodoo was capable of around 300,000 triangular polygons/sec in real-world performance, the Model 3's dual Real3D/Pro-1000 chipset was capable of over 1 million quadratic polygons/sec, and 2 million triangular polygons/sec, in real-world performance, with more advanced anti-aliasing and specular lighting & shading effects to boot. Even the Real3D/100 chip later used with the Intel740 in 1998 was significantly toned down and doesn't come close.

There were other Sega Model 3 games I could have chosen. The graphics of SCUD Race, for example, looks far superior to San Francisco Rush, as far as 1996 arcade racing games go. However, I went with Virtua Fighter 3 because it was a major breakthrough in terms of 3D character model graphics in 1996. And I'm sure you'd agree if you compare it side-by-side with the 3dfx-based arcade fighting game you mentioned, Mace: The Dark Age. And for 1997, I chose Sega Bass Fishing because of the advanced water effects, which again is unrivalled by any other game that year.

Anyway, I'm glad you agree with my other choices though.

Hmm...didn't know that the Real3D/Pro-1000 was the most powerful GPU in 1996-1997. I was under the impression that the 3DFX Voodoo was the best GPU that was running games Arcade/PC wise but now it makes sense. And you are right about intel 740 being used on the Real3D/100 chip and it had pretty good performance/image quality compared to the early generation of PC graphics cards.

The question then becomes why didn't Real3D come out with a PC Graphics card with the full blown specs with 2 million triangular polygons back in 1996-1997. They would have smacked 3DFX/nVidia/ATI? I was under the impression that 3DFX had the lead in terms of technology. I guess poor business decisions lead to it.

By chance, did you follow GPU's back then? I am interested to know the opinions of others as to what they think caused the demise of 3DFX because as I stated they were pioneers and leaders on PC graphics from 1996 to the late 90's.

Anyways, I am glad you included games from the 70's and 80's as I never thought that graphics stagnated for nearly a decade as someone has mentioned.

It seems like graphics started to pick up in the mid to late 80's and proceeded on from their with Wolfenstein 3D and Doom.

When the Real3D/100 was announced in 1996, it was initially hyped up to be the most powerful GPU on the PC market, but then it just kept getting delayed until Intel released it in 1998, by which time other GPU's had already caught up and some overtaken it. Yes, it was down to poor business decisions from Real3D, which, despite having the most powerful GPU at the time, were newcomers to the PC market and failed to make the right business decisions at the right time.

The reason why the Real3D/100 was significantly toned down was because the Real3D-Pro/1000 would have been way too expensive. Considering how expensive a Sega Model 3 arcade machine was back then, I would imagine that a graphics card based on the Real3D-Pro/1000 may have cost in the thousands rather than the hundreds, so the Real3D/100 was a more affordable cost-reduced solution, while still maintaining a lead over competitors in 1996.

I did follow GPU's back then, and remember always wanting a 3dfx Voodoo or a NEC-VideoLogic PowerVR, but couldn't afford them (since I was barely a teenager back then). However, I'm not entirely sure why 3dfx declined. I think a major factor that contributed to their decline was the arrival of the nVidia GeForce 256, which was the first commercially released PC GPU with hardware T&L (transform & lighting) capabilities (previously limited to arcade machines) and no longer relied on CPU power like 3dfx did.

To be fair to the 70's, the graphics were actually progressing quite rapidly, but the progress wouldn't be that noticeable to modern audiences. For example, the arrival of sprite graphics and vertical scrolling in 1974 were major breakthroughs, and then those sprites started getting more detailed around 1975-1976. And then you also had the arrival sprite-scaling in 1976 and of course colour graphics in 1978, which were also major breakthroughs. But yes, it was in the 80's, the arcade golden age, that the pace picked up greatly.

Thanks for the interesting tid bit about the Real3D/100, sounds like it was really powerful chip. Which if you think about it makes sense. Real3D was a division of Lockheed Martin which used to make graphics tech for simulators to train pilots flying fighter jets. I guess it wouldn't be surprising that Real3D/100 was the most powerful GPU in the world at the time as you would have to make GPU's that render real life environments in real time. I guess 3DFX's success initially was that they were able to bring incredible graphics at a price point that people could buy instead of spending $1000+ dollars like mentioned for the Real3D/100. Thus, 3DFX gave off the impression (at least for me) that they were making the World's Most Powerful GPUs simply because they brought consumer grade graphics that people could buy.

As for 3DFX's decline I totally agree with you about the GeForce 256 as being a major factor (or was the start) of 3DFX's decline. The GeForce 256 not only had the features you mentioned but it performed better too. I think it was the first nVidia GPU that actually beat 3DFX's best or got close to it. I like to use the analogy that the GeForce 256 was a sort of a punch that made 3DFX go dizzy and the GeForce 2 released in 2000 was the sort of a knockout punch that 3DFX never really recovered from. In other words, they fell behind in the Graphics Race and failed to bring competitive GPUs to market in a timely manner. By the time they brought out the 3DFX Voodoo 5 5500 it was already beat by the GeForce 2 which came out month's earlier and was struggling to beat ATI's Radeon. The rule still exists today, nVidia and AMD would lose market share if they released one generation of GPU's that wasn't competitive to each other's GPU. Look at the GeForce FX 5800 Ultra or the somewhat weak HD 2900XT from ATI (which despite being weak, it was nowhere near the disaster that the GeForce FX 5800 Ultra was).

In any case, I am glad that you changed the graphics king award for 1998 to Unreal. That game was jaw dropping when it came out. The thing that really "wowed" me was the beginning of Unreal where the camera flew through the castle and wound around it you could see the beautiful reflections on the main entrance way leading to the castle. And as you progressed throughout the game it was a sight to behold. I remember finally completing the game in 2008 and was amazed at how the graphics looked for a game that came out in 1998.

An overall great thread! I think you might be the first person in SW that I totally agree with PC part of Graphics Kings and even manged to get me to agree with the Arcade Graphics Kings. :P

Awesome thread TC! I agree with pretty much everything you had there. However, Unreal was the Crysis of 1998, there is no WAY Half-Life was better looking than Unreal. It used Quake II engine and it wasn't as good as the very first Unreal Engine. I remember when Unreal came out in the summer of 1998 that it got rave reviews for it's graphics reviewers was saying R.I.P. Quake II.

I am glad that you included consoles as those consoles were some of the first ones to use 3D GPU's like the Voodoo Graphics for San Francisco Rush and Mace the Dark Age, you could tell as they had the 3DFX logo, I remember being awed by the graphics when I first saw it. Speaking of San Francisco Rush you got it wrong for 1996 as Virtua Fighter 3 didn't use 3DFX's Voodoo Graphics where as San Francisco Rush did. It easily beat Virtua Fighter 3 as gaming running on 3DFX's Voodoo Graphics vastly looked better than any other games that didn't run on it like Virtua Fighter 3.

As for 1997, San Francisco Rush Rock Alcatraz Edition was significantly better looking than Sega Bass shipping as they used 3DFX's Voodoo Graphics chip too which was better than Real 3D chip that was used in Sega Bass Fishing.

As for 1998, you got it right as I remember Daytona USA 2 looked awesome back in 1998 using the Real 3D chip.

Good suggestion with Unreal. I was actually considering it, but ended up going for the more popular option with Half-Life, without spending much time comparing them side-by-side (since there's so many games to consider). I'll compare them again and maybe update the OP if necessary.

However, I'd have to disagree with your comparison between 3dfx and Real3D. The Real3D/Pro-1000 chipset used by the Sega Model 3 was by far the most powerful GPU around 1996-1997. And the Model 3 didn't just use one, but used a dual GPU setup. While the 3dfx Voodoo was capable of around 300,000 triangular polygons/sec in real-world performance, the Model 3's dual Real3D/Pro-1000 chipset was capable of over 1 million quadratic polygons/sec, and 2 million triangular polygons/sec, in real-world performance, with more advanced anti-aliasing and specular lighting & shading effects to boot. Even the Real3D/100 chip later used with the Intel740 in 1998 was significantly toned down and doesn't come close.

There were other Sega Model 3 games I could have chosen. The graphics of SCUD Race, for example, looks far superior to San Francisco Rush, as far as 1996 arcade racing games go. However, I went with Virtua Fighter 3 because it was a major breakthrough in terms of 3D character model graphics in 1996. And I'm sure you'd agree if you compare it side-by-side with the 3dfx-based arcade fighting game you mentioned, Mace: The Dark Age. And for 1997, I chose Sega Bass Fishing because of the advanced water effects, which again is unrivalled by any other game that year.

Anyway, I'm glad you agree with my other choices though.

Hmm...didn't know that the Real3D/Pro-1000 was the most powerful GPU in 1996-1997. I was under the impression that the 3DFX Voodoo was the best GPU that was running games Arcade/PC wise but now it makes sense. And you are right about intel 740 being used on the Real3D/100 chip and it had pretty good performance/image quality compared to the early generation of PC graphics cards.

The question then becomes why didn't Real3D come out with a PC Graphics card with the full blown specs with 2 million triangular polygons back in 1996-1997. They would have smacked 3DFX/nVidia/ATI? I was under the impression that 3DFX had the lead in terms of technology. I guess poor business decisions lead to it.

By chance, did you follow GPU's back then? I am interested to know the opinions of others as to what they think caused the demise of 3DFX because as I stated they were pioneers and leaders on PC graphics from 1996 to the late 90's.

Anyways, I am glad you included games from the 70's and 80's as I never thought that graphics stagnated for nearly a decade as someone has mentioned.

It seems like graphics started to pick up in the mid to late 80's and proceeded on from their with Wolfenstein 3D and Doom.

When the Real3D/100 was announced in 1996, it was initially hyped up to be the most powerful GPU on the PC market, but then it just kept getting delayed until Intel released it in 1998, by which time other GPU's had already caught up and some overtaken it. Yes, it was down to poor business decisions from Real3D, which, despite having the most powerful GPU at the time, were newcomers to the PC market and failed to make the right business decisions at the right time.

The reason why the Real3D/100 was significantly toned down was because the Real3D-Pro/1000 would have been way too expensive. Considering how expensive a Sega Model 3 arcade machine was back then, I would imagine that a graphics card based on the Real3D-Pro/1000 may have cost in the thousands rather than the hundreds, so the Real3D/100 was a more affordable cost-reduced solution, while still maintaining a lead over competitors in 1996.

I did follow GPU's back then, and remember always wanting a 3dfx Voodoo or a NEC-VideoLogic PowerVR, but couldn't afford them (since I was barely a teenager back then). However, I'm not entirely sure why 3dfx declined. I think a major factor that contributed to their decline was the arrival of the nVidia GeForce 256, which was the first commercially released PC GPU with hardware T&L (transform & lighting) capabilities (previously limited to arcade machines) and no longer relied on CPU power like 3dfx did.

To be fair to the 70's, the graphics were actually progressing quite rapidly, but the progress wouldn't be that noticeable to modern audiences. For example, the arrival of sprite graphics and vertical scrolling in 1974 were major breakthroughs, and then those sprites started getting more detailed around 1975-1976. And then you also had the arrival sprite-scaling in 1976 and of course colour graphics in 1978, which were also major breakthroughs. But yes, it was in the 80's, the arcade golden age, that the pace picked up greatly.

Thanks for the interesting tid bit about the Real3D/100, sounds like it was really powerful chip. Which if you think about it makes sense. Real3D was a division of Lockheed Martin which used to make graphics tech for simulators to train pilots flying fighter jets. I guess it wouldn't be surprising that Real3D/100 was the most powerful GPU in the world at the time as you would have to make GPU's that render real life environments in real time. I guess 3DFX's success initially was that they were able to bring incredible graphics at a price point that people could buy instead of spending $1000+ dollars like mentioned for the Real3D/100. Thus, 3DFX gave off the impression (at least for me) that they were making the World's Most Powerful GPUs simply because they brought consumer grade graphics that people could buy.

As for 3DFX's decline I totally agree with you about the GeForce 256 as being a major factor (or was the start) of 3DFX's decline. The GeForce 256 not only had the features you mentioned but it performed better too. I think it was the first nVidia GPU that actually beat 3DFX's best or got close to it. I like to use the analogy that the GeForce 256 was a sort of a punch that made 3DFX go dizzy and the GeForce 2 released in 2000 was the sort of a knockout punch that 3DFX never really recovered from. In other words, they fell behind in the Graphics Race and failed to bring competitive GPUs to market in a timely manner. By the time they brought out the 3DFX Voodoo 5 5500 it was already beat by the GeForce 2 which came out month's earlier and was struggling to beat ATI's Radeon. The rule still exists today, nVidia and AMD would lose market share if they released one generation of GPU's that wasn't competitive to each other's GPU. Look at the GeForce FX 5800 Ultra or the somewhat weak HD 2900XT from ATI (which despite being weak, it was nowhere near the disaster that the GeForce FX 5800 Ultra was).

In any case, I am glad that you changed the graphics king award for 1998 to Unreal. That game was jaw dropping when it came out. The thing that really "wowed" me was the beginning of Unreal where the camera flew through the castle and wound around it you could see the beautiful reflections on the main entrance way leading to the castle. And as you progressed throughout the game it was a sight to behold. I remember finally completing the game in 2008 and was amazed at how the graphics looked for a game that came out in 1998.

An overall great thread! I think you might be the first person in SW that I totally agree with PC part of Graphics Kings and even manged to get me to agree with the Arcade Graphics Kings. :P

Yes, Lockheed Martin previously worked with the military, which is why the Sega Model 3's Real3D-Pro/1000 chipset was so powerful for its time. Interestingly, arcade rival Namco also worked with another rival military simulator company, Evans & Sutherlands, which produced the the Namco System 22 arcade system's TR3 graphics chipset, which was also powerful for its time in the early 90's and easily rivalled the Sega Model 2's Fujitsu TGP graphics chipset. All of these GPU's were very expensive at the time though, hence why they weren't available to consumers.

As for 3dfx, I agree. The problem with 3dfx was that they weren't able to keep up with the rapid pace of GPU development we saw from Nvidia and ATI in the early 2000's. Same goes for other GPU companies at the time too. Real3D, as we discussed above, released the Real3D/100 too late, by which time 3dfx had caught up. Likewise, NEC and VideoLogic produced the powerful PowerVR2 chipset for the Dreamcast, but they released it too late in the PC market, by which time Nvidia had caught up. Rendition, Hercules and Fujitsu also produced the first non-CPU-based PC GPU with hardware T&L (transform & lighting) capabilities, the Thriller Conspiracy (utilizing the Fujitsu FXG-1 Pinolite T&L chipset, similar to the Sega Model 2 arcade system's Fujitsu TGP chipset), but was eventually cancelled, leaving the GeForce 256 to become the first commercially released, non-CPU-based GPU with hardware T&L. And then with GPU's like the GeForce 2 and especially the Radeon 9700, other PC GPU companies just couldn't compete. In the end, the PC GPU market came to be dominated by ATI and Nvidia ever since the early 2000's.

Good recommendation with Unreal. I wasn't too sure whether to go with Unreal or Half-Life, but Unreal is definitely the better looking game of 1998. Also, Unreal 2004 just so happens to be my favourite FPS of all time...

For the most part, there's hardly been any disagreements over my Arcade and PC choices, besides 1983 (Dragon's Lair snub) and 1998 (where you recommended Unreal). Most of the disagreements mainly come from the console crowd, specifically Sony fans, due to several high-profile Sony snubs (FFVII, MGS2, Uncharted 2, Killzone SF).