Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Vigile writes "Few people will doubt that PC gaming is in need of a significant shot in the arm with the consistent encroachment of consoles and their dominating hold on developers. Today AMD is releasing the Radeon HD 5870 graphics card based on the Evergreen-series of GPUs first demonstrated in June. Besides offering best-in-class performance for a single-GPU graphics board, the new card is easily the most power efficient in terms of idle power consumption and performance per watt. Not only that, but AMD has introduced new features that could help keep PC gaming in the spotlight, including the first DirectX 11 implementation and a very impressive multi-monitor gaming technology, Eyefinity, which we discussed earlier this month. The review at PC Perspective includes the full gamut of gaming benchmarks in both single- and dual-GPU configurations as well as videos of Eyefinity running on three 30" displays."

There are some videos of Eyefinity at work in this article, here is a direct link as well:

Hrm, this reminds me of playing AH-10 Attack on my Mac IIvx with a couple nuBus video cards added and some old monitors. Front screen got the front view; left screen, left window; right screen, right window. It was really quite fun for the time (c. 1992?). I had no idea a generation later the technology had gone missing (after college, game time dried up).

Ah, that's a great idea actually. I'd take it a step further, and hook up a webcam. Let the monitor you're looking towards have a higher DPI, and the others have a lower one the farther they are. Have it adjust this on the fly during gameplay, and it'd be much more like real vision.

But what makes Eyefinity so different or better from Nvidia's tech?For years Nvidia already has tech that allows you to have 2 (or more - quadros support 16) monitors look like one seamless display to the O/S - that's called "span". Span behaviour is undesirable for most people though.

In most usage scenarios, it is counterproductive to have X monitors look like one seamless display to the O/S when the real life image produced isn't seamless - nondisplay edges of the panels etc. So it's very silly to have yo

Time to move up from 1280x1024 displays finally? My hardrive size and processor speeds have gone up 10x in the last 10 years. My screen resolution is unchanged. I think for games, doubling pixel density would be more than noticeable. From their screen sizes could increase 15~21inch standard over 15 years is a pretty sad change for the computer industry.

Time to move up from 1280x1024 displays finally? My hardrive size and processor speeds have gone up 10x in the last 10 years. My screen resolution is unchanged.

As are your eyes. Beyond a certain point, FSAA will increase perceived quality as much as higher DPI.

I think for games, doubling pixel density would be more than noticeable.

Supporting ClearType style subpixel rendering in your FSAA might help too. But the big problem with doubling pixel density is that so many Windows applications other than games are hardcoded for 96 dpi, ignoring Display Properties > Settings > Advanced > General > DPI setting.

15~21inch standard over 15 years is a pretty sad change for the computer industry.

For one thing, desks haven't gotten much bigger. For another, after a certain point, the amount of glass and other materials in a display outweighs the number of pixels in determining price. I went to Walmart* and saw a 32" 720p class Vizio TV for $399 and an otherwise identical 1080p TV for $499. Compare that to the price difference between 32" vs. 42" TVs.

Beyond a certain point, FSAA will increase perceived quality as much as higher DPI.

Absolutely.

As an example, I use a 24" 1920x1200 monitor for my primary display. I can run GTA Vice City (I know it's an old game, bear with me) perfectly at 1920x1200, 32bit color, maximum draw distance and effects. But I think it actually looks better if I run it at 1280x800 with maximum settings and 8xAA and 16x anisotropic filtering added.

Of course, the game was originally designed for the PS2 and meant to run at SDTV reso

My thinkpad has 1920x1200 on a 15" screen, so they exist. You might want to get the 75% more pixels before you complain. Honestly though, the biggest difference in gaming is what's *in* the pixels. Pretty important too, and todays games render in SPF (seconds per frame) instead of FPS on ten year old graphics cards. Or rather on your CPU, because it's doing all emulation in software. 1920x1200 is same resolution as a BluRay which is pretty damn good, if my games looked like that I'd be extremely impressed.

I've been using a 1600x1200 21" LCD for over 7 years now. I recently upgraded to a 2560x1600 30" HP LP3065 monitor and I love it.
I think 1440x900, 1680x1050, 1920x1080 and 1900x1200 monitors (combined) are more common than those with lower resolution nowadays.

Probably just penny pinching to get the sticker price down. But perhaps you've been looking at the wrong laptops. Those in most stores have crappy resolutions, even with large displays. I just hate when they say 18" widescreen LCD, without saying how many pixels. As like as not, it's just 1440x800 or something equally pathetic, but you might have to corner a sales droid to find out.
At home, I've got a 6-year-old laptop, with a 17" WUXGA (1920x1200) display. Since I got it, I'm unwilling to accept any less.

They used to. 1280x1024 is pretty damn decent, especially for a laptop, and was perfectly normal a few years ago. Then some genius decided to chop the top two hundred pixels off, so they could manufacture five screens using the same amount of material that previously only yielded four screens.

His boss pointed out that no one's going to want screens that have about two hundred pixels less resolution than they could get before. So the guy thou

I would much much prefer 2 960 x 1200 monitors to a 1920 x 1200 wide screen, It just helps me group stuff better.

Then get a window manager that will do that for you. All versions of Windows since Windows 95 can do it: control-click on several windows in the title bar, right-click one of them, and choose "Tile Horizontally".

Reading the PC perspective reviews and a couple of others the 5870 seems to be a bit faster than the GTX285 but not by much, and certainly not by a margin one would expect from a new generation of parts vs old.

Admittedly this is all DX9/10 stuff, and there's probably a lot of the transistor budget allocated to new DX11 features but I would have expected ATI's latest offering to have utterly destroyed NVIDIA's last gen part. The GTX 295 is really 2 gpus so it's not really a fair comparison.

I have to amend this. I hadn't looked at http://www.guru3d.com/article/radeon-hd-5870-review-test/ reviews, which seem to have a different selection of games and paint the 5870 in a better light than the pc perspective article. The PCper article I got the impression teh GTX285 was in some cases faster and only 10-15% slower than the 5870 in most cases, the GURU3D test has a more noticeable 25% or thereabout performance boost for the 5870. Better, but not stunning. I would have still expected a 50% or so

"Reading the PC perspective reviews and a couple of others the 5870 seems to be a bit faster than the GTX285 but not by much, and certainly not by a margin one would expect from a new generation of parts vs old."

...I really with they'd come out with decent Linux support. I mean, come on guys, 1280x1024@75Hz is the max screen size you can do with fglrx in your driver?

Would you even begin to contemplate a 20% to 30% increase in R&D expenditure to address what is effectively a non-existent market?

We have a severe chicken and egg problem here. Until games support linux directly the market will stay small. That won't happen if the drivers suck, nobody will put money into better drivers if there is not market.

There is a defacto standard that works relatively well (and keeps getting better) until that changes gaming on linux is king of screwed. But don't go blaming the

And today, they're running OpenGL 1.4 and lots of games work great (OpenArena, Nexuiz, Doom3), and waiting for Gallium3D to change the X acceleration architecture so they can get GLSL and such going. Check the Phoronix forums [phoronix.com] for the current state of affairs. Things change fast in open-source land.

Um, change fast? Right...I bought a Radeon HD 4830 partly because AMD had opened their specs and I figured an open source driver wouldn't be that far off. But, radeonhd driver still pretty much sucks. Installing fglrx doesn't improve things much either (if anything). Part of the reason may be that they are both using DRI. I didn't expect my Linux desktop to be crawling when I bought that 4830, especially since I used to have an NVIDIA 6600GT and KDE with Compiz was pretty spiffy even with that old card! Run

RadeonHD and Radeon both have the same functionality any more. I don't know what you're doing with your system that it's slow, or what you've tried to do, but out of the box, Ubuntu runs quite peppy on even my Radeon 3200HD. Really... did you look through the link I posted?

If you're still running into problems, it could be that your distro is misconfiguring X, and you need to change something in the config [linuxinsight.com].

It takes time for driver development. Good things are in the pipeline. If you don't mind compili

Took the industry long enough to deliver a great 3D perforance and 3 monitor outputs on one card*. If the Linux support is on par with nVidia's support, I look forward to replacing my current nVidia dualhead card and Matrox Dualhead2go box with one of these beauties. Especially since the Dualhead2go "Digital" edition uses an analog VGA input and I can see some faint ghosting on text/sharp lines.

* I know that Matrox had the Parhelia line of 3 monitor cards, but the 3D performance was sadly lacking in those

The summary misses the point of why consoles are gaining so much ground in the gaming world. The main reason consoles are so popular is because the hardware never changes. Most people (like myself) don't want to have to go out and buy the latest and greatest graphics card to run a new game. With an XBOX 360 or PS3 I know that if I buy a title for that platform, it will work. Yes, there are certain exceptions like hard drive requirements, etc., but for the most part it is true. The stability also allows developers to get the most out of the hardware, and generally by the end of a consoles life expectancy, the games are getting very, very good.

There will probably always be a market for the hardcore gamers, but the average, casual gamer would rather play an XBOX 360 at 720P on their big screen than play at double the resolution on a screen a quarter the size.

It's cyclical. When the consoles first come out they look good. But for how long? At some point PC tools and PC hardware make consoles look antiquated. This new hardware is 4 generations later than console hardware, but some of those generations were more die shrinks than anything consumers care about. For the moment we're fluttering around equal quality between PC and consoles, the race to the bottom in PC prices and hardware has meant that trying to make a decent PC only game which both takes advanta

I see two kinds of PC cases at Best Buy: slimline and mid-tower. Mid-towers look out of place next to a TV. Slimline PCs look better, more like an Xbox 360 or an old PS3, but they typically don't come with a graphic card. Instead, they tend to have Intel's Voodoo3-class GMA chipset on the motherboard because they're designed for web and office apps.

A number of companies make HTPC or Media Center cases [newegg.com]. Some of those have room for normal PCIe cards, and some of them require a riser. The real acid test, though, is whether the heat generated by a decent graphics card is compatible with a quiet living room.

Most people (like myself) don't want to have to go out and buy the latest and greatest graphics card to run a new game.

Well that's good, because you don't. You only need the latest and greatest card if you want to play a game at 1920x1200 and still get 120 fps. As long as you don't have to have all the knobs turned up to the max, you can stay one or two card generations behind, and your games will still look better than anything you can get on a console.

The summary misses the point of why consoles are gaining so much ground in the gaming world. The main reason consoles are so popular is because the hardware never changes. Most people (like myself) don't want to have to go out and buy the latest and greatest graphics card to run a new game.

Stop spreading the FUD. If you don't understand PC gaming, which it shows you clearly don't, don't throw around your baseless opinions. Why do you insist that people need the latest and greatest hardware to run the newest games? Where did you ever get that idea in the first place?

When Crysis came out, I ran it on medium settings and the game worked fine and looked fine. So where was my need for the latest and greatest hardware there? The fact is, people routinely play newer PC games on dated hardware too. T

The reason consoles are gaining so much ground is no one wants to waste money on the PC. Why spend millions of dollars on developing a title when 25% of the user base is going to pirate it anyways?

Not every developer is interested in "spend[ing] millions of dollars on developing a title". If you don't have millions of dollars, the PC can prove more profitable because there's a lot less overhead in obtaining a PC devkit than a console devkit.

do you really think the dev kit cost is significant, alogside code/ressources/marketing ?

Console makers want to see a "secure facility" and "industry experience" before they'll even talk to a developer. A "secure facility" is at least a leased office, not your basement, attic, or garage. "Industry experience" is either a previous commercial PC title or an internship at a major video game developer in another state. A team of part-time developers with day jobs outside the video game industry is unlikely to have those.

The reason they are gaining ground is because they are a lot of people like me. My computer: P4 3gz, AGP video card slot, 2 gigs of memory (not sure type anymore), etc. I think it is about 5 years old or so. Does all my work without a problem. Can't play any PC games at any reasonable performance level that have come out in the last few years.

So do I:

A) buy a new computer plus invest all my time to get it up and running with my applications, settings, etc, or

A) buy a new computer plus invest all my time to get it up and running with my applications, settings, etc, or

B) buy a console (PS3 and 360 have had good price drops) plug it into my TV and I am playing a game in 10 mins. and saving at least a couple hundred bucks in hardware and a lot more in time savings.

C. Go to Dell.com, order a Inspiron slimline desktop PC with NVIDIA graphics and no monitor, and plug it in to an HDTV. Use this PC only for gaming, video playback, and web surfing; use your existing PC for the your work. There are numerous worthwhile games for PC that will never be ported to any console due to console makers' policies against part-time development.

It's a bit old, but the "Call of Duty 2" game had a "high quality" rendering mode that was leaps and bounds above normal quality. Graphically, that is. I'm not sure that it provided any benefit except to look pretty.

Yes, $380 for a video card that provides graphical performance that well supersedes the capabilities of the PS3, and possibly even the PS3's successor. Or you can actually compare a video card with very similar capabilities to the card in the PS3, the NVIDIA RSX "Reality Synthesizer" with a 550MHz CPU and 256 MB of DDR2, which would be an NVIDIA GeForce 9400 that you can pick up for about $50.

He's just saying that if you want to experience games as they are on a ps3 or xbox360, in all their low-resolution, non-AA, 30fps glory you can just buy a $50 card and stick it in your $200 pc. I have quad-sli gtx295 because I want my games to look the best and run at 60fps.

You seem to be contradicting yourself here. First you say that you were not talking about image quality then you go and try and refute his argument that a PC card can cost just $50 by saying that then he will not be getting a good piture quality.

So with a PC you have a choice between spending a little like a console (the $50 card) and getting a low quality option and you have the choice of a $380 card and all the other costs for a really good machine which is much better than the console. So in summary wi

Anandtechhttp://anandtech.com/video/showdoc.aspx?i=3643 [anandtech.com]"At the end of the day, with its impressive performance and next-generation feature set, the Radeon HD 5870 kicks off the DirectX 11 generation with a bang and manages to take home the single-GPU performance crown in the process. It's without a doubt the high-end card to get"

Techreporthttp://techreport.com/articles.x/17618 [techreport.com]"Well, Sherlock, what do you expect me to say? AMD has succeeded in delivering the first DirectX 11 GPU by some number of months, perhaps more than just a few, depending on how quickly Nvidia can get its DX11 part to market. AMD has also managed to double its graphics and compute performance outright from one generation to the next, while ratcheting up image quality at the same time. The Radeon HD 5870 is the fastest GPU on the planet, with the best visual output, and the most compelling set of features. Yet it's still a mid-sized chip by GPU standards. As a result, the 5870's power draw, noise levels, and GPU temperatures are all admirably low. My one gripe: I wish the board wasn't quite so long, because it may face clearance issues in some enclosures. "

"Few people will doubt that PC gaming is in need of a significant shot in the arm with the consistent encroachment of consoles..."

I know I don't count, but I resent the assumption that everyone cares. I don't care. I'd never buy a console to play games other than Wii sports.

I assume GPUs will get better and better, as will CPUs, and I'll benefit But I'm still playing StarCraft 1, and I just want a higher resolution interface for the same screen -- I know people think it affects the balance, but I'd like to see the zerglings when they're a little further away.

I don't think PC gaming needs a shot in the arm. I think it needs well designed games that stand the test of time.

But it would be nice if we could get the kind of power we can get for a reasonable price (sub $1000 PC including graphics) today to run cool without fans.

What Nvidia card is this? I have a (stock) passively cooled 7950 GT, but I haven't been able to find any newer cards that come stock passively cooled and aren't junk (ie, low-mid end cards at best). Unless you mean watercooled, but I'd rather stay away from that.

I think it needs well designed games that stand the test of time.But it would be nice if we could get the kind of power we can get for a reasonable price (sub $1000 PC including graphics) today to run cool without fans.

This. 1000 times this.

I have no interest on playing on a platform where I am forced to invest thousand of dollars every couple of years.

I would be interested on buying a fun game that runs on the computer I already have!

Seconded. I played lots of games with my 4670 at 1680x1050. Now that I have two of them, there aren't a ton of games I can't play at that res, even on decent settings (AA, and so on), and I paid less than $100 total for the pair of 'em.

Thirded (? is that a word?). I have a fanless 4670. It plays the games I want fine. Not everything is maxed, but I play at 1680X1050 on the monitor fine. I have hooked the computer via hdmi to a 1080p tv and it looked great there as well.

If one does their homework before buying they are usually better off. This machine I was going for tv hdmi connectivity with it being a DVR. So I got a low power (no extra power plug) 4670 and a fanless one for less noise. It plays my games as well as my the 8800gt I have.

Most of the world has a computer barely capable of 3D graphics. If you want a game like that, look into World of Goo, Braid, Mahjongg, a number of board games, or something similar. There are new games, what in the hell are you bitching about?

If your physical hardware doesn't support 3D, THERE IS NO MAGIC THAT WILL CHANGE THAT! That said, new machines are coming out with better graphics all the time. Even entry level machines have more capable GPUs in them. My main point was that it is cheap as shit to ge

It turns out the card has the same power consumption as the 4870 at load (3dmark), not to mention exceptional idle power. Way to go ATI, I could have sworn TSMC's 40nm bulk CMOS (no metal gates) would have raised leakage, but this proves me wrong!

My take away from the reviews is that it is significantly cheaper than Nvidia's current top of the line single-card solution while offering slightly better performance with a more modest power draw. In another year or two, we'll all be able to play Crysis with all the eye candy turned on.:)

Since you mention it... about a year ago, a friend built a new gaming PC and put a Radeon 4850 (basically 1 generation older than this) in it with 1GB of VRAM. His "test app" for the new system was Crysis... enabled DX10, maxxed every setting, and played through the whole thing. At 1280x1024 his display isn't exactly super-high-res, but the framerate was high enough to be completley unnoticeable in almost every part of the game. A 4870 (same GPU at a faster clock speed) could probably have handled the whole

Nvidia seems to be between a rock and hard place. AMD is nudging it out of the limelight in the graphics marketplace and Intel and AMD are nudging it out of the market for motherboard chipsets...with Intel doing so more aggressively.

A middle-tier ARM SoC provider competing against TI, Freescale, Qualcomm and Samsung for the media player market, with a sideline in high-end compute and graphics boards that exist as a technology testbed for said SoC products?

Yeah, I have to agree: I don't see Nvidia dying anytime soon, but I have to say that (barring some impressive new market), their days of growth are over.

Intel has locked Nvidia completely out of the Intel chipset business, destroying one of Nvidia's major market segments (who buys Nvi

"Yeah, I have to agree: I don't see Nvidia dying anytime soon, but I have to say that (barring some impressive new market), their days of growth are over."

Doubtful, no one other then AMD is able to succesfully compete in the graphics market, also AMD does not have any GPU's or ultra mobile devices, and that market is simply enormous and nvidia is hoping with their next gen mobile chip to get into everything from phones to portable video players, that is a growth market. High perf, low power graphics chips

I've had exactly the opposite - if I ran anything with the GPU my ATI card machine BSoD'd. I sent the card back twice (to the manufacturer, Sapphire - I didn't have problems until about 30s-5 minutes in and didn't realize it until way past when I could return it to the store (and normal VGA tasks like my life at the time, VPN and then remote desktop to virtual machines, worked great - I really do hate crunch time in the computer world). I've planned to check driver stability for months now, but haven't re

PC Gaming may or may not need a shot in the arm, but it isn't because of graphics. It's most likely the plug-and-play nature of games and the dumbing down of controls. Not that ASWD is particularly complex, but if you throw in mouse-look...it's basically like rubbing your tummy and patting your head. Not everyone's up to the challenge.

Yes, in consoles we only use one hand... no? Then we use the two hands to do the same thing... no? Oh, that's right, we use one hand to control a joystick and another to press buttons.

I hardly see this as being much easier to use than WASD+Mouse looking. On the contrary, even though I've started playing in my N64, I have ever find Mouse looking much more intuitive than Joystick looking.

All those reveiws and not 1 of them tested a Lynnfield chip that I could find to see if the dual 8x pci-e slots get pinned when running a DX11 card in SLI. Not one review used a typical median computer that someone would currently own.

So after all those 'reviews' *cough advertisements* we still don't know if someone with a Core2 Duo at 3 Ghz can even feed that card effectively. No DDR2 systems, no Quad Core Core2 running DDR3... just the usual i7 Etremes that tell typical consumers anything. We don't know, after all those review if it's even worth buying based on a typical machine. ZZZzzzz....

If anyone can find a Core2 system tested with this new card let te rest of us know if any of us who don't own $1000 processors get a benefit...

It's called augmentation. You buy the top of the line in cpu/motherboard one generation, then gpu the next generation. There was a time when those core2 quads were top notch... and they'll still beat the 920 in a few tests, nothing to worry about there. On the other hand the Duos haven't been top notch since about 2006, so it's hard to say how much that cpu will hold back your new graphics card.

I consider the p55 limitation there intentionally to avoid solid sli or crossfire performance. It would still b

They use the high end i7's to try and isolate the GPU. They aren't testing the CPU but the GPU. If you look at most benchmarks when they have low settings even with a i7 the GPU's all look the same. If you turn on AA and such the GPU gets to shine and the CPU backs off.
Simply put if you CPU limit your benchmark then you have not benchmarks your GPU's.

Personally, DDR2 vs DDR3 has WAY too many variables like RAS, CAS, and RAS-to-CAS delay, burst memory sends, etc. When I had my brother do this analysis for me (he designs RAM...) he found that in many cases DDR2 was smoking early DDR3 in random access and was only slightly slower in burst. That was until I found a CAS20 DDR3 chip running at 1600MHz in my price range (which sealed the deal for me going with DDR3). Just letting you know, though - if you have fast DDR2, it may be faster than early DDR3.

The review/source contains no information that's even remotely useful to those of us who look for video cards that are quiet, do not reach absurd temperatures (anything above 60C under load is considered absurd; do people realise just how hot 60C is?), and do not have excessive power requirements.

All I've seen after reading the review is a bunch of snapshots stolen from a PowerPoint presentation with said "technological improvements", and some graphs indicating the card draws less watts than competing cards.

Given the size of the HSF (it's full-length -- look at that sucker!), I'm inclined to believe it runs hot. Given the size of the HSF, I'm also inclined to believe the card sounds like a mack truck barrelling down the highway when under load. Finally, given that the card has two -- count 'em, two -- PCIe 6-pin power connectors, this indicates the card requires at least 24V (e.g. two dedicated 12V rails), and God only knows what its amperage requirements are. Then take a look at it's price.

I feel like the only one on this planet who cares about the amount of heat hardware emits, the amount of power it draws, and the amount of noise it makes. Instead, it appears that the "i gota haf 50829fps in WoW!!!!1!! fag!!!11" gamers have taken over technological evolution and turned it into what Intel during the days of the original Pentium 4. Are there others here who have the same reservations about this kind of hardware as I do?

You may just be the only one who still bothers to read the article hoping for useful info.

New video card tech reviews are, almost always, all about vicarious genital measurements. Benchmarks for FPS, raw computational capacity, shader support, etc. all abound - as if it were 1996 and the high-end was still competing for mere adequacy.

It's not that 'the 1337' have taken over the tech evolution, it's that they're the only readers left for those publications as their focus became less relevant for the normal m

The kind of shot in the arm that PC gaming needs isn't at the high end but at the low end. If something better than Intel graphics became common on slimline PCs (as opposed to bulky towers), that would open up the market for gaming on home theater PCs.

The kind of shot in the arm that PC gaming needs isn't at the high end but at the low end. If something better than Intel graphics became common on slimline PCs (as opposed to bulky towers), that would open up the market for gaming on home theater PCs.

The really great news about this card is that it's relatively inexpensive compared to what most top end cards cost at launch. The 5870 is going for $380 just about everywhere, while typical high-end cards launch closer to $500. I hope this is an indication that prices will drop across the board and therefore affect the low end, as well. As far as better graphics getting in to SFF PCs, we've long since left the realm of the "sane" when it comes to thermal requirements on decent graphics chips, but if you

Uh-huh. And a burgeoning community of console hackers, software mods, firmware hacks, millions of modchips sold, Nintendo DS flashcarts selling faster than actual Nintendo DS consoles, and a modchip installation store in just about every organized flea market and farmers market in the country proves that this is a problem endemic only to PC gaming.

I must be few people, as i doubt PC gaming needs a shot in the arm. The way i see it PC gaming has its market and consoles have theirs. For single seat games it is still (and always will be) the shit, except for a short period around the release of new consoles it is not lacking in the hardware department. The same way that the Wii didn't eat into "real" console sales, i doubt the console are eating into pc game sale, what they are doing is being played by a huge market of people who regularly enjoy playing with friends in the same room. It could be argued that PCs lack the software to play multiplayer in the same place (because the HW is there to do it with emulators), but tbh if your going to do that you need to plug it into a TV so either its expensive (laptop) or pointless (if you have a dedicated gaming box connected to you TV why not just call it a console).If you don't play with local mates -> PC gamingIf you play with local mates --------> Console gamingIf you only play with local mates --> Casual Gaming

Despite these categories overlapping in terms of both games and players, they do not directly compete much.

If I play with local mates, but I also want to play (and possibly even make) mods, then what?

Those aren't mutually exclusive, you're just in 2 groups.

But are there any games that serve the two groups? Or are you thinking of one set of games to play with local mates and a separate, disjoint set of games to mod and play by myself or with remote mates?

I don't know about you, but I think the "shot in the arm" PC gaming needs is a serious divergence from console gaming in terms of titles, but it needs to take a big cue from console games in terms of fitting game design to the platform at hand.

Here's a useless antecdote: Need for Speed Shift just came out. Yay me, I love Need for Speed. So I bought it for my PC, which has an SLI pair of not-to-terribly-old nVidia graphics cards and should be pefectly capable of playing Shift. Surprise! It doesn't work. Pres