Share this:

My significant other has been away a bit lately, and being the wanton reprobate that I am, I immediately took advantage of this unmonitored freedom. Then I got bored of wandering around the shops in my underpants while bellowing David Bowie and Bing Crosby’s Little Drummer Boy/Peace On Earth at pensioners, so I devised some other way to indulge myself. But what?

I had it. The sofa was mine, all mine. The television was mine, all mine. I need no longer be banished to my tiny, airless ‘study’ to play PC games. I lugged my brute of a system (purely in mass, not in power, alas) to the living room, shivered at humiliating recollections of abortive, time-wasting similar attempts past, and set to work.
This time around, it was an immediate success. In the past, I’ve either had a horribly fuzzy picture because I was trying to feed a high-resolution PC video signal into a standard definition TV via a composite or S-Video connect, or I’ve spent ages dicking around with scaling and custom resolutions because the graphics card and the HDTV didn’t see eye-to-eye on picture size. Then there was the problem of getting sound from the PC to the TV or amp, and the mouse and keyboard, and the resultant mess of cabling I had to put away every time the PC was restored to its dingy place of origin.

A recent (but cheap) graphics card upgrade to an NVIDIA 560 (non-Ti) had disappointed me by not being up to Serious Sam 3 with all bells and whistles belling and whistling, but its HDMI output entirely made up for the performance shortfall. No delving into the darkest depths of NVIDIA’s drivers to get rid of over or underscan, and as this card also had its own basic onboard soundcard, the HDMI carried an audio signal too: no more cables required. Most non-budget cards of that kind of age (and the preceding generation – so really we’re talking NVIDIA 400 and 500 series and AMD 5x and 6x series cards) should offer similar.

And that, basically, was it. I wot-I-thunked Assassin’s Creed: Revelations this way, with the console port trend of supporting 360 pads out of the box making controls a cinch, and it looked absolutely magnificent. The prettiness of PC, the scale of console.

I happen to have a console copy of the game around too, and even sat back a few feet on a sofa, the addition of ramped-up antialiasing, anisotropic filtering, ambient occlusion and all the other PC version pimp-ups running at 1080p, made for a game that looked dramatically better on computer. I’ve also been playing Skyrim this way, with a bunch of tweaks and mods applied, and I’m pretty sure any console-only Skyrim player would weep at the sight of it looking like this on a 40″ screen.

That’s the full and only strength of my point, really. Just wanted to recommend you try it if you haven’t already. It’s been a possibility, one way or another, for years now, but it feels like the tech’s finally reached the sweet spot wherein it’s as easy as connecting any old set-top box or console to your HDTV.

A few things to watch out for:

Fonts will seem squinty-small at 1080p, especially if your sofa’s a fair way back. You can bump up the Windows font size to 125 or 150% without altering overall resolution in Control Panel – Display.

Only more recent (and often pricier) graphics cards have HDMI outputs. And not many tellies have DVI, which may mean you’ll end having to use VGA, which offers a lesser picture quality and a greater risk of resolution irritations. So, your best bet is to pick up a DVI to HDMI adaptor and use that. Here’s the frankly overpriced Amazon offering, but with a bit of searching you should find somewhere that has ‘em for only cost a couple of quid/dollars. Bear in mind this won’t magically add audio to the output though – that only happens on cards that natively have an HDMI output and their own sound chip onboard. So you’ll still need a separate cable from your standard sound output to the TV or amplifier.

If you’re getting over or underscan problems – where the picture just isn’t fitting right on the screen, have a look in your graphics card’s driver control panel, which has options for tinkering. This may mean you end up running Windows at weird resolutions, like 1920×996, and in turn some games may not support this. Powerstrip is another option for precise screen size/resolution tweaking.

A wireless keyboard and/or mouse are the neatest and most practical way of controlling your setup given the likely distance between you and your PC, but if you don’t want to buy new hardware you can pick up USB extension cables cheaply that will help stretch your wired keyboard and mouse as far as required.

If the PC can’t reach its router from its new living room position, you can try wi-fi – again, cheap USB adaptors are easy to come by. I’m personally a fan of Powerline Networking though – it generally offers a more stable, faster connection than wireless. This cheapie should do the job.

Oh, and you might want to buy or create some sort of lap tray to balance your mouse and keyboard on. You will of course look like a toothless old person about to tuck into their watery TV dinner, but it’s much better than trying to waggle the mouse across your thigh or the arm of the sofa.

133 Comments

I originally hooked my graphics card (GTX 480) up to my normal 22″ LCD monitor with an HDMI Cable. Doing so incurred a framerate penalty of about 10 FPS under all circumstances. I run with VSYNC on because I can’t stand tearing and the best I ever got was 50 FPS over HDMI, even in games in which I’m normally capped at 60.

I’m not sure if this is a function of the transmission format (HDMI vs. DVI) or the display itself and its limitations regarding receiving HDMI signal. Did you experience any kind of framerate degradation going over HDMI?

Assuming you’re using a PAL/SECAM HDTV standards top out at 50hz. If you’re running with VSYNC on with a Euro TV you’d be limited to the refresh rate of 50hz. That’s my guess at least. I don’t know of any performance penalty in using an HDMI out on a graphics card.

I think this will be the wave of the future. I did this when the power supply of my old monitor finally burned out, and I wasn’t sure what to replace it with–thinking about a 3D monitor but wasn’t sure of any that were out there. Anyway, it works great plugged to my TV.

Yes, it did cross my mind that I haven’t done that since I owned a Vic-20. Ah, good times. The more things change…

On topic: I use HDMI, and I think it looks great, plus is is convenient having sound and video through the same cable. With full HDMI, the TV goes into an alternate video mode when I run video full screen, which looks a lot better than the text mode the computer uses otherwise.

The problem with HDMI is that the computer won’t send a BIOS signal through the HDMI cable (I think the reason is the HDML signal does not support 640×480), so for adjusting fan speeds and overclocking, I had to get a separate DVMI -> VGA cable just to access BIOS. Comparing the two, HDMI looks generally better, I think, at least on my setup.

While towers fit great under the desk, they kind of stick out in the living room. But playing games and surfing the internet (Google Chrome comes highly recommended for its zoom feature) on a recliner with the mouse on a side-table works great, so I think we will see it more and more, and I wonder what will be the shape of PCs to come.

I believe whether or not you can see your BIOS with an HDMI cord is dependent upon your motherboard. I recently built a new PC and noticed in some of the reviews the comment about being able to see BIOS, but with my gigabyte motherboard and HD 6850 connected with an HDMI cord to my 26″ TV I’ve been able to do anything I need within my BIOS.

Euro TVs are perfectly capable of accepting and displaying 60Hz signals just as well as 50Hz ones; just because that’s the broadcast television frequency doesn’t mean television are locked to that and that alone. Blu-rays, games consoles (e.g. Xbox 360), region-free DVD players etc all play on “PAL/SECAM” televisions at 60Hz with no issues whatsoever (and 24Hz too).

Moreover, it’s been like that for years, I bought CRTs in the 1980s that supported 60Hz and every one since; I’ve when I’ve been in USA in the past that I’ve had a problem because tvs would only work on 60Hz and that alone.

What is a problem though is Microsoft.

The problem is that they mandated that HDTV drivers must use the default refresh rate in the EDID data that the HDTV returns to computer when 3D gaming, and in the case of European displays this is 50Hz, but not in the desktop. Furthermore, you cannot override it in the driver you can only set desktop refresh rate to 60Hz and as soon as 3D kicks in it’s 50Hz.

My gaming PC has a 37-inch Pioneer Elite LCD plugged into it and I use it for my gaming via HDMI connection — as well as two 30-inch monitors for when I’m working.

Fortunately some games (such as the CoD series) let you select refresh rate as well as resolution, which fixes the problem. Some people play 3D games in a window, which means the 60Hz desktop refresh stays.

I use ATI Tray Tools and force my PC to always output 60Hz, which gets round Microsoft’s stupid limitation for 3D games that they must only obey the default EDID and not the supported resolutions listed under EDID.

I run my daily setup on my desk through an HDMI connection and it shows the boot screen just fine. I’m using an Nvidia 560Ti, but the biggest issue might be with the motherboard. I’m using one of those new motherboards with the new replacement for traditional BIOS (UEFI).

Ok, you need one of these particular thumb-trackballs to solve all your mouse tray problems. You can set this thing comfortably on your stomach while reclining with more-than-questionable posture. The wireless range is pretty good (better than any of the wireless keyboards I’ve tried, anyway…I’ve gone back to a wired keyboard and this mouse — might need one of those USB extender-thingies you mentioned!)

I have to agree. Since my impending divorce, I’ve become quite acquainted with my PC being on the floor. The M570 by logitech has served me well! Granted I’m still gaming on my monitor (23″ Samsung PX2370), but considering the arrangement, I’m more in a television posture/distance.

Now if only they would make a wireless G13, I’d wager I’d be a VERY happy man.

Been using my 46 inch LCD as a monitor for almost 3 years. I’ve thought about going the trackball route, but for me it’s a middle-ground I don’t need. I use a 360 controller and an Xpadder profile for the vast majority of my Windows control and my mouse-pad is clipped to a clipboard with the mouse cord strung though the top of the clip. It’s all wired, plugged into a powered 4 port USB hub and daisy-chained on 10 ft USB extension cable. A 360 controller is less accurate but more convenient then even a wireless trackball. For me, a trackball takes up more space and is less versatile in use than a controller, and I would probably still use a mouse for FPS and other games where I need tight control as my skill with a mouse far outweighs my skill with a trackball; so for me, it’s kind of a useless middle-ground.

I have friends who use this, and if it works for you, go for it; but I doubt I’m the only one who prefers controller and mouse to trackball.

So, your best bet is to pick up a DVI to HDMI adaptor and use that. Here’s the frankly overpriced Amazon offering, but with a bit of searching you should find somewhere that has ‘em for only cost a couple of quid/dollars. Bear in mind this won’t magically add audio to the output though – that only happens on cards that natively have an HDMI output and their own sound chip onboard. So you’ll still need a separate cable from your standard sound output to the TV or amplifier.

I actually have my PC hooked up to my TV, but it’s a bit different. I have a 24-inch TV in my room that I use as my monitor. I have a gaming chair in front of it. I have the keyboard on my lap. And I have a stack of books with a mousepad on top, for obvious reasons. It’s awesome even though it’s kinda hard to get up from the chair.

Ok which kind of new cards don’t come with HDMI outputs. My ancient GeForce 8600GT had two of those. My new GT440, yes not a gaimng card, but my finances were really thin after I bought a new car and compared to 8600GT, have one of those ports. What is your new card Alec?
On other hand this weekend I will try it and I hope for fantastic look compared to my 17′ monitor.

I’ve only got a laptop able to play games atm so will second this as i have to use a pad for pretty much everything. Couldnt believe how easy it was – just stick in an hdmi cable and you’re away although sadly i cant max out the res.

I know they can’t be used on existing accounts, only to create new trial accounts.

If you want one though I can hook you up – just drop me a PM on the RPS forums (username MiniMatt there too) so I don’t have to publish an email addy and see it plagued forever more with penis extension pills and the financial affairs of sub-saharan royalty.

I’ve used Powerline networking for about five years now – definitely recommend it, and is far easier than having to faff around with multiple wifi signals/configurations from my Xbox, PC and whatever else I start plugging in.

The only issue I’ve had in five years is this summer during one of the hottest days of the year, one of my plugs seemed to overheat and freeze on me while watching streaming video, so I was going to buy some new ones (I’ve had the same ones for said five years anyway to be fair, and I know a few people who change their wireless routers more often than this) but after making sure the area they were in was better ventilated, they haven’t done it since.

I have the same powerline adapters posted and they are AMAZING. I’ve gone from a stingy one bar of wireless to ethernet-esq bliss. My flatmates tend to steal them if I go away too as they leave their wireless connections in the dust.
Best investment I’ve made in a number of years! (I live on the 2nd floor of a thick walled student house)

We’be actually got something like this set up, too. Only ours is done through an HDMI switch in our study, which then leads to our fancy 46″ TV through a 20m HDMI cable I’ve lead through our walls.
It’s mainly used for guffawing at those old fashioned predetermined game thingies, though: moving pictures, I think they’re called.

I did have a lovely experience playing Bastion and BioShock 2 with a 360 gamepad that way, though. And watching my wife occasionally trying to competetively play FPSes agains people using proper control methods is always fun to watch, be it through her failure or success.

I’m considering this. Thinking about getting a home theatre receiver with lots of in-ports; HDMI and others. So all sources (decoder, DVD, computer) go to that, and then there is only one cable from it to the TV, and all audio goes from it to speakers. Instead of my current setup where there is a baffling number of different cables going between sources, TV and stereo.

Eagerly awaiting the rumoured Steam update which handles gaming on second TV screen – adjusting resolution automatically, piping the game to that screen and so on. Last time I had to fiddle around with that manually.

I really want to experience Bioshock Infinite on a big screen and full surround sound. Remains to see if I have to use a xbox controller, or if I can get wireless mouse/keyboard positioned so I can use them in the sofa without hunching over or keeping arms in awkward position.

I love having PC games available on my telly, and having the telly wired up to my speakers for fab audio. I used to do it with separate VGA and audio cables (and in my previous apartment these were running out of my office window and back in to the games room window, so I had to get super-long ones), but similar upgrade to my PC (ATI 6770) has allowed me to ditch a them in favour of a single HDMI cable, with a nice boost in picture quality as a result. I’ve now taken to keeping up a wireless mouse and keyboard next to the sofa.

The only downide I’ve encountered, and it’s one that Alec mentions, is that some games have teeny tiny text. Civ 4 is chief offender for me in this regard. It’s lovely to play strategy games from the comfort of a sofa, but I feel like I need opera glasses to be able to read some of the text. I had expected Civ’s font sizes to be exposed somewhere in its mass of XML config files, but it wasn’t to be.

I used to be just Batman on Kotaku cause of a dumb joke. Someone asked a rhetorical question “who are you to say ___?” So I switched my screen name to Batman and posted the pic. Then people kept asking questions to Batman and I started joking that I’m not Batman, but I play one on the Internet. You know, rephrased that ancient joke? Anyone? *Crickets*

You just need a proper armrest or large enough cushion. That way you can use your regular mousepad and a position as if laid back on your office chair. All our LAN sessions are like that, 3 guys on couches and one in the chair.

Oh, but this one does show up, to make me look silly.
Very well, I’ll paraphrase my initial post:

We’ve actually got something similar set up, permanently, while maintaining our usual PC screens and location. I’ve set up an HDMI switcher in the “study” for us both, which feeds into a 20m cable, which leads to our fancy 46″ LED TV, through a wall.
We use it mainly to watch old fashioned predetermined games, though: moving pictures, I think they’re called.

I did have a lovely experience playing BioShock 2 and Bastion with a 360 ‘pad there, though. And it’s always fun to see my wife occasionally trying to competetively manshoot online that way, against people with proper control methods, be it through her success or failure.

Then I tried to paraphrase my post as a reply to this one, and it still won’t show up. I’d decided to ctrl-C it in advance and retry, but now it just says it’s a duplicate comment, without the original being there.

My system is hooked up to a smallish HD tv by standard, although bizarrely I have to use either the VGA, or a DVI-HDMI converter. Hooking things up directly via the HDMI out results in the TV getting all confused when I try to feed it its native resolution (1440*900). Its all very bizarre as the input used on the TV doesn’t change, and my card claims its outputting 1440*900 via HDMI just fine.

I have my PC connected to a 40″ as well. Skyrim supports a 360 controller natively (it’s in the options) or you can use a Dualshock 3 and Motioninjoy to spoof a 360 controller, which is what I am doing. Works great!

After reading some of the comments I’m planning on investigating Power Line Networking for my parents (whose 3 story house, with the modem, router and desktop in the basement) isn’t particularly conducive to wi-fi.

Does this PLN stuff turn your entire house into a big router? That is, can I have two PLN adapters on separate outlets going to different devices? Or if you plug your modem directly into the wall, do all of the outlets become basically one connection?

it doesnt turn it into a big router, it just means the ring mains is one big patch cable, so you plug one end next to the router and then you can use any plug socket to connect, or you can buy extra homeplugs to add more.

I’ve been using a 32″ 1080p TV as my main monitor for a while. My old 4870 only had DVI out, but had a built-in sound card and came with a DVI -> HDMI adapter that added the sound by magic (or clever wiring).

The major malfunction was AMD’s always-awful drivers never remembering overscan settings, often even between alt+tabs, which was solved by this little tray program:link to forums.amd.com
And by it occasionally deciding that 1920×1080 meant interlaced and not progressive, which was fixed by force-adding 1080p as an option in the relevant window (despite the fact it would already use it in most places).

But since I’ve upgraded to a 560ti everything has been lovely and hassle-free.

Been doing this for a year now (with the added bonus of 7.1 surround). My computer’s set up near the TV too though so if the girlfriend wants to watch telly a quick press of Win+P and I can go back to the monitor & hunch over the keyboard with my headphones on until XFactor is over.

Got XBMC running off it too – my XBox has been all but retired since setting this up.

I recently bought a 570 based system specifically for the sitting room. With PC games looking far better than console games (I have a PS3 and 360) I thought it was time to dive back into the world of drivers/patches/crashes et c ;-) My 42 Panny plasma really makes Skyrim look epic and together with the home cinema really is immersive (although the g/f insists on wearing heaphones when she is about :-(). Until the next gen of consoles show up this is my platform of choice.

I’ve had my PC connected to the TV for the past 2 years, now. At first it was a bother to get sound to come out on the TV due to some Nvidia bug, but now, with Windows 7, a spiffy Pioneer amp and a 5.1 speaker system those sound problems are a thing of the past.

Also, If you don’t like the unsightliness of a PC desk right dab in the middle of your living room, you can also get a closed PC desk/cabinet. I have a corner one from The Swedish Furniture Store.

I can guarantee the only people who care about the differences between PC and console version of a game are PC owners who feel the need to justify why they have to wait an extra month to play a sloppy port. I speak purely as a PC gamer. “Significantly better” is in the eye of the beholder, or in the mind rather. I’m sure it looks better, but nothing anyone would get excited about.

The only real advantage PC games have are mods. And price of software (as long as you never buy new games on Steam or in a physical retailer). And spending hours trying to troubleshoot driver and hardware problems, if that’s your kind of thing.

The games already look great on consoles. On PC they look the same with some minor extras. I hate PC elitism, which is rife even amonst PC gamers ‘Oh, I’m surprised you can even load the game with that ‘rig’! You need to spend £3000 like I did because I can’t cope with the reality there was literally no need to spend that kind of money to get a good experience. My sense of self worth is tied directly to my PC specs.’

PC Games could, should, and would look better if publishers weren’t too busy pandering to the console audience. A lot of the frustration with the current video game market comes from the fact that consoles are holding EVERYONE back, being restricted to technology that is nearly a decade old.

Sorry but that’s a bunch of bollocks, even on games that have NO ADDITIONAL PC FEATURES WHATSOEVER the difference on base settings, resolution, texture resolution and FPS should be noticeable for most people. Most games run at 720p or below, often with any kind of AntiAliasing disabled to be able to even run “stable” on consoles, they are just being upscaled to your TVs native res in the process, here is a list of almost all Xbox360 and Playstation 3 games and what “settings” they are actually running at: link to forum.beyond3d.com (second and third post)

Also most of the powers that be consider 30FPS to be a “stable” framerate to be reaching for when trying to optimize and scale everything down, some games even dipping below that. It’s rarer that we see developers go for 60FPS with games like RAGE.

And that’s not even mentioning things like loading times and view distances…

Gaming PC’s don’t cost $3000 unless you are buying from a boutique store. This it the same tired troll from console gamers we’ve heard for ten years as if their entire self worth wasn’t tied to an arbitrary number called a game score.

who are you trying to kid?
current gen console games look fugly compared games on pc (unless you are playing on low resolutions), and yes i do have a console that i fire up occasionaly. anyway its pointless debating this with some ppl.

I have my PC hooked onto a 46″ TV. Here’s the peculiarity of it: they are not in the same room. The PC is in the workroom (which is really a gaming room, to be honest) and the TV is in the living room (which is, I guess, for living?).

Fortunately, these two rooms are separated only by a single wall. A thick, load-bearing wall. So I drilled a hole in it (a mean task which caused a lot of frustration as it is, yes, load-bearing and not easy to drill through). Then I got a LAN, HDMI and USB cables through that hole. HDMI to send the picture from my computer to the TV, obviously. USB is there to hook up a keyboard and a mouse in the living room. And the ethernet cable is there because, hey, I’m already putting cables through walls so why the fuck not. The HDMI cable is currectly connected to the DVI output on my graphics card because the only other output is a mini-HDMI port and I never thought to check before buying the cable, though HDMI to mini-HDMI connectors do exist I’m told.

All these cables are 10m long.

Let me tell you something. A ten meter long USB extension cord is not your friend. It hates you. The signal degenerates A LOT and my Razer Lachesis attached at the end of it has its signal caught by the computer about 50% of the time. Not fun at all. I still didn’t try to put a self-powered USB hub at the end of the extension cord and then plug the mouse into that. It might create a more stable signal.

Other than this glitch, the setup works perfectly. I will game easily on it if the self-powered USB helps. As it is, ignoring the slight annoyance from USB signal degradation, playing movies is a breeze.

I sort of do this, except I use a 32″ TV as a monitor on my desk. I have both that and a 21″ monitor. The latter is used mostly for computery things, while the TV is used for videos and gaming primarily. I’ve been a cable cutter for a long, long time, and I live alone, so I decided there clearly wasn’t really any need for my TV to not be sitting on my desk.

The only problem is, I never actually use my couch, so I completely fail on that benefit. I mean, why would I want to be further from the screen? Still, enjoying some Skyrim or BF3 on a 32″ screen on my desk is sure fun. I also don’t have to worry about fonts being too small, either.

This is exactly my situation, after cutting the cable I don’t even use the regular monitor anymore.

I built a custom desk so I can adjust to different distances and the 42″ tv is now the main monitor. I plan to experiment with gaming in 3d for games like Batman. I remember playing Thief: Metal Age in 3d was quite nice back during the CRT days.

Yup, I’ve tried it myself and set it up for three family members’ comps all in the last year. I guess things are cheaper and the effects of that LCD price-fixing thing are fading.

For myself, I just got an HDMI/DVI cable from AmazonBasics (not an adapter; there’s a difference as johnpeat mentions above) to set up a second monitor last week. That was also a breeze — plug-and-play. The second monitor was 4″ larger and $30 less than the first (which I got bundled with a computer only a year ago).

RE: Overscan. You may have to adjust your television. There’ll be a menu option for picture size, which does things like 4:3, 16:9, stretch, etc. There should also be an option for something like “overscan” or “just scan” something like that, switch to that. It should be a mode that maps pixels to the actual resolution of the screen and doesn’t stretch the image.

I had my PC upstairs in a normal desk setting, but via the magic of a 10m long HDMI cable and a 8m long USB cable (2x4m with a powered hub in the middle cabling fans?) it was also on the TV downstairs with an xbox 360 wireless receiver and wireless keyboard/mouse combo.

I could play either upstairs or downstairs, i had a pc wireless xbox controller and played just cause 2 in glorious 42″ full detail… nowi moved and can’t be bothered to do the cabling again…. too many door ways..

I have a TV next to my gaming desk, and recently discovered as well the joys of HDMI cables for PCs. After years of wonky “dual screens” setup, it’s really refreshing that it “just works”.

However, I can’t really play regular PC games on it. Hard to read if I’m not next to the screen, and I can’t really have a correct setup for my keyboard and mouse. So the only games I play are mostly console ports (or at least gamepad friendly). I recently played Sonic Generations exclusively in this setup, it was really pleasant :P

The only thing that would rekindle my interest in playing PC games on TV would be that rumoured SteamTV set top box but other than that the consoles do the job fine for the kind of games I want to play on TV.

If Microsoft beat Valve to the punch and you could use the Xbox 360 as an output for your PC games that would be a genuine useful feature but I cant imagine they have the foresight to do it.

I’ve done this a few times, mostly with games like Assassin’s Creed that work better with a pad. That said I found my living room TV of 44″ to actually be smaller than my 30″ monitor, due to how far from the TV I sat. So I wasn’t really tempted to do it that often.

Also my “office” chair is just as comfy as my couch. I think a nice $200 office chair should be a requirement for PC gamers!

It’s funny how you write this only days after my black Friday computer parts had come, and I had hooked the new beast up to a 48″ Samsung I got only days before.

I experienced literally all of the issues you outlined, I had to find a DVI to VGA connector, since my video card does not have HDMI. Simply brought my computer speakers out, since I didn’t have an audio connect.

My mouse is plugged to the front of my computer since it’s not wireless, and I had to make my screen 125% larger in the settings, and I still have to zoom in for most websites to be able to read.

I recently managed to do this too, only I had a bit of an extra step to take.

My standard setup is dual monitor, which uses both the DVI outputs of my GTX 460. I used a DVI switch to allow me to switch between having a second monitor and a TV. It worked a treat, even though I was using the switch in reverse.

The really really nice bit is that in games startup on my main monitor when in dual monitor mode, and on my TV when that’s plugged in and my main monitor becomes the second monitor. And windows 7 remembers and implements each setup for me :) The only trouble is with icons and desktop gadgets not repositioning themselves properly