So I want to connect my PC to the TV, but I want audio from my home theater speakers. How can I do this?

I'm using a GTX 260 with an SPDIF cable connected to the mobo so it outputs audio. The GTX 260 has 2 DVI ports so can I put 2 HDMI adapters on them and connect one HDMI to the TV and the other to the receiver?

So I want to connect my PC to the TV, but I want audio from my home theater speakers. How can I do this?

I'm using a GTX 260 with an SPDIF cable connected to the mobo so it outputs audio. The GTX 260 has 2 DVI ports so can I put 2 HDMI adapters on them and connect one HDMI to the TV and the other to the receiver?

That will work just fine - however knowing what you have for a receiver and tv may offer other options...

Thanks! I have a Pioneer VSX-920k receiver and I'm not exactly sure on the TV model, but its an older Sony Bravia XBR9.

Great the 920 has HDMI, then why not just run your DVI to HDMI adapter from your PC to the receiver then HDMI out to tv, of course you will still need to keep the SPDIF connected to the receiver as well... but your way works just as well

If you look at my gallery there is a schematic at the top showing that I have my HTPC connected the same way you proposed - DVI/HDMI adapter from PC directly to TV the SPDIF out to receiver

Great the 920 has HDMI, then why not just run your DVI to HDMI adapter from your PC to the receiver then HDMI out to tv, of course you will still need to keep the SPDIF connected to the receiver as well... but your way works just as well

If you look at my gallery there is a schematic at the top showing that I have my HTPC connected the same way you proposed - DVI/HDMI adapter from PC directly to TV the SPDIF out to receiver

Using my pc right now and the speakers work great. But for some reason every time I play a 1080p file (even youtube stream) the monitor turns purple. Hope it isn't the receiver

DVI does not carry audio so he would need that going to tv, and the digital coax going to the back of the receiver.

I'm not sure if Nvidia offers any special DVI-to-HDMI adapters that handle sound, but ATI does. (Ordinary DVI-to-HDMI adapters don't.) If your card supports such an adapter, it may also support LPCM over HDMI, which will give you lossless audio to the receiver--though not HD bitstreaming. *ONLY* in that situation would I recommend DVI/HDMI from your current card to the AVR; otherwise, keep it SPDIF to the AVR and run DVI/HDMI direct to the TV.

I suggest you get a better video card--one that will support HD bitstreaming and maybe even 3D (both of which your Pioneer will handle), though video cards with full 3D support are rather expensive right now and are generally limited to AMD/ATI's 6000 series. AMD's 5000 cards (like my 5450) will do HD bitstreaming, and have an intermediate 3D solution that works with most TVs; I believe the latest Nvidia cards will do the same thing. Even in that case, however, since your Pioneer has 3D passthrough, I would only run HDMI from PC to AVR, then from AVR to TV; that's how most HDMI AVRs are designed. (Dual HDMI outputs are usually only for older AVRs that can't handle 3D signals.)

Using my pc right now and the speakers work great. But for some reason every time I play a 1080p file (even youtube stream) the monitor turns purple. Hope it isn't the receiver

Assuming you're stuck with SPDIF to your AVR (more than likely), I'd run the DVI/HDMI adapter straight to the TV, so the video card can read your TV's EDID info directly and use that for proper configuration.

More than likely, your TV's maximum resolution is 720p or something close to it; it won't handle 1080p, but your AVR isn't conveying that to the video card.

Assuming you're stuck with SPDIF to your AVR (more than likely), I'd run the DVI/HDMI adapter straight to the TV, so the video card can read your TV's EDID info directly and use that for proper configuration.

More than likely, your TV's maximum resolution is 720p or something close to it; it won't handle 1080p, but your AVR isn't conveying that to the video card.

I'm using a 1080p Sony Bravia and I'm connecting the DVI/HDMI adapter to the receiver. The SPDIF cable is connected to the mobo and gpu so it outputs sound on the DVI-HDMI. Maybe I should just buy a optical/digital cable for sound to the receiver and connect the DVI/HDMI to the tv?? I'm noob at this stuff :P

If you are able to connect both video and audio of your PC to your TV and home theater system in this fashion, you can then use your PC's internet access to capability to watch internet or store images and video on your TV and listen to the audio through either your TV speakers or home theater speakers.

I'm not sure if Nvidia offers any special DVI-to-HDMI adapters that handle sound, but ATI does. (Ordinary DVI-to-HDMI adapters don't.) If your card supports such an adapter, it may also support LPCM over HDMI, which will give you lossless audio to the receiver--though not HD bitstreaming. *ONLY* in that situation would I recommend DVI/HDMI from your current card to the AVR; otherwise, keep it SPDIF to the AVR and run DVI/HDMI direct to the TV.

I suggest you get a better video card--one that will support HD bitstreaming and maybe even 3D (both of which your Pioneer will handle), though video cards with full 3D support are rather expensive right now and are generally limited to AMD/ATI's 6000 series. AMD's 5000 cards (like my 5450) will do HD bitstreaming, and have an intermediate 3D solution that works with most TVs; I believe the latest Nvidia cards will do the same thing. Even in that case, however, since your Pioneer has 3D passthrough, I would only run HDMI from PC to AVR, then from AVR to TV; that's how most HDMI AVRs are designed. (Dual HDMI outputs are usually only for older AVRs that can't handle 3D signals.)

The Nvidia 430, 450, and I believe 460 cards are all HDMI 1.4 and they all bitstream as well. If you're not a gamer, you can get a 1gb GT430 for under $70 bucks at Newegg. I'm doing the samething - I run my computer (via hdmi) to my receiver and my receiver to my TV (via hdmi also).

Hi can someone please guide me through. I have a problem with my BDROM.

I recently purchased LGBH10 Bluray drive and a copy of Inception.

I think my hardware is up for the task of playing movies(?).

CPU: i7 950
GPU: GTX470

I got my rig hooked up through HDMI port of my GPU.

Now the problem is my disc isn't detected. Whenever I try to play it on PowerDVD 9 it says no disc in the drive. I don't have any other BD movies to try and test on it since this is the only copy I have. Additional information for my case is that whenever I insert a blank bluray disc, it is detected.

I'd appreciate if anyone could help me get through this. I was planning to watch this movie together with the family during the holidays.

Thanks in advance and sorry for probably derailing the topic with my issue.

Your hardware is MORE than enough to play Blu Rays. I have to ask, when you say that the disc isn't recognized by PDVD, what about Windows itself?

Well that was weird.... I did multiple reboots and it didn't want to work. And well my rig turned off suddenly a while ago. After turning it on again, suddenly it's working now. It finally detected it.

EDIT:

Is there some possible scenario that Daemon is interfering with my drive? I'm running on an old version of Daemon.

You must connect the cables match the colors on the jacks located on the subwoofer. After connecting all cables, it is time to connect the other cables such as energy, subwoofer and speakers. It's very easy if you pay attention to the subwoofer connection, so you would be able to follow the polarity is written on the connectors. Ultimately, all the cables would be connected to your perfection.

(posting this here because someone thought it would be smart to not let new posters make threads....sry)

So for Christmas I received a Laptop with a blueray player, Avatar and How to Train Your Dragon on bluray, and an HDMI cable.

We currently have a 720p TV in our family room and I was going to watch a move but it won't let me. When its plugged into the TV it says that it can't play in current display mode and i can't find anywhere in my player to change it.

(posting this here because someone thought it would be smart to not let new posters make threads....sry)

So for Christmas I received a Laptop with a blueray player, Avatar and How to Train Your Dragon on bluray, and an HDMI cable.

We currently have a 720p TV in our family room and I was going to watch a move but it won't let me. When its plugged into the TV it says that it can't play in current display mode and i can't find anywhere in my player to change it.

I am using an HP laptop with its built in bluray player.

Any help would be much appreciated.

You need to take the player off 1080P & put it on 720P/1080i. The player should have a menu for picture settings.

your gonna have to look in the graphics section of the laptop for video out settings. there should be something under HDMI output that you can tell it to only output 720p or whatever the max your tv can support. also make sure it's not outputting /24 frames if your system supports that but your tv doesn't (if it doesn't that is).
remember, don't look in the player, look in the video/graphics card settings.