Now that we've walked through installing all our hardware in our previous article, we’re ready to install our operating system and configure it to get the most out of Media Center. As I mentioned before, I originally was planning to do this build with Windows 8 Professional and the Windows 8 Media Center pack, but there’s a few things that Microsoft has removed from Media Center in the new version that make using it for a Media Center a non-starter in my opinion. Not being able to boot directly into Media Center and having to boot into Metro and then launch Media Center is a deal breaker for me so I fell back to what’s been working great for me these past few years, Windows 7 Home Premium.

Before we even start with our Operating System install, there are a few settings you are going to want to configure in your BIOS/UEFI for the best system performance and stability.

First, ensure that your Hard Drive/SATA controllers are in “AHCI Mode” as opposed to IDE or Legacy IDE. AHCI stands for “Advanced Host Controller Interface” and offers some features and performance improvements over the old IDE interface such as hot swapping of drives and NCQ (Native Command Queuing).

Some motherboards will allow you to detect if there is a mouse and/or keyboard connected and stop the boot process if it does not see them. Since we’ll likely not be running the media center with a mouse/keyboard attached, make sure to disable this.

Set the primary hard drive (your SSD or the big spindle drive if you don’t have a SSD) as the First boot device. You may want to temporarily set the CD/DVD drive to be the First boot device to complete your Windows installation and then go back in and change the First boot device back to your primary hard drive.

There was a time in the not too long past that having a Cable or Satellite TV subscription was just a given. Like water, phone or electricity, if you wanted to watch anything other than a few local networks or crazy UHF stations you had to pay your local cable /satellite conglomerate a tidy little sum to pump the channels into your TV.

That’s where I was back in January of 2010. Staring at a $150 bill for Time Warner Cable with the “Basic Package + HD” and a pair of TiVo’s I began to wonder if I was just wasting money since 80% of our regular viewing consisted of a dozen or so shows scattered across only four or five channels. Within a month, and after some deliberation, we decided we’d try to ‘cut the cord’ and since that time I’ve happily saved nearly $5,000 that would have been lining the pockets of some Time Warner/TiVo executives. Ponder that for a moment, $5,000 spent on television. Even after I factor out the cost of hardware I needed to buy and setup, that’s enough money to buy a new big screen TV every year and then some.

Regardless of what big cable and satellite companies say, between 2008 and 2011, 2.65 million households dropped cable/satellite subscriptions. A recent survey found that 9 percent of the people surveyed had cancelled their cable subscriptions in the last year and Time Warner Cable alone has had 10 straight quarters of Pay for TV customer losses.

This multipart series on PC Perspective will walk you through the process of becoming a “Cord Cutter” yourself. Starting with some thoughts on whether or not cutting the cable is right for you we’ll walk you through everything from start to finish.

I’ll also include a few little personal tidbits from My Experiences in my quest to cut the cord and stay that way for the last few years.

To Cut, or Not to Cut, That is the Question…

While dropping your cable or satellite subscription can save you some serious money, it’s not for everyone. Television is a central part of the entertainment for many households, and you need to look at it from all angles before you call your provider and tell them you want out. Cutting the cord may require some concessions and serious changes to the way you get your television content. While you might not mind some inconvenience, your significant other or children may have a meltdown if they can’t get their regular fix of Honey Boo Boo or Yo Gabba Gabba the moment it’s aired.

Regardless, with some consideration and pre-work you can determine if cutting the cord is right for you and make the transition nice and smooth if you decide to kick your cable or satellite provider to the curb.

We saw AMD at CES, and they showed off some hardware; however, it seems they forgot to mention something. Anand managed to get a sneak peek at a certain Thunderbolt competitor that AMD is calling "Lightning Bolt." At first resembling a cable with mini-Display Port connectors, the AMD technology is able pass Display Port video, power, and USB 3.0 over a single cable.

Image the Lightning Bolt cable looking like this miniDP to miniDP cable.

The company is currently working to integrate the Lightning Bolt technology into laptops and ultrathins as a cheap, single cable dock connection. The current implementation involves using a muxer to combine the USB, Display Port output, and power from the PSU electrical signals and pass it over a single miniDP cable. This miniDP cable will resemble current cables but will be electrically different by having two pins on the connectors altered. The dock that the Lightning Bolt cable connects to then splits out or demuxes the signals into a MiniDP connection and a USB 3.0 port(s). AMD is planning for the Lightning Bolt docks to cost about as much as current USB 3.0 hubs, which run about $40 USD at the time of writing. Unfortunately, there are some caveats to the technology including (possibly) limited power delivery and limits on the USB 3.0 connection. The company stated that Lightning Bolt transfers between the computer and USB 3.0 devices would be faster than USB 2.0 speeds, the connection would not support the full 5 Gbits maximum speed.

More information can be found here. Personally, I'm happy that AMD is stepping in despite the tacky name. At the very least, I can see Lightning Bolt connectors being features on AMD notebooks and providing useful competition to bring down the cost of Intel's Thunderbolt cables and hardware. It may also cause Intel to reduce any licensing fees that may be involved with OEMs building Thunderbolt into computers. Although the AMD technology is all electrical (no fancy optics), and thus inherently slower than Intel's theoretical maximum speeds, the cheaper hardware means OEMs will be more likely to integrate it into computers and consumers will be more likely to buy into it. Assuming, of course, that they can pull it off, "Lightning Bolt" sounds like a connection technology that is "fast enough" at a price I wouldn't mind paying a bit extra for in a laptop.

Apart from the name, which is a bit... let's say unoriginal, what do you think of the AMD tech?