June 13, 2009

Finally updates to this blog. New laptop in hand, the Asus G50VT-X5. I am currently playing the Fallout 3 DLCs on it. Runs the game on Very High but with AA turned down to 2X and AF turned to 8X, still very good framerates at 1366×768 resolution. Left 4 Dead also runs very well, with all settings on Very High with 8X CSAA, but such is the case for a Source Engine game.

Its specifications are:

CPU

Intel Core 2 Duo P7450 2.13 GHz

Chipset

Intel Mobile PM45

Memory

4GB DDR2-800 (6-6-6-18)

Primary Graphics

NVIDIA GeForce 9800M GS

Disk Storage

320GB 7200rpm HDD

Optical Media

DVD-RW

Primary Monitor

15.6″ 1366×768

Operating System

Windows Vista Home Premium x64

keys)

The battery life on power-saving mode, which is still viable for watching HD video files, lasts over three hours. On performance mode to play games, it lasts over an hour and a half.

Its WEI (yay for random numbers) under Vista:

Processor

5.1

Memory

5.9

Graphics

5.9

Gaming Graphics

5.8

Primary Hard Disk

5.8

Currently in the Middle Kingdom right now (China), and I haven’t got around to doing any more fiddling with this machine (I’m still running on the pre-installed OS, which comes with a load of horseshit that I disabled immediately upon boot). Once I get around to wiping the machine and maybe dual booting Vista and the 7 RC, I will see what else I can do with it (overclocking and benchmark wise). Overall I am very satisfied with it for the price I purchased it at ($800).

April 23, 2009

Apparently, NVIDIA’s next core, the GT300, set to introduce DirectX 11 compatibility into NVIDIA’s GPU lineup, will feature no less than 512 (!) processing cores arranged into sixteen 32-core clusters. In comparison, NVIDIA’s current single core champion, the GTX 280/285, using the GT200(b) core, features 240 stream processors in ten 24-core clusters. To put this into perspective, using the 65nm process, the GTX 280’s core was a gargantuan 1.4 billion transistors over a 576mm2 die surface. Here it is compared to a dual-core Intel Penryn (what the current crop of Core 2 Duos are based on):

Its like comparing any of us to a professional porn star. Image courtesy of Anandtech.

The 55nm GT200b revision found in the GTX 285 shrunk this down to 470mm2. The GT300 will reportedly use a 40nm process for further size reduction and power savings, while allowing for increasing clock speeds. However, it’s not just the number of processors that are different. The GT200 and all previous NVIDIA GPUs utilizing the unified shader architecture since the G80 (e.g. 8800 GTX) used SIMD (Single-Instruction Multiple-Data) units, while the processors found in the GT300 will be MIMD (Multiple-Instruction Multiple-Data).

What this might mean is that while each cluster (actually stream multiprocessors, as each cluster is further divided into these SM’s with their own caches) in the previous architecture was only able to operate on a single instruction at a time, the stream multiprocessors in the GT300 will be much more versatile, able to operate on different instructions from its cache at the same time in an asynchronous fashion. While the GT200 and its predecessors had fairly fine granularity in its instructions measured by clusters of stream processors, the GT300 will take it even one step further and achieve granularity on a per-processor basis, so potentially, every processor on the GT300 could be operating on a different instruction, if the situation called for it. Of course, I’m just pulling this out of my ass based on a few lines from an online rumor report.

What this means for your gaming is even better load balancing between different computations necessary for rendering bleeding edge graphics, such as pixel and vertex shading. With the advent of on-GPU physics processing through NVIDIA’s PhysX, this becomes even more important (bouncing boob physics), for the GPU to be able to divide its attention between rendering graphics and calculating physics, for all the jiggling and flopping around you could desire at the fastest framerates.

What this might mean for General Purpose computing on GPUs and CUDA or OpenCL applications is finer control over how the GPU issues and executes threads. Currently, calling a CUDA-enabled function to run on the GPU means issuing it in blocks of threads, which is then managed by the GPU, only giving the programmer abstract thread IDs and global synchronization instructions to work with. Since all the clusters across the GT300 will be identical, the programmer might now be able to work within the cluster and perhaps be able to order different processors in the cluster to execute different functions, maybe through something like a processor ID, while the GPU still maintains control over which cluster to issue that batch of instructions to.

Since this is all a rumor though, who knows what final product NVIDIA has prepared, which is also rumored to perhaps appear in Q4 2009. This particular tidbit about MIMD processors on the GT300 is from TechConnect Magazine.

April 16, 2009

Demigod, a game developed by Gas Powered Games, makers of Dungeon Siege, and published by Stardock, of Sins of a Solar Empire Fan, was just released yesterday. Although many Gamestops began selling them early, of which the Stardock boss had this to say in his blog, regarding the subsequent potential piracy. A class act, I say. The game itself can be described as an action/RPG/RTS hybrid, or a massive ripoff of the Defense of the Ancients mod for Warcraft III. Although that would be presumptuous, DotA has way many more heroes than Demigod has…uh…demigods (only 8 as of now)! It also runs a lot worse than DotA! I jest though, considering the graphics engine powering Demigod is much more advanced than the geriatric Warcraft III engine. Although the gameplay might feel similar, Demigod is much more refined, and each demigod has a lot more depth in their skill tree, abilities, and powers. There is also lot more to do on the maps than just destroy trees or pyramids with lightning rods.

The graphical in-game options are fairly broad, without ridiculous amounts of control over the specific shaders that color a Demigod’s demibutt or the fire effects shooting out of the Torchbearer’s torch. Resolution is pretty obvious, if you don’t know what this is you are probably one of those people who game at 1024×768 on their widescreen TV and wonder why everybody looks so wide. Vertical Sync is there as well to control tearing, the refresh rate you are actually at is in parentheses after the resolution. Fidelity Presets are a general setting controlling how the game looks overall, its just governs all the subsequent settings. The options are Low, Medium, and High. I am not quite sure what Fidelity is, although I’m sure its not how often you’ll be tempted to cheat on your significant other with the sexy Demigods in this game.

mmm…polygonal ass

Shadow Fidelity presumably controls how good the shadows look. Anti-aliasing offers you the standard gamut of options, from 2 to 16Q. 2 and 4 are probably multisampling AA, along with 8Q and 16Q, while 8 and 16 are using NVIDIA’s coverage sampling AA. Texture Detail and Level of Detail also run from Low to Medium to High. The settings you see in the screencap are the settings I used in the benching.

I then used FRAPS to measure 60 seconds of gameplay, from the start of a skirmish game with myself and 7 computer controlled demigods. The camera initially does a flyover of the level, and then settles in place behind your Demigod. I then moved the camera forward and backward at regular intervals to capture as much of the action as I could while maintaining some sort of consistency in between benchmarks. As there is no dedicated benchmark option in-game, this was the best I could do for now. I first enabled SLI on my GTX 280s and measured the performance, as well as took a screenshot with Task Manager and temperature monitoring programs open on my other monitor (which is powered by a third GPU, so there is no performance penalty on the SLI setup). You can see the rest of my machine specs on the About page of this blog.

The results (using NVIDIA’s 182.50 WHQL drivers) were:

Minimum

Maximum

Average

4

101

50.483

As you can see here, Demigod is still fairly limited to one CPU core, while it seems one of my GPUs in SLI is under considerably less load (although it could be that GPU0 is behind GPU2 in the physical setup on the PCI-E slots). The benchmarks also show that there is very low minimum framerate, which can mean stuttering (although I was watching the game during the benchmark, and there really wasn’t any noticeable stutter).

I then disabled my SLI and proceeded to bench the game running on a single one of my GTX 280s.

Minimum

Maximum

Average

25

91

34.017

Although the average frame rate has gone done from the SLI setup, showing that there is a performance boost from using SLI, the minimum framerate is much higher, suggesting there might be some driver issues present in 182.50 drivers for SLI under Demigod that might cause the occasional stutter. Neither benchmark was really taxing on my system though, and although Demigod looks good, I can see why it is not so taxing, as the levels are not that big and detailed, and the game is missing a lot of en vogue effects like depth blur and motion blur. The Demigod character models are the high point of this game, barring that close-up of Demislut up there that shows a fairly low polygon count. But screenshots like this and this show that it is definitely a modern game, although maybe not with the bells and whistles that the next Unreal or id Tech engine might have. One extra note, there is no 3D Vision profile for this game yet either, and I couldn’t run the game with 3D on with the NVIDIA shutterglasses and have it display correctly. I haven’t tried renaming the exe to other games though to test out if there would be a 3D Vision profile that works with the game.

If you liked DotA, or Warhammer 40K Dawn of War II, or a quick pick-me-up action RPG that doesn’t require enormous time investment to get into, Demigod is a definite recommendation.

April 9, 2009

Intel is now adopting a Stars rating system for their lineup of processors so that the average consumer can differentiate between a Core 2 Duo E8400 and a Core 2 Quad Q9300 without having to learn the arcane witchcraft of microprocessor architecture and understand terms like “cores”, “cache size”, “memory controllers”, “front side bus”, and various other Pagan chants that for the standard mouth-breathing Best Buy Christmas shopper might as well summon mythical Scandinavian beasts as describe the latest Intel chip.

It used to be a pure clock speed race, and you knew that a Pentium III at 500 MHz was better than a Pentium II at 350 MHz. Then AMD had to go be assholes and innovate by introduce more efficient x86 architecture, so they ran at lower clock speeds without sacrificing performance against their Intel counterparts. In order to compete, AMD then had to make up magical numbers in order to match up evenly against Intel’s lineup, calling a 2.2GHz chip an AthlonXP 3100+ because it was closest to a Pentium 4 at 3GHz in terms of performance. Thus the “Megahertz” race became a “pull numbers and letters out of your ass and make shit up” dick-waving contest.

Intel then responded in kind and made even bigger numbers than AMD’s lineup and stuck letters in front of them as well so they couldn’t be accused of false advertising when it was discovered that a Core 2 Duo E6700 wasn’t actually clocked at 6.7 GHz. Now that this has gotten completely out of hand (Intel has 30 desktop processors and 57 notebook processors in their active roster), Intel has decided to go the route of Zagat or Hotels.com and start giving their processors star ratings.

So a Pentium single core or Celeron gets a 1 star, a Core i7 or Core 2 Quad Extreme gets 5 stars, and everything else is in between. Comparing this to restaurants, if you get a computer with a Celeron processor, you are essentially eating here.

While I am dining in style with my Core i7 965 EE. Suck it, folks.

However, as you can see, that star chart is only viable for a certain time frame. Because unlike restaurants and hotels, microprocessors advance and change faster than the pants of a fellow with permanent sexual arousal syndrome. If Intel’s star ratings applied to real life in hotels, a hotel that looks like this when you check in for your month-long honeymoon:

Will look like this when you check out.

Now if it was a Vegas bachelor party I could understand (there should be a dead hooker and a few exotic animals somewhere in that second pic if that was the case), but now customer service will have to explain to the angry and clueless consumer why their 5-star processor that was purchased four years ago is now slower than constipated shits while watching the newest video response to 2 girls 1 cup on Youtube.

AMD hasn’t gotten in on this act, which means that their processors are unrated. Generally, when hotels are unrated, that means you are getting this:

So Intel is trying to say: if you decide to purchase an AMD processor, you will have to pay to use the computer by the hour, and your system might be infected by a virus by the time you are done. Of course, the best course of action to counter this move by Intel is for AMD to just label all their processors 5-star and say that its because they use Elven magic and Leprechaun blood in their manufacturing process. Your average slackjawed yokel will still be suckered in.

April 7, 2009

The man building this must be going through the tech geek equivalent of a mid-life crisis. He has linked 17 NVIDIA Geforce GTX 295 graphics cards in a single server rack with all the hardware communicating with each other. The 23 total number he comes up with in the video is from adding the two he has in his home computer, plus the four that he is currently waiting on for power supplies. Watch the video and literally feel your e-peen shrink and retreat back into your geek gape-hole.

Unfortunately, seeing as NVIDIA’s display drivers only support up to Quad SLI, in other words just two of these beasts, to to render a game, this setup is really only useful for non-gaming applications such as Folding@Home and various CUDA implementations. So it still won’t run Crysis at acceptable framerates maxed out at Triple-head 2560x3x1600 resolutions. Still, the sheer amount of transistors here outclasses the computing power that any single person has in their possession but at $500 for a single GTX 295, this is also out of the price range of most of us peons. Makes a great space heater, in the very least.

Mercifully for our geek pride though, it turns out this is a hoax, according to the EVGA forums, and that this is actually a Folding@Home cluster. Looks like the creator of the hoax has some problems he is trying to compensate for, although he does it in a quite unorthodox manner, unlike most of us who will brag about non-existent hot-rod cars and various fictional women we’ve conquered.

April 5, 2009

Completed my Home Theater PC build last night after I got my SATA Blu-ray drive, and installed Windows Vista and all the necessary software to turn it into a media center. Its currently off right now as I am watching the Villanova-UNC game. I’m sure a lot of people’s brackets got completely shat on by Michigan State’s victory over UConn earlier. Damn NCAA athletes don’t have any consideration for our betting pools and our over-analyzed predictions. Of course, the winner of your office pool will still be that person who put the least thought into their bracket and just randomly filled it out, probably the wife of your boss. Don’t worry, it’ll still be your boss collecting on the pool money. He’ll use it to buy himself one of those tacky digital picture frames that will feature a slideshow of all his ugly children, and you’ll have to compliment it every time you go to his office and see it.

It is really too bad CBS is covering these games, as I am noticing a distinct lack of Dick Vitale in the announcing (he only has a contract with ESPN and ABC). A college basketball just doesn’t have that same air of pageantry without his ridiculous voice and equally ridiculous comments about the game.

The face, voice, and occasional queen of NCAA College Basketball.

Anyways, after you’ve washed your eyeballs out with lye after seeing that image, it’s time to get back to my HTPC build.

Here is the Foxconn RS233 mini-ITX case taken apart to prepare for the guts of the computer. It has space for a 5.25″ drive and a 3.5″ drive, and two front USB ports and two audio ports. The assembly holding the 5.25″ drive is removeable, which is necessary in order to mount the 3.5″ drive underneath it and the motherboard itself as well.

Here is the FSP 150W power supply that came with the Foxconn case. It only has power connections for the motherboard and two SATA power connectors, which perfectly coincide with the two SATA connectors on the ZOTAC motherboard, so that really isn’t a limitation for my setup. The only limitation I see in the future is the lack of PCI-E connectors, as I won’t be able to take advantage of the ZOTAC motherboard’s full PCI-E x16 gen2 slot for a high-end GPU, as most of them require extra PCI-E connectors, without a power supply upgrade. Although seeing that my GTX 280 has a TDP of around 230W, trying to pair a card like that with this power supply in the first place would be idiotic, and would detonate the power supply in spectacular fashion and set my pants on fire. True story.

Take that hazardous area warning seriously. I didn’t and my pants have regretted it to this day.

Here is the Western Digital Caviar Black 1 terabyte 7200rpm Hard Drive I am using as this HTPC’s storage. It’s got enough space for a fuckton of Blu-Ray/DVD rips, h.264 video, music, and whatever else I can throw at it.

That’s a lot of storage for perfectly legal and non-pornographic media.

Mounting the HD into the 3.5″ “bay” in the case was a little strange. I put bay in quotes because its literally a little sheet of metal sticking out of the bottom of the case that just prevents the HD from shuffling around too much when you put the screws in. The screws in the case are also not aligned with the holes on the side of the HD correctly, unless if its by design to have the hard drive crooked like this:

I hope my files icons don’t slide to the left of the screen in Windows Explorer.

I then mounted the ZOTAC motherboard into the case. The power supply was removed and put back in during this process, to allow for more elbow room in putting the screws into the base through the motherboard.

That 80mm case fan lines up pretty well with the CPU’s HSF, which should help in keeping temperatures down in these cramped conditions. The PSU also has its own fan as well, so this completed build should have decent airflow. The loose wires in these tight quarters don’t help though.

The new Haruhi and Mikuru baseball/cheerleader Figmas make their appearance here, along with another view of the internals of the HTPC. Mikuru is cheering me on as I continue the build, while Haruhi is about to go to town on the computer with a bat and a baseball. You can also see the abundance of ports in the back of the PC. 6 USB ports here and the two in the front make for a total of 8 USB connections, and the HDMI can carry both audio and video to my TV.

Here is the LG Black 6X Blu-Ray/HD DVD-ROM and DVD Burner. This is the retail version and comes with a lite version of the Cyberlink PowerDVD player to play Blu-Ray or HD DVDs. Currently there are no free players for that, which is why I went with the retail version instead of the OEM, which would not have included the player software. The HD DVD player part is pretty worthless now as Sony finally got their shit together and pushed a successful format (unlike BetaMax, minidisc, UMD, etc), and they did it without even the support of the porn industry, who supported HD DVD (main reason why BetaMax failed). I’m sure you could do a complete case study on this and the changing tastes of the viewing public, as people have moved away from sneaking into brick-and-mortar porn stores, hiding their faces, and purchasing legitimately produced porn to watching free amateur videos online now. I guess people just aren’t interested in a well-directed and finely-acted film with high production values anymore while plowing through boxes of tissues and jars of vaseline in front of their monitors and TVs. I blame the lack of an Adult film category in the Academy Awards.

Here is a top shot of the internals of the case with the Blu-ray drive mounted in the assembly and the assembly mounted in the case. And with all this talk of porn, I believe mounting is the proper term to use here.

I then plugged in the HDMI cable to the TV, the USB blue-tooth receiver for the wireless keyboard and mouse, and the power into the PSU, and crossed my fingers.

POST, bitches.

I then proceeded to install Windows Vista, and I don’t want to hear anything about my operating system choice. I like the shiny and transparent Aero interface. I’m a very shallow person.

1000 does not equal 1024, unlike what the hard drive companies would have you believe.

And here is a final shot of the completed machine, while it is on.

Amazingly, this does not have an LED light that turns on to tell you that it is off (unlike every other piece of electronics coming out nowadays).

I haven’t done too much performance testing yet. The CPU idles at around 40C and goes up to 55C under moderate load. Blu-Ray playback and video playback through Media Player Classic Homecinema seem fine too, no dropped frames, no sound issues over the HDMI. I ran the Windows Experience Index rating on the HTPC, and these were my results:

I will be doing some benchmark testing later next week and posting the results, but right now everything looks very promising. As I complete this, UNC just won their Final Four game. Now for the inevitable ESPN circle-jerk around these teams before the championship.

April 2, 2009

VS
Whoever wins, we lose a shitload of money buying new graphics cards

Today sees the hard release of the ATI Radeon 4890 and the paper launch of the NVIDIA GTX 275 to fill in that mid-range segment of the high-end GPU market, although calling these cards mid-range is like calling an Audi RS6 a “decent” performer compared to an Audi R8 or D-cups “moderately sized” because that girl can’t use her rack as a flotation device, unlike the other girl with E-cups. These video cards will still own the shit out of most games at the resolutions most people run at, even Crysis.

The ATI card is a re-engineering of the core found in the 4870 to allow for higher clocks, while the GTX 275 is just the flagship NVIDIA single-card dual-core GPU solution GTX 295 cut in half. Both will start their prices at around $250, although manufacturer’s mail-in rebates will make you think you are only paying $220, but most of the time the bastards will either claim they didn’t receive your rebate request in the mail or “lose” your rebate request because they are currently “transitioning to a new rebate database system” or more likely, just used it as toilet paper.

Rebate Customer Service: Yes, I have your rebate request in my hand right now. Don’t worry, we were just about to “process” it. *flushes*

Anyways, now that the reviews are in, you can expect all the fanboys to come out of the woodwork to cherry-pick benchmark results to show how much better the product from their brand of choice is. If NVIDIA or ATI really wanted blowjobs, I’m sure they could do much better than nerds that get involved in internet dick-waving matches. The reviews will talk about the high overclocking overhead of the 4890 even though most people still think that means setting your system time two minutes ahead, and others will talk about how NVIDIA supports PhysX, even though like one game uses it (Mirror’s Edge), and the chick in that game doesn’t even have the breast size to use the physics engine to its fullest extent.

Who cares about glass???

Physics Engines should be for boobs

Except these boobs

Anyways, here’s a link to a fuckton of reviews on both these cards, but even I wasn’t bothered to sift through them and look at every last benchmark. Especially now with AMD/ATI’s supposed DX11 part coming at the end of summer and NVIDIA’s part following soon after (to probably coincide with Windows 7’s release, and the inevitable DX11 or Windows 7 exclusive game that will get hacked in days to run on Windows XP). If you already have a 4870 or GTX 260 or something comparable, this shit is worthless unless if you leave your computer on 24-7 downloading porn or something, then the power savings of moving from the 4870 to 4890 might be worth it (the 275 sucks up more power than the 260).

April 1, 2009

I got part of my HTPC build today. Came in a decent-size box with Newegg.com emblazoned on the side. I’ve gotten a lot of Newegg labeled packages in my time living here, and they get delivered straight to the front desk of my apartment. I wonder if my doorman thinks I’m some sort of dairy product reseller. Probably not, because he always looks at me askance when giving me these packages, so he thinks I’m probably smuggling drugs or small children for my own sick sick pleasure.

What’s really inside this innocuous looking box?

Why hello, NAMBLA care package

Or I’m getting mail-order brides that I’m going through like I go through my dirty laundry looking for unstained underwear to wear until laundry day finally comes.

Hello sexy, where is your RMA form if you came with the clap? (You should see the ones I get DOA)

Well, here is what was actually in the package, much to some of your disappointment.

The 2x2GB OCZ Fatal1ty DDR2-1066 RAM and the 2x1GB Patriot DDR-800 RAM you see pictured I already had laying around my apartment. A set of them will finally be put to use in this machine.

Here is the ZOTAC motherboard rocking out with its slots out. It has no sense of shame.

It comes with the motherboard, aluminum backplate for the ports, two SATA cables, a molex power cable, a couple of manuals, a driver disc, a separate wireless card that you can install on the mobo, and a long black tubular dealie for the wireless signal. The motherboard CPU socket itself is LGA775 with support for up to a 1333MHz FSB, making it compatible with all of the latest Intel CPUs except for the Xeon server processors and LGA1366 Core i7 Nehalems. It has two 240-pin DDR2 memory slots for dual channel memory, and a single PCI-E x16 slot for support for even the latest hardcore discrete graphics cards, although the onboard GeForce 9300 is no slouch either, especially for HTPC use in video playback. It even has an HDMI port so you don’t need to dick around with converters and shit to be able to connect it to the HDMI port on your HDTV.

Unfortunately, it only has SATA connectors, and there are no IDE ports, which I will explain later is the reason for this build’s coitus interruptus.

Here is the Silverstone NT07 LGA775 CPU Cooler. It’s extremely low profile (cue Apple Bottom Jeans – Low Low Low Low!), even shorter than the already small Intel stock HSF that came with the CPU. This was a necessity because the mini-ITX Foxconn case this is going into has a very limited clearance above the area where the CPU socket is.

Ying-Yang Twins would be proud

I went with the Intel E5200 as that is based on the latest Core 2 Duo Wolfdale architecture and is 45nm, but its also the cheapest in its class with only an 800MHz FSB and 2MB L2 Cache. It’s still a sight better than the weaksauce Intel Atoms, especially for HD and Blu-Ray playback. Although if NVIDIA’s Ion platform, which pairs an Atom with the GeForce 9300/9400 chipset, comes to fruition and is able to evade Intel’s cock block, I might be migrating to that platform, pending final performance and power consumption reports.

Here is the processor all cozy in its CPU socket on the ZOTAC. The Figma versions of Haruhi from Melancholy of Haruhi Suzumiya and Konata from Lucky Star are there for size comparison and to keep the lonely caseless motherboard and CPU company (the Foxconn mini-ITX case is coming in a separate package).

Konata is pointing and laughing at the minor cut I received from not reading the warning label on the heatsink and dumbly removing it before I put the processor in.

A small line of Arctic Silver 5 thermal compound. A pretty piss poor job of applying, but this shit’s expensive and I don’t have a lot of it so I don’t feel like cleaning it up and re-applying for a potential 1-2 degree difference.

Screw you, I sneezed.

I then mounted the Silverstone HSF on top of the socket and plugged it in. Flame-hair and black-hair Shana oversaw the push-pin mounting.

Loliconputer

I decided to use the two 2GB sticks of OCZ Fatal1ty DDR2-1066 memory for the build for now. Despite the uber-1337 naming of the RAM after the famous pwnage “cyber-athlete,” the RAM’s default timings at 1066MHz are 7-7-7-20, which for DDR2 memory is looser than the two-bit whores that Fatal1ty bangs, because “athlete” is a title that still gets you laughed at by the ladies if its preceded by the terms cyber, videogaming, or Halo. In comparison, my OCZ Platinum DDR3 RAM in my main machine runs at 1600MHz with 7-7-7-24 timings.

Unfortunately, it was at this point my build stalled. I had an extra 500W power supply I was planning to use to test out the machine, along with a stand alone power/reset switch to turn this thing on, but it was then I realized that the DVD-ROM drive I was planning to borrow from my secondary machine was an IDE drive, and this motherboard only had SATA connectors. Since my Vista is on a disc and I don’t have an extra USB key laying around with a boot sector to use as an OS install, I wouldn’t be able to install an OS onto the 1TB HDD (I could installed the HDD on another computer and installed the OS from there, but at that point I would have felt completely aggravated if I had to open up one of my other machines again, install the HDD, install the OS onto the disk, and migrate it back to the ZOTAC motherboard and have the OS redetect the different hardware setup. I decided to wait until I get the SATA Blu-Ray drive to install the OS. So this build is currently on hold right now while I wait for my next Newegg package and get the strange look from my doorman again.

March 31, 2009

Right now, this blog blows because the author just started it on a whim at around 2AM in the morning on a work day. Hopefully the first post of substance will come when the author gets in all the parts for his mini-ITX home theater PC in the near future from the hardware overlords at Newegg. The author will be posting pics of his build and subsequent benchmarks, reactions, etc. So hold on to your butts until then. Or not. Playing games of auto-grabass has its appeals though.