Categories

RSS

Hey man, I need a new toaster. You know all about kitchen stuff. Have any suggestions?

The KitchenAid4000 series just came out.

Are those good?

I have a KA4510, and it’s really good.

Does it have 4 slots?

Oh you want 4 slots? Well, the KA4510 XN goes up to four slots, but it only toasts one side.

Let’s pretend I want to toast both sides.

Then you probably don’t want a KitchenAid. Their 4000 series 4-slicers aren’t very good. You could get one of the old KA3510 XN or XNS for cheap these days, but they take like, twenty minutes to toast the bread.

Er. What else is there?

The Cuisinart 7000 series is comparable to the KA 4000 series. The 7420, 7520, and the 7420 all do four slices. Just don’t get any of the SIP models because they can’t do bagels.

SIP?

“Slim Insertion Port”. The units are small, but only regular sliced bread will fit. KA has the same thing on many of their units. Actually, if you want to do bagels with a KA you’ll need the ASI units.

Which is?

“Adaptable Slot Interface”. It just means it can handle bread of varying widths.

So I should get a Cuisinart ASI?

No no no. That’s nonsense. In Cuisinart the units all handle wide bread unless they are SIP.

My head hurts. So I want a Cuisinart 7000 series, but not a SIP, right?

Pretty much. Now, the 7000 series is actually two generations. You don’t want anything before the 7400, because the pre-7400 units actually took up two wall plugs. The 7100 and 7200 four-slotters were actually two dual-slot units strapped together, so they had two cords. Plus, they didn’t have a timer so you had to stand over them yourself.

All I want is to toast bread! Four slices! Both sides!

Then the C7520 T series is for you. You can pick one up at Wall-Mart for about $400 these days.

FOUR HUNDRED DOLLARS! I could buy an oven for that! I could just go out to eat every morning for that kind of money!

Ah, if you’re worried about price then the KitchenAid 4510 ES is a good pick. It’s only got three slots but it’s retailing for about $90.

I’m looking in the Wal-Mart flyer, but I don’t see that model.

Sure you do. Right here: The “Magitoast 7”. See how underneath it says “KA4510 Ex”? That means it’s the KitchenAid 4510 ES or the KitchenAid 4510 EP, just with a brand name slapped onto it.

…?

KitchenAid and Cuisinart don’t actually sell models directly. They make the insides parts of toasters, then other companies buy them, put the fancy shell on them, and give them a new brand name. But if you want to know what you’re getting, you have to look at which design the unit is based on.

Ah! I get it! Then why don’t I get this “TastyToast 2000”, which is like that 7520 you mentioned earlier. This one is only $50.

Er. That’s not the same thing. That’s a 7520 OS. The OS means “One Slice”. Total bargain unit for suckers. Some goes for the 6000 series and anything with a MRQ after it.

I’m shopping for a graphics card, and this is exactly what I’m going through, except I don’t have a know-it-all to help me out. I have never seen such rampant ineptitude at marketing products. I’m even savvy enough to know what I’m looking for, but the endless chipset numbers and sub-types and varying configurations makes it impossible to get any sort of handle on the thing. It’s actually worse than my example above, since higher numbers aren’t always better. I’ve searched around, and I have yet to find a breakdown as clear as the conversation above. What is the difference between these two generations of cards? What does this suffix mean? Why am I seeing this chipset in one place for $119.99 and elsewhere for $299.99? Is this the same product with a huge markup, or is this second unit different in some way I can’t discern?

Features get added in the middle of numeric series. Like, an NVIDIA 7800 supports 3.0 pixel shaders, and earlier 7000 models don’t. (Or don’t list it among their features.) So it’s impossible to do any real comparison shopping until you’ve memorized all the feature sets for all the chipset numbers for both NVIDIA and ATI. Yeah, let me get right on that.

Game developers who keep cranking up the system specs are killing themselves. They’re making sure that their only customers are people who are willing to wade through this idiocy, fork over hundreds of bucks, and then muck about inside of their computers to do the upgrade. You shouldn’t need to be Seth Godin to realize most people would rather drop that same $400 on a console and have done with it. In fact, it’s pretty clear that this is exactly what people are doing by the millions.

The main advantage of the PC as a gaming platform was its sheer ubiquity. But while PCs are probably more common than televisions, PCs which are equipped with the latest hardware are pretty rare, and graphics card manufacturers seem to be doing their level best to keep it that way.

This is the second time this year I looked into upgrading, and both times it seemed like such a stupid, pointless hassle. Like our toaster-buying friend above, I know what I want, but its the sellers job to tell me what they got. Offering someone a Fargleblaster 9672 XTQ is stupid and meaningless.

It really is a shame to watch this aggregate stupidity suck all of the fun out of this hobby. Buying other electronics is fun, but buying graphics hardware is homework. ATI and NVIDIA need to adopt a policy of sensible naming of product lines, fewer products, greater differences between products, and (most importantly) clearly delineated graphics generations, so that consumers can look at a product and know what it is without needing to read the long list of specs. In an ideal world, they shouldn’t even need to understand the meaning of things like DirectX 9.0c and 3.0 pixel shaders. They should know that X is better than Y, and buy accordingly.

I agree with you that graphics cards could be better marketed than they are. However, the “x is better than y” approach doesn’t work, because different graphics cards are better for different things. Some support more features than others but have lower specs, which means they can support more effects of games but at lower polycount/framerate/resolution. Which graphics card is better depends on what game you’re playing.

I heard that! The PC as a gaming platform is being killed by the very companies you’d think depend on it. NVIDIA and ATI made deals with Intel to put built in Graphics chips that don’t do games into most laptops. That’s a big reason why most people would have to upgrade to run any “real” PC game.

Meanwhile Microsoft became a console company. They seem to have forgotten that games are the big reason to get a PC over a Mac. As soon as Windows stops being a gaming platform, there is no reason to choose a PC over a Mac.

Yes, finding someone to get information on video cards is a royal pain. I have a bit of a clue with NVida cards but I lost track of ATI cards a while ago. I now know what the difference between the various specs are and why a card with 256 Vram may be better (and more expensive) than one with 512, but it was a long process to learn. Only getting worse too.
Best bet is going to a place like Tom’s Hardware and looking at their VGA charts, but even then you have to know what card you’re looking for.

See, this is why I have been pushing the “flex computing” meme. Rather than do desktop computing as one big box of components, you go modular. An external graphics card would be cheaper and easier to deal with.

Sorry for double-posting, but I forgot to mention: some hardware makers have handy choosers on their websites. You tell them what you want to do and your price range, and they recommend graphics cards.

If it helps, I’m just flipping through the latest issue of Games For Windows and they’re singing the praises of Nvidia’s GEForce 8800 GT, which they claim delivers the performance of a current $400 graphics card for $250. Might be worth looking into.

I used to work for a company that did video slot machines. A couple of years ago they were settling on a video card for the platform, and at the time then nVidia 5200s were the best ‘reasonably priced’ cards.

They settled on a manufacturer. This manufacturer (eVGA I think, but I might be misremembering) had two models of 5200s, that were priced $10 apart. I think one was $70, the other was $80. If you looked at the retail boxes for them they were identical except for the model number. Literally 100% of the listed specs on the box and descriptive text were the same. You couldn’t find anything that would tell you why the one was better. But it turned out that the one that was $10 more expensive had a wider memory bus, or something, and was literally twice the performance, in our application.

Since then I haven’t bought an expensive graphics card, because frankly I don’t have the time to do the research to track down stuff like that.

Couldn’t agree with you more Shamus! I just built a new system last summer and I looked at so many different parts and numbers that I’m not longer certain what I have anymore. I’ll tell you that I was THOROUGHLY confused when I couldn’t buy the card from the chipset manufacturer.

I’m just glad I have a friend who owns an IT business. When I need new hardware, I just tell him my budget and rely on his judgment to get me the best deal.
If I had to do that myself, I’d have given up on this stuff long ago (BTW, it gets a lot harder when you have to factor in Linux compatibility for all your hardware).

Oh, come on…
3D games are the single most complicated implementation of applied mathmatics in the history of the world. You don’t want to make choosing hardware for it easy, now do you?
It used to be pretty easy, but I guess once the “hobby” became more wide spread the marketing people started to get their grubby little paws into it and now you can’t tell one toaster from another.

Same problem here. I have no desire or will to research all the different varieties of graphics cards.
Fortunately, at this point in time, it seems like the 8800GT is overwhelmingly the favorite in terms of performance and price. The prices seem to run about $200 to $250 for these cards, which seems quite a bit cheaper to me than $400 to $600 for a pure gaming peripheral. It’s an option I’m definitely looking into, although I think you mentioned previously that you don’t have PCI-E, so that might be out of the question for you.

one is 320mb memory and has a few less pipelines (if I remember correctly) the other is 640mb memory and a bit better hardeware-wise. if you want to go cheaper, get the 320mb one, it’s considered a good value for money in a lot of hardware sites.

Also, someone already refered you to Tom’s Hardware VGA charts: they’re actually helpfull, and if you look through their archives you’ll see explanations for most of the technologies present in graphics cards today.

Like kamagurka, I’ve a friend who’s a sysadmin. When I was building a machine last fall, I spent a couple hours in a Fry’s getting his advice via phone. I gave up tracking this stuff around 2002-03-ish. Can’t say I saved any money that way vs. buying a built system, but I mostly got better components and a nice, warm sense of self-satisfaction when I got it up and running a couple days later.

I feel you pain. With this upgrade I ended up looking at Overclockers (after working out what chipset I wanted (GeForce 8800 GTS), and then finding out that the supposedly worse chipset (GeForce 8800 GT) was actually better in every possibly way – cheaper, smaller, less power-hungry, runs cooler, faster, etc), trying to work out if the different manufacturers names with only Â£2 to seperate them actually meant anything, and then getting the OcUK own-brand one (which was actually branded as Leadtek) as it was about the only one in stock.

Meh, it works, and it’s a massive jump from the aging GeForce 4 Ti4600 I used to have. I’m actually able to run my games with all the settings at max, without having to worry about dropping frames in graphically-intensive scenes (though the 2.4GHz quad-core intel might be helping a little).

One thing that’s nice is that when they do a review of a new thing, they include a bunch of old things of the same type for comparison purposes. I particularly like Tech Report’s way of doing benchmarks.

*Shrugs* Maybe it’s just me, but I never really have much trouble shopping for video cards and actually find it quite fun. It usually only takes me a couple hours of research on Tom’s Hardware and shopping on Newegg to find what I want. Like, right now for my dream system, I want an Nvidia 8800GT. To replace the blown-up Radeon X700 in my older computer, I got an ATI HD2600Pro (AGP). It did take me a little bit to figure out why two different 8800GT’s from the same manufacturer were priced about $20 apart (one was overclocked, which I don’t need to pay the manufacturer to do for me) and what the difference between the HD2600Pro and HD2600XT was (memory speeds, mostly).

To arrive at the 8800GT, I just looked at the charts on Tom’s Hardware, read a few paragraphs that indicated Nvidia was dominating currently, then selected the fastest Nvidia card in my price range. That narrowed it down to 8800GTS vs. 8800GT. Reading one more article indicated that the GT was the newest card, designed for that mid-range graphics card market, and was better than the GTS.

For the HD2600, I just determined that Nvidia is no longer supporting AGP in its current generation, checked out ATI’s latest offerings (noting the AGP support), then just went for price point and position on Tom’s Hardware’s VGA charts. Since it was to replace a burnt-out card on a 4-year-old machine, I decided sub-$100. It was merely a choice between X1650, HD2600Pro, and HD2600XT. I chose the middle one as the best speed without going over budget.

I definitely do see the appeal of consoles, but being primarily an FPS gamer, I will not move to console until they start supporting mice as controllers on consoles. I know there’s a mouse peripheral for the PS3, but that console is lacking in the “3D shooters I want to play” market. I know there is a mouse adapter for the Xbox, but having used it, it’s still not as good as PC control interfaces.

Wow… I’m forwarding this to my dad. He does computer set-up/ordering/fixing, and we went through something like this when I wanted to buy a new video card. Incidentally, I gave up and used an older one given to me by the wife of a guy who just died. That’s one way to get a card, anyway…

(Oh, and this is my first post, but I’ve been reading your game design posts and finished DM of the Rings. Nice blog!)

When I got my first PC a couple of years ago, I was blessed to have a tech savvy friend help me sort it out so I could do things like play the Halflife games. Otherwise my brain would have glazed over just thinking about the process.

Yeah I did this a year ago, and i’m so glad I am done with it. Endless hours of reading, reviews and all!!!
But if you want real advice youll have to delve into a forum… try these, these sites are pretty good:

And your still using a AGP motherboard? I feel really bad for you now shamus. Looks like you need a mobo upgrade too
I belive they have a do it your self (or links) to how to build a kick butt rig for really cheap. And dont be afraid to ask them for help. Be very specific into your needs and wants and they will generaly at least give you a link to where you should look.

Now if your like me, I decided that I would be ontop of the game this time, and was prepaired to spend 2 grand on a system, and i’m extreamly happy about it in the end and every thing can be turned up to max. I figure I should be safe for at least five years and then have to do it again. =(

The best graphics cards for the money, December 2007. Use that chart as your starting point – you should consider any card listed on that chart to generally be better than any other card for the same price, and should only consider wavering from that list if you do further research to make up your own mind.

That link above also has a chart at the end that groups cards together. I believe Shamus has a 6200, so you can look at that chart and see which other cards by nVidia/ATi are about the same, worse, or better than what he currently has.

This means for example that if you can find an ATi x800 AGP card somewhere, for example ebay, then that may be a better buy for you than one of the other cards (which are current models only) earlier in the chart. The X800XT PE for AGP is an excellent choice – I recently sold my one second hand for $200 NZ.

The problem is that the hardware market wants it to be hard to understand and wants to put out 15 graphic cards a month. Because sadly they don’t make there money from resonable consumers who upgrade every few years. They make there money from Joe I’mlivingrentfreeinmybarentsbasement and Timmy MyparentscanonlyshowmelovebybuyingeverythingIwant who both go out and buy the next best card every few montsh because it has 10 more diddledues per thingybobs then the last card they got.

It all is confusing and cost to much money. For the cost of one of these super PCs now a days that can run these new games you can buy all three major platforms and a big screen TV to play them on. PC gaming just isn’t worth the hassle anymore.

I think it’s pretty clear what they’re trying to do…drive you to the higher priced graphics cards.

As you have discovered, no mere mortal can successfully navigate the venomous snakepit that is “mid-range” graphics cards, but any idiot can pick out a “high-end” card, because there are usually only 2 to choose from.

They are hoping that you will give up on your hunt for perfectly serviceable year-old technology and buy their brand new generation of products.

Best advice for anyone in this position: look for backdated articles, find out what the best card was a year ago, look for the EXACT model number, and buy it on newegg.com. The price will be extremely fair, and your order will be handled flawlessly. Also they take pictures of every item inside and outside the box from multiple angles, so you won’t buy the wrong one accidentally.

I usually start by finding out which price range I am in, and then research a little in various forums to find the ideal product. It seems there in each generation of cards, or within each price range is a card that’s the most cost-efficient. It doesn’t really matter if it has whatever shaders or what kind of ram, as long as you know you’re getting the most bang for your buck.

I just bought a graphics card, and luckily, it wasn’t such an ordeal for me. But that’s just because my computer is 3 or 4 years old with a “stock” video card (so ANYTHING is an improvement), I really only only want to play WoW (which isn’t very demanding compared to many games), and Futureshop had one on sale (and Nvidia 5500, I think) for $40 (a third the price of the next cheapest model). It should arrive any day now. :)

I know my hardware inside and out, and because of that, I love PC gaming. I took a $300 computer and turned it into the monster I’m using at home now. However, if I didn’t know so much, I would run to consoles in a heartbeat.

These are the only two I’ve found that have what I need: DX10, AGP, and under $120. Sadly, neither one is a safe buy. The comments are filled with horror stories about getting the drivers working. I’m not going to plonk down that kind of money and then spend hours hacking away at config files trying to coax the thing to life.

Looks like the issue here is that when it comes to AGP, NVIDIA has nothing available and ATI has stuff available with didgy drivers and no official support.

“Buying other electronics is fun, but buying graphics hardware is homework. ATI and NVIDIA need to adopt a policy of sensible naming of product lines, fewer products, greater differences between products, and (most importantly) clearly delineated graphics generations, so that consumers can look at a product and know what it is without needing to read the long list of specs. In an ideal world, they shouldn't even need to understand the meaning of things like DirectX 9.0c and 3.0 pixel shaders. They should know that X is better than Y, and buy accordingly.”

-Shamus

That entire sequence of statements is exactly what a console is, ha ha. I stepped into the world of the console a long time ago. Yes, you lose somethings in the translation, but I have never had to wonder if my PS3 will play something. I have never had to wonder about the compatibility. I have never had to look into chipsets, serial numbers, or strange features.

I like my PC, I really do. I keep it upto date for what I want to play, which is mostly MMOs, when I need to but not for everything. Add to the long list of why a console is winning the fact that with your console the game will look great. You don’t have to scale it back, lower the resolution and turn of all the features. Yes, you can get a better than console look on a top of the line PC, but that is what we are argueing against. For the price of the awesome Graphics card, you could get a console that would play most of the same games you want to play on your PC, and do it looking awesome. Instead of having to buy an awesome Gfx card, a PCI express motherboard, a quad core proc, and a cooling system that would cost you as much as having all 3 major consoles and some games to boot.

DX10? Are you using Vista? I just kind of assumed that you were running XP–must have been the implied age of the machine.

If you are not using Vista or planning to upgrade to it in the near future, there is no point in limiting yourself to a DX10-compatible card since XP can’t use it anyway. (Naturally, Vista itself is a demanding taskmaster on the hardware in any case…)

When I wanted to upgrade my graphics card, it took me and a friend 2 days to figure out I needed to build a new computer for it. The only things left from my old computer are the hard drive and processor. Then I had to spend an hour on the phone with Microsoft tech support to re-activate Windows cause it didn’t like my new hardware.

Well, I’m not sure how long your expected lifespan of this card will be. A year? Two years? Longer? At this point, AGP support is a crapshoot since both Nvidia and AMD are not exactly making it a priority.

Without knowing the rest of your system specs and your expected hardware lifecycle, I’m not sure whether to suggest you cheap out, upgrade substantially, or just hold off for a new system altogether. It could be that swapping out your motherboard may be an option, but then I have to wonder if the OS is OEM or retail. :)

I’m expressing significant interest in this since I assembled 2 machines for myself back in August so I went though the whole product research song-and-dance myself. Since then, I’ve been keeping up so all this crap is relatively fresh in my head.

The comments are filled with horror stories about getting the drivers working.

I have a system with an ATI X300 and can tell you for sure that the drivers totally suck for it. I’ve had numerous blue screens on this system solely because of the graphics card. Let’s not even get into the occasional graphical corruption I get just within Windows.

I had very similar issues with the X600 that I used to use at my old job, using several different driver versions. ATI makes fine chipsets (*nods toward the Xbox 360 and Wii*) but their drivers are crap.

A while ago, I got the Orange Box; of course, on coming home I found out the PC had too little graphics card RAM, yada yada. My father, thankfully, works in the computer industry, so he set about changing settings in the BIOS and trying various other graphics cards (there was one called the All-in-One 128, which is fine as a name, but it only had 32MB).

Anyway, the story has a happy ending: he had a more recent laptop and I played it on that instead.

ATi is actually pretty easy to sort out. The number that matters is the third one from the right.

A 9800, an X800, and an X1800 are the same level of card (enthusiast) with different levels of DX9 support: 9.0 (SM 2.0), 9.0b (SM 2.0b), and 9.0c (SM 3.0) respectively. Thank Microsoft for ramping up the shader models in the middle of a DX9 build for that confusion. SM 3.0 should have been DX10.

Think of the X as “10”, the X1 as “11”, and all of the sudden it makes some degree of sense. Then they go to the “2” series and the model numbers go to hell again (since they had just left the 8, 9, 10, 11 progression), but the third from the right rule still applies. If it ends in “500” you’re getting a bargain card. If it ends in “300,” regardless of how many numbers come before it, you’re getting crap.

dx10 is pretty much a farce anyway. The few companies that do use it don’t seem to focus a lot of energy on it. Crysis runs better even in vista using the dx9 executables. The fact that that the xbox360 is a dx9 machine pretty much guarantees that to be the standard for a couple years at least

You should get a decent agp card now and save your cash for a pci-e system down the road. Though odds are good that a top end agp card will see you all the way to the death of of gaming the way things are going. Let’s hope not though

I can generally make sense out of NVIDIA’s at a glance (since the FX series the first number from the left is the series, and the second number from the left is level within the series, starting at 2, for low-end, to 8/9, for the best). The tags at the end are a bit more confusing, but I’ve found that LE is a bit of a lower end version of the card (sort of like how the GF3 Ti200 is a bit slower than the regular GF3), then the GS, GT, GTX, and Ultra models are progressively better. Thankfully, in my experience lower-end chips with nice-looking tags don’t generally perform better than higher-end chips without tags. For instance, my 6600GT might be clocked higher than a 6800, but the 6800 generally performs better than mine due to its 256-bit bus width.

Zaghadka’s explanation for ATI cards helped, because before I had no idea what was going on with them. :P I have to admit to still being very confused about some things as well. For instance, I have no idea what the hell’s going on with the RADEON HDs, and the fact that they played the X1050 — an R300-family device roughly comparable to the X550 — between the R400 and R500 chipsets doesn’t help matters. I guess they were trying to put it “where it belongs,” but it would be far less confusing if they kept the different chipset families separate. For instance, give the R300s the X prefix, give the R400s the X1 prefix, give the R500s the X2 prefix, etc. It would do a better job separating the major chipset revisions and would do a better job preventing confusion (like how the X1300 supports Shader Model 3.0 while the X1250 doesn’t).

I find the best site to go to is either tom’s hardware or anandtech. Google up either and it will be the first link. Tom’s has a very useful “best card for money” which tells you which card in each price range and is updated when cards come out. If you are looking for under 200 the new 3000 series ATI is pretty killer, if you are looking for around 250 someone before mentioned the 8800gt. Those would be my recommendations right now. You will have a card that will last a good 4-6 years.

I just read that you have agp, well then that changes your choices, but the 7800gs is in the 140 price range nowadays and I had that card previously and I liked that card a lot. There is a rumer of a ATI 3800 series agp version, that would be the king of the hill but prob way over 200.