Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

crookedvulture writes "Filling the gap between mid-range graphics cards around the $200 mark and high-end excess that costs upward of $500, Nvidia has added a $350 GeForce GTX 570 to its stable of graphics cards. Based on the company's latest GF110 GPU, the GTX 570 offers equivalent performance to last year's flagship GTX 480 with lower power consumption and a cheaper price tag. The value proposition is strong with this one, although as The Tech Report's review points out, it would be wise to hold out until AMD's "Cayman" graphics card breaks cover, which it's expected to do next week."

I'm using a HD4350. Biggest news I've noticed lately is that it now shows up in lm_sensors output, and the gallium driver can render Minecraft properly albeit slowly. At this rate of progress it'll probably have fast 3D in a few months, then the sugar on top like OpenVG. And it won't suddenly stop working on the whim of one company a few years down the line - that's why I don't buy nVidia any more.

"We're fucked"? What ever gave you the idea that any American car company was fucked? They don't even need to do well in the marketplace any more, all they have to do is make some crappy cars, watch them sell poorly, then whine to the government and ask for a big bail-out.

GPU sales don't work like that, they are not the latest Xmas toy. They will simply drop depending on competition, it could be after Xmas but it could be before. Right now if you have $350 to spend you can get a 570 and get 80% of the much more expensive 580. Or if you don't mind crossfire you could spend ~$380 and get two HD 6850's and get even closer to the 580 for many games.

Two people each plan on spending $150 per year on video cards. Person (A) buys a brand-new $150 video card every year and Person (B) buys a brand-new $300 video card every 2 years.

Person (A) ends up spending half his time with better performance than Person (B), and vise-versa of course. The difference is that at the end of 2 years, Person (A) has accumulated two extra video cards while Person (B) has only accumulated one extra video card.

Or you could realize the performance difference between this card and a decent card from 2 years ago is like 5%... and also that there's not a single game on the market that would really tax a card thats 1/3rd the price and finally realize, like the rest of us did years ago, there's absolutely no point in spending more than $200 on a video card unless you're doing professional 3D rendering.

Thats the issue i have. I got a 470 a few months ago and i can't find anything to give it a workout. I guess the good thing is as long as it doesn't fail, i have a gfx card that will max out(probably almost) any game for the next 3-5 years.

The performance is most certainly larger than 5% (Consider that it can have upwards of 50% performance improvement over the 470 which was launched earlier this year.), but you fail to consider that this performance will be delivered for fewer watts, saving power both in and of itself and through the reduced need for cooling. Benchmarks from AnandTech [anandtech.com] show that Crysis will give this card a workout when played at 2560 x 1600 with high settings, so it's somewhat disingenuous to claim that there's nothing out there that will tax this card. It's definitely a card for enthusiast gamers who want to use the highest resolutions and graphics settings so it's definitely not something the mainstream will care about.

The new cards also have significant compute advantages compared to previous generation cards. The 570 has 4x the performance of a 285 in some benchmarks. [anandtech.com] The 285 came out less than two years ago and cost significantly more at the time of release. OpenCL is allowing graphics cards the opportunity to do a lot of things other than just 3D rendering. For some workflows, investing in these powerful graphics cards is a lot better than buying better CPUs.

I'd argue that a decent card 2 years ago was the long winded 8800gtx. These cards are head and shoulders above a $150 card from two years ago. Jumping from a wheezy 8600gt to a GTX 460 1gb is like night and day -- even when jumping from 1680x1050 to 1920x1080 (about a quarter-megapixel more to render). I went from barely 25fps in BFBC2 to a solid 40fps using the GTX 460 -- and that's about half the speed/"power" of the new 570. I'd wager the difference is closer to an order of magnitude than a mere 5% diffe

I have a problem with manufacturer model number naming when I'm looking for toys. I'm trying to research graphics for notebooks and Nvidia just released some new notebook chips too including the GT540M.

GT540M should not be named so when it's exactly the same architecture and hardware as GT 435M with a minor 22MHz bump in GPU clock speed and 12% increase in memory clock speed. Maybe name it the GT 440M ? Pushing out an entire new 5XX series number for a tiny incremental change probably due only to better

Heh, I've almost given up on following the model numbers, and just head directly to http://www.videocardbenchmark.net/ [videocardbenchmark.net] to get a *general* idea of where a card falls in the grand scheme of things.

I've sort of been toying with the idea of another gaming notebook, but I was kinda disappointed in my last one (Inspiron 7200 with a Geforce 4200 Go)... it seemed great for a couple of years, but still went out of date before its time... Dell never released drivers for anything newer than WinXP, and even under Linux

A 5970 is sorta like 2x 5870 chips SLI'd together on one board, and they had to downclock it slightly to make it stable. Yeah, the benchmark / driver doesn't appear to make use of the additional shader unit... but then that means most games probably don't either. But at least now you know where to go from here.

The Nvidia corporation are not about to allow you or your familiy to have access to any kind of documentation, code or anything else for that matter. You can use their cards on free software systems, but you have to submit to their Binary Blob world order to do so and if you are willing to do that then you might as well run Windows. I've heard it's improved somewhat since 3.1, and people seem to like it. AMD are, on the other hand, barely making an effort to help free software driver development by publishi

You can use their cards on free software systems, but you have to submit to their Binary Blob world order to do so and if you are willing to do that then you might as well run Windows

That quote make no sense at all. Most desktop users of opensource software don't really care about the source availability. They just care that Linux/BSD/Whatever is better in supporting the jobs they need their computer for. Just think about how few of the users who run desktop linux, actuelly have the ability* to modify the source of anything.

As a longtime Linux user, I could care less that I need a closed-source binary blob to run my graphics card. You know what? I'd rather trust the guys at NVIDIA to write a solid, good performing driver for their own hardware, than have a buggy less-capable equivalent even if it's open source. I have had multiple Linux boxen at home running many different generations of NVIDIA hardware, with few problems over the years. Apart from forced driver obsolescence (old hardware not supported in the latest d

[...]I've not had any problems with the binary drivers from NVIDIA under Linux. They just work.

On the other hand, nobody could pay me enough money to run AMD graphics cards in Windows and suffer using their Windows drivers. Admittedly ATI made major driver quality and stability improvements around the R300 launch, but not much has improved since then. And that's 8 long years ago.

You neglected to mention the ATI/AMD Linux driver situation. The proprietary driver works for about 1 out of 32 models available and it will panic the kernel faster than Steve Ballmer can do a monkey boy dance. They are orders of magnitude worse than the Windows drivers.

The value proposition at techreport that is mentioned in TFA has it's toungue so deep up nvidia's bottom that it is hard to keep a straight face.They have this scatterplot that puts price against performance.There are 16 setups in the plot.One of these is nvidia's GTX580.According to the plot, only 2 cards of the other 15 have a worse performance per dollar.Despite this fact the article mentions: "Nvidia's newest is actually pretty well positioned on the scatter plot, with only the mid-range multi-GPU solut