Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

MojoKid writes "When AMD announced the high-end Radeon HD 7970, a lower cost Radeon HD 7950 based on the same GPU was planned to arrive a few weeks later. The GPU, which is based on AMD's new architecture dubbed Graphics Core Next, is manufactured using TSMC's 28nm process and features a whopping 4.31 billion transistors. In its full configuration, found on the Radeon HD 7970, the Tahiti GPU sports 2,048 stream processors with 128 texture units and 32 ROPs. On the Radeon HD 7950, however, a few segments of the GPU have been disabled, resulting in a total of 1,792 active stream processors, with 112 texture units and 32 ROPs. The Radeon HD 7950 is also clocked somewhat lower at 800MHz, although AMD has claimed the cards are highly overclockable. Performance-wise, though the card isn't AMD's fastest, pricing is more palatable and the new card actually beats NVIDIA's high-end GeForce GTX 580 by just a hair."

IDK what you're smoking, because you are just plain wrong. The benchmarks show the 7950 as faster than the 6970, the fastest last-gen AMD card (except for the dual-GPU monstrosity that is the 6990, of course). Unless you actually meant the dual-GPU card, of course, which is not in any way shape or form a fair comparison.

The 6000 series mostly was the 5000 series. The high end may be a bit different, but the upper-midrange (6770, 6870 stuff) was literally the same chips with some minor stuff tacked on. 3D and some more advanced video support mainly, IIRC.

More RAM too. There were not too many 2GB models in the 5000 series. There are a bunch of 2GB or 3GB models in the 6000 series. Nvidia did the same thing with the 8000 series and the 9000 series. The 9000 series was the 8000 series with a few more features. I have a 8800GT 512 MB card and a 9800GT 1GB card. The 9800GT card does not take any extra power while the 8800GT needed a 6 pin power plug to function correctly. The difference nothing that I can see. The 8800GT is actually 'faster' according to tests.

So when will there be cards affordable by normal people? Also for me the biggest thing to come out of the new design is that we should be able to get a passively cooled card with more performance than the HD5750.

When Kepler comes out expect all these cards to significantly drop in price.

GCN was a huge cost on AMDs part, and Kepler will be a refinement of Fermi, so Nvidia will aggressively price the 600 series (especially since they won't launch for another 2 months) and make profit on them. And expect AMD to take a loss on the investment but not on the returns from fabrication on the 7900 series (assuming they fab the 7800 and lower cards on their old VLIW architecture like the roadmap from last years aid they would).

So when Kepler comes out, it will probably be aggressively priced, and AMD will drop prices to match. For now they are exclusively the only maker of "next gen" gpus after 2010s 500 and 6000 series, and Kepler is 2 months away, so AMD is milking it.

According to MSI via Fudzilla, the 77xx series will launch in two weeks at $139/$149 and the 78xx series in March at $249/$299. After that the ball is in nVidia's court, but the current guesses are they're not ready until April, sometime around Intel's Ivy Bridge processors. I think it's working, I've looked at the 7950s and is tempted but will probably wait until then and see if they bring better performance or lower prices, if nothing else to get a better price from AMD. Currently the 7950 costs about dou

The 7700 series will definitely be interesting, if you want to build a quiet computer that still can handle most games (albeit not at the highest graphics settings).My latest PC upgrade a few months ago used 6770, and so far it has handled all I've thrown at it.

Quiet computer? Well I bought a 4870, which just burned up, thank god. I got tired of leaving my computer on all night because I couldn't go through boot up without my computer sounding like a jet plane taking off and waking the whole house. At least I could control the fan in Windows for gaming but in Linux it just sat there at 5k rpm. Well at my old age I went back to school and gave up gaming, no time or money for a video card that costs as much as a console. BTW playing even WoW, or Rift at ultimate cau

Well the sweet spot is usually about 8-9mo after the release of a new card. That gets all the major bugs out of the manufacturing, and all the driver issues hammered out. And the prices have pretty much bottomed out too.

I was mostly making a joke, but it is true that eight months from now some will start to wait for next gen rather than buy current gen at a good price. The way I do it is to just buy whenever I think I need it. Like that there is no remorse.

Game what at 1080p? 4 year old games? 1080p is a resolution (1920x1080) and a frequency, it says nothing about the quality of the image being drawn, it just clearly defines the size and refresh of the image. You could run the original Xcom at 1080p on a 25 dollar air cooled card, but that's not what you mean is it?

If you want to play Arkam city with something close to max settings at 1080p you need a better card than the 5750. If you're willing to take crappy settings then it *might* be possible for an

Not very likely to happen. Most modern games do put a lot of stress on GPU, which means that you either forego quality, fps, or you install a proper active cooling solution.

Market for functional "silent" solutions is generally an expensive one as it either uses expensive fans with high end bearings and bigger blades (allowing for slower rotation speeds for same air flow), or liquid cooling in high end. You're not going to enter it with a sub-150USD card with passive cooling - these cards are notorious for b

Well, the 6870/6850 was pretty much the bang-for-your-buck card in the last gen, with the 6770/6670/6570 being really affordable for most any aspiring gamer - so I'd assume you'll need to wait for a 7870/7770/7670... shouldn't be all too long now. I'm waiting for the 7770 (or the 7 series equivalent of the 6770) myself - should be a nice reduction in power consumption and noise, coming from an 8800GT.

I sure hope so. I have a 6950 that I flashed. I wanted to get another, but I could not find any more. Still, I might just get the normal 7970 because it just overclocks so well. Still waiting on nvidia so that we can get some price drops, though.

Probably. Though since the cards have a very uniform architecture, with many repeats of the same thing, my guess is that they bin the chips according to the number of stream processors which are defective. This allows them to fab nothing but top end cards and get good yields of slightly off the top end cards.

GPU manufacturers certainly used to sidable non-"pro" level features in cheaper cards (which could be re-enabled by various hacks) though the car

Hate to break the news to ya but the E350 has been supported OOTB by Ubuntu since 10.04 so i'm sure the others are up and running as well. you seem to forget AMD actually hired coders for the free FOSS drivers as well as handed out the specs, go look that chip up on Phoronix, they say it works just fine.

It has nothing to do with AMD and frankly you will NEVER get those bits because it would be illegal to give them to you. AMD has already said there is nothing they can do over HDCP and protected path as that technology is owned by the HDMI consortium and to give out that information would be breaking DMCA as well as get every AMD card blacklisted. If you want those bits you can use the blob which again Phoronix ran full tests [phoronix.com] and found it runs just fine on Ubuntu 10.04 and Ubuntu 11 runs OOTB, it also smokes Atom + ION on their benches. For a board that costs just $142 for the barebone kit [newegg.com] complete with PSU and case that makes it a hell of a cheap Linux box, especially when you figure in the fact you are getting dual core plus Radeon plus the ability to run 8Gb of RAM.

But FOSS users are simply gonna have to accept the fact unless you wanna do like RMS and hop on chinamart for some funky ass Loongson MIPS netbook there are NO machines that you are gonna have complete access to, because if it has even slightly modern video output it'll have protected path and if it has wireless it'll have non FOSS firmware. Hell even the Raspberry Pi has broadcom binary blobs, welcome to reality. in the end what should matter is "does it work" and as phoronix shows yes it does, and it beats Atom + ION while having better graphics and often a lower price. Seems like a win/win to me but if you really have your heart set on Nvidia they have a PCIe slot, and there is an open box GT210 on Newegg for less than $20, knock yourself out. Even with the discrete card it'll still be cheaper than an Atom + ION board.

So far, 3D acceleration is also significantly slower than in the closed sorce Catalyst driver. Some of that technology may also be owned by 3rd parties, but it is not as clear-cut as in the case of HDCP.

I suspect AMD's reasons for not releasing that stuff are part legal and part not wanting to give away the latest know-how.But the latter seems a bit silly, as NVidia drivers already have the better reputation and probably the better code. AMDs advantage seems to be on the hardware side, with their chips cran

But its not like you have an either or choice here friend, if you want Nvidia graphics you can slap a $17 Nvidia discrete and it'll still stomp Aton + ION because the Zacate chip is simply the better CPU. But considering how fast the open team is catching up and the fact that AMD is paying extra coders to help them I'd say the safe bet is to use the closed now and the open in a year, maybe less. After all with each release they are closing the gap and now that even Nvidia is going OpenCL which AMD fully sup

In my experience (OpenSuse, though, not Ubuntu), install first, add extra monitors later, especially if they run at different resolutions. If you use the official/proprietary drivers, be sure the open drivers are completely removed from your system or you'll have a conflict.

The Catalyst drivers just landed on the 27th of Januray I think, before that there was a hotfix release for real enthusiasts. Open source support is as far as I know still missing, but basic support should not be far away. They've consistently come closer to release date with each release, last it took 2.5 months and I expect less this time. If you want it the moment it's released expect to compile your own kernel/xorg/driver though. Don't expect any miracles from the OSS drivers though, as I understand it

But does it run Linux?
No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly.
The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..

I have been doing dual monitor with ATI/AMD X300 (Benq Joybook 5200G), HD3470 (Toshiba Satelite M300), and HD5650 (Sony Vaio VPCEA36FG). The only time that dual monitor failed me is when I'm using Ubuntu 8.10. Currently I'm using 10.10, with a Samsung 43' LCD tv as secondary monitor via HDMI. Mirror and splitscreen works

Actually the OSS ATI drivers aren't too bad on Linux. I hadn't really messed with any of that stuff before in KDE so when I did my new Arch install, I was surprised by how easy it was to configure all that. I was kind of irritated that hitting apply didn't save my settings and it took me quite some time to figure out there was a separate "save" button somewhere in the display dialogs, but other than that...it's not bad. The only thing that's kind of annoying is the power control. You have to manually se

People expect AMD to be cheaper, even when they are competitive on a performance standpoint. AMD usually aims for the mid-range market more, so I expect seeing a top-end card from them (at top-end prices) is a little surprising.

The market is changing, and the reviewer is reflecting that. People don't want to spend 600 dollars on a top end card, even if 5 years ago the 'top end' cost 800 dollars (or whatever it was).

The perception is (rightly or wrongly) that all of these things should be getting faster and cheaper at the same time. That's not entirely wrong, but it's not entirely right either. A die shrink should mean lower cost for the chip itself, depending on yields but has nothing to do with any of the other parts on the PC

Maybe that the first of the 28nm process generation costs about the same as the last of the 40nm process generation released a year and a half ago? Currently the effect on the price/performance ratio has been almost nothing, they've offered higher performance at a higher price. Yes, the 7950 is now beating the GTX 580 in most ways but it's not exactly a massively better deal. Hopefully nVidia will be a bit more aggressive but if they're both held back by TSMC's delivery capacity the duel can get a bit lame.

...well, let's clear things up: I was always an AMD fan. Their CPUs rocked. I had a seriously great time overclocking my SS7 gear until it boiled.

The graphics cards sucked though. I'm talking about the old Radeon AGP cards. Put down your paddles, lads, 2006 was the last time I bought an ATI branded card (an X1800) and IMHO it sucked monkey balls. I couldn't even get it to perform at low resolution on Unreal 2002. That's why I went straight back to the store and swapped it for an NVidia 7600GT. Oh, yeah, life was sweet after that.

A couple weeks ago I bought a secondhand Sapphire HD3650 with 512MB DDR2. OK, it's a bloody old and very low spec card by tech standards, but it blows my GF 7600GT right out of the water - even on a slower, single core 64-bit processor running 32-bit platform. That made me a fan of ATI/AMD graphics right there. The old machine (Core Duo) with the NVidia is now collecting dust.

Lol. You replaced one old outdated card with another:) My personal experience has been that NVidia has excellent drivers. ATI/AMD have better hardware and better visual quality (NVidia often as strange visual artifacts). Downside of ATI is their drivers are dodgy. It is always a risk upgrading an ATI driver. Sometimes new drivers can break your favourite game until a hotfix comes out (usually takes a fortnight or so). So, whether you go NVidia or AMD depends on what you want (NVidia ease of use) or ATI (mo

ATi drivers aren't just dodgy...they are awful. I've had a 4870x2 for a while now and I've seen issues ranging from buggy games, to crashing video drivers playing flash, and green video for flash. I did a completely clean install for the last release and got about 2 days of being able to use Youtube before it started green videoing again. It is truly incredible that they can make a video driver that can't properly play fucking YOUTUBE VIDEOS with hardware acceleration while at the same time being able to

yea my 9600GT kicked my 7600GT right square in the nuts, actually just about any card after the 7600GT would have rocked it, your comparing a sports car to a yugo. The 7600GT was the absolute worst waste of money I have ever spent on a video card as my 6600GT actually performed just as well

ATI have a better hardware, but their drivers are pure, total crap. It's like buying a Ferrari and put a mediocre pilot to drive. After many problems with drivers I gave up buying a new Radeon and now I use a GTX580

im on a 6950, it is clocked at 810 mhz, but it can do 910 mhz by just using the ati catalyst slider. no fancy stuff. if you go into serious overclocking, you can approach 1000 mhz easily, if you play with the voltages and stuff.

moreover, X950s are generally unlockable. for example i unlocked the 6950 im sitting on, unlocking 24-30 or so shaders, basically making it a 6950. i could also flash a 6970 bios and make it a full 6970, but that's totally unnecessary, since i can get more than that by overclocking.

You don't say. Must not have factored in Nvidia's history of selling and shipping GPUs that were known to be defective and then conspiring with the purchasers to hide this fact from the users until after their warranties ran out.

If they had, this new GPU would out perform Nvidia's by huge leaps and bounds.

really what is the point of this any more? 90+ % of your games are optimised for consoles first giving you at best a geforce8800GT, computer monitors are not getting any higher resolution and they still have not come up with a cooling system that doesnt clog with dust in a month!

Lucky you. On my box, regardless of distro, kernel, and catalyst drivers, VLC always segfaults when trying to play accelerated video. It works fine with nvidia, so I have to conclude that the drivers for the AMD card are worthless.

That's me 40 years ago, but you wouldn't know it today, eh? I was in high school, I was a football star. All the girls wanted to dance with me. And I had a Diamond Viper. It was the fastest on the block.

(not the GP) You wouldn't like it. Seriously. Speaking truthfully, my senior year in high school (25 years ago) I -was- a football star at a large Texas school, and drove an older corvette with a big-block. I -also- was a grade-A, unadulterated, shallow, ass-blossom douchebag with absolutely no values worth speaking well of. It's an over-rated experience that set me back years growing up and becoming an actual person. If I had to do it over once again and given a choice, I'd rather lose an arm.

You should probably still be cheering because that means the last gen stuff will drop like crazy! Hell the HD4850 I've got in here now retailed for $240 at release, know how much i paid for it a year and a half ago? $60. And frankly it still cranks out the purty on my 1600x900 monitor.

Of course that gets to the heart of the matter and why they are having to push 3D and GP-GPU and Eyefinity, simply because games don't keep up anymore. With the exception of a few games i call "benchmark bait" like Crysis frankly most of the games are console ports and all that extra power is sitting there twiddling its thumbs.

So while i'm hoping this will mean I'll find a steal on a 5850 or 6850 just because they crank out less heat honestly I doubt I really NEED it for any of the games i'm playing. What you'd actually use this card for except for winning benches and showing you have the biggest epeen is beyond me, is there even a game that would stress this bitch?

Try a flight simulator like DCS:Black Shark 2 or DCS:A-10C. They will work out any video card pretty hard. So while you may play first person shooters with 300 meter horizons that don't stress your card out, when you get up in the air and have a 20 km horizon your card will be working its guts out.

There are console first person shooters, and then there are PC first person shooters.

Try running BF3 on high/ultra in high resolution. My reasonably overclocked GTX 560Ti can just barely handle high in 1080p, ultra utterly murders it with clear jerkiness present in many situations. On the other hand, it eats MW3 for breakfast in pretty much any resolution/quality I could throw at it. You don't need to crank out a "20 km horizon" to overload a modern card.

And frankly, if a game makes your card render 20km of ground in level of detail that actually affects it, of which you will literally see only a few hundred meters, it's doing it wrong. Badly wrong.

Actually, when at altitude the horizon is far more than 20 km distant. Of course it is not rendered at the same level of detail as the close terrain, but the polygons etc for mountains, lakes etc do need to be processed as you can see them 50 nautical miles away from 50 thousand feet. Also remember that the rendering area goes as the square of the distance: double the range means four times the rendering area (yes, you often look behind you when in an aircraft - TrackIR is wonderful). Incidentally, what is

Again, if your game renders background the way you suggest it does, it does it TERRIBLY WRONG. I once again present the case study, battlefield 3. It often renders huge backgrounds without the catastrophically increasing impact on either video memory or GPU load (i.e. view from a plane looking over entire map vs view of a foot soldier looking at his spawn).

This is done using various LOD techniques and is called "optimization". Notably end result looks worlds better then any of the games you presented as exa

lol. The maps in BF3 are *tiny*, just a few km by a few km. The backgrounds are merely animated 'sky domes'.

The equivalent sky dome in a flight simulation is much much more distant than that. nb: BF3 is a 'game', and doesn't cut the mustard in *simulation* terms (that's ok, it's not trying to be a sim, but let's call it what it is). Even Arma2, which is a vastly better in terms of simulation than BF3 of ground combat (which is ok, since Arma2 is a sim and BF3 is merely a game) is weak when it comes to air

No offence, but the way graphics are handled in most simulators nowadays is afterthought at best. And it shows. BF3 can produce a beautiful scenery for several kilometers, while utterly ugly (both aesthetically and graphically) simulator graphics in most modern simulators can eat almost as much of both GPU and memory and end result will look like something utterly horrible in comparison to BF3.

Has it ever occurred to you that most of graphics engine design is not about looking as realistic as possible, but

> No offence, but the way graphics are handled in most simulators nowadays is afterthought at best. And it shows.
No offense taken. You are wrong, however. Graphics are not an afterthought at all in sims (assuming you have actually used anything modern - oops that's right, your card can't handle them, which was my point). Go and check out the in-cockpit shadows on the A-10C or the Ka-50. BF3's cockpits are lame in comparison (they have nice textures but are essentially static). The soldiers, foliage and

Uhhh...I didn't say there wouldn't be ANY guys that could use this, I'd said there would be very little because frankly most are playing BF3 and not a hardcore FS. Have you LOOKED at the Steam hardware survey lately? I mean you may have triple 2500 resolution monitors but that just makes you 0.01 percent of the population. The biggest settings are 12x10, 16x9, and 16x10 last i checked, the majority were dual cores but quads were climbing while we 6 core players are still a tiny minority (which when you can

These are the circles I move in, here is someone talking about their setup:http://forums.eagle.ru/showthread.php?p=1390782#post1390782 [eagle.ru]
It is good lots of people like BF3, it is a good game. However, the original argument was not a lame "Is BF3 the bestest game out there" debate but was an assertion that an old video card is good enough. I said that there are people out there (eg. myself and the other folks playing in my genre, as with the person listed above) for which video cards can't ever have enough

You want distance pal try Just cause II. You can climb on top of a mountain and....wow, the view is just stunning, with the snow whipping, and you can then jump and free fall alllllll the way through the different climate zones all the way down to the jungle floor. I actually tied a bike to the back of a 707, let the 707 pull me up to about 25,000 feet and then cut the line and did the wickedest free fall bike stunt you'd ever seen...played just fine on my HD4850 BTW.

Actually, what really matters to performance these days is the amount of video ram you have. It is good you are content with low-end graphics. Many of us are not (if you get the right game it is easy to tell the difference between a low end and high end performing PC gaming system).

That's what happens though. Expect that the Xbox 3, PS4 will have something on par with a 7000 series Radeon or 600 series (not yet commercial) nvidia card, at which point, to keep up with a console you'll need something new ( I don't have any insider information here, but that would be consistent with the projected timelines and everything that has happened in the past).

That's how the market has worked for a long time. The consoles come in and converge performance parity to PC's by being sold at a loss f

Take any shooter or MMO with really large maps and corresponding memory requirements.

For instance, "All Points Bulletin" comes to mind. After a few minutes, it always brought my PC (AMD dual core, 2GByte RAM, NVidia 8600 GT) to its knees due to requiring 2GByte of memory or more for itself.CPU and GPU seemed to have no problem, as the game ran fine until the memory limitation kicked in. So I guess the CPU and GPU in current-gen consoles might be able to handle the load as well. But memory-wise, they would r

Memory on consoles is a different baby completely than on PC. On a console you know exactly how quickly you can pull data in from the optical drive, and have a good idea about the hard drive. On the PC you figure most people have a couple of gigs of RAM, so you may as well use it, and you have no control over what else is using those resources on the system, so you're better to use RAM than to rely on disk access. You also have very different memory space requirements with the GPU (you might be mirrorin

Dude you ought to snatch one of the Thubans while tiger has them so cheap. Hell they have the 1035T for $105 bucks after rebate and you can get the full kit for less than $300, its sweet dude, real sweet. if your board can't take the Thuban and you have DDR 2 i'd get the Asrock A770DE+, that's what i picked up since i had 8gb of DDR 2 from my ECS quad (damned lying ECS bastards saying their board would take a Thuban when it wouldn't) or if you've only got 2gb you can get an Asrock DDR 3 board for like $30 a

Thanks for the hint, but that was several months ago and I have since upgraded one of my other computers (an old P4).The upgrade consists of a new AM3 board, a Phenom II 910e quad core, 4GByte of DDR3 ECC Ram, a Radeon HD6670 and a new harddisk. While not the very fastest, this system is easy on the electricity bill (only about 80 watt when doing light duty) and should last me a few more years.

The ex-P4 is now my primary PC, and the dual core I tried APB on has been demoted to secondary.

Okay I gotta ask.....why? The E series is a horrible chip, loses in every metric and you could have simply underclocked a standard chip and not only would it have been cheaper you would have saved on electricity while having the ability to ramp up if needed. Frankly the only E series i touch is when a customer has a socket AM2 board and they want a cheap upgrade as Starmicro [starmicroinc.net] has some E series quads dirt cheap and if the customer just wants a render box or video converter better to have 4 slower cores than 3

I mean this one:http://products.amd.com/pages/DesktopCPUDetail.aspx?id=623 [amd.com] The 910e is a "Deneb" core. At 2.6 GHz, it is not that bad in performance, and the official TDP is half that of most standard AMD quad cores at the time I bought it (65 W vs. 125 W). You may get lucky with a standard chip that happens to be close to the 910e in power consumption,

With the plummeting prices on monitors(at least for those of us blessed enough to have undemanding taste for the finer details of color reproduction and perfectly uniform luminosity, I can't speak for the poor fellows who have to buy the good stuff), I am appreciating the increasing number of video outputs that some of AMD's newer cards offer. When you can get a 1920 x 1080 panel in the 21ish inch range for ~$120, more video outputs means more sweet, sweet, screen area without the hassle of accommodating mu

Unless he's running Linux, for which I will admit ATI/AMDs driver support has been nearly or completely non-existant for quite some time, he's totally full of shit.

I have never gone more than a total of two weeks with a driver-related problem on an ATI card on a windows based system for any game. As opposed to NVidia from whom I haven't purchased a card since the debacle where I couldn't play SW:KotoR for over a month due to their problems.

I meant ever. There was a problem with SW:KotoR 2 with it crashing like clockwork after an hour playing that took 13 days to resolve. Other than that, and one 6 hour period where there was a conflict with WoW on my 4890, which I heard about after it was fixed already, there haven't been any major problems that I know about that could possibly have effected me.