Instead of creating new threads every time AMD/ATi, nVidia or Intel try to blow us away with the latest technology, I thought we might as well keep it all in the same thread to keep it nice and tidy since most of these topics are related in some way or another...
My main sources are The Inquirer and CustomPC, but I don't check them all the time or feel like sharing info all the time (because it's not always worthy of its own thread lolz), but this thread might be a good way to keep each other up to date I believe.

I'll go first

AMD loses $396 mill
^ Funniest part is this:
"Rival Intel made $1.86 billion in a simlar period, but still decided to cut costs by axing 2,000 staffers."
I am never buying another Intel product, no matter how much better their benchmarks might be...

"Rival Intel made $1.86 billion in a simlar period, but still decided to cut costs by axing 2,000 staffers."
I am never buying another Intel product, no matter how much better their benchmarks might be...

I applaud you for your righteous fury.

I hope Intel buys AMD.

"No, Mama. You can bet your sweet ass and half a titty whoever put that hit on you already got the cops in their back pocket." ~Black Dynamite

The only reason Intel has been so competitively priced and has brought out good value chips is because AMD is their main competitor, without that they'll become complacent and probably scr*w the costumer over even more.
*cough*look at M$*cough*

It'd more likely be a mega-conglomerate interested in entering the CPU market, like sony for example. Theyve always said theyre going to do it, Im not sure what they are currently up to....

I cant see it happening anytime soon though. Even if phenom doesnt take off for the desktop/gaming crowd, the server market, ATI products and project fusion will keep them around for a whiles yet.... For the same reasons as negsun, I'd never purchase an intel product again..... pull them out of pcs in the dumpster, yes (like the dual core pentium d powering my media center pc)

Edit:
To keep things balanced and unbiased, here's the (alleged) first review of the 8800GT!
Very good results I must say, and very consistent too, shame for all the GTS owners out there cause this puppy owns it lolz

Still, this is a perfect opportunity to start a rant about hardware manufacturers taking the enthusiast crowd for a ride.

I like tech as much as the next guy, but putting everything into context, and considering the prices of the higher end items, there are some very important provisos here..

I can say this specifically applies to CPU and GPU hardware in particular. People should think about...

1. What do you actually do with your computer ??

2. What monitor do you have, what is its max rez and output frequency

Im lucky enough to have a monitor that outputs at 1600p, but it outputs this at 60Hz. This means that anything I can EVER hope to do is capped at 60fps. At lower resolutions, it allows for a higher frequency and hence higher fps... Hence, I can play Bioshock, Oblivion etc at 1600p with maxed out settings and keep a steady and pretty 60fps with no problems with my 8800GTS. This was why I never forked out that extra $200+ for the GTX. Until a monitor that does 1600p at 120Hz comes out, I wouldnt bother with a higher end card.

Most monitors people have lurk around the 70-75Hz mark. Hence, when considering their purchases they need to keep this in mind. Do they need the GTX which gets 103fps for <insert game here> or could they get by with the GTS or ATI equivalent which does 82fps with all effects on ?? That 21fps reported by fraps etc doesnt make it to your eyes, so why should you bother ?? Ive always asked this of gaming tech-heads and have never gotten a response that makes any sense. It is difficult to talk about 'future proofing' for GPUs, with DX10 revisions, not many of the 'enthusiast' crowd are investing in a gfx card now that they can have in 3 years time.

There are a few 1080p screens out which operate at at 120Hz, but they are very expensive and actually oriented towards the home theater crowd atm. I havent seen a 1600p monitor that does 120Hz yet, though Im sure they arent too far away

Quiet simply, the GPU manufacturers do a really great job at duping the higher end purchasers out of their cash - which is quite annoying

This can even be argued with CPUs as well nowdays. With the constant revisions Intel and AMD are making, a 3 year old CPU will never perform as well as its contemporary.

For those working with applications that involve high end CPU and GPU demand, eg. graphical, HD video editing/compositing and animation apps, the argument for going high end is stronger - though these people will more likely have custom built(often mac based) pcs running a quadro card, simple as that.

So, at the end of it all, buying the highest end tech should be a purchase you should think about carefully. It's a high price to pay for bragging rights !

As for myself, the 8800GTS is handling itself OK at 1600p considering its 60fps cap... until a bunch of games come out that I really want to play actually challenge this, I now have the opportunity to squander cash on something other than my pc !! (actually saving for a holiday atm)

That's totally true, I mean here I am playing Jade Empire and NWN2 on my OC'ed Barton 2600+ 2.1Ghz and GeForce 6800...All this stuff is like years old technology wise, but does it still do the trick? Hell yeah.

That's why even if I do get around to upgrading next time I'll probably be going for the X2 5000+ Black Edition and a cheapo X1950Pro, why? Cause that's all I'll ever need, my monitor atm is 16", when I ever upgrade to a new monitor the max I'd go for is probably a 19" or 22" widescreen one...which caps at about 1680x1050

Consider the fact that you'll never see me playing Oblivion or FEAR or STALKER or even Crysis (not a huge FPS fan, though I might give Crysis a go at the lowest setting in the future), I don't really need much more than that.

It's the same with overclockers these days, they buy a rediculously expensive chip to OC it even more wheras the whole point of it (IMO) is to buy a cheap chip and pwn people who paid lots more for the same performance. [/rant #2]

That and our critical flicker frequency (where we see something as a continual light instead of as a series of flashes) is maxed out for most people below 50 Hz, unless you have an extremely bright big spot of light. Some people can go up to 65 Hz (and a few with super-eyes may perceive higher frequencies), but most of us looking at a monitor won't perceive anything different past 60 Hz anyway. Now, if you have a game that won't keep the fps up around 60 unless you have a humongous graphic card, then it might be worth it.

That and our critical flicker frequency (where we see something as a continual light instead of as a series of flashes) is maxed out for most people below 50 Hz, unless you have an extremely bright big spot of light. Some people can go up to 65 Hz (and a few with super-eyes may perceive higher frequencies), but most of us looking at a monitor won't perceive anything different past 60 Hz anyway. Now, if you have a game that won't keep the fps up around 60 unless you have a humongous graphic card, then it might be worth it.

Thanks Jae. I was going to go digging for a similar article. 60 is indeed the magic number it seems. There are some who state aiming for 80fps in games as a top end is good as it it means your lower end or mean fps can sit around 60... but this is very variable depending on which game, which settings and what hardware.

I wonder about those 120Mhz home theater TVs. Not even the highest definition video formats clock in at 120fps! Heck, blu-ray doesnt even go anywhere near it...

I wonder about those 120Mhz home theater TVs. Not even the highest definition video formats clock in at 120fps! Heck, blu-ray doesnt even go anywhere near it...

I smell a sales gimmick aimed at the cashed up home theater crowd

mtfbwya

Well, the general perception is bigger/faster/stronger/whatever is better (I smell a Ray comment coming.... ), but I bet most people have never heard of critical flicker frequency, much less know that there's actually a cap to what the eye can perceive. So, I can't _entirely_ blame the sales force. My guess is that most companies don't have vision specialists on staff to point these things out, and with the drive to improve fps rates, even if some specialist did point that out, Joe Public wants the higher fps even though it doesn't make a difference. The companies risk being left behind if they don't keep up with what the market wants or perceives it wants.

From MST3K's spoof of "Hercules Unchained"--heard as Roman medic soldiers carry off an unconscious Greek Hercules on a 1950's Army green canvas stretcher: "Hi, we're IX-I-I. Did somebody dial IX-I-I?"

When I was working in optics I found it funny that people spent loads on a fancy HDTV but didn't care to buy glasses when their distance VA was below par lolz...You just spent thousands on a TV but your sight isn't even sharp enough to make out the difference with your old one!

Once all of the price-gouging dust settles after the initial launch it should be a good deal. Hell, it outperforms the GTS (640MB) for less money (even with the blatant price-gouging!) and gets quite close to GTX levels in some games. It also uses less power than either one. It also has all of the video encode/decode features of the 8600xx cards, which both the GTS and GTX lack.

I can't wait to see how the HD3800 stacks up to it.

"They should rename the team to the Washington Government Sucks. Put Obama on the helmet. Line the entire walls of the stadium with the actual text of the ACA.

Fix their home team score on the board to the debt clock, they can win every game 17,000,000,000,000 to 24. Losing team gets taxed by the IRS 100%, then droned."

I'm happy to see that the 8800 GT made progress in reduction of power consumption, probably due almost entirely to the shrink to 65 nm process. I'm also pleased that NVIDIA is also supporting more video decoding formats on the hardware side, like the lower-end 8xxx cards do. After buying my 7800 GTX back in 2005 I learned that it's often times better to wait for the new mid-range card to show up (7950 GT) as that often has better performance/price ratio. The 8800 GT confirms my theory as Anandtech says the 8800 GT should be priced in the USD$200-250 range and that is my sweet spot for a graphics card price point.

Is AMD's new RV670 chip going to be produced as the HD3800?

Want to battle against cancer and other chronic diseases? Join Team LFN!

Seems like it, I read in an article they wanted to step away from the HD2000 name because it didn't impress much...True, false? I dunno. But that seems to be the name they're going for.

In other news:
There's an article on the first Intel Penryn (45nm) chip here

It's the Quad Extreme version though, so not your average Joe chip, they did a few benchies and an OC as well and they seem quite impressed with the first 45nm chip out there...It runs quite cool and uses less juice apparently, now let's just see AMD get their finger out and actually roll out some good hm?

lolz...after being reminded of the workings of the human eye, thanks to Jae's very informative post, Ive all of a sudden become very disinterested in the GT or anything for that matter. Im strictly applying a "does it do 60fps at 1080-1600p" rule to any game I buy now... my guess is my GTS will serve me for a whiles to come.

I think Im slowly breaking free of my CPU upgrade obsession... we're planning to go see the pyramids next year, so more spending money for me

I think Im slowly breaking free of my CPU upgrade obsession... we're planning to go see the pyramids next year, so more spending money for me

The way I see it is that nothing can really bottleneck your CPU as quickly as it'll bottleneck your RAM or HDD or GPU or whatever...So unless you're running an absolute dinosaur, you should be good for at least a few more years (and I recall you've got an X2 6000+ so that'll cut it for a wee while yet I'm sure...plus the socket AM2 is supposed to hold all AM2+ and AM3 CPUs as well so even then you wouldn't even have to change you mobo)

OT: Watch out in Egypt it absolutely scorches there and you'll be surrounded by armed guards to visit the Piramids (Israel is just around the corner after all and they loves to shoot the Arab peoples yes they do)[/strong political opinion]

Im not being facetious when I say its good to see linux being utilised in such a way. Software support will most likely be provided by the vendors, though as its designed for general use, shouldnt encounter massive issues Id imagine. No "OMG...IM not getting full DX10 lighting effects in Crysis with nvidia driver version 16x.xx..."

well, Nvidia is about to lose a lot of wind out of their sales with the 8800 GT. the "new" Radeon 3800 series launch is soon upon us (November 19), and the prices will range from $150 -$179 for the 3850 and $200 - $230 for the 3870 series. this will put the new Radeons in an excellent price range for mainstream buyers, and the performance is supposed to be on par or better than the current 2900 XT's. and since these cards use a 55nm GPU, its my guess that overclocking is going to be a dream.

Asus seem keen on stepping up to play in the big league, and they're quite right! I personally haven't used Asus but I've never heard a bad word about them either (I've been using MSI products for many years now and will continue to do so cause I'm familiar with their BIOS and all that, it doesn't have all the latest tricks and maybe isn't as popular, but it works for me and I like them )

Quote:

Originally Posted by stingerhs

well, Nvidia is about to lose a lot of wind out of their sales with the 8800 GT. the "new" Radeon 3800 series launch is soon upon us (November 19), and the prices will range from $150 -$179 for the 3850 and $200 - $230 for the 3870 series. this will put the new Radeons in an excellent price range for mainstream buyers, and the performance is supposed to be on par or better than the current 2900 XT's. and since these cards use a 55nm GPU, its my guess that overclocking is going to be a dream.

Keep in mind that it's Fudzilla, and therefore should be taken with a grain of salt (just like the Inquirer), but the pictures look legit. Looks like ATI's taking the single card, multi-GPU plunge first.

"They should rename the team to the Washington Government Sucks. Put Obama on the helmet. Line the entire walls of the stadium with the actual text of the ACA.

Fix their home team score on the board to the debt clock, they can win every game 17,000,000,000,000 to 24. Losing team gets taxed by the IRS 100%, then droned."

^^^^
well, the 790X/FX boards are impressive enough for me to recommend them over Intel's X38 which is currently the only real competition feature-wise. so, i think its kind of a so-so idea to put a multi-GPU card on the market when you can just get two 3850's on a 790X/FX board probably for less money and still have very similar performance.

granted, that's not going to help if you want a C2D proc, but i still think its the best solution. besides, its my guess that ATI desperately needs to put out a card that can truely challenge the 8800 GTX. so, i guess this card fills that void, but i still don't think that its the best solution.

I dunno... even if it did work out 'cheaper than a 8800' to get a sli/x-fire setup going... it has been proven *over* and *over* that such link interfacing setups only really deliver at high rez setups.. (except crysis!) If someone wants to be ''1337 pwnage'' at 1280x1024, they should save their money(and electricity) and stick to a single decent card.