Jon Peddie Research have released their findings for the graphics market in Q3 of 2012, with bad news for the market, though not so bad for NVIDIA. The downward trend in PC sales has had an effect on the overall graphics market, with the number of units sold dropping 5.2% from this time last year and only NVIDIA seeing a rise in the number of units sold. AMD saw a drop of 10.7% in the number of units they shipped, specifically a 30% drop from last quarter in desktop APUs and just under 5% in mobile processors. Intel's overall sales dropped 8%, with both segments falling roughly equally but NVIDIA's strictly discrete GPU business saw a 28.3% gain in desktop market share and 12% for notebooks when compared to last quarter.

Worth noting is what JPR includes in this research above and beyond what we used to think of as the graphics market. Any x86 based processor with a GPU is included, tablets to desktops as are IGPs and discrete cards; ARM based devices, cell phones and all server chips are excluded.

"The news was terrific for Nvidia and disappointing for everyone the other major players. From Q2 to Q3 Intel slipped in both desktop (7%) and notebook (8.6%). AMD dropped (2%) in the desktop, and (17%) in notebooks. Nvidia gained 28.3% in desktop from quarter to quarter and jumped almost 12% in the notebook segment.

This was a not a very good quarter the shipments were down -1.45% on a Qtr-Qtr basis, and -10.8% on a Yr-Yr basis. We found that graphics shipments during Q3'12 slipped from last quarter -1.5% as compared to PCs which grew slightly by 0.9% overall (however more GPU's shipped than PCs due to double attach). GPUs are traditionally a leading indicator of the market, since a GPU goes into every system before it is shipped and most of the PC vendors are guiding down for Q4."

AMD is having a string of successes with its 28nm 7000 series graphics cards. While it was dethroned by NVIDIA’s GTX 680, the AMD Radeon HD 7970 is easier to get a hold of. It certainly seems like the company is having a much easier time in manufacturing its GPUs compared to NVIDIA’s Kepler cards. AMD has been cranking out HD 7970s for a few months now and they have gotten the binning process down such that they are getting a good number of pieces of silicon that have a healthy bit of overhead over that of the 7970’s stock speeds.

And so enters Tahiti 2. Tahiti 2 represents GPU silicon that is binning not only for HD 7970 speeds but is able to push up the default clock speed while running with lower voltage. As a result, the GPUs are able to stay within the same TDP of current 7970 cards but run faster.

But how much faster? Well, SemiAccurate is reporting that AMD is seeing as much as a 20% clock speed improvement over current Radeon HD 7970 graphics cards. This means that cards are able to run at clock speeds up to approximately 1075MHz – quite a bit above the current reference clock speed of 925MHz!

The AMD 7970 3GB card. Expect Tahiti 2 to look exactly the same but run at higher clock speeds.

They are further reporting that, because the TDP has not changed, no cooler, PCB, or memory changes will be needed. This will make it that much easier for add in board partners to get the updated reference-based GPUs out as quickly as possible and with minimal cost increases (we hope). You can likely count on board partners capitalizing on the 1,000MHz+ speeds by branding the new cards “GHz Edition” much like the Radeon 7770 has enjoyed.

With 7970 chips having overhead and binning higher than needed, an updated and lower-power using refresh may also be in order for AMD’s 7950 “Tahiti Pro” graphics cards. Heck, maybe they can refresh the entire lineup with better binned silicon but keep the same clock speeds in order to reduce power consumption on all their cards.

Although there were quite a few rumors leading up to AMD's Radeon 7000 series launch, the Internet has been very quiet on the greener side of the graphics market. Finally; however, we have some rumors to share with you on the Nvidia front. As always, take these numbers with more than your average grain of salt.

Specifically, EXP Review managed to uncover two charts that supposedly detail specifics about a range of GeForce 600 series Kepler cards from the number of stream processors to the release date. Needless to say, it's a lot of rumored information to take in all at once.

Anyway, without further adieu, let's dive into the two leaked charts.

Model

Code Name

Die Size

Core Clock (TBD) MHz

Shader Clock (TBD) GHz

Stream Processors

SM Count

ROPs

Memory Clock (effective) GDDR5

Bus Width

Memory Bus Width

GTX690

GK110x2

550mm2

~750

~1.5

2x1024

2x32

2x56

4.5 GHz

2x448bit

2x252GB/s

GTX680

GK110

550mm2

~850

~1.7

1024

32

64

5.5 GHz

512bit

352GB/s

GTX670

GK110

550mm2

~850

~1.7

896

28

56

5 GHz

448bit

280GB/s

GTX660Ti

GK110

550mm2

~850

~1.7

768

24

48

5 GHz

384bit

240GB/s

GTX660

GK104

290mm2

~900

~1.8

512

16

32

5.8 GHz

256bit

186GB/s

GTX650Ti

GK104

290mm2

~850

~1.7

448

14

28

5.5 GHz

224bit

154GB/s

GTX650

GK106

155mm2

~900

~1.8

256

8

24

5.5 GHz

192bit

132GB/s

GTX640

GK106

155mm2

~850

~1.7

192

6

16

5.5 GHz

128bit

88GB/s

From the chart above, we can see the entire lineup of Kepler cards from the NVIDIA GTX 640 to the dual GPU GTX 690. The die size in the higher end GeForce cards is approximately 50% larger than that of the AMD Radeon HD 7970, but not much bigger than that of the GTX 580. If only we knew the TDP of these cards! In the next chart, we see alleged performance comparison versus the AMD competition.

Model

Bus Interface

Frame Buffer

Transistors (Billion)

Price Point

Release Date

Performance Scale

GTX690

PCI-E 3 x16

2x1.75 GB

2x6.4

$999

Q3 2012

GTX680

PCI-E 3 x16

2 GB

6.4

$649

April 2012

~45%>HD7970

GTX670

PCI-E 3 x16

1.75 GB

6.4

$499

April 2012

~20%>HD7970

GTX660Ti

PCI-E 3 x16

1.5 GB

6.4

$399

Q2/Q3 2012

~10%>HD7950

GTX660

PCI-E 3 x16

2 GB

3.4

$319

April 2012

~GTX580

GTX650Ti

PCI-E 3 x16

1.75 GB

3.4

$249

Q2/Q3 2012

~GTX570

GTX650

PCI-E 3 x16

1.5 GB

1.8

$179

May 2012

~GTX560

GTX640

PCI-E 3 x16

2 GB

1.8

$139

May 2012

~GTX550Ti

If these numbers hold true, NVIDIA will handily beat the current AMD offerings; however, I would wait for reviews to come out before making any purchasing decisions. One interesting aspect is the amount of GDDR5 memory. It seems that NVIDIA is sticking with 2GB frame buffers (or less) per GPU while AMD has really started upping the RAM. It will be interesting to see how this affects gaming in NVIDIA Surround and/or at high resolutions.

What do you guys think about these numbers, do you think Kepler will live up to the alleged performance scale figures?

Popular maker of NVIDIA graphics cards Galaxy, recently announced that they are extending the warranty of their graphics cards products to three years. "Galaxy has listened to the enthusiast market and we are glad to move from a 2 year warranty to a 3 year warranty by registration." The new extended warranty will apply to all graphics cards purchased after August 1st, 2011 that are then registered with Galaxy. Products will further bear the seal shown below to let customers know that the graphics card qualifies.

Seeing warranties being extended is always a good thing, especially in a world where the once popular lifetime warranty is rare. What do you think of the extended warranty? Will this be enough to push you towards a Galaxy branded card on your next purchase?

At Computex 2011, NVIDIA plans to showcase the latest addition to its mobile graphics lineup, the GTX 560M GPU. Powered by a mobile version of the 500 series desktop GPU, the graphics card will bring support for NVIDIA's Optimus, 3D Vision, and PhysX technologies. On launch, there will be two notebooks from Asus and Toshiba, the G74sx and a Qosmio gaming laptop respectively, with many more to follow.

The Asus G74sx and Toshiba Gaming Notebook

From a performance aspect, the GTX 560M purports to deliver twice the performance of the current latest 540M mobile chip. According to GeForce.com, in Crysis: Warhead the GTX 560M pulls a respectable 30-40 FPS at 1080p resolution with “Gamer” detail settings. This is in contrast to the older GTX 540M, which can only maintain 30-40 frames per second at 1080p at the lowest detail settings. In 3DMark Vantage, the GTX 560M scored 10,000 points whereas the older 540M only pulled off approximately 4,200 points. Andrew Coonrad, of NVIDIA’s Technical Marketing department further stated that the graphics card would play both the Witcher 2 and Duke Nukem: Forever at approximately 50 frames per second.

GeForce states that if you are a mobile gamer looking for an easy to carry gaming notebook that can offer Optimus’ battery saving technology and 3D Visions gaming features, laptops with the GTX 560 are the way to go as the older GTX 480M is not nearly as power efficient (and thus less portable). Laptops with the new graphics card are in stock now at several online retailers.

Not to be left out of the slew of NVIDIA GeForce GTX 560 releases, KFA2 announced two new NVIDIA graphics cards to their current graphics card lineup. Both are based on the Geforce GTX 560 GPU; however, one card is overclocked and fitted with an aftermarket heatsink and fan combo (the other is a standard single, centered, and shrouded fan design). Labeled the KFA2 GeForce GTX 560 1GB 256bit and the KFA2 GeForce GTX 560 EX OC 1GB 256bit, the DirectX 11 cards offer the following specifications:

GeForce GTX 560 1GB 256bit

GeForce GTX 560 EX OC 1GB 256bit

CUDA Cores

336

336

GPU Clock

810 MHz

905 MHz

Shader Clock

1620 MHz

1810 MHz

Memory Clock

2004 MHz

2004 MHz

Memory

1 GB GDDR5 on 256-bit bus

1 GB GDDR5 on 256-bit bus

Memory Bandwidth

128.3 GB/s

128.3 GB/s

Texture Fill Rate

45.3 Billion/s

50.6 Billion/s

The two new cards seem to be positioned (specifications wise) between purely reference cards and the highest clocked GTX 560 cards of their competitors. The street price will ultimately determine if they are worth picking up versus other brands with higher clocks or reference clocks but aftermarket cooling. KFA2 states that the cards will be available online and in retail stores throughout Europe, and are backed by a two year warranty.

Coinciding with the NDA lift on the NVIDIA GeForce GTX 560, Gigabyte announced its enthusiast class Overclock Edition graphics card based on new the GTX 560 GPU.

The new Overclock Edition replaces the reference design's cooler with Gigabyte's own WindForce 2X variant, which they claim reduces the noise of the card under full load to 31db. Further, the heatsink used direct heat pipe technology, which means that the heat pipes that carry heat away from the GPU and into the fins physically contact the GPU itself. Both fans produce 30.5 CFM of airflow to quickly dissipate the heat of the overclocked GTX 560 GPU, Gigabyte was able to clock the card at a 830 MHz GPU clock and a 4008 Mhz memory clock from the factory. Gigabyte claims to improve overclocking capability by 10% to 30% thanks to it's "Ultra Durable" copper PCB technology and power switching enhancements.

The full specification of the GeForce GTX 560 Overclock Edition are as follows:

Model Number

GV-N56GOC-1GI

Core Clock

830 MHz

Shader Clock

1660 MHz

Memory Amount

1 GB

Memory Type

GDDR5

Memory Bus

256 bit

Card Bus

PCI-E 2.0

Process Technology

40 nm

Card Dimensions

43mm (h) x 238mm (l) x 130mm (w)

Power Requirements

Minimum 500 Watt PSU required

DirectX Support

11

Outputs

1x HDMI and Display Port via adapter(s)

1x mini HDMI

2x DVI

1x VGA (via adapter)

Gigabyte is a popular motherboard manufacturer for enthusiasts and it seems that they are striving to gain that same level of consumer brand loyalty with their graphics cards. Do you have a Gigabyte graphics card in your rig?

VR-Zone reports that NVIDIA is gearing up to deliver a revised edition GTX 590 in June to combat the overheating problems that some overclockers fell victim too using certain drivers. PC Perspective did not run into the issue when overclocking their card; however, VR-Zone stated in an earlier article that:

"NVIDIA has sent out a cautionary to their partners regarding possible component damage due to high temperature when running Furmark 1.9 as it bypasses the capping detection. . . . This is something not able to fix through drivers nor it is just applicable to GeForce GTX 590."

Fortunately for overclockers, NVIDIA is planning to re-engineer aspects of the design, including new inductors, which should help with the over-current protection issues. This new design will also effect the size and dimensions of the current GTX 590 PCB, which means that current third party heat sinks and water blocks made for the (current) GTX 590 will not fit.

It is nice to see that NVIDIA is sticking by it's technology and updating its hardware to fix issues. Overclockers especially, will benefit from this updated model.

The NVIDIA GeForce GTX 560 is due to release on May 17th. As the release date approaches, vast speculation and rumors have flooded the Internet. GeForce.com has stepped up to preview what the card looks like and how it fairs in three soon to be released PC games versus the 9800GT at the popular 1080p resolution. GeForce chose the 9800 GT for comparison because they found the card to be one of the most popular used on Steam. As games are becoming more advanced graphically and 1080p monitors are becoming more popular, they wanted to compare what the GTX 560 is capable of versus a card that many people are familiar with.

While they were unable to share exact hardware specifications and performance numbers (due to NDA), they were able to show what graphics detail settings the card was able to run at 1080p and at least 35 frames per second. The stated "Optimal Playable Settings" for the GTX 560 were then compared to the 9800 GT in three games. These three soon to be released games were each chosen because of their ability to showcase what high resolution, high PhysX detail, and Nvidia Surround looked like. The GTX 560 was able to handle all three of those features with ease, whereas the older but popular 9800 GT ran into issues playing games with those features smoothly. The system configuration they used to test both cards is as follows:

Motherboard

ASUS P8P67 WS Revolution

CPU

Intel Core i7 2600K @ 3.4GHz

RAM

8GB DDR3 1333MHz, 9-9-9-24

Operating System

Windows 7 x64

The first game they showcased was Duke Nukem Forever. GeForce states that Duke Nukem will support both NVIDIA 3D and PhysX. The graphics details they were able to achieve with Duke Nukem Forever are:

Resolution

1920x1080

Texture Detail

Medium

Shadow Detail

Medium

Shadows

World & Characters

Motion Blur

On

AA

Off

Film Grain

On

Post Special Effects

On

Stereoscopic 3D

On

The GTX 560 managed to pull off at least 35fps. Conversely, the game was not playable at these settings with the 9800 GT. Specifically, the 3D feature was not practical with the 9800 GT.

Alice: Madness Returns was the second game GeForce showed off. One interesting aspect of Alice is the useage of PhysX. The graphics quality is much improved by the graphics textures and particles added by PhysX, as you can see in the comparison screenshot below.

The GTX 560 managed to run the game at the following setttings:

Resolution

1920x1080

AA

On

PhysX

High

Post Processing

On

Dynamic Shadows

On

Motion Blur

On

The 9800 GT that they compared the GTX 560 to was a "slide show" by comparison. The demands of PhysX were especially responsible for the reduced performance. The 9800 GT simply was not capable of processing both high resolution graphics and the high PhysX calculations. The GTX 560 was; however, capable of running the game at maxed out settings (at 1080p).

GeForce finally showcased the GTX 560 running Dragon Seige III. In this test, they utilized 3 monitors in an NVIDIA Surround configuration. The graphical settings that they were able to get out of the GTX 560 included:

Resolution

5760x1080

Motion Blur

On

Shadow Quality

Insane

Texture Quality

High

Shader Quality

High

Visual Effects Quality

High

AF

On

MSAA

8x

Their results are as follows:

"On these settings, which were near maximum aside from anti-aliasing which tops off at 16x, the average framerate was again consistently smooth and playable. Here, the ultra-wide experience allowed us to immerse ourselves into some deep dungeon crawling. Unfortunately for the 9800 GT, the GPU in SLI does not support NVIDIA Surround, making it impossible to play at the 5760x1080 resolution. "

The GeForce GTX 560 is reported to be positioned between the Geforce 460 and 560Ti on the NVIDIA side, and the 6870 and 6950 (1GB) on the AMD side. When it comes to 1080p resolution, so far it has been a toss up for many DIY enthusiasts between buying the AMD 6950 (2GB) and the NVIDIA GTX 560Ti for maximum performance. If GeForce's preview holds true for other games, the GTX 560 may well provide an another option for enthusiasts after the bang for the buck price and performance at 1080p resolutions.

As for speculation and rumors on the graphics card's hardware, there have been many floating around the Internet. For example, Tech Connect states that the GTX 560 will feature 336 CUDA cores, 56 Texture Units, and 1GB of GDDR5 memory on a 256-bit bus. Further, Tech Connect maintains that the card is rumored to be priced at approximately $200. From Nvidia's statement that the graphics card will be positioned between the GTX 460 and the GTX 560Ti in terms of performance, the GPU will likely be clocked somewhere between the 675Mhz of the GTX 460 and the 820Mhz of the GTX 560Ti, with the RAM being slightly lower than the GTX 560Ti's 4008Mhz.

Unfortunately, (until the NDA is lifted) only NVIDIA can tell us what the real specifications of the GTX 560 will be, and they are not talking. You can; however, find further details as well as a video of the soon to be released card in action over at GeForce.com, and PC Perspective will have a review up with benchmarks gallore and the official hardware specifications as soon as the NDA is lifted on May 17th.