The fruity tax-dodging cargo-cult has finally admitted that it has pulled the plug on its self-driving car project.

For months the Tame Apple Press has been claiming that Apple technology would be driving users to work rather than just driving them crazy. However, we warned in September that that Jobs’ Mob had killed the project off.

At the time we said that the reason was that it could not find any partners in the car industry who were dumb enough to meet all of Apple’s demands.

Now Apple has confirmed that the hardware side of the somewhat stupidly named Project Titan has been axed and its staff have been re-allocated to other departments. Titans were so old hat anyway, having been out-evolved and replaced by the Olympic gods. A ship named after the Titans smacked into an iceburg and sang in what came to be known as a “Céline Dion” incident.

Apple is still hoping the Titan software can be flogged to car makers for use in their own vehicles. In fact, it has told those in charge of Titan to produce something feasible before the end of 2017, or else.

According to Bloomberg the whole project had been a Titanic failure of leadership for years until Bob Mansfield took over the team. Mansfield looked at the project and realised that doing a Tesla when you are used to making expensive gadgets for rich kids with more money than sense was impossible. It was not as if Apple was innovative or anything. He thought it was better to kill the idea of being a Tesla competitor and concentrate on a technology platform that could be sold to third parties.

Another important reason was that Apple normally forces its partners to mass buy huge amounts of components to obtain an economy of scale. The car industry does not work in the same way and told Jobs’ Mob it was their way or the highway. Then the talented people Apple hired to get the project working saw the writing on the wall and left to go to Tesla.

Apple tried to partner with other companies who were just not used to being arrogantly told what to do by someone new to the industry who did not know their arse from their elbow.

Nvidia’s Titan X Pascal is out on the streets from today and the chip is being talked up on GeForce.com site.

When we say talked up we mean it, apparently the chip has an “irresponsible amount of power”.

“We packed the most raw horsepower we possibly could into this GPU. Driven by 3584 NVIDIA CUDA cores running at 1.5GHz, TITAN X packs 11 TFLOPs of brute force. Plus it’s armed with 12 GB of GDDR5X memory – one of the fastest memory technologies in the world,” the site enthuses.

Titan was supposed to bridge the gap between the consumer-centric GeForce lineup and pricier Quadro professional cards. Nvidia launched this card during an AI meetup in San Francisco which is odd really. With 44 TOPS INT8 performance, a new deep learning inferencing instruction, it should be headed to neural networks and machine learning rather than AI.

The Titan X Pascal has 3,584 CUDA cores with a 1,417MHz base and 1,531 boost clock. That is half a gig faster than the older Maxwell GPU-based Titan X and and more than 1,000 CUDA cores greater than the GeForce GTX 1080.

It has 12GB of next-gen GDDR5X memory clocked at 10Gbps which is connected to the GPU over a 384-bit bus providing 480GBps memory bandwidth aided by Pascal’s new delta colour compression system.

This is a little surprising because it lacks the super-fast high-bandwidth memory first seen in AMD’s Fury cards which Titan X was expected to get. This means that we will not see the second-generation high-bandwidth memory in Nvidia gear until end of the year..

Nvidia’s not providing any additional architectural details about the new “GP102” processor inside the Titan X. What it is saying is that Titan X Pascal will offer 60 percent greater performance than the previous Titan X. This should mean that it can play most top games at 60fps with high settings at 4K resolution.

The Titan X Pascal uses 250 watts of power which was pretty much the same as the original Titan X. It needs 8 and 6-pin power connectors. The two-slot card measures 10.5-inches long by 4.376-inches and has a DVI-D, an HDMI 2.0b port, and three DisplayPort 1.4 connections.

There appears to be the same vapour chamber cooling with a blower-style fan as seen in the GTX 1080. This time the angular metal shroud is black.

If you ever wondered why Blizzard cancelled Titan, the follow-up to World of Warcraft, the answer is pretty much why you might have expected – nothing worked at all.

This was a little surprising given that in 2010, Blizzard announced that it had a team of crack developers working on an MMO that would prove to be the successor of the highly popular World of Warcraft. However in September 2014, however, without any reasons given the studio announced that the game had been cancelled. It went on to support WoW with regular updates.

Blizzard designer Jeff Kaplan told Gamespot that the project was a complete failure and totally missed the mark in the design stages.

According to Kaplan, that fact hit the Titan development team hard, too. Kaplan didn’t need to get too much more specific, but he went on to say that the failures were across the board, in “every way that a project can fail.”

He added that this was especially hard to stomach for the team as, being a part of team Blizzard, they had only known success in the past.Kaplan said the pressure of being responsible for one of the studio’s only failures led to a sense of embarrassment. That, in turn, helped forge the team into a stronger unit, one that felt it had something to prove.

Everyone shifted over to Overwatch, Kaplan and the team saw it as their last chance to prove themselves. He said they saw it as a chance to prove they were still capable of making a great game. The beta for that looks good and it does not sound anything like what happened with Titan.

According to the latest report, it appears that Nvidia is on the verge of releasing a new dual-GPU graphics card that will place two GM200 GPUs on the same PCB.

According to a report from Wccftech.com, the upcoming dual-GPU graphics card has not only been already showcased to a couple of select members of the press at a secret briefing in New York City, but some of those have managed to score a sample and are wrapping up their reviews.

According to the same report, the upcoming flagship graphics card could bear the GTX Titan branding. Rather surprising piece of information is that the upcoming dual-GPU graphics card will not be based on the GM204 GPU, which was behind the GTX 980 and GTX 970 graphics card but rather two GM200 GPUs, the same one that is behind both the GTX 980 Ti and the GTX Titan X graphics card.

The precise specifications are still unknown as it is unclear if we are looking at two fully enabled GM200 GPUs with 3072 CUDA cores each, or the cut-down version with 2816 CUDA cores. Since the GTX 980 Ti has a TDP of 250W, it will be quite interesting to see the final TDP on such dual-GPU graphics card and the final clocks for each GPU.

It is also quite surprising that Nvidia managed to keep such graphics card a secret for so long and although it was quite obvious that Nvidia will release a dual-GPU graphics card in order to counter AMD's upcoming dual Fiji GPU based graphics card, we did not expect it to be ready so soon.

The price of the upcoming dual-GPU graphics card from Nvidia, which is also the most important part of information, is still unknown but if these rumors are true, we should see it quite soon.

In a desperate bid to demonstrate how good the H.265 decode is on Skylake, Intel was a little creative about its facts.

At Intel's press conference at IFA, Kirk Skaugen Senior Vice President General Manager, Client Computing Group and his assistant "Doug" kept comparing how good the H.265 decode is on Skylake against a "$2000 Titan Nvidia card."

There is one problem with this. There is no such thing as "$2000 Titan Nvidia card".

We went on Newegg and a few other places to see that Nvidia Titan X from EVGA sells for $1029. This is not cheap, but this is not "$2000 Titan Nvidia card" either.

The second card that has two Titan GPUs called Titan Z sells for $1,549.99 again not "$2000 Titan Nvidia card" but not cheap either. It might be true that Intel can process H.265 faster than the Maxwell generation GPU but at the same time, Titan X and Z will obliterate Intel in games.

Intel has showed that a Skylake GPU could process H.265 better and smoother than Nvidia, but it is inescapable that there is no such a thing as "$2000 Titan Nvidia card". Intel could have been comparing Skylake against any Nvidia GPU or for that matter it could have been testing it against something different – like a banjo.

Just minutes after bashing world's largest GPU manufacturer, Skaugen started praising the gaming market and how it was a big that opportunity.

They need each other. Nvidia's GPU needs a decent CPU and Intel needs Nvidia or AMD to provide a GPU for serious gamers. Intel claims that Iris PRO is enough which is far from reality. Every serious gamer we know owns a discrete GPU and Iris Pro might be enough for World of Tanks, casual gamers but not for everyone.

We knew all about it, including its specifications as well as the US $2,999 ex. VAT price tag, but we did not know when it will be actually available. According to the report from Techpowerup.com, the new graphics card should hit retail on 29th of April and stick to the same price announced back at GTC 2014.

The end-user price tag will depend on the country tax and while US $2,999 sounded quite expensive, the sheer amount of compute performance coming from two 28nm GK110 GPUs will be be enough to attract professionals, scientists and a few wealthy gamers.

Fully enabled dual GK110 GPUs

In case you missed it back when it was announced, the Geforce GTX Titan Z features two fully enabled 28nm GK110 GPUs with 2880 CUDA cores, 240 TMUs and 48 ROPs per GPU. The GPUs are connected to 6GB of GDDR5 memory each via dual 384-bit memory interface. Back at GTC 2014, Jen-Husn Huang, described the GTX Titan Z as a "supercomputer in a PCI-Express form-factor".

While AMD had to use a bulky AIO water cooling solution in order to keep the two Hawaii-XT GPUs in check, Nvidia managed to stick with standard air cooler on its dual-GPU Titan Z graphics card.

Clash of titans

While AMD currently reigns supreme with it dual-GPU Radeon R9 295X2 graphics card, Nvidia's Titan Z is a different beast in a league of its own. Just after GTC 2014, we wrote that e-tail players, PC system integrators like Maingear are quite keen on getting their hands on Titan Z and believe that they can sell it without problems, even with a US $2999 price tag. Even AIB partners had no problem with the price as both the Titan Black and the original Titan were selling well at US $999.

Both the Radeon R9 295X2 and the Titan Z are niche products, but they surely have their market and we guess that there are many buyers willing to burn a lot of money in order to get the best possible UHD/4K gaming experience with all details dialed up to 11.

Nvidia’s GTX 780 Ti NDA expires at 3 pm and we should see the first reviews in a matter of hours. However, some partners and retailers decided to jump the gun.

For example, stuff-uk.net is already has MSI’s GTX-780TI-3GD5 in stock for £582. Zotac’s reference card is listed in several DACH shops with prices starting at €677 – but it’s not available just yet. The card was announced a few weeks back and there’s nothing new to report on the spec front. It is based on a GK110-425-B1 GPU, clocked at 875MHz (Boost 928MHz), with 3GB or GDDR5 on a 384-bit memory bus. The memory clock is 1750MHz.

It’s got 2880 cores, 240 texture units and 48 ROPs, up from 2304/192/48 on the original GTX 780. In terms of performance, it churns out 5040GFLOPS (Single), 210GFLOPS (Double). The “old” GTX 780 could manage 3977GFLOPS (Single), 165GFLOPS (Double). For a bit of perspective, here are the Titan numbers: 4500GFLOPS (Single), 1311GFLOPS (Double, 732MHz).

On paper it looks like a powerhouse and of course we are already playing with it on our test rig. Naturally we can’t share the results right now, but a preview should be coming shortly.

In theory it should give the R9 290X a run for its money, but then again it does end up a bit pricier - $699 is official MSRP, making it $150 more expensive than AMD’s flagship. We still don't understand what Nvidia plans to do with the Titan now. Early retirement sounds like a good idea.

We have heard from multiple independent sources that the upcoming Geforce GTX 780 Ti will end up faster than the Titan, and obviously significantly faster than original Geforce GTX 780 card.

The Radeon R9 290X is giving Nvidia’s Titan a run for its money, at least in its noisy Über mode, but apparently 780 Ti can put some distance between these cards. Since the GTX 780 currently sells for $649 in most US etail stores and in Europe it costs just over €500, we can only assume that GTX 780 TI performance will come at a high price, but with a Titan like cooler that we saw showcased at Nvidia’s Montreal The way it's meant to be played, the card could end up really quiet.

Let's not forget UK etail where GTX 780 sells for average £499.99 with VAT.

We are not sure if GTX 780 Ti beats the Titan in all benchmarks, but it will definitely be faster in most of them. It remains to be seen what happens to the Titan, currently priced at $999, as the GTX 780 Ti launch will render this pricey card obsolete and uncompetitive.

One can only assume that there might be a Titan price drop happening after the 780 TI launch. The other possibility is that the Titan will be discontinued, if the Ti ends up significantly cheaper to produce.

In the US Titan cards sell for $999 and up, in the German market you can buy a Titan for €800 and in the UK it’s £779.99 on sale, or £800+ on an average day. These cards make sense for people with 2560x1600 or higher resolution monitors and all settings cranked up to max.

The eagerly awaited Hawaii GPU got its official brand last month and now it’s finally official and shipping for $549. The big kahuna, the faster of two cards based on Hawaii is the Radeon R9 290X, while the Pro version will appear in a week or so.

The new Hawaii XT chip is the first significant "big core” from AMD since the launch or Radeon 7000 series in Q4 2011. It took a while before the Tahiti 28nm chip got a series successor that could step on Geforce GTX 780’s toes. Not only that, but in some cases the R9 290X it can even bring the fight to Nvidia’s pricey Geforce Titan.

The AMD Radeon R9 290X is a 28nm chip with 6.2 billion transistors and the engine clock goes up to 1GHz, the primitive rate is at 4 prim per clock, it has 2816 stream processors, churns out 5.6 Teraflops. It’s got 176 texture units, 64 ROP, a texture fill rate of up to 175 GT/s and up to 64Gpixels a second. It has a new 512-bit memory interface coupled with 4GB of GDDR5, ensuring a data rate up to 5 Gbps and 320GB/s data bandwidth.

If you look at the table below, this means that the chip is roughly twice as powerful as the R9 270X. Its main competitor is Nvidia’s GK110, which has 48 ROPs (12 less than Hawaii XT), 2880 shader processors, or 64 more than Hawaii, but it can process 196 texels in integer and FP16 precision while Hawaii can deal with 176 integers and 88 fp16.

AMD has a 512-bit memory bus while the GK110 has a 384-bit bus, but Nvidia has 7.1 billion transistors doe to its better precision capabilities. The Hawaii chip is 438 mm2, while the GK110 is 551 mm2, so it’s much bigger and costlier to produce. Specification wise you can see that these chips are practically evenly matched, but we will leave that to benchmarks.

The card has two dual-link DVI outs, one standard HDMI and standard DisplayPort out. Next generation 4K (4096x2160) and UHD (3840x2160) resolutions are supported. You can also use any combination of display connectors.

The Radeon R9 290X draws power via a 6-pin power connector and an 8-pin power connector.

AMD dropped dual-BIOS feature in the last generation, and you will not find it on Rx200 series cards, with the exception of the R9 290X. The VBIOS switch allows you to choose between “Quiet mode” And “Über mode”. The default BIOS position is going to run the card a bit quieter with some performance penalty, while BIOS position two will make the cooler run faster, increase the clock and card settings to maximum, but it will give you a few extra frames, accompanied with more noise. If you want to change from normal to Über mode you should shut down, adjust the switch, boot up and hit defaults in Catalyst Control Center.

Note that the card lacks a Crossffire connector. All the communication between chained cards is now handled via PCIe 3.0.

AMD has some new driver settings in Overdrive, but we will write more about this in the review. Now let’s check out some of the numbers we got after a few hours of testing.

You’ll probably notice the high GPU temperature first. It can go up to 95 degrees Celsius, but rest assured 95C is a perfectly safe temperature. There is no technical reason to reduce the target temperature below 95C. By running at 95C, AMD is both maximizing the performance and minimizing the acoustics of the product. This is achieved by increasing clocks/voltages and/or reducing the fan speed until the GPU runs at the temperature target. By having the GPU target at lower temperature, you sacrifice either performance or acoustics. However you can change fan speed at any time in Overdrive.

Here you can see the temperatures and settings in idle mode. When the card is in 3D mode, the fan’s target speed is 40% of max RPM. At this setting the fan is not very loud, you can hear it, but it’s not a distraction.

Our tests indicate that the card’s default Quiet BIOS setting reduces the GPU clock when the temperature hits 92C or 95C. After a few runs in Unigine Heaven, we saw the GPU clock drop to 865MHz.

Quiet BIOS: Readings during Unigine Heaven test after 2 minutes.

Quiet BIOS: Readings during Unigine Heaven test after 3 minutes.

Quiet BIOS: Readings during Unigine Heaven test after 15 minutes.

It is obvious that in demanding applications the algorithm drops the GPU clock to well below 1000MHz, which could result in slightly lower performance after an extended gaming session. With that in mind, we ran a few tests and moved on to Sleeping Dogs. However, the throttling didn’t have much of an effect on overall performance.

Even when it gets really hot, the R9 290X easily beats the GTX 780 and in Uber mode it even outpaces the Titan.

Uber mode accelerates the fan to 55% RPM, but then things get pretty loud even if you are used to plenty of noise.

So what makers Uber mode so different? Well , it forces the card to run at 1000MHz most of the time and the clock never drops below the 935MHz mark. In quiet mode it spends most of its time at or around 865MHz.

3DMark Extreme tells a different story. Even with the quiet BIOS, the card delivers peak performance. It does not overheat and the Uber BIOS didn’t yield any extra performance.

The R9 290X consumes a bit more power than the GTX Titan. However, given the performance and very attractive price, we really can’t hold this against it.

We got the $549 price right and it turns out that the Battlefield 4 edition might be just a tad more expensive. Our sources claim that in the US the Radeon R9 290X Battlefield 4 Edition sells for $30 more or $579.

Since the game is coming out next Wednesday, October 30, we would expect these pre-orders to start shipping shortly. The bundle is limited we never got the right number of bundles but with small premium on top of the non-bundle card, but the price is right and it might make sense to buy the more expensive one and get the game that so many gamers want.

AMD is definitely putting a lot of heat on Nvidia and its AIBs, as the 290X performs really well and it is priced to hurt Geforce GTX 780 sales. In Uber mode can even give the Titan a run for its money, and since the Titan still costs $999, we are talking about a lot of money.

Nvidia will soon fire back with the Geforce GTX 780 Ti, a card that is meant to mess with the Radeon 290X performance and it could end up on par with the 290X, but pricing remains a problem for Nvidia. We would not be surprised if the Geforce GTX 780 gets a price cut to make it more competitive, but only if bundles don’t do their job.

Ideally, Nvidia would like to take on the R9 290X with the GTX 780 Ti, while the older GTX 780 should battle the R9 290, but to do that Nvidia would have to sacrifice a lot, killing its margins in the process.

As for the $999 Titan, we have no idea what Nvidia plans to do. It would have to get a massive price cut to remain competitive and that might not be an option for Nvidia.