Preventing Espionage at AMD: How The Eyefinity Project Came to Be

There’s one more thing Carrell Killebrew has done for the world. He’s single handedly responsible for getting Eyefinity included in the Evergreen stack.

It started like this. All GPU vendors go to their customers (OEMs) and ask them for features they’d like to have. The notebook vendors wanted a total of 6 display outputs from the GPU, although they only needed two to be active at the same time. Two paths could be used for LCD panels, two could be used for external outputs (VGA + DVI/HDMI) and two routed to a docking station connector.

Carrell thought it would be a shame to have all of these output pins but not be able to drive all six at the same time. So he came up with a plan to be able to drive at least 3 displays on any Evergreen card. The high end cards would support 6 displays simultaneously.

His desire to do this wasn’t born out of pure lunacy, Carrell does have a goal in mind. Within the next 6 years he wants to have a first generation holodeck operational. A first generation holodeck would be composed of a 180 degree hemispherical display with both positionally and phase accurate sound. We’ll also need the pixel pushing power to make it all seem lifelike. That amounts to at least 100 million pixels (7 million pixels for what’s directly in front of you, and the rest for everything else in the scene), or almost 25 times the number of pixels on a single 30” display.

We’re not quite at 2016, so he had to start somewhere. And that somewhere happened to be with enabling a minimum of 3 and a maximum of 6 displays, per card, for all members of the Evergreen family. Today we know the technology as Eyefinity, but internally Carrell called it SunSpot.

Carrell didn’t want anyone knowing about SunSpot, so he kept it off the Cypress PRS. Through some very clever maneuvering he managed to keep it off of the radar while engineering hammered out the PRS, and even managed to keep it off of the chopping block when the GPU was cut down in size. He knew that if anyone got wind of it, they’d ask him to kill it while the chip was being scaled down. To make matters worse, if anyone outside of a trusted few became aware of it - there was the chance that NVIDIA would have time to copy and implement the feature. It then became Carrell’s goal to keep SunSpot as quiet as possible.

It began with a list. On this list were names of people who needed to know about SunSpot. If your name wasn’t on the list not only did you not know about SunSpot, but no one who knew about the project was allowed to talk about it near you. There was an internal website created that had the names of everyone who needed to know about SunSpot.

Along with the list, came rules.

As I just mentioned, no one on the list could talk about SunSpot in a place where someone not on the list could overhear. And if you wanted to get someone added to the list, it had to be approved - the final say was in the hands of none other than Carrell Killebrew.

The SunSpot engineers went to work on the feature, bringing in others only when absolutely necessary. The team grew one person at a time and eventually plateaued. The software engineers weren’t made aware of SunSpot until the last minute. Carrell only gave them enough time to enable SunSpot, they didn’t get the luxury of advance knowledge.

Carrell went to David Glenn, head of software engineering at ATI and asked him what the latest possible date that they needed to have someone in software working on this stuff. David gave him a date. Carrell asked for a list of names of people who needed to know. David gave him three names. On that date, the SunSpot team called up those three people and said “we need to tell you something”. Needless to say, no one was happy about Carrell’s secrecy. Some of the higher ups at ATI knew Carrell had people working on something, they just had no idea what it was.

It's the software that ultimately made Eyefinity

When in his own cube Carrell always spoke about SunSpot in code. He called it feature A. Carrell was paranoid, and for good reason. The person who sat on the other side of Carrell’s cube wall left to work for NVIDIA a couple months into the SunSpot project. In all, ATI had three people leave and work for NVIDIA while SunSpot was going on. Carrell was confident that NVIDIA never knew what was coming.

Other than the obvious, there was one real problem with Carrell’s secrecy. In order for Eyefinity to work, it needed support from external companies. If you’ll remember back to the Radeon HD 5800 series launch, Samsung announced thin-bezel displays to be sold in 1, 3 or 6 panel configurations specifically for Eyefinity setups. There was no way to keep SunSpot a secret while still talking to OEMs like Samsung, it’s just too big of a risk. The likelihood of someone within ATI leaking SunSpot to NVIDIA is high enough. But from an employee for an OEM that deals with both companies? That’s pretty much guaranteed.

For a feature like SunSpot to go completely unnoticed during the development of a GPU is unheard of. Carrell even developed a rating system. The gold standard is launch; if SunSpot could remain a secret until the launch, that’s gold. Silver is if they can keep it a secret until they get chips back. And the effort would get a bronze if they could keep it a secret up to tape out, at that point NVIDIA would be at least one full product cycle behind ATI.

Eventually, Rick Bergman, GM of graphics at AMD, committed to keeping SunSpot a secret until bronze, but he told Carrell that when they got to tape out they were going to have a serious talk about this.

Time went on, SunSpot went on, Carrell and crew made it to bronze. The chip had taped out and no one knew about Carrell’s pet project. It got a little past bronze and Rick asked Carrell to have that talk. There were three customers that would really benefit from talking to them about SunSpot, then the killer: it would also help ATI competitively.

Carrell didn’t want to risk tipping off the competition to SunSpot, but he knew that in order to make it successful he needed OEMs on board. The solution was to simply add those at the OEMs who needed to know about SunSpot to the list. The same rules applied to them, and they were given a separate NDA from existing NDAs in place between AMD and the OEM. AMD legal treated SunSpot as proprietary IP, if anyone else within an OEM needed to know about it they needed to first ask for permission to discuss it. To make sure that any leaks would be traceable, Carrell called SunSpot a different name to each of the three OEMs involved.

A few weeks prior to the Cypress launch one of the CEOs at one of the OEMs saw Eyefinity and asked to show it to someone else. Even the CEO’s request needed to be approved before he could share. Surprisingly enough, each of the three OEMs abided by their agreement - to Carrell’s knowledge the tech never leaked.

NVIDIA's Surround driven off two cards

While NVIDIA demonstrated its own triple-display technology at this year’s CES, it’s purely a software solution; each GPU is still only limited to two display outputs. I asked Carrell what he thought about NVIDIA’s approach, he was honest as always.

Eyefinity allows for 3 outputs from a single GPU

ATI considered a software only approach a while ago, but ultimately vetoed it for a couple of reasons. With the software-only solution you need to have a multi-GPU capable system. That means a more expensive motherboard, a more powerful PSU and a little more hassle configuration wise. Then there were the performance concerns.

One scenario is that you have very noticeable asymmetry as you have one card driving one display and the other card driving two displays. This can cause some strange problems. The other scenario is that you have all three displays coming off of a single card, and in alternating frames you send display data from one GPU to the next either via PCIe or a CF/SLI connector. With 6 displays, Carrell was concerned that there wouldn’t be enough bandwidth to do that fast enough.

There were also game compatibility concerns that made ATI not interested in the software approach. Although I was quick to point out that FOV and aspect ratio issues are apparent in many games today with Eyefinity. Carrell agreed, but said that it’s a lot better than they expected - and better than it would have been had they used a software-only solution.

Not to belittle the efforts of ATI’s software engineers here. While Carrell was one of three people originally responsible for SunSpot, they weren’t the ones who made it great. In Carrell’s own words “In the end, I’d say the most key contributions came from our Software engineering team. SunSpot is more a software feature than a hardware one”. ATI’s software team, despite not being clued into the project until it was implemented in hardware, was responsible for taking SunSpot and turning it into Eyefinity.

As for the ridiculous amount of secrecy that surrounded SunSpot? It wasn’t just to keep Carrell entertained. AMD has since incorporated much of Carrell’s brand of information compartmentalization into how it handled other upcoming features. I have to wonder if Carrell somehow managed to derive Apple’s equation for secrecy.

Post Your Comment

132 Comments

I fully subscribe to point raised by a few previous posters. Namely, the article being such a worthy read, it actually justifies the creation of an account for the sheer reason of expressing appreciation to your fantastic work, which does stand out in the otherwise well saturated market of technology blogs. Reply

"I almost wonder if AMD’s CPU team could learn from the graphics group's execution. I do hope that along with the ATI acquisition came the open mindedness to learn from one another"

it would be a true concern if based on mere observation, but the hard facts are soo much terrible: AMD fired tons of ATI personnel, hence ATI drivers are years behind NVIDIA- we are still begging for centered timings on ATO cards, a feature that NVIDIA offers 6 generations past! ATI produces cards that are gameless. DirectX 10.1?! There was a single game with DirectX 10.1 support, and NVIDIA made the game developer REMOVE DirectX 10.1 features with a game patch that "increased" performance. DirectX 11?! ATI has to put money on driver developing team and spend TONS of cash in game developing.

I would be a happier costumer if the raw performance of my 4870X2 was paired with the seamless driver experience of my previous 8800GT.

And another game that AMD was too late is the netbook and ultralow voltage mobile market. A company with the expertise in integrated graphics and HTPCs GPUs with ZERO market share on this segment?! give me a break! Reply

Funny... after the heaps of problems I had with drivers, stability and whatnot with my old 8800GTS (the original one, 320MB), I decided to switch to ATI with a 4870. Don't regret doing that.

My only gripe with my current 5870 is the drivers' and the stupid giant mouse cursor. The Catalyst 9.12 hotfix got rid of it, but it came back on the 10.1.... go figure. Other than that, haven't had problems with it and have been getting great performance. Reply

I think the reason he had issues with the X2 is that it's a dual card. I think most gfx card driver problems comes from dual cards in any configuration (dual, crossfire, sli)

The reason you had issues with the 320mb card is that it had some real issues because of the half-memory. The 320mb cards where cards originally intended as gtx cards, but binned as gts cards that again got binned as 320mb cards instead of 640mb cards. Somehow Nvidia didn't test these cards good enough. Reply

Are you kidding me? Become informed before you spread FUD like this. I've been able to choose centered timings in my CCC since I've had my 2900 Pro back in fall 2007. Even today on my CrossFire setup you can still use it.

As for your DX10.1 statement, thank NVIDIA for that. You must remember that THEY are the 600lb gorilla of the graphics industry - I fail to see how the exact instance you cite does anything other than prove just that.

As for the DX11 statement, if NVIDIA had it today I bet you'd be singing a different tune. The fact that it's here today is because of Microsoft's schedule which both ATI and NVIDIA follow. NV would have liked nothing more than to have Fermi out in 2009, believe that. Reply

It's a shame that AMD doesn't have its driver department firing on all cylinders like the hardware department is.
The 5000-series are still plagued with various annoying bugs, such as the video playback issues you discovered, and the 'gray screen' bug under Windows 7.
Then there's OpenCL, which still hasn't made it into a release driver yet (while nVidia has been winning over many developers with Cuda and PhysX in the meantine, while also offering OpenCL support in release drivers, which support a wider set of features than AMD, and better performance).
And through the months that I've had my 5770 I've noticed various rendering glitches aswell, although most of them seem to have been solved with later driver updates.
And that's just the Windows side. Linux and OS X aren't doing all that great either. FreeBSD isn't even supported at all. Reply

Anand, these type of articles (Rv770,'Rv870',and SSD) are beyond awesome. I hope it continues for Northern Islands and beyond. Everything from the RV870 jar tidbit to the original die spec to the SunSpotting info. It's great that AMD/ATi allows you to report this information, and that you have the journalistic chops to inquire/write about it. Can not provide enough praise. I hope Kendell and his colleagues (like Henri Richard) continue this awesome 'engineering honesty' PR into the future. The more they share, within understandable reason, the more I believe a person can trust a company and therefore support it.

I love the little dropped hints BTW. Was R600 supposed to be 65nm but early TSMC problems cause it revert to 80nm like was rumored? Was Cypress originally planned as ~1920 shaders (2000?) with a 384-bit bus? Would sideport have helped the scaling issues with Hemlock? I don't know these answers, but the fact all of these things were indirectly addressed (without upsetting AMD) is great to see explored, as it affirms my belief I'm not the only one interested in them. It's great to learn the informed why, not just the unsubstantiated what.

If I may preemptively pose an inquiry, please ask whomever at AMD when NI is briefed if TSMC canceling their 32nm node and moving straight to 28nm had anything to do with redesigns of that chip. There are rumors it caused them to rethink what the largest chip should be, and perhaps revert back to what the original Cypress design (as hinted in this article?) for that chip, causing a delay from Q2-Q3 to Q3-Q4, not unlike the 30-45 day window you mention about redesigning Cypress. I wonder if NI was originally meant to be a straight shrink? Reply