Post Your Comment

58 Comments

From a Reuters article "Nvidia is also open to licensing its soon-to-launch Long Term Evolution (LTE) modem technology, Huang said."

"When that happens, NVIDIA (and AMD, and others) believe that opportunities for continued growth will appear in new markets (e.g. TVs, wearables, other connected compute devices)."Those devices are not that different though from mobile SoCs. A huge market for them -that they could just serve with their own silicon no need to license- should be autonomous and semi-autonomous robots, cars included.Reply

Ultimately it's up to NVIDIA whether they want to license any given technology to any given business. However what NVIDIA has told us is that they're looking to do whatever makes financial sense for the company. So if CUDA made sense - in other words, if there was enough money on the table - then that certainly is a possibility.Reply

Thanks! GCN have good compute power (as you shown here at AT), together with openCL 1.2 driver this flexibility would be an advantage. Same can be said about NIVDIA launching an openCL 1.2 driver (as many have asked for).Reply

Kepler isn't any better than Southern Islands, neither in terms of absolute performance nor in performance per watt. Tesla (GK110) is, but licensing that isn't on the table.Sea Islands (HD 7790) is more efficient than nVidia's current offerings.Reply

Well, I'd imagine licensing Kepler from nVidia might be cheaper since I imagine nVidia would welcome the opportunity to eliminate some GPU tech competition and price it for AMD so they'd love to get out of the GPU market. Meanwhile, AMD wouldn't have to spend time designing, testing, worrying about the tech coming back defective or just plain bad.

All they'd have to do is integrate it into their chips and move on. They wouldn't have to do the design work or a lot of the more expensive parts of design. It seems like it could wind up a lot cheaper, depending on how cooperative nVidia was in the whole thing.

AMD's done a lot of things lately to save a buck. This one seems like a potentially great way to save some money, too. And AMD's all about saving money as of late.

Still, I doubt they'll do it, if only because being wholly dependent on nVidia for GPU tech advances would be a whole lot of crow to eat all at once.Reply

Who are they going to license to?Intel: Nope, just dumped the last person they were licensing GPU from.Apple: No way, Apple is moving closer and closer to fully in house design on their SoCSamsung: Nope, Samsung has their own graphics core and nothing to gain licensing from NvidiaQualcomm: Nope, they make their own GPU cores which are currently faster than Nvidia's.

What about driver support since that's a very important part of GPU performance beyond just architecture? Does nVidia plan to keep their drivers entirely proprietary to ensure a competitive advantage, provide reference drivers to licensees, or perhaps even go so far as to fully support third-party Keplar products if they are a direct implementation with no architectural changes?Reply

Wow a bit of a shocker but once again Jensen Huang shows he's a great forward thinking mind. I don't know if they have to make this move now, but it's clearly a pre-emptive gambit to break into the SoC and HPC/compute markets from all angles.Reply

exactly, as someone else put it (i said this earlier in the comments and forgot to say it was not me who "came up" with this):Who are they going to license to?Intel: Nope, just dumped the last person they were licensing GPU from.Apple: No way, Apple is moving closer and closer to fully in house design on their SoCSamsung: Nope, Samsung has their own graphics core and nothing to gain licensing from NvidiaQualcomm: Nope, they make their own GPU cores which are currently faster than Nvidia's.Reply

1) Intel - actually already licenses Nvidia graphics IP, but this is the most basic IP which is legacy status from IGP days. Much improved but clearly MUCH room for improvement. Kepler would give Intel integrated graphics instant credibility and massive performance gains. It would also address the APU elephant in the room with AMD if they also licensed CUDA.

2) Apple - uses IT/PowerVR for graphics and custom-ARM CPU for their SoC, they can go custom as much as they like but it's still licensed tech on both fronts, why not Nvidia if they can offer a better solution?

3) Samsung - same deal, but they license GPU tech from both ARM (Mali) and IT/PowerVR. They will once again license GPU tech from somebody, and Nvidia is now an option.

4) Qualcomm - with Adreno (thanks AMD), they are the only one on the list that truly has no reason to license GPU IP from Nvidia.Reply

Sorry but the absence of a deal does not indicate the absence of interest. If the price was right, there's many who would benefit from the world-class performance and efficiency of Kepler immediately. Samsung, Apple, Intel all come to mind but beyond that, you have other players who have gone away from ASIC design and mfg that might take a shot at licensed architectures for their own custom uses again like IBM, Sony, Motorola (Google).

As for the Nvidia PR hype vs. substance, I'd tend to disagree, they're one of the few companies that consistently creates something out of nothing and actually has product to show for it. Recent examples being CUDA, PhysX, Tesla, Tegra, GRID, Shield. All products/technologies that didn't exist 6-7 years ago, all products that were not originally a part of their core business of building graphics processors.Reply

I get the part that Nvidia does not want folks to jam on the fact that today's worlds fastest supercomputer uses Xeon Phi. However, Nvidia's release would have read better if they identified any partner that wants to license this technology so they can enter the enter the fray on Nvidia's side.

I am not sure whether channel conflict is the correct term, but customers sometimes get skittish when their vendor wears multiple hats and directly competes with them. Was there any indication of Nvidia's strategy to address this issue?Reply

Yes, but it would take 1.5-2 years to make it, and if broadwell is already being designed it means that it will not be there. It be a VERY LATE 2015 product at earliest (Unless Intel decided to scrap Broadwell and start over with Kepler/Maxwell in it)

So Nvidia sees a financial problem in their future and this is also a big PR move to cover that too (not all about financial). So they want support for their Denver / Echelon software stack, they want extra money, spread Cuda.

Time will tell that how much they are willing to licence their IP's. If they keep tellin "no i wont licence this to you " (etc. Cuda/Physx to AMD), it will be more or less like "Cuda is now opensource" lie / Hype (which it isnt, only at backend).

AMD already using ARM IP's in its products, so Nvidia's IP's will be also interesting. Nvidia's willingness and pricing will tell us how much it will change things.Reply

Intel is already dependent on Nvidia IP for their GPUs, they pay them $250M annually for the privilege of using stripped down base level GPU IP. From Intel's perspective, why not pay more for the privilege of using a modern, world-class IP like Kepler?

What if Intel raised their licensing amount to $1Bn for Kepler performance today? That would certainly put a damper on AMD's APU plans, as that's really the only area Intel is deficient relative to AMD at the moment. Win-Win for Intel/Nvidia, Intel wins CPU and GPU, Nvidia doubles their revenue overnight, profits/margins would skyrocket as well.Reply

Not Native English, But I was wondering, the article has multiple mentions of "very sensible". To me this implies this has been known (to them) or they would have guessed they are going to doing it. But then they say something about they didn't known this was coming.Wouldn't "makes sense", "sensible direction", "logical step" be a better word to use? Just Wondering.

Back to Nvidia, Since the market, right now only has three group of players, Apple, Samsung, and Others. With Apple and Samsung taking 95+% of profits and 85%+ of market shares. And there are only 3 GPU IP Vendor, ARM, Nvidia and IMG. ARM GPU is ... Crap. The good thing about Nvidia is its IP can be used across Desktop and Mobile. Which Fits Apple perfectly, but i am not sure if Apple wants to put all its egg in one basket.

Nvidia's Software Defined 4G Modem. I know Samsung, LG, and ST-Ericsson all have their Stack of 4G Modem, but none of them license it out. So as far as i know this is the only 4G Baseband available on the market. ( Correct me if i am wrong ) And Since Qualcomm 4G Baseband pack is the 2nd most expensive component on smartphones, it make sense for Vendor to actually make it them self or even integrated into their own SoC.

I wonder about that...it makes little sense to me that Apple would make their own GPUs when they can already choose ARM or PowerVR...and now Nvidia. My guess is the engineers they hired are just to deal with integrated existing tech into their chips, not reinvent the wheel.

Who knows though...they did develop their own core. Though there two I wonder if there will be a Swift 2. The only point to Swift (and Krait) seems to be that ARM left a gap between A9 and A15, which they've since filled with A12 anyway. Swift and Krait might fit in that gap (though I suspect they're closer to A9 than A15) but the next logical jump is A15.Reply

Apple isn't designing their own GPU, those Design Team are for Optimizing and Tuning the PowerVR arch or what ever GPU IP they decided to use. Design a GPU from Ground up is hard, and it takes lots of time.

The reason why Nvidia fits Apple is just like you mention, vertical integration. Apple could have one GPU tech across Mac and iOS. The cost of writing GPU drivers is increasing and since Apple are developing the drivers themselves it make sense for them to do away with less variation on GPU. Reply

"But I was wondering, the article has multiple mentions of "very sensible". To me this implies this has been known (to them) or they would have guessed they are going to doing it. But then they say something about they didn't known this was coming.Wouldn't "makes sense", "sensible direction", "logical step" be a better word to use? Just Wondering."

Your suggested alternate phrases ("makes sense, sensible direction, logical step") are perfectly fine usages for the application you suggest. I suspect from your question, however, that your understanding of the phrase "very sensible", in this particular usage, may invest or credit the author with either more (insider) knowledge than has been established, or perhaps more foresight/prescience than may be warranted.

You are/it is correct in your use of "very sensible" when a particular decision or course of action is being discussed, & the benefits of that same decision/action could be predicted, or even has been suggested by others in advance of the event. In this use case "very sensible" is a way of saying that, "Given the known variables of a, b, & c, & present market conditions, it is obviously "very sensible" for X to do Y."

As the author has used "very sensible" in this article/post, he is discussing a course of action taken by Nvidia that he (Anand) had no advance knowledge of, nor had he any real cause to suspect that Nvidia was going to embark upon this path. In this use case, that decision/action is being reviewed in retrospect. In this use case, it's the same as saying, "Though I never had reason to consider the prospect of such a thing happening, now that the decision is made & the reasoning h as been scrutinized, it is clearly a "very sensible!"path to follow."

Your inference of advance knowledge is not necessarily incorrect, but neither can it be assumed. Reply

Wouldn't it be crazy if AMD decided it was cheaper/the better option to license nVidia GPU tech than continue their own? Heads would explode all over the internet, forums would burn down from all the flames, and chaos would ensue.

I'm curious to see where this goes. I think it's a smart move to do ahead of any major slam from Tegra failures this year and it gives them some cushioning in case the market moves in a direction they don't anticipate.

I'd love to see Intel license and use nVidia cores instead of their own IGP, but it would be years before such a product would come to fruition. I wonder how the driver scenario would wind up with a product like that. Intel has been known to license third party GPU's before.Reply

or may be AMD decide to also license out their GPU IP as well? Using AMD GPU IP could have games easier to port from PS4/XboxOne to your Mobile Phone. Surely this is a much easier sell then Nvidia?Reply

I suppose this is smart? Not sure what the implications are for Tegra and their discrete GPUs...

Also I wonder had they done this earlier, would they have been used in the two next gen consoles? Not sure how much that's worth to AMD though as I assume the deals are pretty good for Sony/Microsoft.Reply

"Had Tegra 4 been out and available, I think it’s safe to say that the SoC would likely have been used in at least some previous Tegra 3 design wins."

This has to be one of the most ridiculous statements I've seen you from anand. If S800 was out and available I'm guessing it would be used in some S600 design wins. I can say this every year for EVERY product ever produced. If Intel's broadwell was out now, I'm sure it would be selling in haswell models instead of haswell. Do we see how dumb this is?

"The cynic in all of us can point to NVIDIA’s struggles with getting Tegra 4 into devices and out the door as motivation behind wanting to license its GPU IP. "What struggles? The first device they are selling to is THEMSELVES. And it isn't out yet...LOL.http://venturebeat.com/2013/06/03/toshiba-launches...2 toshibas with T4 (and a 3rd with a T3)Vizio has a T4 (and T3) coming also that they showed at CESAlso the Asus Transformer Pad Infinity, HP SlateBook x2 both T4, not to mention Mad Catz MOJO (they are SEEKING deal T4 they said).BungBungame's Kalos tablet uses T4 also

Couple those with Ouya, gamepop, wikipad etc and I think tegra has a pretty good year lined up even before the T4i. Hp, Toshiba and Asus all showing tegra support. Who's left besides Samsung/LG/Apple (who does their own stuff-but now may go NV next year with a T5 or kepler gpu at least? Who knows)? I guess you guys missed computex? All 3 tablets (all but Kalos) with T4 were on display there 3 weeks ago. Is this the AMD love showing again? No mention of a T4 device or did I miss it in your computex coverage? How about their award, top in handhelds (so no Vita or 3DS victories or anyone else...LOL)?:"NVIDIA today received its fifth consecutive Best Choice Award at Computex, as the SHIELD open-platform gaming device won the top Golden Award in the smart handheld category, as well as a special Media’s Choice Award, voted on by editorial staff of local publications."And the Media Choice award too...LOL But this T4 thing has trouble. You guys were all over Computex and NV doesn't even get a mention...The president of Taiwan handed it to them...LOL. I hope you guys know alexa will be showing you further down this year...How long will you guys keep hiding NV products, hiding any articles about NV making record quarter after quarter, make excuses for lack of FCAT results, and BS the public about AMD cpus (an A8-5600 for all single gpus is good in your 1440p articles?...ROFL)? Does AMD really pay enough to have you have your alexa traffic off by a good 1/2 in the last 9 months? Sheesh. Wasn't shield a centerpiece at Google I/O? :) Just Glass, GS4 and Shield right? Missed that too? I can't believe you guys covered all Intel stuff at computex but not mention any of the 3 tablets from Nvidia on display at computex (but had time to SPOT THE KABINI...LOL). Whatever. More NV hate from Anandtech. Even the GPU's got an award at computex, nothing about that either from anandtech. It's comic you guys have the only 780M laptop with cpu throttling issues so GTX 780m can't shine...LOL."Except, either due to our particular MSI GT70 sample being a lemon and having CPU throttling issues (or it may be a design problem that affects all of MSI’s current GTX 780M notebooks—we’re not entirely sure), the 7970M really isn’t that far off the GTX 780M performance."From the top article at anandtech on the 7970 enduro junk (yeah, enduro runs like junk).http://www.laptopmag.com/reviews/laptops/msi-gt70-...You're the only one with a crappy Lemon laptop running 780m. Exact model kicking some butt. You're the only ones with problems with FCAT, the data is just too large, drivers caused invalid data blah blah blah. Anything to wait until AMD fixes their prototype drivers :) Still no part2 of Ryans FCAT articles etc etc...

T4i announcements can't be far away, and that reaches a whole other market via $100-$400 phones (aimed at $100-200), but until then it is basically modem-less and thus, basically PHONE-LESS. I've said it before and I'll say it again, you can't really announce product launches without knowing exactly HOW MANY can NV allocate to you? I don't suspect anyone will know that until NV sees how well Shield sells in the first month (hence the July ship date for T4 to everyone else which gives them a month of sales to test with). If shield sells 10mil (joke of course) why would I even allocate a single chip to a tablet? I'll sell 10mil shields and tell you tough luck buddy, we make more on shield than a soc until shields are stuck on shelves :) Well DUH...

I'm always confused by ridiculous claims T4 is having trouble selling to devices...LOL. I'm kind of shocked Anand himself is doing this. Seriously? You can't be this dumb or ignorant. If a company like HP etc announced something then shield sold like hotcakes forcing a few million HP pre-orders (insert amount here, whatever) to be delayed for months until shield slowed down, there would be a ton of HP hate, and probably some NV hate. Nobody will risk this hate and SMART business would delay saying a word until allocations are known and product is shipping. You'd have to know this Anand. So why make these comments? Any proof of T4 being turned away by device makers? And I mean literally an OEM saying T4 sucks we're running away and have no plans because it's bad? I don't see any S800 announcements yet either...For the same reasons no doubt (hiding power use surely doesn't help). But do we think S800 isn't selling because of no announcements? No, that's SILLY. If a chip isn't even SHIPPING yet you can't announce squat.

http://blogs.nvidia.com/blog/2013/05/31/big-games-...25+ tegra 4 optimized games, with arma and hawken (according to kotaku) being exclusives so far and also Dead Trigger 2 if memory serves. If they are telling the truth it's a pretty good launch for something they dev'd for 10mil...LOL. IF it's a bunch of lies, well it will deservedly bite them in the butt. Anyone want to place a bet as to whether they'll be out of stock in a week or less at $300? The list of features making 3DS and Vita (never mind GDC2013 showing only 2% making a 3DS game and 5% planning vita games vs. 60% for mobile at GDC 2013, 2500 devs don't lie) worthless now is long. Stonking soc, 720p, HD movies out to tv (playing anything android can), 4x or more main memory (vita has 512mb this has 2gb 3DS has less than both), 16GB included (and a slot free), the entire android library of apps/games, TV browsing, pc streaming, GPS (3G vita has this though right?) and on and on...At $350 it was a tougher sell, but at $300 I think they'll be gone in short order. I'm sure Kepler based T5 will be in it next year and a new one every year. I wouldn't stop as long as they made a $1 profit. Proliferate Tegra games, is the name of the game here and sell gpus to stream from. Slowly kill directx making windows pointless with google's android help (if I don't need office or directx games why do I need windows?). HTML5, OpenGL, WebGL, OpenCL etc FTW :) Kill DirectX now thanks.

IP lic note: I wouldn't want to be imagination right now. They're broke already. This will kill them slowly or fast. They're dead. They had to borrow 20mil for a measly mips 100mil purchase. Maybe Apple buys them? Otherwise, welcome to the list of defunct gpu makers. Since both AMD/NV did this to them already in the desktop market years ago, does this mean they killed them twice?...ROFL.Reply

For a company that wants CUDA to be as popular as possible (more than OpenCL), and a company that likes to have exclusive games tied to its GPU technology, this strategy makes PERFECT SENSE.

I doubt Apple will consider Kepler, but maybe Maxwell. I could definitely see Samsung trying it out, though. They've been pretty confused about what to use in their SoC's lately. Is it Mali? Is it PowerVR? Hell, just go with Qualcomm.

I'm also not sure if Samsung is already their client or not, but Nvidia definitely has a pretty big manufacturer lined up to get it. Maybe it's Huawei or someone like that. Nvidia wouldn't have announced this if they didn't already have customers. Would be pretty embarassing to announced it, and then have no one want it.Reply

I wouldn't be surprised to see both Samsung and Apple license. Gaming is a very profitable market that both have publicly expressed interest in exploiting. If done right Apple could capitalize on an all purpose console of their own in the next year or two. Nvidia did a veryshrewd maneuver in recognizing the sheer hate, & competitiveness S & A have with each other by instilling the "what if" factor if one thinks the other might out-maneuver them by licensing. IP? They will have no choice but to license if they want to keep relevant in a very profitable and competitive gaming market that will keep advancing at a pace that Imagination, or any other in-house GPU attempts will not be enough, Reply