Dead wrong. An GPU13 slide stated Mantle was being implemented by an AMD team.

My "guess" is that AMD will continue to waste money on hyping up the gimmicky, proprietary features which don't even work for majority of games.

I'll believe several world class developers that see massive potential in Mantle and consider it worth implementing over your 'guess' it's gimmicky.

hrmm where to start here... oh yeah.

2x the software development cycles... to make 2 completely different engines.2x the DEBUG cycles to debug two completely different render engines2x the patch cycles, because again we have 2 engines.worse, you will have 2 different code paths, so patches will NOT be cross compatible, and you only know how to program 1 of them from experience. the other... you need to PAY amd for help.2x the beta cycles

all of that is 2x the COST...

and all that mess for a api that will only be usable on less than 5% of the entire cpu market ??? no... world class devs will use OPENCL, DirectX and OPENGL... standards that have been around for decades and despite their bloat are INFINITELY cheaper to develop for and will be used by 95% of the market.

look at it this way.. if your engine costs 10 million dollars to make... are you going to spend 20 million porting it? JUST so that you get another 5% market for cheap people who MISTAKENLY think Kaveri will allow them to play games at anything more than LOW settings ??? (without dedicated gpu)

nope. you will cave into the bean counters and investors and stock holders.. you will only code *1* engine... now which one?

5%... or 95%.. oh yeah... lets code for INTEL,NVIDIA AND AMD... not *just* amd in the *HOPES* that you have a AMD CPU AND GPU..(now dead and buried with the official disconitunace of opterons, fx and all am3+ sockets) ..

yeah sure it makes sense to AMD..after all the apu is the only cpu they are developing and have dropped all am3+ plans and all opteron dev.

but to a software guy who is trying to make money without doubling his work load? nope... not mantle. not going to happen, and not for the limited amount of gains that will be seen on entry level, sudo mainstream apus that will end up in 400$ dells and 300$ acers..

Cybert said: Capitlization and periods are hard for you, aren't they? I've given over $100 to techforums. I should have you banned for my money.

I hope TR does a really rigorous test with Mantle finally shows. Same hardware setup, one with a radeon testing D3D/OGL/Mantle, and then same setup, but swap in an equivalent Nvidia card for D3D/OGL.

I'm still wondering if the different multi GPU stuff they talked about will be in the first available build or not.

My A10-7850K should be here pretty soon, but the R9 280X has to wait for the price to come back down.

Multi-GPU like that Hydra Stuff Intel was using, or do you mean X-fire and SLI?

I've got an A10 in my laptop, but yours seems to have a newer GCN component to it. Hope it works well for you!

When I was in a Windstream CO installing aggregators, it was nice being able to turn off as much unnecessary fluff as possible, and have the laptop last 8hrs on a single charge. It was fun watching my coworkers compete for the single free outlet, or rush to their van and waste gas powering up their inverters to try and charge up their gear.

A GPU13 slide said Mantle was being implemented primarily by an AMD team with assistance and feedback from DICE/Frostbite.

Just imagine the level of histrionic outrage that we'd be hearing if Nvidia were to have its own developers basically rewriting the software for a major AAA game release to specifically target Nvidia hardware. If this is what is happening, then frankly I don't care if there is a 60 - 80% performance boost in BF4, I don't really think this is a good metric for judging if this whole Mantle circus is real or just a glorified tech demo.

Oh you mean like Gameworks and cough..phys-x ?

Hypocrites.

Can't wait to see you eat a bucket of humble pie when Mantle makes your NV cards look like integrated graphics.

And just like Havok... /riposte/ /thrust/ /touche~!/

Cybert said: Capitlization and periods are hard for you, aren't they? I've given over $100 to techforums. I should have you banned for my money.

Mantle's execution model extends to multiple GPUs. Developers have access to all of the engines on all of a system's Mantle-compatible GPUs, and they can control those GPUs and handle synchronization themselves. "Synchronization between the GPUs," Riguer explained, "becomes a natural extension to the mechanism we exposed . . . on synchronization between multiple queues. In fact, we make [the] multi-GPU model exactly like a single-GPU model scaled up to multiple devices."

As a result, developers have much more flexibility in the way they split up workloads between GPUs, and they can "try to make [their games] scale a lot better" than what's possible with CrossFire right now. Techniques superior to today's alternate frame rendering (AFR), whereby each GPU renders a different frame in the animation, can be developed, and asymmetric configurations—such as those with slow integrated graphics and fast discrete graphics—can be more readily exploited.

Moving beyond AFR is particularly important. While that technique works reasonably well with current games, Riguer said future titles will run more workloads with lots of frame-to-frame dependencies, such as compute-based effects. To handle those, "You would need to either duplicate the workload across GPUs or serialize across the GPUs. In either case, your scaling suffers."

That article is probably the best info you're going to find. There should be more info in the coming weeks (CES and the BF4 Mantle port being imminent) but I don't think it'll be much that wasn't said already.

...That article is probably the best info you're going to find. There should be more info in the coming weeks (CES and the BF4 Mantle port being imminent) but I don't think it'll be much that wasn't said already.

Ah, cool! Thanks! That cleared a good bit up for me. Now I'm looking forward to CES!

You really need to take a look on how games are made, because the workload suddenly doesnt double because you are using an different rendering path. You realise that is already done in part with various version of DX, PhysX effects, OGL vs DX on games that support both.

In DICE case the engine developers are a fully different and self contained team with something like 40 people, so if you were talking about only engine development, then perhaps you double the workload, but a ton of time for the game go to so much other things then the pure rendering.

And if these things will actually free up a lot of resources so you can make a game that looks better, runs better, and are easier to optimizae since you are likely to have less contraints, then it might actually end up making it easier and faster to use it for certain markets.

You really need to take a look on how games are made, because the workload suddenly doesnt double because you are using an different rendering path. You realise that is already done in part with various version of DX, PhysX effects, OGL vs DX on games that support both.

In DICE case the engine developers are a fully different and self contained team with something like 40 people, so if you were talking about only engine development, then perhaps you double the workload, but a ton of time for the game go to so much other things then the pure rendering.

And if these things will actually free up a lot of resources so you can make a game that looks better, runs better, and are easier to optimizae since you are likely to have less contraints, then it might actually end up making it easier and faster to use it for certain markets.

I remember an old quote from Carmack where he said a full game engine is around 30% of the entire game if you count development hours. And the graphics rendering part is around 30% of the full engine. So for the total workload for an entire game, the graphics engine is around 10%.

it's an old quote, probably around the time he built the D3 or Rage engine but I think it still holds considering the art, level and design assets are even more detailed today.

I haven't been following the whole Mantle discussion very closely and, while I can see the benefit to such software to the developer, I was thinking about Mantle and its' benefits to the end user. It seems to that the end result will be higher frame rates and the fact that you will be able run higher video settings.

But what benefit is higher frame rates if you still have tearing. In FPS, a game type that really benefits from high frame rates, you would have to enable V-Synch which then introduces lag. The lag in some games is at times horrible, so much so that my KDR drops dramatically with V-Synch enabled. Maybe higher frame rates isn't the main benefit but I just wonder what Mantle will bring if it doesn't address this issue of V-Synch lag in fps and produce smoother game play overall. I guess that's why I think G-Synch will do more for me than Mantle. The 2 together would be awesome but we'll have to see if the green and red teams play nice and share or if they go out and develop their setups in competition.

It seems to me that a better way to go is clean up the existing standards to clear up the issues that Direct X has, rather than scraping the whole system.

I haven't been following the whole Mantle discussion very closely and, while I can see the benefit to such software to the developer, I was thinking about Mantle and its' benefits to the end user. It seems to that the end result will be higher frame rates and the fact that you will be able run higher video settings.

But what benefit is higher frame rates if you still have tearing. In FPS, a game type that really benefits from high frame rates, you would have to enable V-Synch which then introduces lag. The lag in some games is at times horrible, so much so that my KDR drops dramatically with V-Synch enabled. Maybe higher frame rates isn't the main benefit but I just wonder what Mantle will bring if it doesn't address this issue of V-Synch lag in fps and produce smoother game play overall. I guess that's why I think G-Synch will do more for me than Mantle. The 2 together would be awesome but we'll have to see if the green and red teams play nice and share or if they go out and develop their setups in competition.

It seems to me that a better way to go is clean up the existing standards to clear up the issues that Direct X has, rather than scraping the whole system.

Since I have a very large monitor and that is capped at 60Hz, I run with V-Sync on, because tearing really makes everything really bad, but I haven't noticed any additional lag for that reason compared to how bad it can be with v-sync out because of tearing. On the other hand, I run with settings so any slowdown below V-Sync FPS is extremely seldom. Now if it would've had tons of slowdowns below 60, that might have been an other matter entirely.

On the other hand, that also involves other factors and how the game is made. I prime example would be the new NFS that was ported to PC from console that is frame rate locked, you can actually hack it to get higher FPS, but the simulation of the game is locked to the FPS so the game actually run twice as fast. On the other hand, if you have a game like Battlefield, it seems that the server tick rate is very much decoupled from the FPS from a simulation standpoint.

All in all, I would love to have the G-Sync, based on time past, while AMD seems to embrace the idea that other's could and should use mantle, I bet nVidia would die before letting anybody else use G-Sync if there is any way to do so.

I remember an old quote from Carmack where he said a full game engine is around 30% of the entire game if you count development hours. And the graphics rendering part is around 30% of the full engine. So for the total workload for an entire game, the graphics engine is around 10%.

it's an old quote, probably around the time he built the D3 or Rage engine but I think it still holds considering the art, level and design assets are even more detailed today.

That sounds beliable, although I would probably hazard a guess that it depends on the engine and how generic or specific the engine is to a certain game. In old times there was almost a new engine or larger development for each game, but the engines of today is so advanced and very much more generic so I guess once they have the implementation of an engine down, the assets and the actual game overtake any specific engine coding by a huge margin. Especially since most engines seem to go to be used in a lot of games. Frostbite is a good example of that actually, BF series, consoles, Mirror's Edge, Battlefront, Plant's vs. Zombies 2, next Mass Effect, NFS:RIvals(which also has the stupid thing of locking the simulation to fps and cap at 30).

I bet nVidia would die before letting anybody else use G-Sync if there is any way to do so.

Personally I don't think that AMD would have much of a problem coming up with a similar solution.

Given the two tasks: Either cloning Mantle or re-implementing G-Sync (be it an outright copy or just a functional equivalent), I can assure you that G-Sync is by far the technically easier challenge. It's really not super-complicated (the monitor refresh is now driven by the GPU instead of by the internal monitor controller at a fixed rate) and most of Nvidia's advantage is that the monitor needs an updated controller and Nvidia has made the first prototypes. However, assuming that Nvidia doesn't want to play ball, Intel & AMD could easily band together with monitor manufacturers to get their own solution out on the market.

Reimplementing Mantle well.. Nobody outside of EA/AMD has even seen Mantle documentation much less the information needed to "clone" Mantle. Not to mention all the parts that are GCN specific where the other manufacturers would have to gut their microarchitectures (ain't gonna happen) or come up with performance killing software kludges.

Duplicating Mantle would be more time consuming than G-Synch. That being said I don't think it would be years away if NVida thought it would be something that would sell. My hope would be that Nvidia would buy into Mantle or an open standard based on Mantle rather than yet another competing standard. Based on the little research I've done DirectX isn't going to go much higher than 11 so something will have to take it's place.

With regards to G-Sync, what I really fear isnt the technical solution, as you said, that is that that hard to replication, but think of having separate propieraty standards competing, or having a new patent-troll-war, neither of what I would put past nvidia in any way. That is not a situation we went to be in. So the question is how it's currently implemented and what can be used without falling into either of those.

I would rather Mantle be made into an open standard instead of having a competing one emerging, since that will probably make them coexist for a while with a few years of uneven support and not that much progress. But with so little information out about Mantle in specifics, there is very little to say in how much of mantle is actually GCN specific, or how much that can be made to similar specifications for Nvidia hardware. While it's closer to the iron then DX, it is still more integrated with the game engine, and seems to rely on a thin api/abstraction layer to the hardware, so I would say that considering how much similar work both architectures do in order to support DX, if built properly, it might not be that much work as long as the thin driver between mantle and can offer the same api upwards. So my layman's take on it is that if they can support DX, they can probably support mantle, or big parts of it, if not in current gen, then at least into next generation, probably without doing massive reengineering.

I think the key to Mantle's success is having the developers use it and push it and getting the Nvidia on board if it's ever gonna happen, so if they can do that, it might only be a question of AMD having one architecture iteration headstart, which is a while, but that that far away.

I But with so little information out about Mantle in specifics, there is very little to say in how much of mantle is actually GCN specific, or how much that can be made to similar specifications for Nvidia hardware.

Mantle is designed to be a thin hardware abstraction ‒ Not tied to AMD’s GCN architecture ‒ Forward compatible ‒ Extensions for architecture- and platform-specific functionality 

Mantle would be a much more efficient graphics API for other vendors as well ‒ Most Mantle functionality can be supported on today’s modern GPUs 

Want to see future version of Mantle supported on all platforms and on all modern GPUs! ‒ Become an active industry standard with IHVs and ISVs collaborating ‒ Enable us developers to innovate with great performance & programmability everywhere

Johan Andersson gave the presentation this slide accompanied. It is likely there is no man on this earth with a deeper simultaneous knowledge of building a game, building a game engine and implementing Mantle.

When it comes to people trying to make money, nothing's valid until proven.

Strawman argument. Motivations differ widely. Uber-techs like Andersson tend to be far more motivated by technology advancements and creation enabling than money, as in ... "‒ Enable us developers to innovate with great performance & programmability everywhere."

EA's into Mantle for it's future profitability potential.

Anderssons is obviously passionate about advancing gaming technology and by all indications is into Mantle for it's future gaming potential.

That rather lengthy set of slides has lots and lots of grandiose speculation.You know what the most important factual statement in that entire presentation is?Here's a hint: It's on the very first slide (not slide 34!)... can you guess what it is?Still don't know? OK, here's the answer:

Johann Andersson -- Technical Director Frostbite Electronic Arts

A whole bunch of promises made by a guy who is NOT AN AMD EMPLOYEE and has exactly zero authority about how AMD treats Mantle and is not making decisions about Mantle for AMD.

Here's what an actual employee of AMD has to say about Mantle in his official capacity as an AMD employee:

@Thracks is MANTLE open source ? @GnrlKhalid No. It is an API for the industry-standard GCN Architecture and its specific ISA, done at the request of game developers.

Rule 1: When you accuse other people of being anti-AMD because they agree with on-the-record statements from AMD employees who AMD pays to say exactly what I just quoted while you "prove" them wrong with statements from people who have never worked at AMD and do not represent AMD, maybe you should reconsider the words that you are putting in AMD's mouth.

That rather lengthy set of slides has lots and lots of grandiose speculation.

To offhandedly call an official slide accompanied presentation from a person widely acknowledged as among a handful of the most accomplished and brilliant technical minds in the gaming industry 'grandiose speculation' reflects not on Mr. Andersson, but the person who would make such an assertion. I won't bother to ask you for any slightest evidence or proof to back such an assertion as I know such doesn't exist. If you can prove otherwise I'll profusely apologize.