Nvidia The Way It's Meant to be Played 2013 (Day One)

Nvidia discusses next-generation graphics, new development software SDKs, and games at its Montreal event

We had the chance to check out Nvidia's The Way It's Meant to be Played 2013 event in Montreal, Canada. The two-day editor's event officially kicked off today and centered around the company's new promising game development tools which the green team asserts will usher in truly next-generation graphics. New graphical features like "Flame Works" and "FLEX" were announced along with improvements to existing tools like PhysX and more.

Today covered more of the software and game side of development, but Nvidia assures us that day two of the event will focus more on graphics technology. Hopefully we'll hear some new hardware announcements! (A tech geek can only hope!)

Until then, you can check out all of today's slides below and let us know what you think of Nvidia's offerings so far in the comments below!

Nvidia says many of these tools will work on its upcoming Tegra SOC which will use the company's Kepler architecture, which made its debut with Nvidia's GeForce 600-series video cards. In the background here we have Nvidia's new Tegra chip running the company's Project Logan facial demo which we've seen running on dedicated desktop GPUs in the past.

To demonstrate its new FLEX system, Nvidia shows off a demo of falling blocks being pummeled over by flowing water. As the water comes crashing down, the blocks fall over and hit the other blocks creating a realistic domino effect.

The last unified GPU PhysX demo has water filling up a small room with little boxes and bouncing balls inside. The water knocks the particles around and the particles within all realistically hit and collide with one another.

The specific type of ambient occlusion Nvidia is pushing is called HBAO+. Ambient occlusion refers to a game engine's ability to mimic the way light sources interact with objects in the real world. "In real life," light may hit an object and bounce around corners to illuminate other objects that may be the in shadows, etc.

Another graphical tool Nvidia is pushing is contact-hardening shadows. "In real life," shadows are darkest nearest the object that is casting the shadow and lightens the further it stretches out. With contact-hardening shadows, Nvidia says developers will easily be able to integrate this aesthetic.

In this image here GI works is enabled and Nvidia demonstrates how easy it is for developers to move a dynamic light source. The little cut-out square on the bottom left of this image indicates the viewpoint of the light source behind the T-rex. As the programmer controlling the demo shifts the direction of the light source, we can see that the light is truly dynamic and illuminates different parts of the room and the objects within.

Similar to ambient occlusion, another benefit with GI Works is that it allows light to bounce and reflect off of different surfaces. Here you can see these television sets being mirrored on the floor below whereas you couldn't see this when GI Works was disabled (see: two slides ago).

Nvidia then does a live demo of a dragon breathing fire around a metal ball. As the fire flames out of the dragon's mouth, it doesn't merely engulf the ball but realistically interacts with and bends around the contraption.

In a live demo of Batman: Arkham Origins, we see Batman walking throw the snowy streets of Arkham City. As he steps through the snow, we can see his foot prints behind him which is achieved using Nvidia PhysX.

The next game to take the stage in Nvidia's lineup is the upcoming game Assassin's Creed IV: Black Flag. Here an Ubisoft rep talks about how the game will use dynamic foliage. You will be able to walk through the forest and will realistically interact with the branches and foliage in your path.

Another graphical trick Black Flag has up its sleeve pertains to its improved rain and wet surfaces system. In the demo we are shown, clothing looks naturally damper and darker as it gets soaked and wood surfaces have a slight appropriate glow to them in the rain under the moon's light.

Finally Nvidia closed the first day of its event by mentioning that it will have different GPU-game bundles where purchasers can get the opportunity to get Batman: Arkham Origins, Assassin's Creed IV: Black Flag, and Splinter Cell: Black List for free in addition to $100 off on Nvidia's Shield handheld gaming console.

The AMD marketing dept. really are a bunch of trolls. Who can forget about the Trinity "laptop in a desktop" joke, or the chips that they handed out at what I think remember being IDF (or something Intel related)?

I am actually writing this from my hospital bed in the ER, high as a kite on painkillers, so I have no clue if this makes any sense to those reading it,but here goes nothing:

Proprietary. Proprietary EVERYWHERE! It's sickening to see the levels that nvidia has gone to (as well as other companies) as far as open standards are concerned. CUDA vs OpenCL, proprietary Liinux drivers only vs open and prop, closed down standards such as physx. Not cool, green team, not cool. There are reasons I chose AMD. This is one.

When the alternative doesn't work as well or at all as said proprietary standards, then as a developer, I'd rather not play this game of moral mumbo jumbo and just pick something that let's me get the job done.

And openness doesn't automatically mean freedoms. The source code Apple exposes on their OS kernel, if I read it correctly, is basically "You're free to view and edit this, but any changes you make are property of Apple". Hm.

The only reason that CUDA seems to work better than OpenCL is that it is more mature. It's a few years older than the OpenCl standard, making it better documented and more widely supported. Also, generally speaking, proprietary standards have a tendency to die very slow, painful deaths (although DX seems to have bucked that trend very well).

I am not arguing that proprietary standards produce better results. Generally they do, at least until an open standard takes over, but that period will almost always end.

Well, AMD had every opportunity to buy the company that created PhysX, and didn't.

I believe it was Agea. AMD is AMD's own worst enemy. They can't decide if they want to be a budget maker or a top performer. They put out cards at the same price point as Nvidia but they perform worse except for whatever the current top line card is.

Sorry to say, but their proprietary technologies are what is keeping Nvidia on top. Physx blows AMD's "realistic hair tesselation" out of the water, Cuda transcodes video worlds above and beyond any AMD solution I've used (though intel's quickstep is king) and their Catalyst suite is crap in my experience.

AMD cards are great for my wife's computer or my HTPC, but I go with Nvidia on mine.

At least AMD is being smart in the processor line. They realize that hey, our chips aren't the fastest, but you'll pay 1/3 the price. If they marketed the video cards that way, maybe they'd stop hemorrhaging money. Buying ATI was a terrible mistake because now they're fighting against 2 top tier brands they are scrambling to hold on to second place to.

Not sure if I'm reading your question right, but if you're talking about PhysX, it's an Nvidia-only tech. There used to be a hack awhile back that would allow you to run it on a mixed-card system, but I think the single developer couldn't keep up.