To answer that, let’s take a quick look at the history of water in video games. Realistic water is common now, but how did it get that way?

Water was just a blue blob in some of the first games I played. Not compelling. Not believable. Just part of the background. Its interactivity was limited. You were in it or out of it. But if you saw the blue blob in Pitfall—an adventure game released in 1982—you knew to stay out or stand on the gators’ heads.

In 1999, I first saw what developers could do with rendered water. That’s when NVIDIA introduced the world’s first GPU, the GeForce 256. It featured a transform and lighting engine for geometry calculations that let developers create more interactive objects, like water.

NVIDIA released a pair of demos to show what was possible. The Bubble demo showed a reflective bubble, which would ripple when poked. The Crystal Ball demo showed a transparent bubble, another great water effect.

But real breakthroughs didn’t arrive until 2002. That’s when GeForce 4 hit the scene with second generation programmable vertex and pixel shaders. DirectX 8 had evolved along with these new programmable shaders, and we began to pour resources into our game developer program.

Those where are all great technology advancements for 2002. But our demos showed that we at least had the ability to do those effects in 1999. Why did this realistic water take so long to spread throughout the industry.

What changed? The way we introduce cutting-edge new effects to developers.

ILM for Games

We call this effort GameWorks. GameWorks encompasses all the game-related technologies we’ve invented and refined over the years. It’s a robust suite of tools and graphics technologies backed by 300+ visual effects engineers who create libraries, developer tools and samples. They work closely with developers—often onsite—to enhance their games.

Back in 1999, we would write and deliver piles of code to developers to add to their code. But that created more work for developers. As we tackled increasingly challenging visual effects problems, just giving away code samples proved even less effective.

So, we adopted a more production-oriented approach and turned our library of special effects into middleware. Think of us as Industrial Light & Magic for games.

The result: Developers can adopt new techniques by just dropping that middleware into their games. It’s a proven formula to success. PhysX – one of our key GameWorks technologies — is now one of the most widely used pieces of middleware in the industry.

Four years ago, we introduced middleware we call NVIDIA Turbulence. Less than a year later it appeared in DarkVoid. Then we added support for Unreal Engine 3, Unreal Engine 4 and CryEngine.

Turbulence effects are now common in games. You’ve seen them in Unreal Engine 3 games like Batman: Arkham Origins, Hawken, and WarFrame. They are in Unreal Engine 4 games such as Daylight. The CryEngine game Warface just added them. Even games based on proprietary game engines like Assassin’s Creed 4: Black Flag, Call of Duty: Ghost, Metro: Last Light and Planetside 2 have added Turbulence.

With the middleware in place and support built in to the key game engines, the task of adding these effects to games is significantly less challenging for developers. It took just four weeks to integrate Turbulence in to Daylight and only six weeks to add it to Batman: Arkham Origins.

In short, we’ve gone from telling developers what they can do – to showing them how to do it. That’s shortened the time it takes for new technologies to reach games and accelerated the pace of innovation. Isn’t that what it’s all about?