ND2Dx update: faster, more flexible, new rendering techniques, 3D transformations and new dynamic shader system

After all these months without any update for ND2Dx, I decided to take some time and try to package all the modifications I have been doing on it when working on different projects over the past few months.
It is not an easy task as lots of things have changed and not everything has been incorporated into the new release. There might be some bugs and/or left overs from the last release but things should get updated more often during the following weeks.

For the moment, I’m glad to announce a huge performance boost thanks to a new direct GPU drawing technique (thanks Genome2D) as well as 3D transformations coming back to ND2Dx and a totally new dynamic shader system that will allow you to easily create your custom shaders and change its textures and properties on the fly.

Using the Direct GPU Drawing technique, it now can render up to 55.000 rotating sprites at 55 fps on the same machine I ran the previous tests before.
Using the scene graph (display list), it renders around 23.000 rotating sprites (compared to 18.000 I think).

Demos

Use Left and Right arrow keys to switch between scenes/demos.Also, for some strange reasons, Chrome seems to be much slower than FireFox on my computer (even with the original Flash plugin from Adobe)

Things that have changed

No more Materials

Those are gone (sorry for those of you who were using them to create some kind of new effects/filters). With the incorporation of the new batching system, they became an unecessary step in the rendering process. This also led to an increase of performance when using the scene graph (display list)

No more RenderSupports objects

After testing the new direct GPU drawing system, it became clear that it should become the new centralized way of rendering. On top of that, I quickly realized that I was only using one batching system at a time and switching between two or more different batching systems during a same rendering process was just ridiculous. But it led to the new Renderers objects.

Added the new Renderers objects

Those objects are the new way of rendering meshes on screen.
Each renderer proposes two ways to render a mesh on screen:

textured quad: draws a textured quad. This one is faster as it only needs to draw a quad.

textured mesh: draws a textured mesh. This one is slower but more flexible as it can draw any kind of mesh.

There are currently two different Renderers:

TexturedMeshCloudRenderer: it batches textured quads and custom meshes together but only supports 2d transformations.

TexturedMesh3DCloudRenderer: it also batches textured quads and custom meshes together but does also support 3d transformations (so 2d and 3d).

Of course the first one is faster than the second one as it does not need to compute the new 3d transformations. But the second one still remains faster than the old ND2Dx rendering system.
It is also important to note that I’m using “faster cos/sin” from http://lab.polygonal.de/?p=205. It gives us faster cos/sin computations for a less precise result (but remains negligible in 99% of cases)

Added 3D transformations

With the new Renderers system, it made it possible to add 3d transformation without losing the ability to batch everything together. The only thing that is needed in order to use those new 3d transformations is the specific TexturedMesh3DCloudRenderer.

So no need to use special objects or containers to be able to transform a node in 3d. Everything can be done with a simple Node2D object.

Node2D is now the only display object

This is not an obligation but rather is something that I have decided for myself in order to make things more clear and flexible. Everything that extends a Node2D object functionality will be moved to components. This will also be true for the BitmapFont2D object that is not yet implemented as a component.
I’ve also taken that decision in order to make my life easier when adding new things to my WorldGameMaker editor.

Added a new dynamic shader system

This new system will allow you to use custom shaders (mainly and/or more easily written in a xml file) to render a mesh on screen. It has not yet been added into the new Renderers system as I’m still thinking of how I could do that in a clean and constructive way but it’s working and you can use them right now. The only downside is that it can’t be batched yet.

The good thing about it as that, first, you can use your own shaders quite easily, and secondly, you can create properties (straight from the xml file) that can be used inside your shader and modified outside of it in a very easy and efficient way.
In other words, it’s dynamic 🙂

SignalDispatchers

A signal dispatcher works kind of the same way as an EventDispatcher. But it’s much more simple and up to 7 times faster. I use it now to dispatch mouse signals from Node2D objects. I find that it’s easier to use than have all sort of different signals inside the class and much more flexible. This way events/signals can be sent from/over node and components in a simple and effective way.

Things that are not there yet and/or need a few adjustments

BitmapFont2D

I still need to incorporate this one into a new component.

UI system

I already have a UI system that is working but I need to convert it into components. I’m also changing a couple of things to make it more flexible.
I don’t think it is one that you have seen before. I really wanted to keep things simple (or so) and very flexible. So you’ll have to build your own buttons and checkboxes, etc… but more on that later, when I actually have something to show.
There is also a slice 3 and slice 9 system (and thinking of a slice 5 one… :))

Resource system

Everything from mesh to texture to shader is a resource. By grouping all of those things I wanted to be able to load and unload them on the fly. I’m now less fond of that ability so I might change and simplify a couple of things about that.

Dynamic mesh modification

This one will be a component that will allow you to modify a mesh at runtime. I’m thinking of control points that will each have an influence on specific vertices of the mesh.

SignalDispatchers

I still need to work on its implementation across the whole framework. My goal is to try to avoid signals where performances are not really needed. (SignalDispatchers are still very performant, much more than events).

Conclusion

This new release is stable enough to use it but as you could have read, some features are missing (and the most obvious one is BitmapFonts)
I plan on updating the framework more often now that the base has been cleaned. You’ll also find a couple of classes inside the framework that I haven’t written about here. Those are still experiments.

Hey Kevin
thanks for the input, we are actually using the same exact approach 🙂 ! If you only take the Signal and SignalListener classes (respectively SignalLite and SlotLite in your case). It is indeed much faster than events and Robert Penner’s Signals.
I then added a SignalDispatcher that takes a type parameter and allows me to centralize all the signals in one place instead of having them spread out in different objects, just like events. It is ~11 times quicker than events (with no parameters).
On a comparison note, when used alone, the Signal class is ~18 times faster than events and ~2.5 times faster than Robert Penner’s Signals. Also, and very importantly for me, adding and removing listeners takes far less time than Robert Penner’s Signals, thanks to the use of linked lists.
I have a couple of benchmarks of all those tests I made, I can put them online if you want, when I find some time.