Tag: Molehill

Passing values from the vertex to the fragment shader is a very common thing. The simplest use case is when you want to sample a texture: this operation can only be done in the fragment shader, but the texture coordinates to sample are accessible in a vertex attributes in the vertex shader. In the end, any non-trivial shader will require to pass values from the vertex shader to the fragment shader.

The actual shader is pretty simple, but I recommend you read the entire tutorial to understand it properly:

C++

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

publicclassRGBShaderextendsActionScriptShader

{

privatevar_color:SValue=null;

override protectedfunctiongetOutputPosition():SValue

{

varxy:SValue=float2(getVertexAttribute(VertexComponent.XY));

_color=getVertexAttribute(VertexComponent.RGB);

returnfloat4(xy,0.,1.);

}

override protectedfunctiongetOutputColor():SValue

{

returninterpolate(_color);

}

}

Another interesting thing is optimization. As values are linearily interpolated between the vertex shader and the fragment shader, it makes it possible to perform some per-vertex computations and reuse them in the fragment shader. The only requirement is that those values have to be linearily interpolable. In other words: every linearily interpolable values computed in the fragment shader could be computed in the vertex shader instead. It gives a very important performance boost, as the vertex shader is much faster and runs on a much smaller data set: we always have a lot less vertices than we have pixels. I will soon write an article about this very optimization.

Last but not least, this tutorial uses the latest additions in Minko. Among them, an easier way to declare vertices and fill a vertex stream using a VertexIterator. Indeed, you can now use an (inlined) Object to initialize an entire vertex at once. It was done by overloading the flash_proxy::setProperty() method in the VertexIterator dynamic class. You can now chose between 3 different approaches to fill a vertex stream:

C++

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

varvstream:VertexStream=newVertexStream(null,format);

varvertices:VertexIterator=newVertexIterator(vstream);

// the inline version (new!)

vertices[0]={x:0.,y:.5,r:1.,g:0.,b:0.};

vertices[1]={x:-.5,y:-.5,r:0.,g:1.,b:0.};

vertices[2]={x:.5,y:-.5,r:0.,g:0.,b:1.};

// the dynamic version

vertices[0].x=0.;

vertices[0].y=.5;

vertices[0].r=1.;

vertices[0].g=0.;

vertices[0].b=0.;

vertices[1].x=-.5.;

vertices[1].y=-.5;

vertices[1].r=0.;

vertices[1].g=1.;

vertices[1].b=0.;

vertices[2].x=.5;

vertices[2].y=-.5;

vertices[2].r=0.;

vertices[2].g=0.;

vertices[2].b=1.;

// the fast version

vstream.push(

0.,.5,1.,0.,0.,

-.5,-.5,0.,1.,0.,

.5,-.5,0.,0.,1.

);

All those 3 versions do the very same thing, but in a different fashion. You can now use any of those depending on what you are doing.

As always, if you have questions or suggestions, you can post them on Aerys Answers.

Building a flexible and extensible 3D framework has always been our priority. This goal has a huge influence on the way we work and what we provide as a final result. This is also a main difference with other 3D engine/frameworks that tends to provide everything in a single monolithic library.

One of the side effects of such politics is Minko’s plugin system. Here is how it works:

Every core APIs are in the actual Minko project

Any additional feature will be implemented as a “plugin”

About the core APIs

Those APIs are the foundations of the framework. It will be released as an open-source project. They are very loose coupled so that even Minko’s core can be splitted into distinct entities. There are two main APIs:

The Scene Graph API: it provides interfaces and base classes to build a scene using a data structure called a “graph”. In the end, building a Minko 3D scene is just as simple as building a 2D scene with the good old Flash display list API.

The Rendering API: it provides everything you need to perform rendering operations using the Molehill API and shaders.

We define a simple mechanism to pass data from the Scene Graph API to the Rendering API. We also provide basic scene and rendering algorithm to make it possible to start building a simple 3D application from scratch.

About the additional features

So what are additional features? They are features that will rely on the core APIs but that are not essentials or required in every project. Those features are (the following list is not exhaustive):

Dynamic lights

Dynamic shadows

Animations

Skinning

Dynamic reflections

File formats parsers (3DS, Collada …)

Physics

Particles

Mouse interactivity

Every new feature will be implemented as a “plugin” in a separate project. If you don’t want it, don’t use it! If you want to provide a different/better way to do it, feel free to do so! Does Minko lack a feature you need? You can create your own plugin and share it as you like.

Those plugins also make it possible to create bindings between Minko and thrid party projects such as jiglibflash or Stardust.

Why ?

The main goal is to ensure the core project exposes the right APIs. This way, it will be extensible enough and anyone should be able to work and build against it to provide more and more plugins… We also want people to be able to pick what they need and build a community around the project they like the most. And clearly, some plugins will be a lot more technical than others.

This video demonstrates what can be achieved using Minko, the next-gen 3D engine for Flash “Molehill” I’m working on. The environment is a Quake 3 map loaded at runtime and displayed using high definition textures. The demo shows some technical features such as:

Quake 3 BSP file format parser

Potentially Visible Set (PVS)

complex physics for spheres, boxes and convex volumes using BSP trees

light mapping using fragment shaders

materials substitution at runtime

dynamic materials using SWF and the Flash workflow

first person camera system

Quake 3 BSP file format

Quake 3 uses the version 46 of the BSP file format. It’s farily easy to find unofficial specifications for the file format itself. Yet, parsing a file and rendering the loaded geometry are two very different things. With Minko, it is now possible to do both without any knowledge of the specific details about the BSP format and algorithms. Still, those are interesting algorithms so I wanted to explain how it works.

Binary Space Partioning (BSP) is a very effective algorithm to subdivise space in two according to a “partition”. To store the subdivided spaces, we use a data structure called a BSP tree. A BSP tree is just a binary tree plus the partition that separates both children. This picture shows the BSP algorithm in 2 dimensions:

In that case, each node of the BSP tree will store the partition line. When working with 3D the partition will be a plane. We can then walk through the tree by simply comparing the location of a particular point to the partition. A good use case is 3D z-sorting: we simply have to compare the location of the camera to the planes stored in the BSP nodes to know which child should be walked through (or renderered) first. But in Quake 3, the BSP tree is used for:

physics: the tree makes it possible to optimize collisions detection by performing only the required computations and tests

rendering optimization: by locating the node where the camera is, we can know what is the Potentially Visible Set (PVS) of polygons that should be renderered and ignore the rest

In each “*.bsp” file are stored the BSP tree itself, the partition planes, the geometry, the PVS data and even bounding boxes/spheres for each node to perform frustum culling. This data is very precious and is compiled directly in the file by tools like GTKRadiant.

Credits

I was at “Retour de MAX” (“Back from MAX”) 2010 – an Adobe France event – to present Molehill, the new 3D API for the Flash Platform, alongside Adobe’s web consultant David Deraedt. The goal was to present the technical details about Molehill and the pros and cons of the API. We also wanted to demonstrate how Minko, our 3D engine, adds all the features Flash developers might expect when dealing with 3D.

We also presented a worldwide premiere demonstration of the new Flash 3D capabilities right in the browser (Internet Explorer 8 in this case). This experiment follows my work on rendering Quake 2 environments with Flash 10 using Minko. Of course, the new Molehill 3D API and the next version of Minko built on top of it makes this kind of work a lot easier. For this demonstration, we chose to present a Quake 3 environment viewer using HD textures provided by the ioquake3 project. Here are some screenshots (more screenshots on the Aerys website):