PNG files are handy for Stage3D games since they offer great compression and alpha channel support for transparent textures. If you make your PNG files in a program like Photoshop and try to render them, you might notice ugly black 'halos' or 'outlines' around any transparent parts and that even the solid colours don't look quite right.

The reason you're seeing blackness behind transparent pixels is because it saves PNGs with pre-multiplied alpha and that really messes them up if not handled properly.

Luckily, we can fix this by 'un-multiplying' in the fragment shader quite easily.

b) Navigate to ...\Common\Configuration\ActionScript 3.0 and create a new folder named "FP11beta2".

c) Place the playerglobal.swc file from step 1b into the "FP11beta2" folder you just made.

d) Place the XML file you downloaded in step 1d into ...\Common\Configuration\Players

e) If you used any name other than "FP11beta2" for step 2c, open the XML file and adjust this line:

3. Compiling and Debugging

a) Flash Player 11 Beta 2 should now appear as an option in the Publish Settings of your FLA.

b) Select that option, make sure Flash is set to export an HTML file as well, and build the project.

c) If it throws errors during the build process: double-check you've followed all instructions correctly so far.
If it builds but crashes immediately from a "VerifyError": you're fine, move on to the next step.
If it builds correctly and runs: you're either fine or magic, move on to the next step.

d) So now you've successfully built a SWF for Flash Player 11 but Flash Pro won't run it once you start mucking with Context3D and you don't have a standalone player either, so instead open that HTML file you just published in step 3b.

e) If it runs: you're fine, move on to the next step. Otherwise? *shrug*

f) We need to modify the HTML file so Flash can actually use your GPU for hardware acceleration.
First, turn off the HTML option in Publish Settings from now on so it never gets overwritten.
Now open the HTML file in any editor and set all 'wmode' parameters to the value 'direct'.

g) Before you go test it, boot up Vizzy. No special configuration or installation; just start it up.

h) When you run your SWF, Vizzy should automatically pick up any trace() calls and errors, and Context3D.driverInfo should be able to list your video card and the renderer being used (DirectX/OpenGL).

Congrats! You're done and ready to start making awesome stuff!

4. Additional Troubleshooting

If Vizzy isn't picking up any traces, make sure you installed the Content Debugger version of the flash player in step 1a and not the release versions.

If driverInfo says you're in software mode, check the HTML and make sure wmode is ALWAYS set to 'direct'.

If that doesn't help, make sure you aren't running any browser extensions that change the wmode when you load the page. HoverZoom on Chrome caused problems for me, it's likely not the only one.

If that doesn't help, try a different browser.

If that doesn't help, contact Adobe with your OS/Video Card/Browser versions so they can work on better compatibility.

If that doesn't help, throw a hissy-fit and run down the street naked.

Let's say you have a 256x256 texture but are viewing an object that's so far away it only takes up 16x16 pixels of your screen. That's pretty wasteful, so mip-mapping tells the renderer that it should automatically choose the most appropriate texture size. You can make all the sizes by hand, or do it automatically with this code:

/*Note: All code is by McFunkyPants (http://www.mcfunkypants.com)

Keep an eye out for his upcoming book on Stage3D game programming, too!*/

This is a quick tutorial on how to set up Texture objects for rendering models in Stage3D. I'll cover how to:

1) create a Texture from some BitmapData
2) create a Texture from an image file in the library (Flash Pro)
3) tell the GPU which Texture to render with
4) sample the texture in your fragment shader
5) optimize your texture usage

//Note: 'repeat' allows you to tile textures on a surface while 'clamp' locks the texture to the 0-1 UV range

#5. Optimization Tips:

Texture swapping is expensive; group models that share the same texture so you only need to swap once for the whole group.

If you have an animated character, put all their frames in the same texture sheet ('sprite sheet') and adjust the UV coordinates to point to the right section rather than using a bunch of separate image files.

If possible, combine textures for environments as well, or just try to use the same textures on as much as you can get away with (you can even combine textures in a BitmapData at load time!).

Always make your texture sizes are powers of two (e.g. 256x256). GPUs are designed to take advantage of these numbers for extra performance.

Keep your texture size as low as possible! Painting textures at high resolution is a good idea, but always scale them down as far as you can stand before exporting them from Photoshop (or whatever you use). You'd be surprised how small you can get things before losing detail!

Don't waste rendering power; use different fragment shaders for textures that are fully opaque and textures that have transparency. You can even use BitmapData.transparent (boolean) to check for alpha transparency data dynamically!

Use 'repeat' in fragment shader texture sampling to save on performance (render one model with a tiling texture vs. multiple models with the same texture) and load time/file size (one small texture repeated vs. a large texture)

I've been meaning to update the old Ambient Lighting tutorial for a while now as I found that the effect it accomplished was not exactly what I intended. It could still be useful in its own right, but let's look at how to do ambient lighting the way you would expect it to work.

And while we're at it let's throw in a three point lighting setup because I'm lazy and that is what is in the copypasta shader I'm grabbing from a project of mine having multiple lights is really useful!

Much of this tutorial is identical to the last one with some additional constants (multiple lights!) and a new fragment shader. Don't be daunted by the length, much of it is just copy/paste and tweaking values.

/* Now the real work: going through each light (including ambient) and adding all of their effects together.
Note the reuse of temporary registers (especially ft1) in this shader. Why are we doing this?
Temporary registers are limited in number so if you don't reuse them you will run out! */
fragmentShader =
"nrm ft0.xyz, v1 \n"+ //normalize the fragment normal (v1)
"mov ft0.w, fc0.w \n"+ //set the w component to 0
"tex ft2, v2, fs0 <2d,clamp,linear> \n"+ //sample texture (fs0) using uv coordinates (v2)

Update July 14th: Fixed missing part of the fragment shader and added a note about binding a texture

It's been a while since my last Molehill tutorial. Working on five games at once doesn't leave much time for writing shaders and blogging. Still, I've snuck in some time to play with it and thought I would share my new lighting setup with everyone.

/* Set up our normal transformation constant. This is a superior way to how I did it in the previous lighting tutorial */
var invmat:Matrix3D = mod.clone();
invmat.transpose();
invmat.invert();
context.setProgramConstantsFromMatrix("vertex", 4, invmat, true); //vc4

It's really easy to add more lights too, just add more fragment constants for direction & color, then repeat those middle four steps of the fragment shader for each light and add them all together at the end (note: not the best solution ever, but it works well enough for now).

This tutorial is deprecated. I've left it up for reference, but I highly recommend checking out the more up-to-date tutorials.

A previous tutorial outlined how to do vertex lighting in your vertex and fragment shaders. It had one major flaw: the normals for each vertex had to be transformed in AS3 every frame if the object they were part of was rotating.

The GPU is much faster at doing that sort of calculation than AS3, so let's offload the math to the GPU! Luckily, we don't need to add very much.

I've written several tutorials on how to achieve some basic effects in your Flash Player 11 shaders using the Adobe Graphics Assembly Language (AGAL). I thought it would be a good idea to put up a quick reference guide for what the various niggly little registers and things actually are.

Consider it AGAL 101:

In vertex shaders:
va[x] = values from a vertex buffer
set with context.setVertexBufferAt(x,buffer, ...);

This tutorial is deprecated. I've left it up for reference, but I highly recommend checking out the more up-to-date tutorials.

Continuing on with my series of Molehill experiments/tutorials, we come to vertex lighting (+the distance fog from before).

At this point I've started building up my code as an actual engine, so it would be confusing to copy and paste whole sections of code with engine functions calls and everything. What I will do it just try to focus on the most important snippets and trust you'll know where to plug them in.

/* set up a directional light vector
In this case it's directly down the -Z axis into the screen */
context.setProgramConstantsFromVector("fragment", 1,
Vector.([0,0,-1,1])); //fc1

/* set up an inverting vector or else our light direction will be backwards */
context.setProgramConstantsFromVector("fragment", 2,
Vector.([-1,-1,-1,1])); //fc2

For this example, I:
-calculate the vertex normals when creating the cube
-then calculate the transformed vertex normals each frame based on the cube's rotation in AS3
-and pass those transformed normals into the vertex buffer.

My next step is to figure out how to get that transforming done on the GPU instead as it would likely be faster.

I'm starting to get the hang of Adobe Graphics Assembly Language (AGAL) a bit now. I've got a rough sense of what all the registers are and how to use them. As a result I have a massively improved distance fog shader which works per-vertex and doesn't care what your camera orientation is (as it uses the final vertex depth after being transformed through your view matrix).

This tutorial is deprecated. I've left it up for reference, but I highly recommend checking out the more up-to-date tutorials.

There isn't much information out there right now on the Molehill API and how to program for it, so unless you work on a 3D engine framework like Away3D you are really flying blind right now. I thought I would throw my meager experiments up here in the hopes that someone can run with them and do much cooler stuff (and hopefully share code of their own!)

Per-Polygon Distance FogDisclaimer: I'm making this shit up as I go. I am far from being an experienced 3D programmer and don't really know what I am doing.
I'm not going to go over how to set up Flash CS5 or Flex to compile Flash Player 11 builds, because there are a number of tutorials on that already. I'm going to assume you've been able to get something like this great starter source code by Ryan Speets running.

/* Setting up camera/model matrices is for another tutorial.
I'm assuming you have a model matrix 'mat' ready to go */
context.setProgramConstantsFromMatrix(VERTEX, 0, mat, true);

/* Now for each polygon you are rendering,
figure out a fog intensity value based on its depth.
To keep things simple I'm just using the Z coordinate of each object.
This assumes you are looking straight down the Z axis.
If you had a rotating camera, you would need to do some more fancy math */
private const FARPLANE:Number = 10000;
var depth:Number = 1+(obj.position.z / FARPLANE);

/* Now we set up the fog fragment constant 1 (fc1) vector structure is (r,g,b,a)
In this case I'm going to use a fog that ADDS RED as objects get farther away. */
context.setProgramConstantsFromVector(Context3DProgramType.FRAGMENT,
1, Vector.( [ depth, 1, 1, 1 ] ) );