I'm wanting to develop a game with an 8-bit feel. Since this game is mostly for my personal enjoyment, I've set a requirement that I want my game to have an 8-Bit feel to it that is most similar the 8-Bit systems of yore, on a lovely CRT TV.

This is also my first Cocos2d project so I don't know which approach is better. I'd like the general advice of the community on how to render my game. I've come up with two possible approaches. Please advice which is better.

Approach #1

Create images for backgrounds and sprites using the "8-bit" resolution of my game. After I'm done making an image, I scale it up and I save two variants; the standard sized one and the "@2x" variant.

When I render the game, everything looks very 8-Bit but I will still need to apply mathematics to every animation so that my virtual pixels always shift by 4 or 8 pixels (for retina displays) on the screen when they move.

This approach doesn't sound difficult but it seems like it could become a bit annoying and tedious.

Approach #2

Generate all content in my image designer at the exact resolution of my virtual screen. When I generate my scenes in Cocos2d, I generate them all with the "virtual resolution". When the scene is displayed on the iOS screen, I scale the entire scene up to the resolution of the device.

I think this could be the simplest approach since all of my sprite and background movement mathematics can just behave as normal. However, I'm not sure that this approach is even possible. Furthermore, I know that within iOS, graphic scaling has an anti-alias affect that gets applied to graphical objects as they scale-up. Obviously, I'd want to cut this affect off and just use a "nearest neighbor" algorithm to scale-up my scene.

If I can't cut off the scaling anti-aliasing, I don't want to use this method. However, if I can setup such a rendering system, I think this approach would be best.

So, my question is simple. Can I use approach #2 and, if so, which approach is really easier to work with in Cocos2d and iOS?

2 Answers
2

Use Approach #2 (render to a small offscreen texture, then display that texture onscreen scaled up using nearest-neighbor texture sampling), because:

The asset workflow is simpler. You can edit individual pixels, without having to quadruplicate them, or manually scale everything up.

The assets will probably take up less space on disk. This is particularly important when distributing mobile apps.

Your offscreen rendering code will have fewer pixels to push, which will probably improve performance and/or consume less battery life. Rendering a game screen typically involves overwriting a lot of pixels (drawing the background first, then drawing sprites on top of it). By rendering to a smaller texture offscreen, you're overwriting a smaller number of pixels than you would overwrite if you were rendering at a larger pixel scale. Rendering the game screen is also typically pretty expensive per-pixel since you'll probably be using complex shaders, whereas upscaling and nearest-neighbor texture filtering a single texture are both very cheap on modern OpenGL hardware.

How is pushing pixels to an offscreen buffer better? You still have to display a full image as always, beside that interpolating textures is costly, it would be way better to scale the images once they're loaded and use the scaled ones with regular drawing. This is all completely unnecessary though, since it's way easier to just scale your images offline, either by using tools, most graphics editors will have that built in, or by using programs which support up-scaled pixel art creation like GrafX2 for example.
–
dretaJan 26 '13 at 16:26

@smokris That's a good point, the answer would probably benefit if you included that in it.
–
dretaJan 28 '13 at 15:03

Would scaling really affect the timing of my application all that much if the scaled images aren't being anti-aliased? I would that the "nearest neighbor" implementation would simply be a large copy of pixel values from one array to another. No math=fast implementation. Correct?
–
RLHJan 25 '13 at 14:37

if I understand him right, the scale would be determined only upon first run of the game - when the hardware capabilities can be assessed. I don't know anything about cocos2d, but I'd imagine it can scale stuff - and a nearest neighbour texture upsample should not be very expensive?
–
melak47Jan 25 '13 at 23:35