It only took me about 3 days to get my framework working with all my targets: Windows, Mac, Linux, iOS and Android. I took an extra day to write my own simple mixer for WAV and OGG audio. And there might have been another day in there for other cleanup / fixes. So maybe 1 week total.

You can get SDL2 here. If you’ve used SDL before, you’ll find that it’s pretty similar, only it seems to work better, and now it’s got slick iOS and Android support.

You will want to follow the SDL 1.2 to 2.0 Migration Guide. And then for Android and iOS, read the README-android.txt and README-ios.txt as included in the source zip. I’m just writing up some tips here to help you along the way. I’m writing this with OpenGLES 1.1 in mind.

SDL2 for Windows, Mac, Linux

– Initializing your Window must be done in the right order. Basically, I call SDL_CreateWindow first, then SDL_GL_CreateContext. If SDL_CreateWindow fails with my preferred settings, I chose more fail-safe settings. After setting the mode I use SDL_GetWindowSize and SDL_GetWindowFlags to see if I got what I wanted. If things aren’t quite right, I use SDL_SetWindowFullscreen and SDL_SetWindowSize to try and request them again. (I always go out of fullscreen, set size, (maybe go into fullscreen), then set size again.

– As of SDL 2.0.1, Mac Retina displays do not work consistently. There is a flag for this “SDL_WINDOW_ALLOW_HIGHDPI” but I’ve found that if you ever change mode, or do anything, things seem to fall apart. If you aren’t messing around much, it might just work well enough for you.

– glu doesn’t seem to work anymore. So I had to switch over to using glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP,GL_TRUE); to generate my mipmaps. I examine the glGetString(GL_VERSION)) to make sure it’s >= 1.4 before doing that. And if I’m not on a new enough GL, I just don’t use mipmaps at all.

– Use stb_image to load your images. (I didn’t do that just for SDL2, but it’s a great tip!)

SDL2 for Android

I can’t believe how smoothly this went. (Recall, 2 months for NDK, 1 week for Marmalade .. this port took 1 DAY!!!)

– Follow README.android for the details.

– in AndroidManfiest.xml in the “activity” tag add android:configChanges=”orientation” , keeps your app from crashing. You may use other orientation tags to keep it in a single orientation, or whatever.

– If you use sockets, be sure to add these permissions “android.permission.INTERNET”

– Use SDL_rwops to read your own data files (which you have placed in the “assets” folder of your Android project.)

– If you want to use SDL2_mixer, you may need to edit SDL2_mixer/Android.mk and disable a few things.

– If you are using single touch, your mouse SDL code might work already, otherwise add support for SDL_FINGER* events (and filter out the touch events from your mouse code if (e.motion.which == SDL_TOUCH_MOUSEID) { break; }

– Be sure to call SDL_SetTextInputRect before SDL_StartTextInput if you are using key input. SDL_SetTextInputRect let’s you specify where on the screen the text is appearing so that SDL2 can shift the screen to keep the virtual keyboard from overlapping it.

– On suspend / resume I had to pause my audio.

SDL2 for iOS

– Follow README.ios for the details.

– SDL_SetTextInputRect doesn’t work. So you’ll need to capture UIKeyboardWillShowNotification and shift your screen to keep the virtual keyboard from overlapping it on your own.

– Unlike in Android, the orientation won’t change between portrait and landscape UNLESS you add SDL_WINDOW_RESIZABLE to your SDL_CreateWindow flags.

– Even though I added in SDL_iPhoneSetAnimationCallback, I got crashes when suspending my app. I also had to use SDL_SetEventFilter to capture SDL_APP_WILLENTERBACKGROUND and SDL_APP_DIDENTERFOREGROUND to set a flag to tell my callback to start / stop doing its thing.

Custom Mixer?

You might be able to use SDL2_mixer on all platforms. I did try it out for Android, and got it working pretty easily. However, I decided to write up my own mixer using SDL2 to stream the output. (This ended up only taking a few hours, and it keeps me from having to have SDL2_mixer as an additional dependency on all platforms.)

I used SDL_LoadWAV to load wav files and stb_vorbis to load and stream ogg files.

Good luck!
-Phil

Posted in C++, development | Comments Off on SDL2 Tips, Tricks, and Workarounds

Through a bit of Googling and whatnot, I got in touch with Joe Delgado who was able to provide a patch to fix ENet to work with Marmalade. These are modifications to unix.c. When I am making a Marmalade build of my games, I always have PLATFORM_MARMALADE defined. So you can replace that with whatever you do.

After the #ifdef __APPLE__ segment, add this to set up the correct defines, etc.

And that seems to get ENet working 100% with Marmalade. I applied this patch to ENet 1.3.6 … The original patch was written for ENet 1.3.3, and I suspect it would work for the latest version of ENet. I built this with Marmalade 6.3.2. I’ve passed on a request to the Marmalade team to fix sendmsg so that the larger chunk of the code would no longer be necessary.

So – I went Universal with Dynamite Jack just today! Yay! This involved a lot of “blah blah” messing with resizing all the menus for iPhone users, which wasn’t very interesting, though it came out really well. The interesting bit was when I realized that “going Universal” meant that my retina iPad assets were going to be put on everyone’s phones. And that meant that the 50 MB OTA limit was going to hit me. Quite a few devs advised me against going over the limit, so I took their warning.

I was at 62 MB. I had to take the size down AT LEAST 12 MB to hit the 50MB limit. However, I also know that when distributing your game iTunesConnect pads things a bit, so adding 2MB to that is a good idea. So my target is 48MB.

Okay, so I’m at a good place, I know exactly who is eating my space. But now what to do? Let’s check it out!

Video

So I decided not to do anything about the video. I include two IVF files, one for the retina iPad and one for everything else. They are as lossy as I want them to be, so further altering would start to eat up the quality. I could probably get away with re-encoding them a few % lighter and save 1 or 2 MB if I really had to.

Sound Effects

Stereo 16-bit WAV files sound great but they are huge. I took some advice and converted them to IMA4 files which are 25% as large as WAV files. This saved me about 5 MB. However, I found when testing the game out on my decent computer speakers that the IMA4 format really adds a lot of noise to the sound, this was not acceptable to me. I decided to re-encode them using the AAC encoder at 96k which resulted in perfect sounding sound effects and saved me 6MB!

iMac$ afconvert -d aac -f caff -b 98304 in.wav -o out.caf

I am using CocosDenshion for my audio engine, and it seems to handle this just fine.

Music

My music was already encoded as AAC files at 128k. I tried a variety lower bit rates to see if I could save a few bytes. I found at 64k it was really obvious that I was cutting corners. I found at 80k I couldn’t tell any difference, so I decided to go with 96k since that would give me a bit of a margin above that just so I could be sure the music sounded perfect. I used the same command line as converting the sound effects. Going from 128k to 96k saved me 5MB!

So far I had saved 11MB, but I was 14MB over and I knew I needed to trim a bit more fat to make this work.

Images

Previously I had tried a ton of variations on 16-bit “4444” dithered style images. These, unfortunately, looked horrible in Dynamite Jack. So I wasn’t able to use that trick.

I did find out about ImageOptim which takes forever to pack PNGs but it did manage to pull me back 2MB getting me down to 13MB total saved, which was really going to cut things close. I decided to investigate one other option.

I had heard that Amazing Breaker had used JPGs for the RGB component of images and a PNG file for the alpha component. I really require high-quality images in my game, so I found that at 98% quality and 1×1 sampling I was able to get really great looking images.

I pre-blitted them (which gave me premultiplication) onto a black background to get the JPG. Then I made a grayscale PNG file of the alpha channel. I created my own mini format “.cuz” to combine these into single files and loaded them in my game. I found that the game looked perfect! This saved me 6MB!

Afterwards, I found that some of my pre-baked font images got larger using my format, so I left those as straight PNGs.

Finally …

So all said and done, I had saved 6 + 5 + 6 = 17MB! This got my IPA down to around 46MB, which is a nice distance below the 50MB limit I’m quite pleased with the results. Some bonus tips:

– The AAC sound effect trick will only work iOS 3.0 and higher. Which shouldn’t be a problem now-a-days. I hear that this won’t work out-of-the-box with OpenAL, so maybe check out CocosDenshion.

– Definitely check your own music to find what bit rate starts to degrade the sounds. Playing on your iPhone or iPad speaker isn’t enough. Playing on earphones isn’t either (unless they are really nice). I recommend playing on your computer speakers so you can be sure the sound IS really good before deciding.

– The JPG+PNG image trick can get great results – but definitely keep the quality high. I found that I was able to go down to 98% and found no artifacts in my game. Be sure to experiment and find the sweet spot for your game images. Also be sure to test on all device resolutions you have to check for artifacts. I tested on all 4 iOS screen resolutions to be sure things were perfect.

Anyway, I hope you find this helpful in your quest for saving bytes! And don’t compromise on quality! Nobody wants to hear or see compression artifacts.

-Phil

P.S. If you need to cut your App in half after that, here’s what you could do:

– Change all SFX+music from stereo to mono
– Set JPG quality to 95% or something even less

That would probably cut mine back another 15MB or so, and very few people would notice. I would notice a tiny bit, and in my case, I don’t need to compromise any more, since I’m already under 50MB.

Bonus: a bit more detail on my image file format

First, I used a python script to figure out which was the best way to compress the image. I do a variety of conversions and see which one is the smallest:

So, for example, fonts ended up working best as “original PNGs” and most everything else ended up being JPG-RGB + PNG-A. There should be a 4th option of just JPG-RGB with no alpha, but I didn’t bother, since I don’t have any fully opaque textures.

So in October of 2011, Ludum Dare hosted a second October Challenge. I had so much fun the last year, despite canceling my game, I decided to give it another go. I was really attached to the idea I felt I was approaching with Stealth Target, so I wanted to give it another try. Since I realized the aesthetics and UI were the biggest problems, I decided to take the game back to “Glorious 2-D” and use the aesthetic from my earlier Ludum Dare game Anathema Mines for the starting point of this game.

Here are cut-down versions of the blogs posts I made during the October Challenge 2011. Additional commentary included below the quotes.

I’m doing brute-force ray casting here and it works great. It’s really nice to be targeting the desktop using C, so I can do stuff like that. (The older LD version was in python so I had to code it smart, and if I were targeting mobile I’d have to be more optimized.) Anyway, my goal is to have this game selling on the Mac App Store before the end of the month for a few bucks.

TECH: I’ve done a fair bit of optimization here, but really, the main gist is that I raycast from the center of the light until I hit something. I have a few optimizations and whatnot that help make this faster, but nothing super clever. A win for the component object system was that I’m able to change the size of the shadows each object has, which helps for the fine tuning of the look. If you look carefully you can see the size of the player’s shadow get larger when he dies and falls down.

BIZ: I changed my mind about the Mac App Store before the end of the month. I soon realized that this game was coming out really good and that it was going to be worth taking the extra time to really polish it up before releasing it for sale.

I re-did my lighting systems in the game so now I can have various colored lights and I can add ambient light to corners of the caves.

TECH: Each tile on the map is given an RGBA “lighting” component. Each frame I color where light is on the map, and then I blur the coloring of the map. Then I draw the flooring and tiles using the lighting values. I use a different averaged color for each corner so that the shading is nice and smooth. When the player walks you can see the lighting jump ahead by tiles, it’s a technical shortcoming, but it “feels okay” because it feels like the light is flickering a little.

Some new goodies today. Well, the explosions I’ve had for a while, but I just added in the technology that you have to destroy in order to defeat the evil over-lords or whatever. The technology is RED that’s how you know it’s EVIL technology.

DESIGN: If you remember back to Dynamite the core game mechanic was exploding the load bearing pilars in the game so that the building would collapse. I decided that collapsing the cave like that didn’t make much sense, and that glowing alien technology would just look way cooler. I had to come up with a way for blowing up the tech to have a purpose, so requiring the user to explode all the tech of a single color to unlock some doors seemed like a straight forward design choice.

DESIGN: You can see the black “pit” below the explosion. In the prototype of the game, the explosions actually created holes in the floor that were impassable. I decided I wanted my game to never back the player into a corner, so I now have the explosions only break down walls and give the player more area to move in, instead of less.

So, here’s my level editor thing. Right now I’m trying to figure out how to set up the level entrances / exits / pathways throughout the level. Sort of some kind of cryptic code system. I’m not sure how complicated I want it to be. Depends on if I will have the level editing open to the general public or not.

DESIGN: I was thinking about some really bad ideas at that point …

That said, I think I want it to be editable by normal people. So I think I’ll probably pass on using those weird codes. But at least now I have those cool hex icons for no reason.

DESIGN: I quickly came to the conclusion that if the editor was going to be too hard for a “normal person” to use, I would also eventually get sick of it. So I made sure to only include things in the editor that I felt everyone could use, not just myself. This really helped me when creating the levels for the game. Since I’m not hugely into creating levels, having a super easy to use editor was what made it possible for me to create the 28 levels for the game.

UPDATE: Using my cool-sauce edge generation script, with just a few minutes of graphics work I can get a totally different look to my game. This is going to be super helpful to giving my low-budget game the appearance that it has art in it (maybe).

TECH: This is the one place that I really used some fun python code. I created these interesting mini drawings of the walls in the gimp, one of the ones I use in the final game looks like this:

TECH: I then use a python script to use sub-sections of that image and face them in all different directions to generate the 200+ possible wall tiles for that style of wall. It took a fair bit of messing around to get this to work perfectly, and in fact the “red technology” has two separate layers to give it the look it has. I also save alpha data about each of these 200 sub-tiles which I use for the light ray-casting collision detection. I also use the same data for just plain collision detection.

Here’s my gameplay demo video. I’m attempting to “monetize” the game as of Oct 31st, so I’ll report back on how well that goes.

BIZ: I didn’t report back, but I will now. I sent the video to Valve along with some of what I was planning. They were interested! Had they said no, I would not have spent more time working on the game. This was my way of attempting to “fail early” on this project by seeing if the game looked good enough to have mass market appeal.

DESIGN: You can see how the guards reacted to seeing your flashlight in the distance in this video. I changed this later on in development as it made the game too hard. Also the other “scientist” characters had that ability, so I decided it would give the game more variety if they behaved differently. You can also see how the guards turn around counter-clockwise in this video. This was somewhat random at one point, but now they always turn clockwise when going between two points. This makes tracking their paths much easier when playing.

It’s been a great month working on this. The game is coming along super-well, I imagine it’ll actually be released publicly in about a month now.

BIZ: I obviously have some rather poor time estimation skills. It is now six months later and the game is finally coming out this week! The amount of work and polish that went into this game were way beyond what I imagined, but it’s been totally worth it! I’m super pleased with how this game came together.

The game is coming out on Thursday, May 10th! Be sure to check it out then

-Phil

P.S. The prototype was named “Anathema Mines”. I almost named the final game “Escape from Anathema Mines” but enough people couldn’t pronounce or remember the name that I decided to change it. A TON of ideas were thrown around, but eventually Dynamite Jack stuck

Hey there… This happens to me sometimes. I want to be able to turn on a debug mode or a tool that will crash my code during runtime. I can use Xcode, MSVC 2008, or GCC under Linux. Anything. Though, XCode is preferred. This must work with C++, not just C code.

I’m gonna dig into some of the tech behind Dynamite Jack, so hold onto your seats.

So, a lot of devs are into Component Object game design. That article is by a friend of mine who goes into a good overview of a lot about the model. Anyway, you could probably spend all day reading the resources he links to. The short version (as I see it): Ideal Component Object game design is like a Normalized Database. It basically means, it’s kind of a chore to get it working but it’s super flexible and powerful.

I thought that was interesting. So I went to a talk at 360iDev by Gareth Jenkins. He demonstrated how to do component architecture with real concrete examples, which really helped me get my head around the idea. But the most profound bit of the talk was the end, where I remember him saying something like:

“It doesn’t matter HOW you do this, as long as you DO IT.”

That was super freeing for me. The idea that it didn’t matter how hacky I did it, but it would still give me all the benefits was very helpful. So here’s how I do the component object model in Dynamite Jack:

Now, you’ll see the cool bit is that I have access to all data without creating anything, so no free / delete needed. The simple bit is activating a feature is just a matter of setting “has_light” to on or off. That’s how I actually turn on and off the player’s flashlight in the game.

In the game code, I just loop through all the items and dispatch to various functions for each component type.

And I have a similar function for handling events, and painting the screen. It made it really great for prototyping the game and changing features. The “cave trolls” in the game (see at the start of the trailer) had their AI replaced quite a few times, and being able to just hack new things on and change things around without breaking my other entities was really nice!

One of the other nice things about having it explicitly in code, as opposed to “registering” things magically, is that I can see exactly in my code what order the components are being run in, so if something depends on something else, I can be sure they are in the right order.

Dynamite Jack has 20 different “components” but they all lie in the same big structure. Another cool one is having an “action” component, which doesn’t even exist in the game, it’s just a timer to trigger some future event. With everything being an Entity, it just gets it’s own loop called, and is able to do whatever.

So, yeah, my method gives me only a single Class, as many components as I like, and the flexibility to mix and match stuff. The power to toggle components on and off. And a fixed amount of memory used. So easy “save to file” style serialization of the game state.

Anyway, that’s the tech for the day. I definitely recommend component systems, it makes it SO EASY to try new things with your game, and be able to tweak them until they work “just right”.

-Phil

P.S. If you’re looking for the code on how to do this, it’s all in this post. It really can be THAT SIMPLE

Disclaimer: this tutorial covers how to render IVF / VP8 / libvpx video in an OpenGL libSDL / SDL window. IVF video only includes video, not audio. For game developers, it’s trivial to play audio via their own audio system. So you’ll have two files per movie “movie.ivf” and “movie.ogg” or whatever. As an exercise to the reader, you could easily jam both into a single file if you really wanted to.

The Problem We’re Trying To Solve

So you’re an indie game developer and you want to show a clip of video in your commercial cross-platform (PC/Mac/Linux/other?) game! Obviously you want a patent-free open source unrestricted license to do it.

Wait, can’t I go commercial?

Better than that, you could just use the built-in codecs on a platform! I’d suggest this if you are targeting a single platform, iPhone / iOS for example.

Otherwise, you’ll be using Bink, a commercial solution at $8500 / platform. I emailed about their “indie licenses” and never heard back.

The Open Source Options I didn’t like much

Here’s what we have for patent free open source codecs .. and their various problems.

Xiph Theora – Probably the best known codec. To get it working you have to have libogg, libvorbis, and libtheora all built for your target platforms. To me, that seemed like a lot to ask. Also, the libtheora API is a MONSTER. playtheora is a SDL example (similar to this one) that covers some of that ugliness, so I’d recommend checking that out if you want to use theora.

Dirac / Schroedinger – the BBC funded codec. I couldn’t get this one to build. It doesn’t seem to be all that popular.

Motion Jpeg – This isn’t so much of a codec as an idea. Make your own movie file with a ton of .jpg’s in it. I tried this. The files get really huge really fast. I wouldn’t recommend this.

Motion JPEG 2000 – This implementation was also pretty confusing. I couldn’t find where to start. And, yeah, this isn’t all that popular either.

libvpx .. why I chose it

WebM / libvpx – Backed by google this is a new contendor on the block. The thing that sold me was the sample encoder which was pretty simple. It also depends on nothing. Also, building it on OS X and Linux was trivial. Also they offer a pre-built Windows binary. Also, they just had their 1.0.0 release a few days ago.

So, yeah, having a supported, up and coming, easy to build codec was key to me.

How to encode for IVF / libvpx

Since it’s a new codec, not much supports it right now. I used a fresh build of ffmpeg under linux that I built with this configure command:

./configure --enable-encoders --enable-libvpx

Then I was able to use ffmpeg to encode ivf files pretty easily:

ffmpeg -i Untitled.mov -vcodec libvpx -b 1000k -s 1024x512 movie.ivf

Note: we’re not dealing with WebM files. WebM files are container files that also contain audio. Again, you’ll have to store your audio separately, or create your own container file, or figure out what WebM is on your own time.

So .. what’s the bottom line? Do we get any code?

Yes! I created a libSDL player that plays back the video at max speed possible and it converts the YUV data to RGB data and loads it as a texture. Here are the functions I provide:

void playvpx_init(Vpxdata *data, const char *_fname) ;

Just init your Vpxdata with a filename “movie.ivf” .. It’ll try and get libvpx up and running for you.

bool playvpx_loop(Vpxdata *data) ;

Call this once per frame to have it decode a frame of video. It will return false once it has run out of frames. If you want to mess with the libvpx YUV data yourself, it’s data->img. See the playvpx.cpp source or the libvpx example above to see what that structure provides. It’s pretty simple.

int playvpx_get_texture(Vpxdata *data) ;

Call this once per frame to have it convert the YUV data to RGB and upload the texture to OpenGL. It will return 0 on failure or a OpenGL texture ID on success. I convert on your CPU, so it’s not super fast, but it should work fine on modern computers. If anyone cares to provide a Shader version of this function, or provide a SIMD / MMX / SSE version .. well, that would be faster!

void playvpx_deinit(Vpxdata *data) ;

Call this function when you’re done to cleanup.

Conclusion and Source Code

Okay, here’s playvpx for you to check out. It’s a C-style API, but I’m sure I use some minor C++ in there. Probably wouldn’t be hard to make it C-only if you require C for your project.

Oh, and I include the libvpx binary for Windows, OS X, and Linux. So you may not have to build it for any platforms!

The code is licensed under the libvpx BSD-style license. My code here is a gutted version of their sample_decoder.c, so .. that seems to make most sense to me.

Posted in C++, development | Comments Off on How to create and play IVF / VP8 / WebM / libvpx video in OpenGL

A few weeks ago I entered the Ludum Dare game development contest and whipped together a fun game about defending off invading blobs using a spray can.

I spent another week getting it polished up so it works on a ton of platforms. The game is “The Invasion of the Blobs” (iBLOBS for short). You can get it here. It’s available on iPhone/iPad, Android, PC, Linux, Mac, and pretty soon the Mac App Store.

The reason for the porting frenzy with this game is I’m working towards releasing an open source C++ toolkit for supporting all these platforms (and maybe a few more). This is my first game release with this kit. It uses code from all my recent games, but it finally puts that code into a clean and organized re-usable structure. This is going to be super helpful for reducing bugs and improving game code across each platform.

Anyway, I hope you enjoy iBLOBS. It’s totally free, so you might as well give it a whirl. If you want to help out, please post a message here if there are any crashes or support issues on any platform, I want to get those ironed out best I can

Have fun!
-Phil

UPDATE: The Android port has been giving quite a few people trouble. If you are a dev with the Android dev kit, please do an “adb logcat” and post the results here, that would be a huge help!