I have used this function numerous times without problems, so this game must be using this function in a wrong way.

Even if the game is using the function incorrectly, I still wouldn't expect the native error, since LWJGL is supposed to check ranges. glDrawArrays is a complicated function though, so it probably isn't possible to do every check. Debugging it would probably be easier with source -- doable without, but probably not worth the trouble.

I wouldn't chalk it up to deprecated GL paths -- since these testers haven't run into issues with vertex arrays before, it seems more likely that you are doing something wrong. Which means that when you go to refactor to VBOs/shaders, you may end up with the same error. Maybe post some of your rendering code?

Vertex arrays are deprecated but they should still work fine if you are not requesting core profile.

I know learning OpenGL is a good idea...But writing applications for OpenGL 3 is certainly not a good idea! Many people out there still have low-end or old hardware and are restricted to OpenGL 2.1 or even lower.

So to get most people playing your game I suggest targeting OpenGL 2.1 hardware, since it seems to be the best balance between being not deprecated and being usable for everyone.

I haven't got some of those links, but minecraft, steam and WolfireGames have made stats about the user's hardware. It showed that at the moment only about 50% of all the users have OpenGL 3+ cards (and drivers) and about 80-90% have OpenGL 2.1+ hardware. I myself got my OGL 4.2 hardware this summer. Before I had OpenGL 2.1 hardware.

davedes for example still has 2.1 hardware.

I suggest learning the OpenGL 3 specification, but not really using it.

Doesn't break it down by OpenGL version, but you can expand the list of GPUs, and it overwhelmingly shows OpenGL 3+ capable cards. Consider that Steam installations already tend to bias toward higher-spec machines (I don't have Steam on my crappy laptop) and that the survey doesn't include specifics on Apple hardware at all, and 50% doesn't sound implausible.

If your game's free, then frankly you probably shouldn't need to care. Learn and use whatever helps you the most.

Doesn't break it down by OpenGL version, but you can expand the list of GPUs, and it overwhelmingly shows OpenGL 3+ capable cards. Consider that Steam installations already tend to bias toward higher-spec machines (I don't have Steam on my crappy laptop) and that the survey doesn't include specifics on Apple hardware at all, and 50% doesn't sound implausible.

If your game's free, then frankly you probably shouldn't need to care. Learn and use whatever helps you the most.

Just ninja'ed... Right a second before I found the link...Didn't find the link about minecraft tho...

About 91% of Minecraft users have computers that support OpenGL 2.0+, meaning we can write games fully with the programmable pipeline (GLSL shaders) and start to safely forget about supporting or having a fallback for the old fixed function pipeline.

51% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.0+.

38.8% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.2+.

34.2% of the Minecraft user base have computers with graphics cards capable of OpenGL 3.3+.

19.6% of the Minecraft user base have computers with graphics cards capable of OpenGL 4.0+.

8% of the Minecraft user base have computers with graphics cards capable of running the latest OpenGL version 4.2.

Intel cards are crap (yes everybody already knew that) and account for the majority of the 9% that don't support OpenGL 2.0+.

Java 5 use has pretty much died with very few users still on that version of Java.

And of course, OpenGL ES is pretty much just GL 2 with a few more features. This is why many of the "modern" GL 3+ tutorials are not great if you are looking to develop games for today's casual market, and why I started my lwjgl-basics API and tutorial series. i.e. Learning the programmable pipeline in a GL 2.0 compatible context.

Okay, lots of the code in this project was for things I had never done before. Ever.In my Ludum Dare entry, I rewrote the whole engine, and actually wrote BETTER code.So, I am going to clean up the Ludum Dare entry, remove anything that was not part of the engine, and rebuild the game on top of that.This will take a few days if things go smoothly.

Also: Switched rendering to VBOs only. Apparently, VBOs DO work well with constantly changing data. Hopefully that will fix the previously reported crashes when I release the next prototype.

There is not much. Basically you just defend the tower for as long as possible, collecting coins to help.If you have 50 coins, you can heal fully by pressing H. If you have 150 coins, you can summon another Guardian by pressing the character select buttons while you are alive. (1 = Warrior, 2 = Archer, 3 = Assassin)If the goblins get enuogh coins, they can summon orcs and trolls.Remember that you can steal coins from the goblins.

I will tweak a few things if people want, but no new content will be coming, and if it does, it may be a while.

Don't worry. The art will continue!

At the start of this project, people loved the pixel art. I will not let them down.I am not stopping work on the spritesheet (or the engine). I will continue making small similar games until the engine is polished enough to do a big project and the spritesheet is complete.

If anyone requests, I will make the spritesheet available for free and the engine Open Source so that you can make the same kind of games (or just peek at the source) and use the art for your own projects (as long as I get some credit, because I don't want the (rare, but still possible) situation where your game gets super-popular and I get fanboys screaming that I stole your art ).Have fun people, and my next project will be shown soon.

CAN A MOD/ADMIN PLEASE MOVE THIS TO SHOWCASE

If you want a more complete-feeling gamewith similar gameplay (made with what is now the new engine) with similar art, go here to play my Ludum Dare entry.

Yup, you need to pass all entity data in one go to the GPU -- aka create a "sprite batcher".

Other optimizations - use STREAM_DRAW, don't create new float arrays every frame, and ensure your transformation stuff is not slowing anything down. Also make sure to glEnableClientState and glDisableClientState for position, texcoord and colour since you are using fixed-function. And I'm not sure why you are deleting the VBO each frame..

Unless I'm mistaken, glDeleteBuffers is like glDeleteTextures. It tells OpenGL to delete the object when it's no longer used. So in this case it's a useless call, since you are still using the buffer every frame. Still, it might screw things up depending on how the driver implements it.

Just create one buffer (glGenBuffers) at the start of your game, give it data every frame, and then once you are done (i.e. when your game is closing), delete the buffer with glDeleteBuffers.