A)The topic title really says most of what I'm looking to do. Obviously, most character movement, and even object movement (non-physics related by and large, I mean) is done via character meshes with prior animation loaded and ready to be performed at certain button strokes or happenings. What I'd like to do, in addition to that, is be able to programmatically create animations via some algorithms of mine in-game without prior creation of said animations. I'd like to have certain parts of the character meshes "tagged", even certain vertices, so my particular algorithms can intake controller input and create animations on the fly intelligently utilizing greatly the "tags" of the mesh. I don't know if this is common, uncommon, not at all done, or whatnot, but I'd like to be able to do so. I'm utilizing Unity and a mixture of K-3D and Blender for some of the meshes. I'm also creating generating meshes real-time and would like to be able to run said animation algorithms on them.

B)Also...is there a technique by which I can have a standard set of really basic meshes and then run custom algorithms in-game to form them realtime to various objects which the code logic would determine?

Creating animations from skeletons can be done. A company I am working with called Natural Motion did a system for just that a few years ago. They were featured on UK television as the results were so impressive. However they have since dropped it as no one wanted to buy it. I want it.

It works by a GA routine. The code allows the skeleton to try and walk, at first all the generations fall over in a snotty heap. After about 20 generations it can walk. After a few more it can run. I don't know how long the code took to run, but I am going to have to do this myself in a few months, so I guess I will find out.

On your other question, I have a library of parts I combine to create meshes. Things like tubes, cylinders, boxes, quad patches, bezier patches, etc. All of these generate a single vertex buffer and a single triangle buffer which can just be thrown at the rendering engine.

On top of these basic shapes you need a couple of blend shapes to clean up the joins. I use bezier surfaces for this and it works really well. You have a set of rules for combining the objects. Say you have a box that joins on to a cylinder. You can take each edge of the box and create a number of points that match the number of points on the corresponding arc of the cylinder. Use these as the end of the patch, then interpolate the normals to create a smooth patch that links them together.

The only real limitation with this technique is texturing. It's really, really difficult to generate texture coordinates that makes sense for some of the shapes, and impossible to add details into a single texture like most people do these days. If you look at a texture for a plane as an example. Typically the upper wings are placed as a separate region on the texture and details added such as roundels or swastika's. The sides of the body are two separate regions as well. You can't do that procedurally with any sort of good results.

However if you are on a fast enough machine you can animate the patches as well, really fun stuff to play with.

Wow, totally kick-butt Stainless. Thanks a ton for that. I've already downloaded the LE or SE or whatever edition. What a great concept.

That leads me to another Q regarding a similar thing: is there a way to take meshes and assign "tags" to certain vertices, and overall sections as well, of a mesh? I would like to be able to do so so that I can run custom algorithm logic on different parts of the environ, objects, and characters and manipulate their meshes and behavior realtime.

Wow, now that's a charitable post. Thanks Stainless, that's very kind of you.

Actually, this thread is sparking a track of thought for me; I had already started to develop algorithms of my own that would do what Natural Motion's software does, and that sparks a new Q for me.

Basically, the reason I'm on these forums is because I decided to adapt my own comic book story into game form late last year. Not knowing how in the heck to get it published by large studios, with no background in games and no desire to have other artists wrestle my dear characters and story around, I thought I was stuck. However, over the course of a few days after realizing that games were my final frontier, I rapidly developed a programming algorithm based on (really) Einstein's equations and subatomic particle science that can intelligently generate worlds, buildings, vehicles, creatures, and, well...games themselves. I, by myself, could sit down and create a game in a week or two that would normally take a hundred people, millions of dollars, and several years to create. It would also present huge leap forward in AI technique.

At any rate, being a total neophyte in game programming, I'm having to pick it all up now, which is why I'm on this forum and others. I realize that sounds a tad extreme, and I'm certainly not looking for advice on how "that can't be done" or "what are you thinking", as it most certainly does work. The hang up is, I either need to sell my development to a company and get grafted on as a developer of it, or start my own company myself. I'm currently working on the latter, but regarding the former, I would be up for it; I just don't have a clue in hades as to getting a quality word with a good company. I'm also, obviously, very protective of it. I'm willing to talk with any good, decent company anywhere.