I like the look of the character but there are many similar characters. So, if it is to be a logo or mascot for your brand ... something more distinctive or different.

Yeah.. it has been a tricky design process. A lot of people have mentioned he looks like a "Marshmellow Man" or the child of "Michillen". Though, for arguement sake, having an ambiguous character are more successful at creating empathy with the audience when it comes to story - because more people could identify with him more about the story than the actual character. This is the lead goal. The other secondary charcters for the branding half designed are a lot more characteristic in form. His personality should be quite defined with animation though.

The head seems to have innumerable bones. It is like mesh edges where converted to bones. Otherwise as well it seems like an "atomic fly swatter", overkill, too complicated. Especially for the simple shape of this character. That's a question not critique: Why so complicated? What is gained to be worth so much effort?

It's complicated because I won't/can't use morph shapes to get the expressions - and the expressions are very cartoony, squash, stretch, deform all over the place. I need a system that can push and pull deformer bones while somehow retaining some original shape design and volume. I also need a system that can give me "shape" like editing, or "sculpting", with a "lattice" all without sculpting, using a lattice or using shapes. I gain a rig that is light on the machine (less drivers, less vertex deformations) and compatible with realtime fur in Unreal Engine through Gfur and FBX export to render in realtime.

Makes sense! Thanks for the explanation. When you have some animations done could you show them and tell how this rig/system/method is better for it? Like "this rig has X which it better for this expression because ..."

Well to be honest I dont know why you are doing it that way?
Am I missing something????
What I would have done..?
Use a biped rig..a normal one !
Make the movements of the face with morphtargets!
Very easy to do ie in Poser!
You have all the necessary dials.. and if you use the standard bones Poser - Daz setup even use software like mimic Pro for perfect lip sync!
Daz even allows you to control your morphtargets with a joystick like feature!
Why doesnt it work that way in Unreal even with fur???
Normally it should!

Many people have asked why this way. I can't argue that there are better methods. But there are very specific reasons why not to.

Here are the reasons: 1. Technical Limits for new tech. I can't use morph targets as the addon Gfur is limited with morph target support. Gfur is an addon that gives me realtime fur, so I can spit out a high quality character with fur without the render cost for quick production turn around, viral videos, and weekly animations. The end goal is production cost which requires a bit more pre-production which pays in the long run. 2. Third-Party compatibility. I have to export everything to Unreal Engine 4, meaning I can only export what is compatible with FBX. Meaning, many constraints don't export, lattices don't export, scale exports incorrectly due to UE only being "Local " with scale, etc. I use a shadow rig constrained to a deformer rig to work around that for that. UE has a 150 deformer bone limit with 8 vertex per influence.. so having smaller bones influencing areas that incluence other bones works around that. Less bones on the same vertex, same flexibility. This also is cheap on performance, both in Blender and UE. 3. Non-Destructive Flexibility. I need the same flexibility as a lattice, morph shapes and fine animation "drawing" control that exports nicely and is non-destructure - BUT without using all those features to keep it compatible with UE and the Gfur addon. If I edit the mesh, the shapes and poses would be destroyed and I'll need to rebuild hundreds of shapes again - with blendmorphs. This is expensive and time consuming if I want to rebuild the rig on another character or do any changes to the mesh. With this rig, any pose and shape I build with the flexible rig will be non-destructive and have potential to transfer (with animation libraries) to other characters and allow me to change the mesh. I'm building once for hopefully many characters and non-destructively. 4. Performance.. morphs are baked in stone and require heavy vertex evaluations all the time, plus drivers to drive them all which are harder to program (requires some code) and harder to setup (specially in Blender, and for the same control and effect I'd need hundreds to setup and then minimalize to a few user friendly controls with fancy code hacks), and creating drivers to make them all really hogs the performance of Blender. I could get 30fps out of this rig with opendubdiv and very few drivers - but with the same morph based drive rig I'd drop to 15-20 fps. I did this once in an easier way through nodal programming using Animation nodes, but that still was a performance hog, up to 10-15fps. Believe it or not, creating hundreds of drivers onto hundreds of shapes is about the same or more time to do than this rig setup.

So there you go.. This is experimental, yes. Not a lot of people do it. It's not conventional. But it's built with purpose with a lot of justifiable reasons.

I want fur. I want realtime. I want Unreal Engine. I want non-destructive animation controls. I want cartoon squash and stretch, and I want it animated quickly and exporting quickly to render quickly. Because I want weekly animations. And I want many characters with similar rigs, to have my animators not have to re-learn controls and have animation cross compatible and export to a high quality realtime render engine.

And.. I didn't want to auto-build the rig because I want to understand how to make it work properly so if an auto-rig breaks, I know how to fix it.

This is why I chose not to use blendmorphs/shapes and make this bone rig work.

Hope that makes sense! And yes, it is a crazy ambition, but so far quite rewarding.

# The Update
I am this close to getting this rig going. I have been rather mellow with the way work has been the past few months, but I have been using it as fuel to move forward and make a plan - a sale campaign. This character is part of it, and also useful for the portfolio. I have been recording my work with this guy and will release some videos on him. Check back and maybe I'll have it published!

I feel I need to build rigging body parts to join together later for future characters, build rigs like Lego blocks. This is taking too long - and Autoriggers never do what I want them to really do. I imagine myself so frustrated one day that I will practically write my own autoriggers.

# The Progress

- Mouth is now fully deformable Jaw Ik is working like a charm, works with head Squash/stretch/bend and shape free *(meaning it's non-destructive on the mesh).*
- Lips are now SplineIK and also squash and stretch relatively al right. *Still missing Zip locks and Curl.*
- Cheeks inflate and have controls that work nicely Eyes have SplineIK
- Mouth has asymmetrical offset too
- Eyes can blink and right now squash and stretch relatively ok. *Offset on the eyes is still missing though.*
- Eyeballs now squash and flatten, also has auto lookat eyelid offset.
- Face now has proper squash and stretch ribs for volume preservation through bones.

# The Roadblocks
I forgot how Blender doesn't give clear definition or control of "Global" and "Local" transforms of bone constraints and hierarchy. So I had to work around many constraints setup incorrectly with "World" to new "Local" proxy constraint bones. That took some time to do - I had the whole head working; but when rotating the neck, it all broke spectacularly.
I have a dependency loop that has undo and reset lagging - with the curves controlled by the bones then bones constrained by SplineIK to the curve. This isn't a technical loop, but Blender is thinking that it is. Hopefully in 2.8 this won't be a thing.
The eye had a lot of troubles when I tried to get it squashing and stretching. Ultimately I had to try mimic the mouth.

Yup I understand that you cant use morph targets in the final!
But why not set up the character with the bones attached and animate with morphs..yet let the animation transform with the bones in final? Wouldnt that be much easier???

Track the bone transformations to blendshapes? Then export the bones?
Hmm... if I use any blendshape, change the mesh vertex count, then it will break the vertex order. I'm not in Softimage but Blender unfortunately, so to keep things non-destructive I think I'll avoid them. In softimage this would have been a piece a cake as all modeling and rigging is very much non-destructive. Gosh I miss that... I could use morphs for corrective shapes, possibly.

I'm not clear how it might work.

Method 1 (traditional method)
-Create morph shapes with the displaced vertex in shape library
-Create deformer bone constraints to nulls on hooks or hook bones to vertices
-Create controller bone drivers to morphs so I can animate the shapes in pose mode
-Bake hooked deformer bones to animation
-Export baked bones
-Edit the vertex count of the mesh = break all shapes, repeat first step and second step