I'm writing an animation system and I have basic bones and animation blending working. How does bone animation cooperate with outside factors like looking direction on player models, or leg/ankle inverse kinematics against terrain?

If a player is looking up or down at an angle how does the looking angle factor into the player model posture with other animations happening? If the player is running and holding a weapon, how does the weapon jostle back and forth and still reflect the angle the player is looking?

Is this just a blob of animations blended together, or is there special-case stuff going on?

Weapon pointing and looking can be implemented by blending in extra animations on top. These animations tend to be 'additive' in nature (as opposed to the usual animation blending which average or lerp bones). An alternative is to hard code some sort of 'posable' bone solution whereby the code can manipulate key bones like the various spine and neck sections to control look direction.

With both solutions it can be tricky to get a decent look with combined with an energetic animation without end up running like a camel. For instance, if you are twisting the spine bones to the right because that's how you're aiming, but your character is bent forward a little during the run, then the twist is a bit 'wonky'. A previous place I worked at programmatically inserted an extra bone into the hierarchy which stayed flat to give a good flat basis for rotation around the Y axis.

Then there's IK. This is usually applied last and attempts to correct any wrongness. Now, if you're using additive animations or posable bones, you'll likely find that hands might creep off guns, so you'd use IK to correct that.

It's a huge subject area, really, and I'm no expert, but I've tried to answer your questions with some of the basics.