Eventually, I do want to support having each animation in a separate file. The way Morrowind has all animations in a single track makes it impossible to replace individual animations since you'd have to provide all animations together... at best you can supply a single override animation nif on a per-character basis, but that's not scalable or generalizable. However, that's a separate issue from the textkey map, which maps textual tags to animation times for the engine to act on (for things like playing a sound, indicating weapon attack state, or loop points). If the osgAnimation kit (or any of the desired model formats) doesn't support an equivalent object, then we'll need to handle it ourselves.

How does Blender handle things like this when exporting? From the perspective of someone making a new animated model with timestamped text tags "the proper way" in Blender, then exporting to osgt/b, OpenGEX, or glTF 2.0, what happens?

Eventually, I do want to support having each animation in a separate file. The way Morrowind has all animations in a single track makes it impossible to replace individual animations since you'd have to provide all animations together... at best you can supply a single override animation nif on a per-character basis, but that's not scalable or generalizable. However, that's a separate issue from the textkey map, which maps textual tags to animation times for the engine to act on (for things like playing a sound, indicating weapon attack state, or loop points). If the osgAnimation kit (or any of the desired model formats) doesn't support an equivalent object, then we'll need to handle it ourselves.

I'm still learning here, but it seems like the text key map is doing double duty. It's used for both key events within animations and also to designate the start and stop of each animation on the continuous track. As far as I can tell, baseline osg animation doesn't support it but if it ever does, it's trivial to add them back into the main format as they're just text -> float time pairs.

How does Blender handle things like this when exporting? From the perspective of someone making a new animated model with timestamped text tags "the proper way" in Blender, then exporting to osgt/b, OpenGEX, or glTF 2.0, what happens?

Not sure which formats will actually export them but markers have a few benefits:
1. They integrate with blender's UI on the timeline so you can just click and add them.
2. They should shift if you add new frames before or after you already set the marker although I haven't confirmed this.

Compare that to the current anim.txt generated on import of nif which is just a text list of the mapping of text to time. If you add one new frame at the start of your timeline, that entire text file is wrong as the times aren't snapped relative to your animation in any way.

I suppose my approach would be to rig my character and have him ready for animation, then copy him to 30 separate .blend files and start animating and exporting them 1 by 1.

I guess the other option would be to animate everything in 1 file and then start cutting it up into 30 pieces, which I'm less inclined on.

If we come up with a naming scheme for animation_skeleton_name+animation_name.osgt, we could support both options (1 monolithic animation file with tracks splits vs. 1 file per animation) without too much effort.

I'm still learning here, but it seems like the text key map is doing double duty.

Sort of. When an animation plays, the engine says which animation (i.e. animation group) to play, like "idle1" or "weapontwohand", and which tags to start and stop at, like "start" and "stop" (the most common) or "weapon slash start" and "weapon slash max". When looping is desired, it's implied to use the "loop start" and "loop stop" tags if found. Each textkey entry is composed as "group: tag", and there are also special textkeys like "soundgen: foo" that specifies to play the given creature's soundgen or "sound: foo" to play the specific Sound record from the actor's position. With individual animation files you might be able to omit the "group: " portion of the textkey (if the group can be inferred from which animation file is loading, though the special soundgen: and sound: markers would still be needed for emitting sounds) and the "start" and "stop" tags can be implied by the start and end of the animation, unless there's some technical reason you may want some lead-in or lead-out frames (non-linear interpolation or something, perhaps). The other tags a given animation would need still need to be specified for it to work, though.

That's the important bit. If we're going to properly support more modern model and mesh formats, we need to know how the timeline markers or whatever they're called are handled when exporting to said formats (followed by how they're handled when OSG loads those formats).

That's the important bit. If we're going to properly support more modern model and mesh formats, we need to know how the timeline markers or whatever they're called are handled when exporting to said formats (followed by how they're handled when OSG loads those formats).

I tried exporting to osgt with added markers from blender but a quick search through the resultant file showed that the markers didn't export. I'll try with opengex / gltf 2.0 and see what happens but considering my first target is osgt, we may have to brainstorm a simple parallel file export format to accompany the animation file for the text keys.

How does Blender handle things like this when exporting? From the perspective of someone making a new animated model with timestamped text tags "the proper way" in Blender, then exporting to osgt/b, OpenGEX, or glTF 2.0, what happens?

Are you planing to do plugin for Maya? Are you familiar with Maya HIK?

How does Blender handle things like this when exporting? From the perspective of someone making a new animated model with timestamped text tags "the proper way" in Blender, then exporting to osgt/b, OpenGEX, or glTF 2.0, what happens?

In Blender, all animation data for objects is stored in actions. Actions hold the keyframes and it's where you do your animation work. The same action can be applied to multiple objects and the .blend file can hold multiple actions, so it's not limited to one chunk of animation data per .blend file.

The actions can further be organized in whichever order in the NLA (non-linear animation) editor. They can be scaled, mixed, repeated etc.

It then depends on the exporter how it handles this. For example, Blender's FBX exporter supports multiple ways of picking which animation data gets exported and how:
1) Exports whatever animation you see in Blender (what you see in the 3d view is what you get) and creates a single chunk of animation in exported file.
2) Exports all the action clips assigned to an object in the NLA editor (unless the track is muted). Scaling, repeating, cropping is all taken into account. Every action clip is a separate animation in exported file.
3) Exports all the actions in the .blend file. Every action is a separate animation in exported file.

For my workflow I use 2) as it allows for nice control what exactly gets exported, even when I have various unfinished or test animations residing in the same file. Let me know if this makes the export process clear enough or if you need any more info.

In Blender, all animation data for objects is stored in actions. Actions hold the keyframes and it's where you do your animation work. The same action can be applied to multiple objects and the .blend file can hold multiple actions, so it's not limited to one chunk of animation data per .blend file.

I'm going to look into this since it seems to be a different way of organizing multiple animations without just adding markers to a single track.

Currently though, I'm stuck on the below problem for converting (hopefully scrawl or Chris can help). I don't see bound sphere data for the bones in the osgAnimation format. I need it to assign the osg version of the below nifloader line:

I noticed a function called computeBound in Transform which seems to compute a bounding sphere on the fly without the need of data from the format file but i'm not sure if that would be a fair replacement for this.

If I had to guess, the bounding sphere of a bone would either be the region of space where things influenced by the bone are or the region of space where things influenced by the bone may end up as the bone moves (which would be infinite if bone lengths weren't fixed, which I think is usually a per-bone property in most animation systems, and therefore might well mean that this definition isn't something helpful and the first one should be preferred).

Do you have anything in both formats? If so, you could temporarily add an osg::ShapeDrawable with a sphere in it to represent the existing and computed bounding spheres to check they match.

Do you have anything in both formats? If so, you could temporarily add an osg::ShapeDrawable with a sphere in it to represent the existing and computed bounding spheres to check they match.

The original nif has the correct bounding sphere to check against which i'm already serializing as my golden spec for the conversion work. If i'm understanding you, you believe this represents the sphere of any angle rotation of the bone using the bone length as the radius of the sphere correct?