I'm going to write this down and post it while it's fresh in my mind...
Having spent several months (I'm ashamed to say) writing my own class for parsing x files and displaying skinned meshes I've realised the thing that held me back the most was not being sure exactly how the data is stored in the x file.

I read the excellent chapter on skinned meshes in Frank D Luna's "Game Programming with DirectX 11" which explained the mechanism of animating the mesh about as well as any article I've read, and there are plenty more tutorials out there. The one problem with it is that it uses a custom file type to hold the mesh data. This then means you either have to write your own file exporter for your modelling software (Blender in my case) or learn to interpret one of the file types it already exports. I had an experiment with Assimp to import the data but got inconsistent results (certainly my fault, looking back at it!) so decided to bite the bullet and start from scratch.

I'm not going to write a tutorial on creating the whole class here. Like I say there are plenty of tutorials explaining where and when to use the offset matrix, lerping between quaternions etc. I realised that most of my time had been wasted trying to work out why my final bone tranformations were all at crazy angles when if fact I hadn't loaded the correct matrices and quaternions in the first place!

My biggest hurdles were finding out which of the many matrices listed in the x file were the offset matrices I needed, and converting from Blender's "Z is up" coordinate system to DirectX's "Y is up" system. This is straight forward enough for static meshes but becomes a bit more of a mental challenge with matrices and quaternions.

There are some parts of the x file that I ignored completely. I'm sure they are there for a good reason but I couldn't figure it out! (and I got my class to work without them)

So, this is to be about how I got the correct data out of my x file rather than how to animate the mesh....

To start with some notes on static meshes...

When exporting I used the following settings ticked in the Blender 2.67 x file exporter... (the 2.67 is important as it has a new script for exporting x files)
Export Meshes
Export Normals
Export UV Coordinates
Export Materials

Everything else is unticked - but you must make sure your blender scene does not contain anything else such as cameras or lights. Also I set the mesh oblject's shading property (Object tools) to flat rather than smooth. That gives us the same number of normals as faces in our mesh. I then wrote a function to smooth the normals once the mesh was loaded.

I skipped the FrameTransformMatrix and started reading data at "576;". This is the number of vertices in the mesh. Each vertex's coordinates are then listed in batches of three. Now, here's the important bit... We are changing from Blender's coordinate system to DirectX so you need to read the three numbers in x, z, y order. So vertex 0 will be at:

x = 0.195090, y = -1.000000, z = 0.980785

I read these straight into my vertex data one vertex at a time.

At the end of the vertex data the x file lists the number of faces in the mesh...
.
.
.
-1.000000;-3.000000; 1.000000;,
0.000000;-2.000000; 1.500000;;
192;
3;0,1,2,,
3;7,6,5,,
4;11,10,9,12,,
4;15,14,13,17,
.
.
.

In this case 192 faces. The following list becomes your index buffer, but you can't read them straight in like with the vertex buffer. This is because Blender stores faces made from 3 or 4 vertices. The first number in each line tells you how many vertices make up the next face.

This means the first face of my mesh has three vertices (vertex 0, vertex 1 & vertex 2)
The second face also has three vertices (vertex 7, vertex 6 & vertex 5)
The third face has four vertices and needs splitting into two triangles.
Happily this is easy enough...
The first one is made of vertex 11, vertex 10 and vertex 9.
The second is made of vertex 9, vertex 12 and vertex 11.
And so on... So my index data looks like:

The number of mesh normals should match the number of faces, 192 in this case. If you didn't set the Object shading property in Blender to flat these numbers won't match. I read these into a temporary std::vector<D3DXVECTOR3> in x, z, y order so that my std::vector looks like

This means the normal for our first 3 vertices in our vertex buffer is the D3DXVECTOR3 stored in myvector[0],
the normal for our next 3 vertices is the D3DXVECTOR3 stored in myvector[1],
the normal for our next 4 vertices is the D3DXVECTOR3 stored in myvector[2],
the normal for our next 4 vertices is the D3DXVECTOR3 stored in myvector[3],
... and so on.

Once we have these copied to our vertex data we can delete our temporary std::vector now. We won't need it again.

I ignored all of these matrices but used this structure to load the names of each bone (Armature_Bone, Armature_Bone_001 & Armature_Bone_002) and work out the heirarchy.
Armature_Bone is our root bone.
In this case Armature_Bone_001 & Armature_Bone_002 are both children of the root bone.
If Armature_Bone_002 was listed before the "// End of Armature_Bone_001" then it would have been a child of Armature_Bone_001.

Further down the skin weights are listed like so:
.
.
.
SkinWeights {
"Armature_Bone";
192;
0,
1,
2,
.
.
.
This indicates that Armature_Bone influences 192 vertices, and then goes on to list which ones.
At the end of this list is another list of float values
.
.
.
190,
191;
1.000000,
0.032200,
0.480000,
1.000000,
1.000000,
.
.
.
...which tell us how much Armature_Bone influences each of the listed vertices (the skin weights).
So Armature_Bone has an influence of 1.0 over vertex[0] but only 0.0322 over vertex[1] etc.
Once we have this list for each bone we need to loop through them all to work out which are the four most influential bones for each vertex (and to store those bone IDs and skin weights in each vertex's data).

"AnimationSet Global {" tells us we are looking at the start of animation data, and the following animation data is for {Armature}. We can ignore this and skip forward to the animation data for our bones, so we search on for...
.
.
.
49;3; 0.000000, 0.000000, 0.407193;;;
}
}
Animation {
{Armature_Bone}
AnimationKey { // Rotation
0;
50;
0;4;-0.668282,-0.331680,-0.466460,-0.475187;;,
1;4;-0.668557,-0.331353,-0.466652,-0.474719;;,
2;4;-0.669386,-0.330367,-0.467231,-0.473306;;,
.
.
.
This is a list of key frames for the rotation of Armature_Bone (our root bone)
The "0;" tells us that the key frames are for rotation (as opposed to position or scale)
There are "50;" key frames. Each key frame is a time position (frame number in Blender) and then "4;" numbers representing the quaternion of the rotation.
Again, we are converting Blender's coordinate system to DirectX so we read each quaternion in w, x, z, y order. So the first three quaternions we will store for Armature_Bone above would be:

Scale is read in the same way, remembering to swap the order of y and z
.
.
.
AnimationKey { // Scale
1;
50;
0;3; 1.000000, 0.100000, 1.000000;;,
1;3; 1.000000, 0.100300, 1.000000;;,
.
.
.
The "1;" tells us that these key frames are for scaling. There are "50;" key frames with a time position and "3;" numbers to represent the scaling in x, z, y order.

All this key frame data then needs sorting so that each key frame consists of a single time position, a rotation quaternion, a position and a scale.

Each bone animation contains of all the keyframes (50 in this case) for one bone.

Therefore we should have the same number of bone animations as bones in our mesh.

This has become quite a bit longer than expected, but hopefully it might save someone else struggling along with the wrong matrices for months!

I'm reluctant to post my source files as my work-around solutions mean the code will probably not work with all x files, possibly from earlier versions of Blender or other modelling software. I'd rather have it more finished and stable before posting. Sorry, but I hope this post is of some use!