No more need to keep models within some 256-bounds cube ala 3ds2unr. Or worry excessively about maximizing scale to avoid Unreal's vertex rounding behavior. UnrealFBX handles this for you.

Special triangle modes (for rendering, among other things) can be hinted at for the converter by embedding certain substrings into triangles' material names. The presence of these substrings will give the desired result once in engine. The substrings are as follows:

Does not handle multiple meshes in the FBX file's scene properly. For now, artists are suggested to use single meshes per .fbx, combining multiple meshes in their tool(s) of choice before exporting.

"Still" animation sequence may not represent the actual scale and orientation of the mesh once it's in motion. Given this is an issue for your mesh, suggest using a real animation sequence to preview in the level.

Some sequences don't appear to properly animate all their mesh's vertices (e.g. animations in "Zombie_Anim.fbx" example file). The reason for this is currently unknown.

Anim sequence timing in relation to Unreal's animation system are an unknown.

Any FBX mesh orientation data is not being used, so meshes are likely to come in strangely rotated without modifying the '#exec' directives in the .uc file.

Exceeding Unreal's implicit limits aren't being checked, so issues will arise when these are violated.

Unreal Limitations - Things You Should KnowAll of these are unavoidable:

Unreal vertices will always 'snap' around a little during animation. UnrealFBX strives to greatly mitigate this, keeping this side-effect as subtle as possible.

Unreal UVs are rounded to 256 possible positions (0-255), so they may be incorrect-looking within the engine. Artists are advised to snap to a 255x255 (I need to verify this) grid when creating their UV maps before texturing. This should ensure Unreal's rounding doesn't noticeably affect your UV maps after conversion.

Unreal's data formats have implicit consequences which limit the data you can convert. While the converter doesn't address or warn about these (yet), they are there. Examples include triangle/vertex/frame counts, among other things.

For those who want to try this out, please feel free to ask questions, and throw any bugs or feature requests here as well. I'd like to get this even cleaner and usable in future, so any helpful feedback is appreciated.

UBerserker wrote:So I'm really dumb at this stuff but this is mostly suited for bipedal pawns and the likes?Man how I still wish we had Drakk in U1/UT.

Yesterday, I tried converting a weapon model I did for Proph. Without further testing, my understanding is that if you animate your meshes in a straightforward manner, you can easily animate and convert anything without too much difficulty. You just have to animate with skeletons/bones in your 3D tool, which might be considerably easier than morphing vertices by hand? At least that's my current thinking.

Blender, and I assume other major 3D software, have constraint systems that allow you to control certain bones with other bones, do weird shit, etc. I did weird shit for the aforementioned model's animations, and only where I did very strange things did it have issues in the exported model. If I did it in a more straightforward way, I shouldn't have had any major problems.

But this is what testing is for. I may demo a few tests of my own here in the future, since Blender, my preferred modeling software, is one of the more feature-rich applications out there that's had consistent problems working with Unreal's vertex mesh format.

So this is a modern replacement for 3ds2unr with no need for awkward keyframe duplicates? That's pretty massive news! Modeling and animation were always the biggest hurdles for truly groundbreaking Unreal SP projects, and this sounds like a wonderful remedy to make artists' time easier.

This comes right in time for me, as I've actually been actively playing with Blender this last month-or-so.

Do you have any Blender-specific steps or recommendations on what to avoid, what scale to keep the model in etc.? You said there should be only 1 mesh, but the Armature is still be a separate object, right?

Also, have you tested this with other tools that export to FBX, most prominently 3DSMax/Maya?

However, I cannot help but wonder why vertex animation was the first thing you supported?I didn't try it yet, but Unreal supports skeletal animation, and skeletal meshes do not seem to suffer from the same hardcoded limitations that vertex ones do.

Skeletal meshes do not seem to have that boundary and snapping limit, since everything about them (including their UVs) seem to be declared as floats (at least from the native headers), so all the problems you have with vertex meshes seem to be non-existent in skeletal ones, at least theoretically, and the conversion from FBX sounds like it would actually be easier to convert to skeletal meshes as it would be closer to a more 1:1 conversion than to vertex.

Don't get me wrong, vertex animation support sounds awesome and is still useful for a few cases, but skeletal animation seems to be more important and easier to support.

ividyon wrote:Do you have any Blender-specific steps or recommendations on what to avoid, what scale to keep the model in etc.? You said there should be only 1 mesh, but the Armature is still be a separate object, right?

Also, have you tested this with other tools that export to FBX, most prominently 3DSMax/Maya?

These may have exceptions, but they're good guidelines:

The model and armature are the two objects you need. In Blender, I tend to select my model object and armature, and export with the "Selected Objects" flag.

Blender's .fbx exporter has a scale multiplier. Since Blender units are the same as Unreal units, you can export your models "to-scale". So for example, if your model is roughly within a 4x2x3 space in Blender, you can multiply by 64 to get a 256x128x192 model with no issues.

I'm going to suggest that the model and armature objects in any software should have their local origins at the global scene origin. You can see Blender object origins as little dots usually near the objects themselves. In Blender, Ctrl-A brings up the translation "Apply" menu, so you can apply location, rotation, scale, etc, which I suggest you do for both your model and armature.

You'll also want to think about where the "net-center" of your model is with every frame of every animation combined, since this will maximize the precision you have on each axis.

And no, I haven't tried this with other tools yet. I'm sure there are many ways for UnrealFBX to explode, but my hope is most major tools won't have much trouble. I'll test at some point, and anyone else who wants to as well are invited to report bizarre results here.

Feralidragon wrote:This sounds awesome.

However, I cannot help but wonder why vertex animation was the first thing you supported?I didn't try it yet, but Unreal supports skeletal animation, and skeletal meshes do not seem to suffer from the same hardcoded limitations that vertex ones do.

Skeletal meshes do not seem to have that boundary and snapping limit, since everything about them (including their UVs) seem to be declared as floats (at least from the native headers), so all the problems you have with vertex meshes seem to be non-existent in skeletal ones, at least theoretically, and the conversion from FBX sounds like it would actually be easier to convert to skeletal meshes as it would be closer to a more 1:1 conversion than to vertex.

Don't get me wrong, vertex animation support sounds awesome and is still useful for a few cases, but skeletal animation seems to be more important and easier to support.

No, that's a valid point. I had a neurotic vendetta against the vertex animated format, so it was at the top of my hit-list and I did that first. Also, IIRC bone scaling is ignored in Unreal's skeletal animations, and with vertex animation this is a non-issue. The intent is to eventually support both.

Skeletal animation shouldn't be difficult; I just need to familiarize myself with Unreal's skeletal mesh file formats. That might be a good target for the next build, along with some command line options so you can choose vertex animation, skeletal, both if you want, along with other options.

To further explicate the choice of vertex animation: in my last post I mentioned a fully skinned and animated model that I did for a Red Nemesis project several years ago. Compatibility with vanilla Unreal 225 and UT99 was an imperative. Because I had so many issues for so long, someone with 3DS Max and ActorX finally did the conversion for me, and the experience left me relieved but frustrated.

Last edited by Jet v4.3.5 on 09 Jun 2018, 17:37, edited 1 time in total.

That makes sense. For all intents and purposes, vertex meshes are the ones who are guaranteed to work everywhere indeed, even in the oldest versions of both Unreal and UT99.

Yeah, I recall the bone scaling to not be something supported (although I had forgotten until you pointed it out right now, to be honest), which is indeed a limitation of skeletal meshes to keep in mind.

Either way, I am looking forward to further development on this, especially given that I intend to use it myself eventually once I get back to create more content I have planned for the game(s). This game has really been in dire need of something like this, so we can reliably use Blender without worrying on how to convert later, so, thank you.

Thanks for this tool, coincidentally I was doing some of my own research into the FBX format from Blender on standard .T3D conversions to Unreal. This will hopefully enable Blender as an all around alternative to create meshes and brushwork without the UED grid constraints.

I decided to work on some command line stuff before getting to skeletal mesh conversion, and did a few wiki pages on the command line and the "preference system". You can get to the wiki from here and on the first post.

It's work-in-progress, but I invite feedback, especially on the preference files and how convenient people think they'll be.