Actually, the vbftool1.1 by tek547 works just fine. It's slow (unpacking and repacking on a single HDD takes like 2.5 hours, it's much faster with two separate drives or SSD). It's fully functional for now, though. I'm working right now on my own tool that can reinsert files muuuch faster. Will update you once it's ready.

I was also looking for a different solution, though the old tool works, it takes quite some time. I'd love to see what you can come up with ffgriever. If you update on it, would it be on the forums here or elsewhere?

I Have Looked in It. on Steam to learn what we know right Now https://steamcommunity.com/linkfilter/? ... 21&t=17648 a LIcense Board Editor Mod has been Made, and i am Researching the Character Model Files to see if i can Play as Unplayable Characters. Join The Modding Discussion on that Page. And Also, and someone open some of the game's character Phyre Files? Because i don't know how to do it.

As promised, a tool to quickly insert changed files into FF12 vbf archive. Please note it's still WORK IN PROGRESS. Keep a copy of your original file, just in case.

So what's the problem? Repacking 30GB archive can take even two hours (eg. with both unpacked directory and vbf on the same HDD).

With this tool it takes a few seconds at most (though it depends on number of files and their size).

The general idea of vbf:
1. All files are compressed (with uncompressed chunks whenever compression wasn't viable).
2. Static sections written one after another (header, md5s, file data, names, block data, compressed files).
3. If a file is modified and compressed, few issues are possible:
a). the original file is the same size, the compressed file is of a greater size.
b). the original file is of a greater size.

In case of a) there is a problem, because the file doesn't fit into space available. My solution is to move the file to the end of VBF archive, leave the original space unused.
In case of b) there is a problem, because the file has more blocks (64kB each block), so the block data is larger, yet doesn't fit into space available. My solution is to move few first files in archive to the end of VBF archive to make some room for all the additional header data (trying to free at least 10MB but have to always move whole files).

Now, the file data contains sizes and offsets, but in case a modified file compresses to a smaller size, we don't know anymore, what was the original space available (you can't subtract offsets of two consecutive files, as these do not have to be placed one after another). So let's say the next time the file after additional modifications is slightly larger but sill would fit within original boundaries. But since we don't know what the boundaries were, the file would be relocated to the end anyway.

To solve that, I've added additional file to the VBF archive, that keeps track of original offsets and sizes, so such an issue won't happen (the files when relocating, are getting some additional padding, so further alleviate the issue). If VBF doesn't contain this "vbf_extra.bin" file, it will be automatically created. If it exists, it will be read and used during reinsertions. This extra file is uncompressed to make it easier to read/modify.

Basically, only headers are recalculated each time. Only files present in patch directory are compressed and inserted.

As you see it lacks "creation" mode (full repack). I will add that later for completion sake, but I'm actually trying to avoid the hassle of rebuilding whole archive - that's the whole point of the tool, to save time .

A quick find that I made when working on a Polish translation. The files are properly inserted into the VBF, but if the size of a file is greater, you also have to modify an appropriate entry in FileSizeTable_US.fst (or any other of these if necessary). Otherwise, depending on situation, the excess memory used might be overwritten by another file or the game will stop loading file at block boundary. It's actually very surprising so many modifications DO work without changing sizes (pure luck, I guess).

ffgriever wrote:A quick find that I made when working on a Polish translation. The files are properly inserted into the VBF, but if the size of a file is greater, you also have to modify an appropriate entry in FileSizeTable_US.fst (or any other of these if necessary). Otherwise, depending on situation, the excess memory used might be overwritten by another file or the game will stop loading file at block boundary. It's actually very surprising so many modifications DO work without changing sizes (pure luck, I guess).

I've seen you've already coped with "FileSizeTable_US.fst". Just for future reference then.

The file consists of sections. Each section is 20000 bytes long and contains 32bit elements. In theory each section contains 5000 entries, but in reality most wrappers (there is a bunch of them) of the "getSize(file, section)" function will return -1 for file number greater than 4096.

A value of -1 for an element says that the file is not present/not used (if needed, game will try to read it from other languages that are considered higher level: eg. if a file is not present in "es" spanish dir, it will search in "us" dir, then in either "in" or "jp", etc.).

Most files are in pretty much the same order they were on PS2. Files within one directory are (most of the time) close together, but folders that are close together, many times aren't close in the table. Here is and example (file;offset)

Still, it's consistent with the PS2 order (there were no files, there wasn't even a munge/bigfile, just raw sectors on DVD and a raw sector table that links file numbers to sector locations and sizes).

AFAICT, many file size requests are hardcoded (with both file number and section number directly in code), but some are read from other tables.

I've attached a list of all the files I had to change for Polish translation of the game (I'm pretty sure everything is translated, I omitted offsets for bitmaps and other gfx files, as their size doesn't change unless you change the resolution, which I didn't do).

You do not have the required permissions to view the files attached to this post.

@ ffgriever, shift all the data +10mb, re-write the entire toc. This makes space for more entries in the compressed chunks table. Then you can inject any file to the end regardless of it being larger than the original file. It only takes a one time re-write of the archive. After that all injects are instant and you don't have to worry about the file size or if it has more 64kb chunks than the original.

EcheloCross wrote:@ ffgriever, shift all the data +10mb, re-write the entire toc. This makes space for more entries in the compressed chunks table. Then you can inject any file to the end regardless of it being larger than the original file. It only takes a one time re-write of the archive. After that all injects are instant and you don't have to worry about the file size or if it has more 64kb chunks than the original.

Shifting all the data by 10MB would mean shifting 30GB+ of data... That could take over 10 minutes on a typical HDD, so it's a big no.

It still does something similar. It just moves few first files (relocates these to the end) to free at least a given amount of space at the beginning (no need to move entire file). In an unmodified vbf it will move few small files and one movie. The other relocations are done only for the particular files that won't fit into original space. The tables are rebuilt each time anyway and there is plenty of space for the TOC.

In my case (over 1000 files, most of which are larger than original so they have to be relocated) the difference between first run (including freeing up the space for new toc and moving over 100MB of data, as you cannot move files partially and the movie that's near the beginning is over 100MB) and each consecutive run on a HDD is 10 vs 7 seconds. On SSD it's CPU bound and it's pretty much the same in both cases on a solid rig.

The whole point was to not move entire 30GB file

As for tracking the space usage. If you just keep adding everything at the end, soon you will end up with 30GB file growing to 100GB. In my case it's ~400MB per try/test for all the files. And I've done it probably over a hundred times while modifying and testing. Unless you restore 30GB backup each time... Which takes A LOT of time.