I want to write my own pak file creator, i've searched the internet for virtual file systems and found truezip, while truezip will fullfill my needs there is one big drawback, it stores zip files up to 4 GB whereas my files tend to be a factor 10 larger. But since these files are stored in a way like this /textures/...,/pointclouds/... etcetc.I'm wondering how difficult it can be to wrap this files in a nice pak file and browse inside this file as if it were a normal file.

Beside polygons, models, textures, i'm also storing point clouds, where a single point of such an cloud consist of an(x, y z ) (r, g, b) ( u,v ) and (normal.x, normal.y, normal.z) since this data is coming from a laser scanner it are billions of points(which are then divided in an octree)so 40 gig is not that much

Wouldn't it better to split the data? I assume you don't need all the data at once or all the time.Beside the fact only linux/unix with newest fileformats support such big files. NTFS for windows only allows 4GB? files.And if compressing your data, if not allready done, will decrease the overall size.

But for the topic, if your not happy with truezip or any other zip format, use your own.You could use the Zip/GZip Out/In Streams of the java zip package to compress individual files and store them in the fileformat of your choice. Just build a filetable

Beside polygons, models, textures, i'm also storing point clouds, where a single point of such an cloud consist of an(x, y z ) (r, g, b) ( u,v ) and (normal.x, normal.y, normal.z) since this data is coming from a laser scanner it are billions of points(which are then divided in an octree)so 40 gig is not that much

Perhaps it would make sense in this case to use a database instead of a file-structure?

Beside polygons, models, textures, i'm also storing point clouds, where a single point of such an cloud consist of an(x, y z ) (r, g, b) ( u,v ) and (normal.x, normal.y, normal.z) since this data is coming from a laser scanner it are billions of points(which are then divided in an octree)so 40 gig is not that much

Perhaps it would make sense in this case to use a database instead of a file-structure?

How should i achieve this, right now the algorithm to load files goes like this -> determine visible nodes -> load files -> memory map them -> renderbut i have no idea how to do it in say SQL or with other databases

Wouldn't it better to split the data? I assume you don't need all the data at once or all the time.Beside the fact only linux/unix with newest fileformats support such big files. NTFS for windows only allows 4GB? files.

Wouldn't it better to split the data? I assume you don't need all the data at once or all the time.Beside the fact only linux/unix with newest fileformats support such big files. NTFS for windows only allows 4GB? files.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org