Recommended Posts

Greetings!
I am writing some convertes for Alias|Wavefront Maya .OBJ file format. The point is, that i get the information in some kind of vertex list. For every face i get 3 vertices. Not vertex indexes. Small example of two faces:
3.2 5.2 6.1
2.1 2.3 4.5
6.7 8.2 1.2
3.2 5.2 6.2
3.4 5.6 8.7
6.7 8.2 1.2
That would be 2 triangles, vertices are passed as float x,y,z;
I load those vertices in my:
struct vertex_t{ float x, y, z; };
I pushback every vertex in a vector like this:
std::vector<vertex_t*> vertices;
vertices.pushback(new vertex_t(x,y,z));
In souch a "vector" some vertices do repeat themself. Is there any automated way that would generate me a unique list of only unique vertices like face information? I see myself writing souch kind of parses every month. Is there a more elegant way? I don't want to sort that vertex container over and over again by my self.

Share this post

Link to post

Share on other sites

Insert it all into a std::map<Vertex, unsigned short> object, with the unsigned short being the index of the vertex.

So, load a vertex from file, then check if it exists in the map object. If it doesn't, insert it into the map index with an index of map.size()-1, or something like that. If it does exist, don't insert it into the map and assign the index of the next vertex in the triangle to be map[vertex].

0

Share this post

Link to post

Share on other sites

Ow, ow, ow. Please don't store them by pointer in your vector, there's absolutely no reason to. It's already a small type with value semantics and you don't need polymorphic behavior. Storing on the heap will increase your memory use by at least a quarter ( possibly more, depending on padding your heap's block size ), make your program slower ( since heap allocation isn't cheap and you'll have an extra indirection penalty ), and make your program harder to code ( because you'll have to remember to delete all those structs at some point ).

Share this post

Link to post

Share on other sites

I second Nitage's code, as modified by me22 to actually remove the duplicates. I bet it'd be much faster than using std::set or std::map. People always seem to use maps even if a sorted vector fits the problem better. Sets and maps are more useful when you're inserting/removing elments frequently. In this case though all your inserts are up-front and then you never insert again. Go with a sorted vector here.

I also recomment you follow me22's advice of working with a container of vertex objects rather than a container of pointers to vertices.

0

Share this post

Link to post

Share on other sites

And yes, for anything performance-critical sets are basically only useful for HUGE amounts of data ( where their algorithmic complexity finally wins over vector's tiny constants ) or where you really need set's iterator invalidation properties ( in which case a vector of shared_ptrs might be better anyways, especially since that gives you weak_ptrs )