It's mostly a question of good practice. You allocate, you free. In some cases, this can save you a bunch of headaches if you ever want to change a number of global constructs to ephemeral ones.

The other main use is memory leak tracking. If your application is supposed to delete everything before quitting, then anything remaining right before quitting is necessarily a leak, QED. With full memory freeing, there is no possible confusion between a leak and global memory.

sohta Wrote:It's mostly a question of good practice. You allocate, you free. In some cases, this can save you a bunch of headaches if you ever want to change a number of global constructs to ephemeral ones.

The other main use is memory leak tracking. If your application is supposed to delete everything before quitting, then anything remaining right before quitting is necessarily a leak, QED. With full memory freeing, there is no possible confusion between a leak and global memory.

Hm, those are fairly convincing

Practically how would you check the memory left right before quitting the app?

Practically how would you check the memory left right before quitting the app?

There is a number of different ways of doing it.

In Object-oriented programming, you can have a static variable that gets incremented in constructors and decremented in destructors.

In C++, simply overwrite new and delete to maintain an allocation table.

In C, do the same thing by piping your malloc/free calls through your own functions.

Some places I've worked at have amazingly complex memory management systems with the sole purpose of attaching chuncks of metadata to allocations in order to track them, and maintain a lightweight linked-list of curently allocated stuff in order to be able to do a memory dump at any time.

I will respectfully disagree with OSC on this one. If destruction takes any time whatsoever for you, then you have other issues on your hands.

Also, I am not sure what you mean by static destructor. If you refer to destructors being called on static variables, then yes. However, I will add that anything doing dynamic allocations before the start of main() is quite evil as well.

I dunno, I used to have this mentality as well until I got burned pretty bad by it. Mind you, there is no reason to not have your cake and eat it too. The fact that your program cleans itself up properly in no way prevents you to call exit(0) in the version that ships to the consumer.

sohta: Did you know that that every Mac OS X Cocoa application out there does not free all of its memory before it quits? After a certain point in the termination process is reached, the whole process just exits rather than release every single object first.