I have a few programs that exhibit a strange behavior: the number of Gen0 and Gen1 collection is almost the same throughout their life time. That is Gen0:Gen1:Gen2 has the approximate ratio of 10:10:1. Other programs run with Gen0:Gen1:Gen 2 in the approximate ratio of 100:10:1 which from what I understand is more typical.

When I use .NET Memory Profiler to trigger an Gen0 collection, I see in the real time graph that a Gen1 collection is also triggered.

What could cause such behavior? Has anyone else encountered this and figured out what caused it? Could it be related to the way I use TCP and UDP sockets, file streams or locks? How can I diagnose this using .NET Memory Profiler (4.6)?

The ratio between the collection generations can differ substantially depending on the allocation pattern and the life-time of the allocated instances. However, having the same amount of gen #0 and gen #1 collections seems a little unusual.

I assume that this could happen if you mainly allocate instances that need to be finalized. If the runtime performs a gen #0 collection when only finalizable instances have been created, then all instances will be queued for finalization and nothing gets collected. So the runtime has to perform a gen #1 collect to actually collect any instances. At the time of the gen #1 collection, the instances that were queued for finalization have most likely been finalized and can now be collected.

If you wish to see whether there a lot of instances being finalized, you can add the "Finalized instances" band to the types list (using the command View->Layout->Column chooser).