G1 targets 2048 regions, but you can set the region size with -XX:G1HeapRegionSize

The four refinement zones (white green yellow red) affect how much work G1 needs to do to update queued pointer changes into the remembered sets. In the white zone, nothing is done (it?s drained at the beginning of a GC); in the green zone -XX:G1ConcRefinementGreenZone refinement threads are activated to reduce the queue back to the white zone; in the yellow zone -XX:G1ConcRefinementYellowZone all refinement threads are fully active to reduce the queue; in the red zone -XX:G1ConcRefinementRedZone the application has to update the remembered sets on each pointer change which slows down the application until the queue is back into the yellow zone.

G1 schedules an Old GC when the whole heap is -XX:InitiatingHeapOccupancyPercent full (default 45)

After an old gen GC is complete, G1 sets a mixed flag, so that after the next young GC, the -XX:G1MixedGCCountTraget fraction (default 8 meaning 1/8th) of old regions which are largely empty are copied to another old region to free up those regions. This is the compacting phase. The regions targeted are those that are more than -XX:G1MixedGCLiveThresholdPercent full of garbage (default 85), and ignores regions with -XX:G1HeapWastePercent garbage (default 5). The mixed flag is turned off when there is enough reclaimed

G1 full GCs are single-threaded (in Java 8) and very slow, so should be avoided. Use -XX:+PrintAdaptiveSizePolicy to know the reason for the full GC.

Cache coherency during concurrent writing has a potentially high performance cost.

False sharing is when the memory cells being accessed and updated are different but they fall physically into the same cache line.

One technique to eliminate false sharing is to pad the memory space between fields used concurrently with intermediate unused fields, so that the actively used fields fall into separate cache lines. Note the JVM can eliminate dead code, so there needs to be some artificial use of padding fields.

From Java 8, the @Contended annotation will enable the JVM to optimize the fields memory usage to eliminate false sharing, even taking into account prefetching. Use -XX:-RestrictContended to allow classes outside the core Java classes to use this annotation, and -XX:ContendedPaddingWidth to set the padding width to a value different from the default 128. In Java 9, @Contended is not exported by default, so needs explicit module access.

You don't need to worry about all memory leaks, instead factors in the size of the leak and the program?s lifetime to see whether the leak is serious enough to need to be fixed.

Reference objects let you reduce memory leaks and respond to memory pressure, but you can be at the mercy of the garbage collector. They are appropriate for objects like listeners where you need to retain a reference to the listener as long as it is live, but you don't want to hold on to the listener beyond that.

Classloader leaks are difficult to identify and eliminate. Jetty provides a set of "leak preventer" classes for the most common types of classloader leaks at https://www.eclipse.org/jetty/documentation/9.4.x/preventing-memory-leaks.html

Closing resources, streams, etc, is an important mechanism for avoiding memory leaks.

Memory leaks can occur when objects are no longer being used by the application, but the Garbage Collector is unable to remove them from working memory because they?re still being referenced unintentionally.

Static fields holding on to objects is one common type of memory leak. Pay attention to the lifetime of objects referenced (directly or indirectly) from static fields, especially any collections, these are prime candidates to cause memory leaks.

Strings interned into the JVM intern memory pool - by calling String.intern() - are potential memory problems, especially before Java 8 when they were stored in Perm Gen.

Forgetting to close streams is a common type of resource leak. Favour using the try-with-resources syntax wherever possible so that possible routes for unintentionally leaving streams open are minimized.

A very common memory leak is adding objects to hashed collections where the objects have incorrect equals() and hashCode() implementations.