I am fairly experienced with electronics work, but I keep coming back to this problem time and time again.

I want to design a power supply and have it able to pass safety certification (e.g. UL). I have a complete understanding of the physical concepts of isolation, creepage, clearance, parasitics, etc. However, when I read though design guidelines, for example here, I always get a little twist-turned-upside down when reading about working voltages and insulation types (Functional, Basic, Supplementary, Reinforced, Double).

Obviously if you're building something with no electrical outputs (power, control, whatever), you can meet the desired level of safety by wrapping the entire thing in an insulating enclosure, or wrapping it in a grounded conducting enclosure. But in a case where you have mains powered electronics with accessible electrical outputs (for example the pins on an RS-232 connector, or a banana jack outputting a regulated supply voltage), how the heck do you get two layers of insulation that you need between the mains and the outputs? See Figure 3 in the above link if it's not clear what I'm asking about.

The first level obviously comes from the galvanic isolation within the transformer that steps the mains voltage down. It's my understanding that these transformers are always double insulated themselves, so single faults within them don't have to be considered (right?). But what about the single level of insulation between the primary and secondary conductors provided by the circuit board itself? If the board faults (however unlikely that may be), then you could have full mains AC on the secondary. That would most likely fry whatever low voltage electronics you have on your external connectors and energize the external components.

Is the idea that the PCB can provide such reliable isolation (with a very wide trace gap) that it is as good as 2 layers of insulation? Is that what Reinforced insulation is referring to?