Over centuries many great minds have pondered over the meaning of probability, trying to apply it to profound questions such as "is harvest likely to be good this year", "how likely is my stock portfolio going to appreciate", "will I get a hangover after drinking this bottle", etc. Some even went as far as to claim that the term "probability" is undefinable using simpler concepts and nevertheless fearlessly proceeded to derive all the correct rules for relating one probability to another without ever revealing the secret of determining the value of either one.

Thanks to the power of Interwebs and Inkscape you no longer have to wonder along. Probability is just glorified counting and taking ratios (e.g. counting things of one type within the set of things of another type). In the end, even the supposedly more general Bayesian view on probability reduces to just one elementary operation, counting. This means that with enough perseveration one can reduce any probabilistic problem to counting balls in an imaginary urn. Like so (click to enlarge):

By the way, when you hear talk about "prior information", what is really meant is "counts". The natural questions is to ask which counts exactly. If they can't tell you, they are safe to ignore. Also, keep in mind that some counts don't count as much as the others. Probably...

More seriously, it may be helpful to realize that every probabilistic model corresponds exactly to some such urn-based setup. Any reasoning using the urn model can be mapped back to the situation described by the probabilistic model - and vice versa. Moreover, urn-based setups may be transformed formally into one another while preserving their meaning. While it's difficult to juggle probabilistic formulae in one's mind, ball-filled urns are quite easy to visualize and quick-check for surprising contents. The continuous probability case also fits nicely by imagining the "going into the limit process", that is, shrinking balls ad infinitum.