The Wikipedia page on risk aversion states that a "Constant Relative Risk Aversion implies a Decreasing Absolute Risk Aversion, but the reverse is
not always true". Let me decompose this statement in two parts:

Remembering that these measures are mainly discussed for utility functions where $u''<0$, and so they are seen as algebraically positive, we can deduce which relations hold with certainty and which do not.

Let me turn my comment into a quick answer:
Using the notation of the article you quoted $A(c)$ is the absolute risk aversion and $c A(c)$ the relative risk aversion. If $A(c)$ is decreasing, the preferences fulfill DARA. If $c A(c)$ is constant, the preferences fulfill CRRA. If CRRA holds, then $A(c)$ must be decreasing in $c$.

If we take any $A(c)$ such that $c A(c)$ is decreasing and $A(c)$ is positive, then $A(c)$ will also be decreasing. In this case, the preferences are both DARA and DRRA. You asked whether DARA implies CRRA in almost all cases. From this analysis, it seems to be rather the case that CRRA is the exceptional case.