Is the U.S.
as religious as it is because everything in our culture is commodified
to such an extent that religion just another thing to own, rather than
being a tenet of life for many Americans. Something like an economic and
social herd instinct at work as opposed to simple faith.

It seems that in so very many(especially protestant)churches that it is
more important to be "seen" supporting a church, while the measure of
one's actual faith is discreetly ignored in direct proportion to the
size of one's tithe.

This is obviously nothing new(indulgences and so forth), but the
religious mood these days here seems somehow unusually "American",
something hucksterish and dishonest. If America is the "flower of
enlightenment" as so many people seem to think, but obviously isn't,
then is it possible that atheism(at least secularism)is never going to
be the basic social ethos here because there is simply no money to be
made with it?