Feminists have pointed out that women are inculcated in a culture where they must buy products that make them young, beautiful, and attractive to men, and they often advocate rejecting the consumer culture as a way of fighting patriarchal domination. We all know the theories purporting the unholy fusion of capitalism and the patriarchy: men control high power positions in capitalism and use sexual insecurity of women created by the patriarchal system to exploit and make a profit. But what happens to this feminist theory of gender role construction when men are sold the same, or very similar, bill of goods? Is patriarchy then dominating itself? Or is capitalism really the only ultimate dominator?