Feminism - Research Article from St. James Encyclopedia of Popular Culture

Feminism

Feminism, the ideology that supports uplifting the status and improving the rights of women, has been one of the most influential political ideas of the nineteenth and twentieth centuries. Since its inception, it has been both hailed as a profound liberation of society, and condemned as a philosophy of victimhood, responsible for the breakdown of the nuclear family and the degradation of society in general. There is no doubt, however, that the work of feminist activists and reformers has been responsible for enormous improvements in the position of women in the United States over the past 200 years. Equally indisputably, a glance at the power structure of most of the world's governments and businesses shows that male dominance is still very much a reality. In spite of this, feminism has changed the American social order, from the superficial, such as media portrayals of women, to the deepest underlying assumptions...