1 definition
by
DisgracedWoman

Feminism used to be about women getting the same rights as men, such as the right to vote and equal pay at work.
Now feminism is a movement full of women who seem to think that their ability to push a baby out of their vagina entitles them to bigger and better everything.

I am a young woman and I disagree with feminism.
It is pretty hypocritical to expect to be treated special because you are a female (such as tighter laws concerning rape or more leniancy if a woman becomes pregnant) yet they get upset when they aren't being treated exactly like a man. The feminists need to realize that women are not better than men. They are EQUAL. You don't see any laws that provide special treatment for men do you?

Example: Once I saw a feminist say that abortions should be legal AND be free. I am all for abortion rights, but demanding that it be free because your ass couldn't use the proper birth control before spreading your legs is absured.

And before you pull the rape card on me, keep in mind that if you go to the hospital after being raped, they give you medications to prevent pregnancy. I see no reason for free abortions.