Feminism—in its very basic definition—means having equal rights for women and men. It's evolved into a dirty word over the years, with countless arguments and opponents likening it to "man hating" rather than a fight for equality. Do these strong, successful women really not believe in it, or does they just not understand what it really means?