Jessica Valenti

What does feminism mean to you? Over the years, the definition of the word “feminism” has changed. For the record, that definition, according to Webster’s Dictionary, is: “the belief that men and women should have equal opportunities.” That seems simple enough, but for some, feminism has become a controversial—even unnecessary concept. Whatever feminism means to you, it’s worth taking a look back at how and why the movement developed, beginning as far back as the early 1900s, and the writers and feminist books that continue to influence our lives today—whether we know it or not. With so much feminist literature out there, this list is not exhaustive. Add your go-to feminist book to the comments. Together We Rise, The Women’s March…