Feminism is defined by Merriam Webster as the theory of the political, economic, and social equality of the sexes. At its core it’s definition means equality for both men and women.

In 2017 the word feminism has become more than just a word but a movement. Starting off the year with the Women’s March where an estimated five million people participated worldwide and ending the year with the #MeToo movement, where women have come forward with their stories of sexual assault, feminism has been at the forefront of the news.

Feminism has been discussed in areas of media and entertainment as well. According to Merriam Webster, they saw “increased lookups after the release of both Hulu’s series The Handmaid's Tale and the film Wonder Woman.”

However, a lot of people take the word feminism and gender associate the word to mean only relating to women and by consequence only women’s rights. Many men and women are hesitant to call themselves feminist because of the critical negative connotation of the word to exclude men or mean anti-men. During an interview, Kelly Conway said, "It's difficult for me to call myself a feminist in a classic sense because it seems to be very anti-male, and it certainly is very pro-abortion, and I'm neither anti-male or pro-abortion."

The high volume of people looking up the word in the dictionary reveals that there is still a lot coming to terms with the word feminism. Since feminism is still a complex idea that can take many forms in different contexts, this idea will still take time to understand.

Recently with the accumulating list of men who have been charged with sexual harassment or assault it has kept the stories of women in the news. So while feminism can be interpreted differently by people one thing is for certain it has allowed for the voices of women to be heard and that's critical for change.