An article on the mainpage just reminded me about a beef (or confusion, at least) that I have with mainstream feminism. We have that quote in ****Flawless (do I have enough stars?) that feminism is the social, economic, and yada yada equality between the sexes.

Rebecca on Jezebel defined feminism as:

feminism (noun):the advocacy of women's rights on the grounds of political, social, and economic equality to men.

So... I'm just sayin. How useful is this very simplified version of feminism? Because you know what else feminists believe in? Patriarchy. And misogyny. Rape culture. And (often) that it isn't just men and women, so a definition about men versus women is going to automatically exclude a lot of the gender work that feminists can do.

Advertisement

I think this is why I didn't find Joss Whedon's argument, those few months back, about 'genderism' or whatever it was, all that compelling. Because feminism isn't just about equality between the sexes, it's a specific understanding of our current society and why things aren't equal. And that understanding is kind of crucial to actually overcoming it, y'know? Feminists have all of these concepts to describe and understand the dynamics we see in our everyday lives, and it's about a lot more than a dictionary definition of equality.

It can be useful, as an entry point, maybe, but I feel like this baby-definition of feminism is being taken too far. Like, we actually have people - and Jezebel - saying to someone "no, you are a feminist actually! Because you don't think women are inferior to men!" and I kinda think there is more to it than that? I mean, how many of us have had to deal with friends and family who think women should get equal pay and voting rights and maternity/paternity leave, but roll their eyes when you talk about systemic sexism or representation in movies? So many people seem to think that the gender inequality we have in our societies is somehow an accident, that coincidentally women don't have as many rights as men, and we just need to remedy that through assimilation. Is it really feminist, to think that women should just have the things men do (without actually changing the associated social structures), and leave it at that?

What do you think? I am so curious about this. How important is patriarchy - and associated concepts - to feminism?

Advertisement

(PS I find it hilarious that Jezebel, a website that has explicitly said they are not a feminist website, is calling out Salma Hayek for being explicitly not a feminist.)

ETA: I apologize for the very White-centric/Western-centric comment there, about the needs of particularly white men being a driving force of patriarchy. My education (academic and otherwise) is almost entirely about Canada, the USA, Western Europe, or Western imperialism/colonialism in countries that ends up subjugating PoC. There are definitely other forms that patriarchy takes!