There are so many articles about feminism in literature, and feminism in politics, and feminism in society... I never thought of women as being neither more nor less important then men, but whenever I see these articles I wonder, "What is so special about women that there are articles about them in every field? Why are there no articles about men? Do people just not care about men as much or are women just making a big deal about their gender?"
I don't mean to offend anyone, I'm just curious.
I don't mean to offend anyone, I'm just curious.