I find that the society prefer women who act more and more like guys. Thinking about what kind of women who are presented as the best in media and so on(for example in movies, doing "masculine" things as fixing cars, fighting, drinking, and in general acting like men).. On the other hand, I don't understand who men would want women who acts more and more like themselves.
Personally, I think society wants me to act more like a man, than I really like. Maybe I am an exception, and all other women like to act masculine.
Why is it like this? Do women really prefer this, or is femininity simply less desirable?
Personally, I think society wants me to act more like a man, than I really like. Maybe I am an exception, and all other women like to act masculine.
Why is it like this? Do women really prefer this, or is femininity simply less desirable?