Do women conspire in encouraging others to see them as sex objects?
I grew up in the time of feminism, when women were angry about being seen and treated as sex objects that had nothing more to offer than what they looked like and what was between their legs. These days, it seems like feminism is a bad word. Almost no young woman would label herself that way. It seems like, more than ever, women dress to kill (at least in the US).
I’m sure they believe the feminist revolution has been won, and that they can be sexy as well as smart and powerful. I’m not sure. I think that if you want to be taken seriously, except in very few industries, you can’t afford to accentuate your sexiness. In fact, you have to diminish it, and dress serious, for business, unless, of course, you are in a business that sells sex, like the food industry (how do women maximize tips?).
It seems to me that this must be a very complex thing to have to deal with—to try to be taken seriously on the one hand, and on the other, to try to be attractive and sexy. I don’t know if it can be done, as long as the two are seen as opposed to each other. I’d hate to live in a world where women didn’t dress in a way that makes my dick grow hard. But when it is hard, it’s damn difficult to think about anything else.
This question is in the General Section. Responses must be helpful and on-topic.