Are there such things as gender specific roles anymore?
We are all aware in the west how gender roles have changed over time. Long gone are the days where men were the breadwinner & women stayed at home. This got me thinking specifically about the role of men in society. & I’m rather inclined to accept the ‘Fight Club’ outlook, that we’ve tried & failed to be like our generally useless fathers, so now what? & how have these changes impacted on females? Do some feel threatened by male incursion into traditionally female roles such as child rearing. Do women feel overwhelmed by the expectation that they should be able to do it all, to be combined working women, mothers, wives, sex symbols, carers etc, etc. Don’t feel you have to answer all the points I’ve brought up, this is just an outline of my thinking.