Does society dictate what we should do in terms of clothing?
I realize that this question makes no sense whatsoever, and I apologize in advance.
Alright. I wasn’t sure what to title this question because I’m a little confused myself. Recently I had been talking to this guy, and he asked me for a picture of my breasts, which I quickly denied. He said that I denied him because I was “a victim of society”. I was “taught” to be ashamed of my body and not show it off.
I said that it wasn’t really being a victim of society. It was more of a matter of privacy and personal comfort levels, to which he said that “native people walk around without clothes and they don’t wear clothes until missionaries tell them to. Your argument of nature is denied by logic.”
What do you think? Is it just society that tells us what we should do and what is socially acceptable?
This question is in the General Section. Responses must be helpful and on-topic.