OK I know this may sound like a dumb question at first but, the only reason I hide mine is 'cause I was raised to hide them and keep them private. We feel comfortable showing our arms and other body parts (even our asses in some childrens movies!) because we all have the same thing and we know what they look like, but we hide the only one thing that sets man and woman apart. Im sure most women know what a mans genitals look like, and I'm pretty sure most men know what womens genitals look like - so why dont we make these visible in town like we would with our arms (as we all know what an arm looks like). Why would it be so disgusting if someone in town wore something that showed off their genitals? Yet fat girls wear short tops and no one retaliates to that (and thats an uglier sight than seeing genitals). I just dont get why we hide our only difference when most people know what it looks like anyway!? Thanks in advance for any answers and please no nasty replies.