OOPS!
I didn't know we couldn't talk about
sex.
--
We all enjoy sex.
It doesn't hurt anybody.
It's a part of life.
It even creates life depending on who's having it.
Why are we made to believe that to be sexual is to be "nasty" or "dirty"?
It's the reason we're all here as living, breathing, feeling human beings.
Why is it such a taboo?
Why do we shame other people for being sexually liberal
and keep our children in the dark about it?
Why do we teach ourselves to be uncomfortable with naked bodies?
It's only our own anatomy. Why should it be offensive?
Why is it so shameful for people to wear revealing clothing
or clothing that emphasizes their form?
What is honestly so
wrong with sex?
Discuss.