I would like to consider myself a Feminist. I have so much respect for women. I think it's because, since I'm gay, I've never looked at/considered women as potential sexual partners. This has allowed me to have more respect for women as human beings.
However, I've struggled with this issue: When women take their clothes off, does it help or hurt the cause? On one hand, women (for decades) have been taking off their clothes to reject the expectations the world has for them. On the other hand, I feel that when women take off their clothes, men objectify and degrade them even more (and this sets them back even further).
What do you guys think?