DISCLAIMER: I'm NOT being literally with the title.
I've noticed a lot of gay subject matter has become more prevalent on TV lately. Shows like AHS, Nashville, Scandel, OITNB, How to Get Away w/ Murder etc have been pushing a lot of gay plot lines into the homes of middle American primetime. Hell, I never thought we'd get gay sex scenes on basic cable primetime like they show on HTGAWM.
The music industry seems to slowly be more accepting of gay acts again. For awhile there was no out gay men in the industry, but now we have Frank Ocean, Sam Smith and a few other up and comers hopefully paving the way. Hell, I've noticed even hip-hop is coming along with guys like Yung Thug making androgyny cool a bit. Also, more and more male celebrities have been doing the "sex sells" photoshoots and ish to appeal to the gayz.
Gay lingo has also been picked up by the media a bit.
Are gay people taking over pop culture? Are we the ones everyone wishes they were but can only dream? Is our Emancipation of Penis era coming?
