Quote:
Originally posted by Weezy
Really no flamebait or anything but don't you guys think these type of shows are portraying black women in such a bad way? I saw a part of this last year, when Nicki talked about it in Yasss Bish! and it annoyed me so much. Black women being on a show that's drives on them fighting with each other, I didn't watch it but there was a three part reunion episode, just with them arguing, arguing and fighting (pulling each other weaves etc.). I feel like that's so stupid, like why would things like that be our entertaiment, that is disgusting in my opinion.
There are so many things going on and shows like these is where people talk about it... I just don't get it. With Jersey Shore, I can see the entertainment value but these people are young and that's just them on a trip but this is real life and actually portrays these women as they are LIVING which is just completely stupid.
Do y'all kinda get my point, lol.
|
I get your point with the wig pulling and I thought that went TOO far but in my opinion, these ladies usually deal with conflict with their words, and they do it well. It's definitely not the same as let's say, Love and Hip Hop.
For the most part these ladies are all very intelligent and successful, more successful than their men, and that's shown in the series.