If suntan and olive skin are considered beautiful among white women?
So white women are allowed to have darker skin? And use spray tan and various skin cremes? Teach me. I'm genuinely wondering
well I mean if white girls want to be darker, then it is only logical that they would be against making ur skin whiter... I mean, why do it, when black is already beautiful?