Racism is so ingrained into societies... it's always straight> gays, natives > immigrants, whites > blacks. Whatever the majority is, they view the minorities with distrust.
It's so normal for minorities to be paid less, have worse jobs, be treated unequal, that really it's so sad.
Will this mess ever end? Should the governments be more involved in defeating instutionalized racism by, for example, forcing companies to hire a certain percentage of minorities and pay them the same as whites?
Discuss and keep it all the way cute.
