Do you really think America is the most racist country?
Or is it the fact that America is progressive enough to call out any racist things happening in the world so we hear about it more while other countries are really racist but no one cares?
Maybe it's because we have so many minorities and different groups unlike some of the homogenized countries that have little racism because there's not a lot of minorities in the country?