As I watch the Opening Ceremony to the London Olympic Games, I constantly hear that nations X, Y, and Z have yet to earn any medals. Ever.
In all their histories. While not terribly surprising given the size of some countries, the thought is still rather peculiar to me, as the United States
always performs well—not only in the Olympics but truly at everything its people dedicate themselves.
At any rate, this led me to a related thought: "How do other countries view the USA?"
As an "insider", if you will, I rarely hear outside opinions of the United States—save for a few news stories and my best guess—and, because Americans are invariably patriotic, it's rare to hear a negative word here. So I ask, what is the general opinion of the United States in your country? Does the USA appear as some "Big Brother"? Protector? Menace? Friend? Is American involvement elsewhere in the world viewed negatively? Positively?
Please begin each post with your country and proceed to summarize the general opinions regarding the United States from your nation's view. I'd very much like to hear what the world itself thinks.
Also, please, no bigoted rants. Nobody wants to hear that.