America dominates most of the media we get in western countries (TV, movies, celebrities, music etc) so we know America pretty well
If I'm going to stereotype then it's pretty much like you said in the OP, Americans think America is the greatest country on Earth... they seem to think they have so much more freedom than all other countries
I think the "freedom to bear arms" is ridiculous btw, it seems like one step away from "freedom to injure" or "freedom to kill".
It's hard to generalise the country as a whole as it's so divided anyway, Republicans and Democrats, North and South, races etc..
I'm from NZ by the way