Thanks to George W. Bush, United States of America left a bad taste in all the world. We were all supporting the country because of 9/11, but after the Iraq war, the perception of USA as an imperialist and oppressive nation who was covering on a "democracy" and free-speech fake shield (see: Dixie Chicks case) was the worldwide general consensus. With Obama, all changed for good (tho he didn't do much), but after Trump's victory I feel the USA is again a concern for the worldwide community.
So, do you hate the USA? Are the USA playing the same role than Germany in the World Wide II?
I voted yes because I hate most of my country right now.
But I'm so happy that I was born and live in California, the best state in the U.S which deserves to be its own country. I feel safe around everyone and everyone is friendly and open minded, I won't ever have to visit any other state ever again.