Basing on another's thread debate, this question popped up in my head..
I've spend several summers in the US, and always feel caged, everyone calls the police on your ass like its no ones business FOR ANYTHING!!!! Its ridiculuous.
Not only that but all the homophobic "family values"conservative ******** that almost the entire country preaches..
Also it's the most routinary lifestyle i've seen, scary and sad (Work>mall>house).
People my age (17 back then) went to the "mall" for fun and it was "the place" and it was all there was to do.
In other parts of the world like Europe, Brazil or Mexico, life felt less routinary, careless, funner, everyone doing their own thing. FREER!
So what do you think.. is America really the land of the free or the goverment sold the idea perfectly?
This country was never ever free never. They said it was free when my ancestors was slaves when my grandparents couldn't vote and now me not being able to marry who I love. This country was built on atrocities and yet they claim to be the Gods of Earth. They rape torture and kill in the name of freedom but it's all lies. Deadly lies at the worlds expense.
I'm actually from the U.S. and I feel caged most of the time. I've traveled to Mexico mostly and I've seen people on the streets who lead happier lives than some ppl in the U.S.
Yes, living in the U.S. is pretty neat. But if you actually think about it... it's school, work, debt, house, more debt, car, more debt, kids, family, money, more debt. It turns into a routine.
One of my history professors once said "America is the land of OPPORTUNITY, nothing is guaranteed"