How does it feel to know that the U.S. is the bad guy? I'm a bit conflicted. I can't imagine living in any other country, but its a weird feeling knowing you're on the side of the super villains, and everything you do supports their strangle hold on the rest of the globe.

But never has any government ever been like Peter Parker. As someone who has a decent background in history I feel like maybe I'm cynical. But I feel like this is how the world works. It's not something I'm very proud of, we watched a film on Vietnam in my class today and it was upsetting stuff, but I understand that it's not something to be seriously shocked about, it's just another straw to place on top of the camels back.

The American Exceptionalism idea has long since faded from my mind. We do what is in our interests.

As a Canadian neighbor, I don't think America is all that bad. The most powerful nations have always been conquering new lands, but the US has done it far less than other nations in the past. The British were even far worse in the past. The US will always put it's interests ahead of everything, but generally let the world be as it is. If your country wasn't drowning in debt and wanted to take over the world, it would probably succeed or at least come close. The US navy alone is ridiculously larger and more advanced than the rest of the world's.

How does it feel to know that the U.S. is the bad guy? I'm a bit conflicted. I can't imagine living in any other country, but its a weird feeling knowing you're on the side of the super villains, and everything you do supports their strangle hold on the rest of the globe.

"It's a funny feeling being taken under the wing of a dragon. It's warmer than you'd think."

As a Canadian neighbor, I don't think America is all that bad. The most powerful nations have always been conquering new lands, but the US has done it far less than other nations in the past. The British were even far worse in the past. The US will always put it's interests ahead of everything, but generally let the world be as it is. If your country wasn't drowning in debt and wanted to take over the world, it would probably succeed or at least come close. The US navy alone is ridiculously larger and more advanced than the rest of the world's.

LOL modern empires no longer need to conquer land to gain power, so the fact that the US isn't doing it means absolutely jack shiit. It's through corporate power and economic manipulation that they gain true control. In fact, countries don't even matter anymore, it's elite interest that prevails. And the US is very bad and very very dangerous. Maybe that doesn't translate into huge death tolls and physical oppression but the results are even more sinister. They've created a system of manipulation that the people have accepted and are unable to change even if they wanted to.

The changes are done so gradually that we don't even notice it done in front of our very eyes. That is way worse than an outright dominating presence because simple human minds cannot detect and therefore revolt against it. Any being with human sensibilities can be disgusted by pictures of a concentration camp, but a complex system of lies and deceit does not immediately get an emotional rise out of someone.

Honestly, it doesn't matter one bit to me if people in other countries have some senseless bias against me. I myself do not hold that same bias against them because I realize that all men are created equal, and such a senseless conviction is simply ignorant.

Also, I, and most people I know, do not ever plan to permanently reside outside the US so this bias will never directly affect me. I realize there are foreign policy implications that sociologically affect me, but hopefully the foreign powers in the position of deciding things like that do not hold such ignorant biases.

Basically - Dey mad cuz we stylin on dem! And dey cant do shyt and dey still mad!