At what point in history did the USA become so despised by other nations and why?
I’ve lived in the US of A for the majority of my formative years (going on 10 years in just a few months) and I can’t think of a time when I didn’t hear of another nation wanting to harm the US and its citizens. Where I’m from, the US is revered by pretty much everyone (so much so that the vast majority of my countrymen have relatives who immigrated here) and yet there are all these other countries who seem to have nothing but unrelenting (bordering on obsessive) contempt for the US.
With the recent events in Boston, I’ve been wondering what makes the US such an attractive target for terrorists. It seems like we’re constantly hearing of countries/radical groups whose sole purpose seems to be to terrorize and cause pain and loss to the US and I’m curious to know at what point in history this all began, who exactly they hate (the government, civilians, both?), and why this is so.
As an aside, what, in your opinion, can be done (if anything at all) by the US as a nation to try to ease the hatred and resulting innocent suffering and loss?
Looking forward to reading your thoughts on this!