Was United States' foreign policy as bad before WWII as it has been since?
Since WWII, the United States has a poor record in foreign relations. From imposing an unwanted ruler on Iran to arming bin Laden and financing the Tunisian military, they have made a number of serious mistakes.
However prior to WWII, the United States did not possess the position of nearly unrivalled power that it has since. I am not personally aware of their foreign policy and its effects at the time. Did WWII lead to a change in US foreign policy? Has it always been so ill-judged?
This question is in the General Section. Responses must be helpful and on-topic.