Us Foreign Policy

Over time, the foreign policies of the United States have changed dramatically. When President George Washington left office, he advised against entering into permanent treaties with foreign nations, as well as against holding inherent biases that result in the country treating one nation better than another. This approach remained the cornerstone of foreign policy for much of the nation's history, but the conflicts of the 20th century would ultimately change everything.
Following World War II, it was clear that the United States needed to form a clear alliance with other nations so as to prevent the rise of another authoritarian regime the likes of Hitler and Nazi Germany. What was once a foreign policy marked by trade relationships only soon became a foreign policy dotted with increased policing of the world by American forces. Proxy wars were fought in places like Vietnam and Korea to stop the spread of Communism; political changes were forced throughout Latin America and other regions of the world to ensure that leadership in these key areas was friendly to the United States. The second half of the 20th century witnessed American foreign policy grow from one with financial motivations to one with motivations of power and control.
Today, American foreign policy does much the same. However, it is not done under the guise of preventing the spread of Communism, but the nebulous concept of "national security." Military engagements the world over are designed to address the spread of Islamic fundamentalism that is so closely linked with terrorist activities. What many challenge the United States on, however, is that this interventionist strategy can be seen as only causing the fundamentalism to proliferate. With more American involvement comes more backlash against America. The result is that countless people are calling for the United States to take a step back and reconsider its role on the world's stage, for the nation's safety as well as that of her allies.