Unquestionably the United States has spent a large portion of the 20th century and all of the 21 century at war or in states of police actions. After reading up on the subject I have found two highly bias views of the United States.
One view commonly held in Europe is that the United States is guilty of Imperialism.
The other view is that our wars are justified in ensure the liberty of the United States and our allies.
Is the reality of the situation one or the other or maybe a mix of both?
please if you're going answer this question I would like a well reasoned thought out comment as opposed to some idiot hippie railing on against the United States or a dirt dumb hillbilly who defend a war just because they like fighting.
One view commonly held in Europe is that the United States is guilty of Imperialism.
The other view is that our wars are justified in ensure the liberty of the United States and our allies.
Is the reality of the situation one or the other or maybe a mix of both?
please if you're going answer this question I would like a well reasoned thought out comment as opposed to some idiot hippie railing on against the United States or a dirt dumb hillbilly who defend a war just because they like fighting.