Ive noticed around this internet that many foreign and American people see the United States as a pathetic country. I dont understand why it is so bad, compared to many African and Asian factions.
Most of the accounts i noticed involved stereotype including "All Americans are fat" and "All Americans are warmongers", neither of which are true.
Can anyone explain why America has been deemed illegitimate?
By the way, i would just like to hear your opinion. I dont need pointless flaming or such.
Most of the accounts i noticed involved stereotype including "All Americans are fat" and "All Americans are warmongers", neither of which are true.
Can anyone explain why America has been deemed illegitimate?
By the way, i would just like to hear your opinion. I dont need pointless flaming or such.