CondensedFlesh
New member
Many leftist writers depict the U.S. as an imperial power. I disagree that the U.S. is such a thing, because it differs vastly in fundamental components from classical empires such as the old British Empire and the old Japanese Empire. I would like to read books by authors which explain American influence (either economic, social, political, or cultural - preferably all four) while refraining from sinking down to etymological incorrectness.