R
romanamic
Guest
I am doing research on the nation-building of the USA. When in history would you say the USA stopped being a mere state, and started being a true nation-state? Would you argue that was already before the Civil War? Or even at its creation in 1776? Thanks!