Many Americans still thought of themselves as Englishmen up until this time. How would you argue that many actually realized that they were different? Or how they were not?
The British were feeding off our colonies as if they owned them and still demanded people pay them tax.. Thomas Paine's "Common Sense" told people not to fear England which is 1000 miles away and which is a tiny island compared to America. Once people realized they could fight the British and win, minds started to change.