batraylover
New member
I am wondering what movie of all time (any movie ever made) gives the best representation of American culture, what America stands for, the absolute essence of what America truly is? I'm not talking about the corruption and everything now, I'm talking about real America, true American values.
Wondering your thoughts on this, it's one of those weird questions that came into my mind out of nowhere.
A movie that covers all of American history, not just now!
Wondering your thoughts on this, it's one of those weird questions that came into my mind out of nowhere.
A movie that covers all of American history, not just now!