American History: How has it shaped today's society?

shimmelman0916

New member
Hey there. From about 1607 (When America, the New World, came to be) to about 1865 (When the American Civil War ended), what historical events are worth pointing out? And how might this history inform our understanding of society today? Someone please help!
 
Not to make light of your question, but in more ways than you can shake a stick at. History informs us of so many important experiences and discoveries. For instance, in the treatment of disease. For thousands of years, people had no idea what caused disease or how to treat it. Some means that were tried actually worked, but the vast majority had no effect at all or actually made things worse. We've known how germs cause disease only for the past hundred forty-nine years, since Louis Pasteur published his famous "germ theory".
History tells us of the important discoveries regarding the weather. It tells us about the discoveries and inventions that led to the automobile, the television, the computer, the space shuttle. We don't have to go back and re-invent the wheel every time we want to build a car or the jet engine every time we want to fly from San Francisco to Tokyo.
History tells us where we've been and that in turn tells us who we are and what we are and why that knowledge is important.
History tells us of the humongous struggles against despotism and human slavery and for basic human rights. It tells me that if I take my hot iron frying pan right off the fire and stick it in cold water to wash it, the shock of cold on the iron is likely to ruin the pan and render it useless. It tells me not to stuff too much powder in my reloading equipment so I don't stick a shell in my rifle that's going to cause my rifle barrel to burst when I fire it. It reminds me that one should wait until after the danger of frost has passed in the spring before I set my tomato plants out in the garden.
Need I go on?
 
Back
Top