Seems to me that adults seem to blame every ill and bad happening in the world on us. They say that everything we do is bad, we are deteriorating and so forth. The reality is that people have been saying this for thousands of years. If the younger generation seems to keep getting worse, why does...