current events more? I'm fairly young, and have only gotten REALLY into politics in the last few years. From my perspective, it seems like things in this country have gotten bad, maybe worse than it ever was. The public is divided, the government seems more corrupt, we are less trusting of anyone in power, and the future just looks very, very bleak. Has the US changed forever (for the worse), or does it just feel like that to me because I've started paying close attention?