im at school, and all they teach is Bad things about germany, germany did this germany did that..., yes people should be taught about the holocaust, WWI and WWII, but why are they always pointing the finger at germany, yes germany has done some bad things,so has every country !!!, they should teach children good stuff about germany, not just the bad stuff. there is so much hatred towards germany and germans at my school, im always getting called, names like "A Nazi", "Hitler" etc., i think its damn right racism and discrimination, teachers are not helping either, like my science teacher talking about Thalidomide, again pointing the finger at germany !