Most people think of Hitler when they think of Germany (although he was actually Austrian). Hitler brings a very negative image for Germany. But we are from Germany too and we have done good things, like help save the white tiger from extinction. So do we give Germany a positive image?