Why does the left and pop culture constantly try to make us think that being

white is actually a bad thing? Why Do they have such low esteem and self loathing guilt as to try to bring down the very people that built this nation that they are benefiting from like a tumor to it's host?
Is it a simple matter of misery loving company?
 
Back
Top