...sports stars, or gangsters? I recently realized that white people (especially in the south) are raised to believe football is God. When they turn on the TV and see huge muscular black guys, do they get scared or something? Also, when they see most black guys portrayed as gun wielding criminals on MTV, do they take this seriously? I am white, but I went to private school, and I am not from the south.