I'm American from Texas. I have friends that are from the U.K., Germany, Ireland, Spain and Italy and they all seem to have a much broader knowledge of Africa than Americans. Is this because there isn't as much of a stigma about black/white relations over there and it's taught in elementary...