Why is it throughout history the white man has forced himself on foreign women?

I've noticed this strange pattern throughout the course of history. Whether it was native, African, middle eastern and even their own European women. In Europe it was seen as a custom after winning a war to capture the women and brutally rape them then murder them. It was a sick tradition carried out mostly by the Roman empire. Even as recent as the Iraq campaign, white soldiers would mercilessly rape local women and then kill them. I don't get it, if they are the standard of beauty why is it every race of women they have come into contact with they have caused mass rape and genocide? And then they call black men the rapists? Please!
You pasty beasts can deny facts all you want but it doesn't change a thing.
 
Back
Top