Western Culture is a Christian culture.
They believe that the only way to have your soul saved, is to be Christan.
That is what is behind their belief, that they will save the soul of indians, even if they kill all of them.
probably because they just wanted to justify taking over their land and killing most of them. i don't think ultimately the native americans benefited from the white man's civilization. their culture was destroyed, they ended up with bits of land and casinos and lost their diet and got diabetes due to bad white man habits put upon them. so no overall i would say they didn't benefit at all!