In American public schools, why do they only teach History?

  • Thread starter Thread starter Fiʂkün Trois
  • Start date Start date
F

Fiʂkün Trois

Guest
And not Herstory?
I have noticed everything in the history books at school is focused predominantly on male accomplishments and downfalls. So the title History suits it, but why don't they teach us Herstory as well?
Some of you may not be aware, but 'history' is derived from 'his story' - 'herstory' from 'her story'.
Loannis, women played a role in the past as well. The Historians and gov't (who decides what the American kids are being taught) don't want us to know that, clearly.
 
Because society was dominated by men, they were the ones making almost all of the accomplishments. Jesus Christ, it's not a conspiracy, it's just how the world used to be, get over it.
 
Um a more pressing question is why do they only teach American History. I mean sexism is endemic in America but Jingoistic no-nothingism has to be induced.

Why do I want soldiers in Iraq who cannot even find Miami on a map?
 
Back
Top