So i was wondering if any of you could point me in the right direction. I'm looking for books that explain the COMPLETE history of Africa. From the first man on earth (man came from Africa) to the colonization, the decolonization, and so on and so forth. I should add that i would prefer the book to be written by an african or possibly an african american. No offense but i'm not interested in hearing a white guy's take on a country his ancestors have been raping for centuries.