Before i came to this beautiful continent, my mental picture of it was disease and hunger
This is how it was portrayed to me by the UK media. So i went to Kenya, summer 07 and to my surprise it was quite different from what i imagined. I literally had everything i did in the Uk. There is quality education, housing, and healthcare in the city's. Of course there is hunger, disease and no electricity but that's mostly in the villages.Needless to say i fell in love with this country and consider myself an African.
My question is why do western societies always paint the ugliest picture of Africa?