Hollywood Can’t Get Africa Right
"The image of the continent has changed little since the days of [i]Tarzan[/i] and [i]Out of Africa[/i]."
In his bitingly sarcastic 2006 Granta essay, "How to Write About Africa," Binyavanga Wainaina facetiously urges Western authors to focus on Africa's dead bodies-"especially rotting naked dead bodies." His observations could also pertain to the way we film Africa. Despite the rise of Hollywood activism, the image of the continent has changed little since the days of Tarzan and Out of Africa: On screen, Africa still must be subdued or saved.Although idealistic doctors, crusading public-health advocates, and U.N. translators harboring revolutionary pasts have replaced colonial protagonists, the Africa we see in theaters is still very much the Dark Continent, filled with beautiful wildlife, savage humans, and wrenching poverty. The heroes and heroines are still invariably white, and their struggles, martyrdom, and occasional interracial trysts still drive the plot.
|Despite the rise of Hollywood activism, the image of Africa has changed little.|