Just a bit to get started, I'm to tired to gather all my thoughts on the subject, so I'll post later.
Do you think that the United States, and other nations for that matter, should continue to support African nations financially, economically, etc?
Keep in mind the fact that the world has been metaphorically raping the continent for well over 100 years.
Do you think that the United States, and other nations for that matter, should continue to support African nations financially, economically, etc?
Keep in mind the fact that the world has been metaphorically raping the continent for well over 100 years.






Comment