University of Hawai'i Press
Abstract

This essay synthesizes research findings in the fields of microbiology, archaeology, and archaeobotany to explore the significance of malaria on the peopling of early tropical Africa before the Common Era. It contends that the human genetic responses to malarial infections in early tropical Africa constitute the earliest known chapters in the human experience with infectious disease. It also advances a new interpretation of the colonization of much of tropical Africa during the demographic processes known as the "Bantu expansions" (fifth to first millennia B.C.E.). It argues against diffusionist theories and in favor of a more integrated theory of the peopling of the continent.

The first chapters in human prehistory are the story of early humanity in tropical Africa. Over long epochs of foraging, our ancestors gradually developed tools, adapted to new environments, and created more complex material culture. The internal dynamics of the early epochs of the human past in tropical Africa, however, have remained relatively obscure. By contrast, the rapid advances of humanity in regions outside of tropical Africa during the extended Neolithic agricultural revolution are generally agreed to mark a revolutionary break with the early epochs. For this reason, it is customary for world historians, after acknowledging a long history of human foraging, to begin [End Page 270] the teaching of world history with the agrarian era, some ten or eleven thousand years ago.1

This essay synthesizes research findings in the fields of microbiology, archaeology, and archaeobotany to explore the significance of malaria on the peopling of early tropical Africa, before the Common Era. Malaria emerged as a human disease in tropical Africa, and over the long run of humankind on earth, it has likely killed more Africans than any other disease. This essay advances evidence that human genetic responses to malarial infections in early tropical Africa constitute the earliest known chapters in the human experience with infectious disease, revising the idea that the first accommodations to infectious disease environments took place in the agricultural settlements in the fertile river basins of northern Africa and southern and eastern Eurasia. It also advances a new interpretation of the colonization of much of tropical Africa during the fifth to first millennia B.C.E. demographic processes known as the "Bantu expansions." It contends that human accommodations to endemic malarial infections—in conjunction with the practice of yam and plantain vegeculture—resulted in an "immunological gradient" between rainforest villagers and foragers that played a major role in the expansion of Bantu-speaking peoples. This interpretation revises older interpretations that have stressed the role of iron making and political violence and supplements newer interpretations that have focused on cultural and linguistic evidence. This synthesis argues against the significance of diffusionist influences and for a more integrated theory of the peopling of early tropical Africa.

Tropical Africa and Human Disease History

In 1976, William H. McNeill published Plagues and Peoples, a highly influential synthesis of the history of human disease. In it the author argued that the disease environment of tropical Africa held a special significance for understanding early human history. The tropical African disease environment was extremely difficult, owing to the sheer variety of parasites that preyed upon human beings and of vector-borne diseases to which human beings were subject. Indeed, the first great [End Page 271] advance in human health was the successful exodus of some human pioneers from tropical Africa. Outside of the African tropics, human emigrants discovered that they had left the ecological zones in which some of the most virulent vector-borne diseases were endemic and that as a result their disease burden was lighter. This improvement in health or, put differently, this increase in reproductive fitness, in turn led to more rapid population growth and ultimately to the establishment of settled communities, marking the most important transition—the agricultural revolution—in the long prehistory of humankind.2

For McNeill, subsequent major early developments in the history of humankind's relation to infectious disease thus took place outside of humid tropical Africa, with the growth of agricultural settlements in the fertile river basins of northern Africa (the Nile) and elsewhere around the southern and eastern reaches of Eurasia (Tigris-Euphrates, Indus, Yangtze, and Yellow Rivers). In these river basins, human beings began to experiment with the domestication of herd animals, and the close proximity of dense-settled human populations with herd animals allowed for some of the pathogens of herd animals to adapt to human beings. A series of major, infectious, nontropical zoonoses such as chickenpox, smallpox, influenza, and measles emerged as diseases of human beings in this manner. Other infectious diseases, both bacterial and viral, that were either waterborne or transmitted by fecal-oral contamination likewise emerged in the centers of human settlement, where populations were sufficiently dense to support "crowd diseases." The consequences of this onslaught of infectious disease were momentous. The river basin populations suffered greatly but eventually developed limited immunities, and when the disease-experienced populations made contact with non-disease-experienced populations, the result was demographic shock (and sometimes collapse), with high rates of mortality and morbidity, among non-disease-experienced populations.3

Historians of Africa have credited the difficult disease environment of tropical Africa with a vital role in shaping the history of African civilizations. In a broad sense, the disease environment has been understood [End Page 272] as a core challenge to the establishment of African civilizations.4 The difficult disease environment of tropical Africa is also held to have isolated tropical Africa, to some degree, from broader contacts with the wider world. It limited contacts between North Africa and sub-Saharan Africa and influenced the interactions between pastoral nomadic peoples and agricultural peoples in the sahel.5 It raised the health costs of both Atlantic and Indian Ocean maritime contacts and thereby constrained the nature and extent of those interactions. The tropical African disease barrier is generally accepted as the principal reason why Africa was not colonized by Europeans during the same era as the Americas and why the long centuries of the Atlantic slave trade along the western coast of Africa took place between European traders and independent African states.6 Until very recently, however, there was little evidence to suggest that there were older disease processes that shaped earlier periods of the African past.

The Genomes of the Malarial Parasites

Over the past several years, microbiologists have decoded the genomes for two of the principal malarial parasites that have played such an enormous role in world history: Plasmodium vivax and Plasmodium falciparum.7 In their recent article in Clinical Microbiology Reviews (2002), [End Page 273] Richard Carter and Kamimi N. Mendis have skillfully assayed the microbiological research literature and other natural science and historical literatures on malaria to produce a major synthesis on the "Evolutionary and Historical Aspects of the Burden of Malaria."8

Plasmodium vivax

The genetic molecular evidence for vivax indicates that it is very ancient and that it diverged from an earlier malarial parasite that infected both apes and protohuman beings approximately two to three million years ago. It seems likely that vivax originated in Africa because of the strikingly broad and concentrated distribution of the genetic mutation known as red blood cell Duffy antigen negativity (hereafter "Duffy negativity") in contemporary West and Central African populations.9 This genetic mutation is found on an antigen receptor on the hemoglobin molecule that is normally invaded by the vivax parasite. Duffy negativity renders the individual with this mutation unable to contract vivax malaria.10 [End Page 274]

When did this mutation arise? The molecular genetic evidence indicates that vivax diverged from a parasite of Old World monkeys, Plasmodium cynomologi, approximately two to three million years ago. It thereafter developed into a scourge of the hominids. It is not possible to estimate the extent of the disease burden that vivax imposed in earliest Africa—or even during the career of Homo sapiens over the last few hundred thousand years—because of the multitude of ecological variables that are involved in malaria transmission and the fact that these vary over space and time. It is certain, however, that this disease burden increased over time. Powerful evidence of this increasing burden can be inferred from the fact that the mosquitoes that are the principal vectors for malaria on the African continent, Anopheles gambiae and Anopheles funestus, developed a marked preference for human blood—rather than animal or bird blood—and became the most efficient of the world's malaria-transmitting, anopheline vectors.

It is not possible to estimate the morbidity or mortality costs imposed by the increasing malarial burden. It is possible, however, and perhaps even likely, that the comparatively low levels of virulence that are documented today in a variety of vivax malaria environments—with mortality usually in the range of 1 to 2 percent, with exceptional mortality up to 5 percent—are lower than they were in earlier epochs. This would be in keeping with a general pattern of the attenuation of virulence when diseases of the humid tropics are transferred to other environments that are either less humid or less warm (or both). In these conditions, infectious agents often make accommodations to facilitate their transmission under less favorable conditions.11

And there are other uncertainties that cloud even the big picture of Duffy negativity. It is not possible to determine where the genetic variation arose or if it arose independently in more than one population. [End Page 275] And yet at some point after the migrations of modern human beings out of tropical Africa—a process that is generally held to have taken place during the period 100,000–50,000 B.P. and perhaps 70,000–50,000 B.P.12—our early ancestors apparently succeeded largely in throwing off the burden of vivax malaria that would later torment populations elsewhere around the planet.13 An extraordinary 97 percent of West and Central African populations today carry the mutation for Duffy negativity and thus are physically unable to contract vivax malaria. This genetic adaptation has come without any negative health consequences, unlike other genetic adaptations that human beings have undergone in response to malarial infection.14

The Spread of Duffy Negativity

What forces might have proved conducive to this genetic adaptation? And when, approximately, did this genetic adaptation become widespread? There is no direct evidence that bears on these questions, and all attempts to answer them proceed from logical inference. A common view among microbiologists is that Duffy negativity likely occurred at the very end of the long period 97,200–6500 B.P. There are two strands of reasoning. One stresses the importance of climate change as a precursor [End Page 276] to the emergence of the "Neolithic agricultural revolution." In this view, climate change—the rapid warming of the planet at the end of the last glacial period—was responsible for more mosquitoes and thus an increase in the malarial burden.15 This increased burden was unstable, and the burden was further amplified when the climate began to warm. This warming was coincident with the beginnings of agriculture in tropical Africa, stimulated by the arrival of seeds and new technologies from the early river basin civilizations. According to this interpretation, the arrival of the "Neolithic agricultural revolution" provoked a vigorous biological response—progress toward the near fixation in the West and Central African populations of Duffy negativity—that blunted the force of vivax malaria.16 According to this view of world history, human populations in humid tropical Africa did not begin to enjoy an increase in their rates of population growth and cultural development until the introduction of new seeds and technologies that emerged from the river basin civilizations and then diffused south.17

From the standpoint of recent Africanist archaeology, this interpretation of the history of Duffy negativity is somewhat problematic. If the selective fixation of Duffy negativity took place in the period beginning ca. 3000–2000 B.C.E., this adaptive genetic mutation would have required a high level of integration among disparate populations during a relatively brief period of several thousand years in order to be transmitted extensively throughout West and Central African populations to produce the near universal present-day distribution. [End Page 277] This high level of integration is not suggested by cultural or linguistic studies.

The archaeological record does, however, suggest other possibilities for understanding the emergence and fixation of Duffy negativity that are consonant with a new appreciation of Africa's role in early human history. In recent years, Africanist archaeologists have done much to revise the chronologies of human evolution that were developed based on the European archaeological record, in light of the longer African archaeological sequences. In the broadest sense, it is now clear that not only did Homo sapiens emerge in Africa, but that very early processes of human cultural evolution took place in tropical Africa.

The findings of Africanist archaeologists have overturned the "human revolution" model of development that held that "modern" behavior of human beings gained first expression in Europe in a burst of evolutionary change. Instead, the Africanist findings have located the origins of "modern" human behavior where one might have suspected—in Africa—and revised the basic chronology of early human history. The discovery of an early "middle Stone Age worked bone industry"—thought to represent large-scale seasonal fishing expeditions that would have necessitated seasonal settlement—located the early origins of complex thinking and behavior necessary for major cultural changes in east-central Africa rather than in Europe. This discovery also pushed back the temporal horizon of these cultural changes from ca. 35,000 B.P. to ca. 89,000 B.P.18 Indeed, it now appears that the transition from the middle Stone Age to the late Stone Age in tropical Africa was an extremely gradual process that took place over perhaps two hundred thousand years.

These new interpretations of the African archaeological evidence thus argue for an African origin of new human cultural behaviors that arose at varying times and places across the African continent, and that these behaviors were exported with the African migrants who left to populate other regions of Eurasia. The archaeological record of these cultural achievements, although interrupted during glacial maxima in some parts of Africa such as the Sahara and the interior of Cape Province [End Page 278] in South Africa, are sufficiently numerous in other African biomes, including the expanses of steppe, savanna, and woodland, to support the idea of the continuous presence of probably widely dispersed human populations across these diverse environments.19

Africanist archaeologists have also documented the intensification of African cultural practices after about 50,000 B.P.—including novel technologies such as new projectiles that increased the productivity of hunting and new fishing methods that allowed for the more efficient exploitation of a vast resource—which meant that the newly available technologies could support larger populations in a given territory. Moreover, these technologies apparently allowed for human groups to expand into new habitats, such as the tropical forest.20 For the study of early malaria, the cumulative significance of these archaeological findings is that they make the point that human communities living in tropical Africa did not wait until the arrival of a Neolithic "development package" ca. 3000 B.C.E. to expand their exploitation of new environments—including those that were ideal for the transmission of malarial infections.

The very early patterns of seasonal migration and riverbank settlement—tens of thousands of years before the last glacial maximum—would have provided ideal conditions for seasonal—and thus, unstable—malarial infection. In one of the ironies of malarial infection, it is the lower levels of infection (as compared to stable endemic malaria among settled communities) that create very dangerous patterns of transmission—what is known as unstable malaria.21 Today, vivax malaria in regions outside of Africa is associated with low levels of mortality (under 5 percent, and often in the range of 1 to 2 percent); this is thought to result from the ancient spread of vivax malaria to non-tropical-African regions in which stable rates of transmission produced limited immunity through repeated infection. According to this line of argument, unstable conditions in early Africa—with higher attendant mortality—would have produced selection pressures that strongly favored the spread of Duffy negativity.22 The upshot is that it is no [End Page 279] longer necessary to propose a dramatic "malaria revolution" in early tropical Africa that is a product of the introduction of a "Neolithic agricultural revolution," or to link the refractoriness to vivax malaria exclusively to a cycle of climate change. It appears far more likely that the refractoriness to vivax malaria emerged across the long process of human cultural evolution, in which human beings entered into new biomes and practiced seasonal settlement over a period of many tens of thousands of years.

This gradualist model of the spread of Duffy negativity is necessarily speculative. Its strengths are (1) that it is consonant with the microbiologists' broad time estimate for the emergence of Duffy negativity and (2) that it fits well with the archaeologists' interpretations of their data. The implications of the gradualist model are that, on the basis of available evidence, human refractoriness to vivax malaria as a result ofDuffy negativity appears to be the very earliest known chapter in human beings' genetic adaptation to vector-borne infectious disease and, indeed, the very earliest known chapter in humanity's long struggle with parasitic disease. This genetic adaptation appears to have occurred well before Eurasian populations domesticated livestock and thereby gained limited immunities to the parasites that were attached to the domesticated animals of the Eurasian steppe—the horse, cow, goat, sheep, camel, and yak—or, for that matter, the donkey of the Nilotic steppe.23

The demographic consequence of the emergence of Duffy negativity was likely significant. Its fixation reduced the disease burden for the communities who inherited the mutation and contributed thereby to an increase in their population. Ongoing population growth was further facilitated by the new technologies and strategies that human communities developed in the post–50,000 B.P. period. Thus, on this basis alone, all other things being equal, it seems probable that an accelerated process of population growth among some populations in sub-Saharan Africa began to occur in an early period—well before the arrival of seeds and new technologies from the early river basin civilizations. It is likely that Duffy negativity became even more widely expressed during the processes of more intensive rainforest exploitation that occurred in more recent millennia and that are discussed later in this essay. [End Page 280]

Plasmodium falciparum

The genetic molecular evidence concerning the emergence of modern Plasmodium falciparum is ambiguous, and microbiologists are not agreed on its interpretation. On the one hand, molecular analysis has shown that an ancestral parasite of simian falciparum malaria diverged from the even older parasite of bird malaria approximately 130 million years ago, and the molecular analysis strongly suggests that modern human falciparum malaria diverged from an ancestral form that was common to both apes and protohumans approximately 4 to 10 million years ago, very roughly coincident with the epoch during which the humanoid line diverged from the line of the African great apes.24 But another part of the parasite genome suggests a more recent development. One line of investigation has suggested that the modern form of falciparum diverged from a protofalciparum parasite very recently, that is, only in the period 8000–3000 B.C.E.25 And, most recently, a comparison of mitochondrial genome sequences has found a major stepwise growth in the parasite population (a proxy for the increased incidence of human infection) in the period 13,000–8000 B.C.E.26 These findings indicate that human beings in Africa experienced a marked increase in P. falciparum infections many millennia before the earliest (ca. 3000 B.C.E.) adoption of the package of Neolithic tools and seed agriculture by tropical African communities.27

The molecular evidence suggests that large-scale falciparum infections [End Page 281] may have occurred tens of thousands of years later than vivax infections. One likely reason can be found in differences in life cycles of the plasmodia: chains of vivax infection are easier to sustain among hunters and gatherers and in groups that practice seasonal settlement because vivax parasites have a period of incubation in their life cycle that takes place in the human liver and can last for six to nine months. After incubation in the liver, vivax malaria can relapse—for up to as long as three years after the initial infection. Thus, vivax parasites could travel with the seasonal settler back into the rainforest or woodland or savanna, and carriers would remain able to infect mosquitoes and continue the chain long after leaving the settlement site. By contrast, falciparum sufferers remained infectious for a far shorter period of time. For this reason, the transmission of falciparum malaria is far more dependent upon continuous high host density.28

The human genetic accommodations to falciparum infections were also fraught with difficulties. Members of human communities who suffered assaults from falciparum parasites in tropical Africa came to carry a genetic mutation of the hemoglobin molecule known as sickle cell hemoglobin, or hemoglobin S. In some tropical African communities today, up to 25 or 30 percent of the population have inherited the sickle cell gene from one of their parents and a normal hemoglobin gene from the other parent and are thus heterozygous for the mutation. Children who are heterozygous for the sickle cell gene have only one-tenth the risk of death from falciparum as do those who are homozygous for the normal hemoglobin gene.29 There is, however, a decided downside to the sickle cell. Those carriers who inherit the gene from both parents and thus are homozygous for sickle cell develop sickle cell anemia. They suffer an early death, before the age of reproduction. The sickle cell mutation thus conveys both costs and benefits to the human communities in which it becomes established. In genetic terms, the distribution of the sickle cell is thought to increase to the point where the aggregate survival advantages for heterozygous carriers balance the costs of sickle cell anemia to these communities. At this point it is considered to be in equilibrium, as a "balanced polymorphism."30 [End Page 282]

Early Processes of Rainforest Exploitation

Researchers in the fields of agronomy and archaeobotany have developed new understandings of early human exploitation of the African tropical rainforest environments. Their findings have complicated the simple and familiar model of the development of human societies that postulated that early (modern) human groups in tropical Africa were hunters, gatherers, and fishers; that after the introduction of seed agriculture, human groups settled and multiplied more rapidly; and that thereafter, human societies became complex. Recent research in archaeology and historical linguistics has upended a long-standing view that the introduction of agriculture from the early river basin societies triggered a revolutionary advance in tropical African societies. This older view can be thought of as the "diffusionist" model of development.

This diffusionist model has come under challenge from Africanist scholars who have investigated human development in humid tropical Africa in the late Stone Age. One dimension of their challenge is temporal: it involves a revision of our understanding of the era in which human groups began to exploit more fully the rainforest resources. The history of the oil palm (Elaeis guineensis Jacq.) and the West African white and yellow Guinea yams (Dioscoreae cayenensis and D. rotundata, respectively), all of which occurred naturally in the West African transitional biome between the woodlands and the forest, have been of particular research interest.31

The oil palm today is found in both West and West Central Africa. Specialists think that the occurrence of the oil palm is older in West Africa than in West Central Africa, and some hold that human beings probably played a role in its introduction and expansion into the rainforests of West Central Africa early in the Holocene period (7000– 2500 B.C.E.). By the late Holocene it had become a common element in the subsistence economy in both West and West Central Africa. According to the archaeobotanist M. A. Sowunmi, it is likely that the human use of fire in rainforest clearings prevented the regrowth of forest trees in the openings and thus brought about an anthropogenically enhanced expansion of the natural oil palm stands. The oil palm, from [End Page 283] which edible oils (both palm kernel oil and palm oil) can be extracted and which can be tapped for toddy to make wine, is one of the most economically useful plants in West and West Central Africa.32 This interpretation of the archaeobotanical evidence for the expansion of the oil palm holds that it is a proxy for the increased human presence in and exploitation of the rainforest biome.33

The question of the cultivation origins of the white and yellow Guinea yams, distinguished by their high caloric yields, has attracted the attention of researchers since the 1960s. In publications during the 1960s and 1970s, the preeminent scholar of West African yams, D.G.Coursey, advanced an evolutionary model of yam use that reached far back in time. He held that hunters and gatherers had exploited wild yams even before 60,000 B.P. and that in the period 45,000 to 15,000 B.P. the late Stone Age peoples had begun to develop ritual concepts and practices to protect the yam plants. According to Coursey, by 9000 B.C.E. these peoples had begun to develop a "protoculture" based on the replanting of selected wild plants.

In Coursey's model, a "diffusionist" impulse remained important. In his view, by 3000 to 2000 B.C.E. Neolithic grain cultivators, influenced by the agriculturalists of southwestern Eurasia, had moved south from the Middle Niger River valley, interacted with the "protoculturalists," and created yam cultivation. For Coursey, this yam culture began to spread deeper into the forest with the advent of iron working, around 500 B.C.E. The forest ecologies favored cultivation of the yam over grain crops, and yam growers could produce more calories and thus achieve numerical superiority over grain farmers and create complex culture systems.34 [End Page 284]

The concept of protocultivation has itself been revised since the time of Coursey's pioneering work. Recent scholarship has argued for early African yam practices that were highly productive but were not a stepping-stone on the path to cultivation. Edmond Dounias has termed this "paracultivation" to distinguish it from the evolutionistically weighted term "protocultivation." In essence, paracultivation consists of the voluntary reburial of the wild yam head after tuber harvesting. The plant is thereby maintained in its original environment.35 These practices apparently advanced the "ennoblement" of the guinea yams—that is, the genetic selection of the better yams to harvest and propagate that resulted in harvestable tubers that were far superior for comestible purposes to those produced by wild plants.

Linguists have also developed historical evidence of early yam cultivation. In the 1980s, the linguist Christopher Ehret judged that the words for "cultivation" and "yam" in proto-Niger-Congo date back to at least 8000 B.C.E. Recently, Ehret has synthesized a wealth of archaeological evidence and located the invention of "West African planting agriculture" by at least 8000 B.C.E.—thousands of years before the arrival of Neolithic seeds and agricultural techniques from the Fertile Crescent or before the emergence of the complex societies that arose in the middle valley of the Niger.36 These new understandings of early Holocene exploitation of the rainforest edges and rainforest openings for yam paracultivation and then yam cultivation delink further the African historical experience from the diffusionist model that stressed the adoption of the "Neolithic agricultural revolution" by hunters and gatherers in tropical Africa. [End Page 285]

An Immunological Gradient in Tropical Africa

Yam paracultivation and then yam cultivation took place in cleared woodlands near the rainforests or in rainforest openings. These microenvironments were created through the use of fire and the stone ax—and beginning in the first millennium B.C.E., the iron ax. There were direct epidemiological implications. The fire-cleared openings in woodland and rainforest were the ideal environments for the gambiae and funestus mosquitoes to breed in. (These mosquitoes do not breed in swamps or full rainforests.) The cultivation of yams entailed lengthier residence and work in sites conducive to anopheline mosquito breeding, and thus encounters with falciparum malaria would have intensified with the shift from paracultivation to cultivation. It seems likely that at this era the increasing burden of falciparum parasites—with many first attacks fatal and others debilitating—began to select for a genetic mutation to mitigate the damage.37

Within the early tropical woodland and rainforest settlements, new disease dynamics became established. When yam paracultivators became yam cultivators and remained near their plantings, this created an environment of continual malarial infection, with paradoxical consequences. Individuals who survive the initial bouts of falciparum malaria and live in environments of endemic infection with high parasite loads develop short-term partial immunities that greatly reduce suffering from the disease. In this respect, after the initial onslaught of falciparum malaria, the survivors are much safer if they continue to live in a settlement of continuous stable infection. When individuals leave, for periods of even less than one year, their immunities deteriorate.

The world of the early yam-growing tropical African village was thus, in epidemiological terms, a different disease world from the rainforest [End Page 286] that surrounded it. When hunters and gatherers made even brief contact with villages of stable falciparum infection—yet where the villagers themselves appeared to be in good health—these contacts produced sharply elevated mortality and morbidity among hunters and gatherers, who were at ongoing risk for death and debilitation at successive contacts. Thus, the establishment of permanent settlements intropical Africa created an "immunological gradient" that shifted steeply from villagers to hunters and gatherers in the rainforests and woodlands around them.

Tropical Vegecultural Frontiers

From the woodlands of West Africa, pioneers extended the practice of vegeculture, centered on yam cultivation, deep into the rainforests. There were two major frontiers of this multiregional, vegecultural expansion. One was within West Africa itself. The West African speakers of languages of the "West Atlantic" grouping of the Niger-Congo language family moved south from the savanna and woodlands into the rainforest belt above the Gulf of Guinea. These rainforests were not entirely unfamiliar to them, and indeed there were no autochthonous peoples who spoke languages from a different language grouping in the West African rainforests.38 From the point of view of historical linguistics, these movements have attracted relatively little scholarly attention.

A second frontier of expansion was initiated from the border of what is today Nigeria and Cameroon. From this region, Bantu speakers spread across tropical Africa to the south and east, into West Central and East Africa. They encountered hunters and gatherers who were Batwa speakers, and over a period of a few thousand years, in two great migrations (5000–4000 B.C.E. and 1500–500 B.C.E.), the Bantu speakers spread their languages over the vast expanse of equatorial Africa and much of eastern Africa. Because of the relatively recent periods in which these migrations took place and the phenomenal extension of the Bantu language zone, the Bantu speakers' phases of [End Page 287] expansion have attracted the attention of scholars from a variety of disciplines.39

Recently, Kairn Klieman has brilliantly reconceptualized the relationship between Bantu-speaking villagers and Batwa-speaking peoples ("Pygmies"). She argues from cultural and linguistic evidence thatthe relationship between Bantu-speaking immigrants and Batwa-speaking hunting and gathering authochthons that began with the first Bantu migration (5000–4000 B.C.E.) was profoundly changed with the introduction of the banana/plantain complex and iron working during the second Bantu expansion, in the late Stone to Metal Age (1500–500 B.C.E.).40

Probably during the end of the second millennium B.C.E. or the first half of the first millennium B.C.E., the plantain/banana complex spread west from eastern Africa into the central equatorial rainforests.41 There it likely took over the role of the staple food.42 The plantain/ banana complex, along with yam cultivation and, by this late date, some use of seed agriculture, provided a robust basis for village settlement. In the same ways that the yam and oil palm played important roles in the opening of the rainforests during the late Holocene, in later millennia tropical Africans widely adopted the plantain/banana complex and incorporated it into the heart of their rainforest village economies.

Over the course of these "Bantu" migrations, and particularly following the adoption of the banana/plantain complex, rainforest villagers were able to establish larger permanent settlements in the rainforests that became centers of falciparum infection.43 This disease [End Page 288] process must have been one of the major factors that led to the replacement of non-Bantu-speaking peoples with Bantu-speaking peoples over vast areas of the continent in a very slow process that unfolded over many centuries.44 Falciparum malaria would have dramatically reduced the numbers of all peoples who visited the village zones of stable malaria, much as it dramatically killed European visitors millennia later during the years of the Atlantic slave trade, the era of the "white man's grave."

Historians have hitherto sought explanations for the expansion of the Bantu speakers in other material processes, such as the adoption of yam cultivation (which yields large numbers of calories and would have contributed to population growth) and the adoption of new iron technologies. The pairing of yams and iron, however, has been criticized on the basis that iron tools are not necessary for yam cultivation. (Wooden digging sticks are admirably suited to the work.) Others have held that in the Bantu migrations, the principal use of iron was for weaponry, strengthening the ability of the Bantu speakers to dominate in war.45 This argument is greatly weakened by the lack of any archaeological or linguistic evidence of such warfare.

A more plausible interpretation emerges from the consideration of the processes of rainforest exploitation and the dynamics of falciparum infection. The expansion of the zone of Bantu language speakers appears to be based upon the demographic advantages of high-yielding yam and plantain/banana cultivation, in conjunction with a tropical falciparum malarial "immunological gradient." In this light, the [End Page 289] processes of the Bantu expansions are direct analogues to the expansions of the disease-experienced peoples of early village Eurasia.46

Figure 1. Hemoglobin S (Sickle-Cell Gene S) distributions. Reproduced from Stuart J. Edelstein, The Sickled Cell: From Myths to Molecules (Cambridge, Mass.: Harvard University Press, 1986), 3, p. 149 with permission from Harvard University Press. (Adapted from Yuet Wai Kan and Andree M. Dozy, "Evolution of the Hemoglobin S and C Genes in World Populations," Science, n.s. 209, no. 4454 [1980], 388–391. Copyright 1980 by the AAAS.)
Click for larger view
View full resolution
Figure 1.

Hemoglobin S (Sickle-Cell Gene S) distributions. Reproduced from Stuart J. Edelstein, The Sickled Cell: From Myths to Molecules (Cambridge, Mass.: Harvard University Press, 1986), 3, p. 149 with permission from Harvard University Press. (Adapted from Yuet Wai Kan and Andree M. Dozy, "Evolution of the Hemoglobin S and C Genes in World Populations," Science, n.s. 209, no. 4454 [1980], 388–391. Copyright 1980 by the AAAS.)

The contemporary distribution of sickle cell mutations in West and Central Africa bears eloquent witness to the end results of these early processes. Today, sickle cell mutations occur in two major independent groups identified on the basis of the location on the gene of the hemoglobin mutation: 7.6-kb fragment and 13-kb fragment. The [End Page 290] first is found in Central Africa in the Zaire basin. The second is found in the Niger delta region. A third independent origin of sickle cell mutation, which is less severe in its consequences for homozygous individuals, is centered in Sierra Leone.47

The map in Figure 1 displays the spatial distribution of the three independent mutations of the sickle cell mutations. As Stuart Edelstein, one of the leading authorities on sickle cell, has argued, both the 13-kb and 7.6-kb fragments offer the same protection against falciparum malaria for those heterozygous for the mutations and the same costs for those who are homozygous. The mutations could not have arisen prior to the Bantu expansions, otherwise, the 7.6-kb mutation, to establish itself so prominently, would have had to confer a decisive advantage over the 13-kb mutation, which it does not. Edelstein estimated that these mutations arose independently in recent millennia, probably during the first millennium B.C.E. and/or the first millennium C.E.48 This estimate would be roughly coincident with or follow the full adoption of the plantain/banana complex. Thus, sickle cell, even if now widely distributed, appears to have emerged only in the final centuries of, or even in the aftermath of, the second Bantu expansion.

Conclusion

This essay examines microbiologists' evidence for the genomes for vivax and falciparum malaria as means to explore demographic processes in early tropical Africa that predate the food revolutions of the early river basin civilizations. It suggests that there were very early processes of growth and change in tropical Africa that unfolded independently from the historical processes that took place in the great river basin civilizations, including that of the Middle Niger. The molecular [End Page 291] biological evidence suggests that malaria was a principal constraint to population growth in tropical Africa and that this demographic challenge existed well before the establishment of permanently settled communities. The challenge of vivax malaria apparently began to be met by the emergence of Duffy negativity long before the rise of the river basin civilizations. Refractoriness to vivax meant ipso facto an enhanced possibility of demographic growth. The development of new fishing and hunting technologies after 50,000 B.P. increased the ease with which human groups harvested food and thereby contributed to population growth. Indeed, it is possible that this population pressure was in part responsible for the increasing forays into the forests and the early transition from yam gathering to paracultivation and then to cultivation.49 In these contexts, Duffy negativity became more widely expressed in tropical African populations.

The demographic challenge created by falciparum malaria is more recent. With the transition of yam paracultivators to yam cultivators in the rainforests, human communities established lengthier periods of residency and provided the critical requirement of more continual host density. This created village zones of falciparum infection that exacted high mortality and morbidity costs from hunters and gatherers who could not acquire the limited immunities that came with village life. With the expansion of the zones of yam cultivation, this immunological gradient played a role in the expansion of Bantu-speaking peoples. After the adoption of the plantain/banana complex, rainforest village communities became larger and more stable epidemiologically, and it is in these contexts that the sickle cell mutation spread and that Duffy negativity came closer to its contemporary near universal distribution.

James I. A. Webb Jr.
Colby College

Footnotes

1. For a recent effort to bring the early epochs into world history, see David Christian's fine introductory chapters "Beginnings: The Era of Foragers" and "Acceleration: The Agrarian Era" in the five-volume Berkshire Encyclopedia of World History, ed. William H. McNeill (Great Barrington, Mass.: Berkshire Publishing Group, 2005), 1:1-35.

2. William H. McNeill, Plagues and Peoples (New York: Anchor Books, 1977). McNeill's broad schema has been widely accepted. In a recent authoritative article, David E. Stannard cites McNeill repeatedly in portraying the broad outlines of human disease history. See Stannard, "Disease, Human Migration, and History," in The Cambridge World History of Human Disease, ed. Kenneth F. Kiple (Cambridge: Cambridge University Press, 1993), pp. 35-44.

3. These themes have been reprised to great success by Jared Diamond in his book Guns, Germs, and Steel (New York: W. W. Norton & Company, 1997).

4. For the Cambridge historian John Iliffe, for example, the core theme in African history has been the struggle of Africans to overcome their environment to achieve positive rates of population growth. John Iliffe, The Africans: History of a Continent (Cambridge: Cambridge University Press, 1995). Some historians have found Iliffe's insistence on the primacy of this theme in the history of recent centuries to be excessive.

5. James L. A. Webb Jr., Desert Frontier: Ecological and Economic Change along the Western Sahel, 1600-1850 (Madison: University of Wisconsin Press, 1995).

6. Philip D. Curtin, "'The White Man's Grave': Image and Reality, 1780-1850," Journal of British Studies 1 (1961): 94-110, and Death By Migration (Cambridge: Cambridge University Press, 1989); Dennis G. Carlson, African Fever: A Study of British Science, Technology, and Politics in West Africa, 1787-1864 (Canton, Mass.: Watson Publishing International, 1984).

7. There are two other human malarial parasites: Plasmodium ovale and Plasmodium malariae. Both are far less lethal than falciparum, and they have received less attention from researchers. Other malaria-protective genetic mutations have emerged within African populations, such as thalassemias, glucose-6-phosphate dehydrogenase deficiency (also known as G6 PD deficiency) and hemoglobin C (a genetic variation that is allelic with that for sickle cell but that does not impose as severe costs for individuals who are homozygous for this variation). These polymorphisms are not as protective as Duffy negativity and the sickle cell mutation discussed in this paper. The historical issues concerning the emergence of these polymorphisms have not received as much scientific attention from the molecular biological community.

8. R. Carter and K. N. Mendis, "Evolutionary and Historical Aspects of the Burden of Malaria," Clinical Microbiology Reviews 15 (2002): 564-594.

9. The deep geographical origin of vivax is contested. One view holds that vivax malaria may have been present in the New World prior to European contact in the late fifteenth and early sixteenth centuries. Although most of the evidence suggests that this was not the case, recently scholars have called for biomolecular analysis of existing skeletal remains in the Amazon to reach a definitive conclusion. See Marcia Caldas de Castro and Burton H. Singer, "Was Malaria Present in the Amazon Before European Conquest? Available Evidence and Future Research Agenda," Journal of Archaeological Science 32 (2005): 337-340. The two principal views are that vivax originated either in Africa or in Southeast Asia. Because of the large number of variables and the nature of the evidence, there is latitude for diverse interpretation, and it is unlikely that the issue will be definitively settled. For a recent argument that vivax originated in Southeast Asia, see Ananias A. Escalante, Omar E. Cornejo, Denise E. Freeland, Amanda C. Poe, Ester Durrego, William E. Collins, and Altaf A. Lal, "A Monkey's Tale: The Origin of Plasmodium Vivax as a Human Malaria Parasite," Proceedings of the National Academy of Sciences 102, no. 6 (2005): 1980-1985. In this view, vivax never was present in Africa; Duffy antigen negativity is either the result of a genetic accommodation to another, unknown agent or a random genetic variation that became fixed. On the African origins of vivax, see Richard Carter in "Speculations on the Origins of Plasmodium vivax Malaria," Trends in Parasitology 19, no. 5 (2003): 214-219. For an overview of some of the complications in unraveling the history of vivax, see Stephen M. Rich, "The Unpredictable Past of Plasmodium vivax Revealed in Its Genome," Proceedings of the National Academy of Sciences 101, no. 44 (2004): 15,547-15, 548.

10. Duffy antigen negativity (FY*0) is one of three forms of mutation on the Duffy antigen receptor. Duffy A (FY*A) and Duffy B (FY*B) confer limited immunity to malaria. Neither FY*A or FY*B occur in tropical Africa. For more on Duffy mutations, see Martha T. Hamblin and Anna Di Rienzo, "Detection of the Signature of Natural Selection in Humans: Evidence from the Duffy Blood Group Locus," American Journal of Human Genetics66 (2000): 1669-1679; Martha T. Hamblin, Emma E. Thompson, and Anna Di Rienzo, "Complex Signatures of Natural Selection at the Duffy Blood Group Locus," American Journal of Human Genetics 70 (2002): 369-383. The biological process is termed "complement fixation." It involves the binding of a "complement" (a heat-sensitive, complex system in fresh human and other sera) to an antigen-antibody complex so that the complement is unavailable for subsequent reaction.

11. The issue of the evolution of virulence is complex and unresolved. For an introduction, see Alison P. Galvani, "Epidemiology Meets Evolutionary Ecology," Trends in Ecology and Evolution 18, no. 3 (2003): 132-139. See also Margaret J. Mackinnon and Andrew F. Read, "Virulence in Malaria: An Evolutionary Viewpoint," Philosophical Transactions of the Royal Society of London, Series B, Biological Sciences 359, no. 1446 (2004): 965-986.

12. Human populations apparently were greatly reduced in number, perhaps to as few as five thousand females, around approximately 70,000 B.P. This "population bottleneck" coincides with a cataclysmic eruption of the Toba volcano in Sumatra, which produced a series of volcanic winters. See Stanley H. Ambrose, "Late Pleistocene Human Population Bottlenecks, Volcanic Winter, and Differentiation of Modern Humans," Journal of Human Evolution 34 (1998): 623-651; Michael R. Rampino and Stanley H. Ambrose, "Volcanic Winter in the Garden of Eden: The Toba Supereruption and the Late Pleistocene Human Population Crash," in Volcanic Hazards and Disasters in Human Antiquity, ed. F. W. McCoy and G. Heiken, Geological Society of America special paper 345 (Boulder, Colo.: Geological Society of America, 2000). For critical interpretations of the Toba eruption and a rejoinder, see F. J. Gathorne-Hardy and W. E. H. Harcourt-Smith, "The Super-Eruption of Toba, Did It Cause a Human Bottleneck?" Journal of Human Evolution 45 (2003): 227-230, and Stanley H. Ambrose, "Did the Super-Eruption of Toba Cause a Human Population Bottleneck? Reply to Gathorne-Hardy and Harcourt Smith," Journal of Human Evolution 45 (2003): 231-237.

13. M. T. Hamblin and A. Di Rienzo, the authors of a recent genetic analysis, have proposed that a "selective sweep" toward the fixation of Duffy negativity in sub-Saharan populations began between 97, 200 and 6, 500 years ago. See their "Detection of the Signature of Natural Selection in Humans," pp. 1669-1679.

14. For a survey of genetic responses to malarial pressure, including sickle cell (discussed later in the present paper), see Carter and Mendis, "Evolutionary and Historical Aspects," pp. 570-573.

15. Carter and Mendis, "Evolutionary and Historical Aspects," pp. 577-578. The last glacial maxima in tropical Africa have been estimated at between 22,000 and 12,000 B.P. See A. S. Brooks and P. T. Robertshaw, "The Glacial Maxima in Tropical Africa: 22,000-12, 000 B.P.," in The World at 18,000 B.P., vol. 2, Low Latitudes, ed. O. Soffer and C. Gamble (London: Unwin Hyman, 1990), pp. 121-169.

16. The thesis that resistance to malaria is tied to the "agricultural revolution" in Africa is broadly accepted. See, for example, Nina L. Etkin, "The Co-Evolution of People, Plants, and Parasites: Biological and Cultural Adaptations to Malaria," Proceedings of the Nutrition Society 62 (2003): 311-317; J. C. C. Hume, J. Lyons, and K. P. Day, "Human Migration, Mosquitoes and the Evolution of Plasmodium falciparum," Trends in Parasitology 19, no. 3 (2003): 144-149.

17. This argument was first advanced by F. B. Livingstone in his classic article "Anthropological Implications of Sickle Cell Gene Distribution in West Africa," American Anthropologist 60 (1958): 533-562. Recently, M. Coluzzi has written in support of this thesis in "The Clay Feet of the Malaria Giant and Its African Roots: Hypotheses and Influences About the Origin, Spread, and Control of Plasmodium falciparum," Parassitologia 41 (1999): 277-283.

18. John E. Yellen, Alison S. Brooks, Els Cornelissen, Michael J. Mehlman, and Kathlyn Stewart, "A Middle Stone Age Worked Bone Industry from Katanda, Upper Semliki Valley, Zaire," Science 268, no. 5210 (1995): 553-556. For the interpretive comparison with Europe, see B. Bower, "African Finds Revise Cultural Roots," Science News 147, no. 17 (1995): 260.

19. Sally McBrearty and Alison S. Brooks, "The Revolution That Wasn't: A New Interpretation of the Origin of Modern Human Behavior," Journal of Human Evolution 39 (2000): 458.

20. Ibid., p. 532.

21. Populations that have no immediate experience with malaria can be subject to epidemic malaria. Epidemic malaria is also extremely dangerous. By definition, epidemic outbreaks are episodic; there are fewer prospects for genetic accommodation.

22. Carter and Mendis, "Evolutionary and Historical Aspects," p. 573.

23. For an ambitious, brilliant interpretation of the period 9000-1000 B.C.E., see Christopher Ehret, The Civilizations of Africa: A History to 1800 (Charlottesville: University of Virginia Press, 2002), pp. 59-158.

24. Some scholars estimate the period at five to seven million years ago for the divergence of modern falciparum malaria. A. A. Escalante and F. J. Ayala, "Phylogeny of the Malarial Genus Plasmodium, Derived from rRNA Gene Sequences," Proceedings of the National Academy of Sciences USA 91 (1994): 11,373-11, 377.

25. Carter and Mendis, "Evolutionary and Historical Aspects," especially pp. 572 and 578. See also Xin-zhuan Su, Jianbing Mu, and Dierdre Joy, "The 'Malaria's Eve' Hypothesis and the Debate Concerning the Origin of the Human Malaria Parasite Plasmodium falciparum," Microbes and Infection 5 (2003): 891-896; Daniel L. Hartl, "The Origin of Malaria: Mixed Messages from Genetic Diversity," Nature Reviews Microbiology 2 (2004): 15-22.

26. David J. Conway, "Tracing the Dawn of Plasmodium falciparum with Mitochondrial Genome Sequences," Trends in Genetics 19, no. 12 (2003): 671-674. See also David J. Conway, Caterina Fanello, Jennifer M. Lloyd, Ban M. A.-S. Al-Joubori, Aftab H. Baloch, Sushela D. Somanath, Cally Roper, Ayoade M. J. Aduola, Bert Mulder, Marinete M. Povoa, Balbir Singh, and Alan W. Thomas, "Origin of Plasmodium falciparum Malaria is Traced by Mitochondrial DNA," Molecular and Biochemical Parasitology 111 (2000): 163-171.

27. David J. Conway and Jake Baum issue a cautionary note on the problems of dating the recent emergence of modern falciparum. See their essay "In the Blood—The Remarkable Ancestry of Plasmodium falciparum," Trends in Parasitology 18, no. 8 (2002): 351-355.

28. Hamblin and Di Rienzo, "Detection of the Signature of Natural Selection in Humans," p. 1677.

29. Carter and Mendis, "Evolutionary and Historical Aspects," p. 571.

30. Frank B. Livingstone, "Malaria and Human Polymorphisms," Annual Review of Genetics 5 (1971): 33-64.

31. For an overview of the ecological history of West Africa, see James L. A. Webb Jr., "Ecology and Culture in West Africa," in Themes in West Africa's History, ed. Emmanuel Akyeampong (Athens: Ohio University Press, 2005).

32. J. G. Vaughn and C. A. Geissler, The New Oxford Book of Food Plants (Oxford: Oxford University Press, 1999), pp. 24-25.

33. M. Adebisi Sowunmi, "The Significance of the Oil Palm (Elaeis guineensis Jacq.) in the Late Holocene Environments of West and West Central Africa: A Further Consideration," Vegetation History and Archaeobotany 8 (1999): 199-210. These conclusions are tentative, and Sowunmi calls for palynological studies of terrestrial cores and more extensive archaeological study to confirm or refute them. For an interpretation that the expansion of the oil palm was due to climate change, see Jean Maley, with the collaboration of Alex Chepstow-Lusty, "Elaeis guineensis Jacq. (Oil Palm) Fluctuations in Central Africa during the Late Holocene: Climate or Human Driving Forces for This Pioneering Species," Vegetation History and Archaeobotany 10 (2001): 117-120.

34. D. G. Coursey, Yams: An Account of the Nature, Origins, Cultivation, and Utilisation of the Useful Members of the Dioscoreaceae (London: Longmans, 1967); "The Origins and Domestication of Yams in Africa," in Origins of African Plant Domestication, ed. J. R. Harlan, J. M. J. de Wet, and A. B. L. Stemler (Hague: Mouton, 1975), pp. 383-408.

35. Edmond Dounias, "The Management of Wild Yam Tubers by the Baka Pygmies in Southern Cameroon," African Study Monographs, suppl. 26 (March 2001): 135-156. Dounias defines the term "paracultivation" as "a combination of technical patterns and social rules which structure the exploitation of wild plants. This term characterizes a particular process of wild plant harvesting which aims at encouraging plant reproduction, so that the plant can be repeatedly exploited. Furthermore, the plant is voluntarily kept within its original environment, in order to better respond to the seasonal mobility of forest dwellers. The maintenance of plants in the forest is the key difference between paracultivation and protocultivation" (p. 137).

36. Christopher Ehret, "Historical /Linguistic Evidence for Early African Food Production," in From Hunters to Farmers, ed. J. D. Clark and S. A. Brandt (Berkeley: University of California Press, 1984), pp. 26-39; Civilizations of Africa, pp. 82-83. The Middle Niger Valley is an important agricultural region in which some dryland crops (notably pennisetum) and the wet crop "red rice" (Orzya glabberima) were domesticated. For an excellent overview, see Roderick McIntosh, The Peoples of the Middle Niger (Malden, Mass.: Blackwell Publishing, 1998).

37. Although it is not possible to specify exactly when the sickle cell mutation first developed or became common, specialists are generally agreed that the falciparum malarial parasites developed from within the tropical African woodland and rainforest environments and later spread into other biomes. The earliest human physical evidence of sickle cell anemia comes from the mummified remains of Egyptians from the fourth millennium B.C.E. (A. Marin, N. Cerutti, and E. Rabino Massa, "Use of the Amplification Refractory Mutation System [ARMS] in the Study of HBS in Predynastic Egyptian Remains," Bollettino della Società Italiana di Biologia Sperimentale 75, nos. 5-6 [1999]: 27-30.) This evidence suggests that falciparum may have spread from the upper Nile region to the lower Nile region. The issue of whether or not the sickle cell mutation arose separately in the lower Nile valley remains an open question. Today, the incidence of sickle cell along the entire length of the Nile is very low (0.0-2. 5 percent). Stuart Edelstein, The Sickled Cell: From Myths to Molecules (Cambridge, Mass.: Harvard University Press, 1986), fig. 7. 3 , p. 148.

38. See, for example, M. E. Kropp Dakubu, "The Peopling of Southern Ghana: A Linguistic Viewpoint," in The Archaeological and Linguistic Reconstruction of African History, ed. Christopher Ehret and Merrick Posnansky (Berkeley: University of California Press, 1982), pp. 245-255; and M. E. Kropp Dakubu, "Linguistics and History in West Africa," in Akyeampong, Themes in West Africa's History.

39. On the expansion of yam farming in early West Africa, see Bassey W. Andah, "Identifying Early Farming Traditions of West Africa," in The Archaeology of Africa: Food, Metals and Towns, ed. Thurston Shaw, Paul Sinclair, Bassey Andah, and Alex Okpoko (New York: Routledge, 1995), pp. 240-254. The expansion of rice farming into the western regions of the West African rainforests is thought to have taken place in the first millennium C.E.

40. Kairn Klieman, "The Pygmies Were Our Compass": Bantu and Batwa in the History of West Central Africa, Early Times to c. 1900 C.E. (Portsmouth, N.H.: Heinemann, 2003).

41. E. De Langhe, R. Swennen, and D. Vuylsteke, "Plantain in the Early Bantu World," Azania 29-30 (1996): 318-323; Christophe Mindzie Mbida, "Evidence for Banana Cultivation and Animal Husbandry during the First Millennium BC in the Forest of Southern Cameroon," Journal of Archaeological Science 27 (2000): 151-162; Christophe Mindzie Mbida, "First Archaeological Evidence of Banana Cultivation in Central Africa During the Third Millennium Before Present," Vegetation History and Archaeobotany 10 (2001): 1-6.

42. De Langhe, Swennen, and Vuylsteke, "Plantain," p. 158, citing Jan Vansina, Paths in the Rainforests (Madison: University of Wisconsin Press, 1999).

43. As Klieman notes, "Bantu populations grew in number, settled into larger more sedentary villages, and began to produce larger quantities and more diverse styles of ceramics. Iron tools and banana cultivation also allowed Bantu villagers to move into forested regions away from the original riverine routes of settlement. This phenomenon resulted in the formation of numerous new speech communities, especially during the Late Stone to Metal Age (1500-500 BCE). As was the case in other parts of Africa, the introduction of iron engendered a greater centralization of local economies and an increase in economic specialization" ("Pygmies Were Our Compass," pp. 123-124).

44. Jan Vansina, "New Linguistic Evidence and the 'Bantu Expansion,'" Journal of African History 36, no. 2 (1995): 173-195; Christopher Ehret, "Bantu Expansions: Re-Envisioning a Central Problem of Early African History," International Journal of African Historical Studies 34, no. 1 (2001): 5-41; Roland Oliver, Thomas Spear, Kairn Klieman, Jan Vansina, Scott MacEachern, David Schoenbrun, James Denbow, Yvonne Bastin, H. M. Batibo, and Bernd Heine in "Comments on Christopher Ehret, 'Bantu-History: Re-Envisioning the Evidence of Language,'" International Journal of African Historical Studies 34, no. 1 (2001): 43-87.

45. Jared Diamond has recently repopularized this notion of a Bantu military-industrial complex (Guns, Germs, and Steel, pp. 394-396).

46. McNeill, Plagues and Peoples, pp. 69-131.

47. This third mutation was first identified in Senegal. Individuals with the Senegalese pattern of sickle cell anemia produce higher levels of hemoglobin F, and this anemia may therefore be less severe. See Edelstein, Sickled Cell, pp. 148-149. Experts are not agreed on whether or not sickle cell hemoglobin mutations have a common ancestor. For the argument for a Middle Eastern origin of hemoglobin S and the diffusion of a single mutant, see F. B. Livingstone, "Who Gave Whom Hemoglobin S: The Use of Restriction Site Haplo-type Variation for the Interpretation of the Evolution of the β-s-Globin Gene," American Journal of Human Biology 1 (1989): 289-302.

48. Edelstein, Sickled Cell, pp. 55-56, 147-148. In the 1980s, it was thought that the two Bantu expansions had taken place in the second and first millennia B.C.E., respectively.

49. For the argument that population pressure played a major role in prehistory, see Mark Nathan Cohen, The Food Crisis in Prehistory: Overpopulation and the Origins of Agriculture (New Haven, Conn.: Yale University Press, 1977).

Additional Information

ISSN
1527-8050
Print ISSN
1045-6007
Launched on MUSE
2006-01-13
Open Access
No
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.