CrossFit South Rockland

Monday, May 16, 2011

Day 8... Refuting a Myth About Human Origins

Refuting a Myth About Human Origins

Homo sapiens emerged once, not as modern-looking people first and as modern-behaving people later
2011-03SheaF1.jpgClick to Enlarge ImageFor decades, archeologists have believed that modern behaviors emerged among Homo sapiens tens of thousands of years after our species first evolved. Archaeologists disagreed over whether this process was gradual or swift, but they assumed that Homo sapiens once lived who were very different from us. These people were not “behaviorally modern,” meaning they did not routinely use art, symbols and rituals; they did not systematically collect small animals, fish, shellfish and other difficult-to-procure foods; they did not use complex technologies: Traps, nets, projectile weapons and watercraft were unknown to them.
Premodern humans—often described as “archaic Homo sapiens”—were thought to have lived in small, vulnerable groups of closely related individuals. They were believed to have been equipped only with simple tools and were likely heavily dependent on hunting large game. Individuals in such groups would have been much less insulated from environmental stresses than are modern humans. In Thomas Hobbes’s words, their lives were “solitary, nasty, brutish and short.” If you need a mental image here, close your eyes and conjure a picture of a stereotypical caveman. But archaeological evidence now shows that some of the behaviors associated with modern humans, most importantly our capacity for wide behavioral variability, actually did occur among people who lived very long ago, particularly in Africa. And a conviction is growing among some archaeologists that there was no sweeping transformation to “behavioral modernity” in our species’ recent past.
As Misia Landau argued nearly a quarter of a century ago in the essay “Human Evolution as Narrative” (American Scientist, May–June 1984), prescientific traditions of narrative explanation long encouraged scientists to envision key changes in human evolution as holistic transformations. The idea of an archaic-to-modern human transition in Homo sapiens arises, in part, from this narrative tradition. All this makes for a satisfying story, but it is not a realistic framework for understanding the actual, complex and contingent course of human evolution. Most evolutionary changes are relatively minor things whose consequences play out incrementally over thousands of generations.
In order to better understand human prehistory, I recommend another approach, one that focuses on behavioral variability. This trait, easily observed among recent humans, is becoming more apparent in the archaeological record for early Homo sapiens. Prehistoric people lived in different ways in different places at different times. We must seek out and explain those differences, for, in evolution, only differences matter. Thinking about prehistoric human behavioral variability in terms of various adaptive strategies offers an attractive way to explain these differences. But first, we need to discard an incorrect and outdated idea about human evolution, the belief that prehistoric Homo sapiens can be divided into “archaic” and “modern” humans.

An Idea Is Born

2011-03SheaF2.jpgClick to Enlarge ImageArchaeology’s concept of archaic versus modern humans developed as prehistoric archaeological research spread from Europe to other regions. The study of prehistoric people began in Europe during the 19th century in scientific societies, museums and universities. By the 1920s, discoveries made at a number of European archaeological sites had prompted a consensus about the Paleolithic Period, which is now dated from 12,000 to nearly 2.6 million years ago. Archaeologists divided this period into Lower (oldest), Middle, and Upper (youngest) Paleolithic phases. Distinctive stone-tool assemblages—or “industries”—characterized each phase. Archaeologists identified these industries by the presence of diagnostic artifact types, such as Acheulian hand axes (Lower Paleolithic), Mousterian scrapers made on Levallois flakes (Middle Paleolithic), and Aurignacian prismatic blades and carved antler points (Upper Paleolithic). The fact that tools from more recent industries were lighter, smaller and more heavily modified suggested there was a trend toward greater technological and cultural sophistication in the Paleolithic sequence. European Upper Paleolithic industries were associated exclusively withHomo sapiens fossils and Lower and Middle Paleolithic industries were associated with earlier hominins (Homo heidelbergensis and Homo neanderthalensis). This supported the idea that there were important evolutionary differences between modern Homo sapiensand earlier archaic hominins.
Early Upper Paleolithic contexts in Europe preserve evidence for prismatic blade production, carved bone tools, projectile weaponry, complex hearths, personal adornments, art, long-distance trade, mortuary rituals, architecture, food storage and specialized big-game hunting, as well as systematic exploitation of smaller prey and aquatic resources. Furthermore, the variability of these behaviors within the Upper Paleolithic is much greater than that seen in earlier periods. In much the same way that anthropologists have documented cultural variability among recent humans, archaeologists can easily tell whether a particular carved bone point or bone bead is from a site in Spain, France or Germany. Not surprisingly, most prehistorians accept that the archaeology of the Upper Paleolithic is, in effect, “the archaeology of us.”
Lower and Middle Paleolithic stone tools and other artifacts found in Europe and elsewhere vary within a narrow range of simple forms. Properly equipped and motivated modern-day flintknappers (people who make stone tools) can turn out replicas of any of these tools in minutes, if not seconds. Many of the differences among Lower and Middle Paleolithic artifacts simply reflect variation in rock types and the extent to which tools were resharpened. Geographic and chronological differences among Middle and Lower Paleolithic tools mostly involve differences in relative frequencies of these simple tool types. Nearly the same range of Lower and Middle Paleolithic stone tool types are found throughout much of Europe, Africa and Asia.
The differences between the Lower/Middle and Upper Paleolithic records in Europe are so pronounced that from the 1970s onward prehistorians have described the transition between them as “The Upper Paleolithic Revolution.” This regional phenomenon went global in the late 1980s after a conference at Cambridge University entitled “The Human Revolution.” This revolution was portrayed as a watershed event that set recent modern humans apart from their archaic predecessors and from other hominins, such as Homo neanderthalensis. The causes of this assumed transformation were hotly debated. Scientists such as Richard Klein attributed the changes to the FOXP2 polymorphism, the so-called language gene. But the polymorphism was eventually discovered in Neanderthal DNA too. Many researchers—such as Christopher Henshilwood of the University of Witwatersrand, Curtis Marean of Arizona State University, Paul Mellars of the University of Cambridge, April Nowell of the University of Victoria and Phil Chase of the University of Pennsylvania—continue to see symbolic behavior as a crucial component of behavioral modernity. Yet as João Zilhão of the University of Bristol and Francesco d’Errico of the University of Bordeaux have argued, finds of mineral pigments, perforated beads, burials and artifact-style variation associated with Neanderthals challenge the hypothesis that symbol use, or anything else for that matter, was responsible for a quality of behavioral modernity unique to Homo sapiens.

The Missing Revolution

2011-03SheaF3.jpgClick to Enlarge ImageIn fact, fossil evidence threatening the Upper Paleolithic revolution hypothesis emerged many decades ago. At about the same time the Paleolithic framework was developed during the 1920s and 1930s, European-trained archaeologists began searching for human fossils and artifacts in the Near East, Africa and Asia. Expatriate and colonial archaeologists such as Dorothy Garrod and Louis Leakey expected that the European archaeological record worked as a global model for human evolution and used the European Paleolithic framework to organize their observations abroad. Very quickly, however, they discovered a mismatch between their expectations and reality when Homo sapiens remains outside Europe were found with Lower or Middle Paleolithic artifacts. Archaeologists started assuming then that the remains dated to periods just before the Upper Paleolithic revolution. But in fact, those discoveries, as well as more recent finds, challenge the notion that the revolution ever occurred.
In Europe, the oldest Homo sapiens fossils date to only 35,000 years ago. But studies of genetic variation among living humans suggest that our species emerged in Africa as long as 200,000 years ago. Scientists have recovered Homo sapiensfossils in contexts dating to 165,000 to 195,000 years ago in Ethiopia’s Lower Omo Valley and Middle Awash Valley. Evidence is clear that early humans dispersed out of Africa to southern Asia before 40,000 years ago. Similar modern-looking human fossils found in the Skhul and Qafzeh caves in Israel date to 80,000 to 120,000 years ago. Homo sapiensfossils dating to 100,000 years ago have been recovered from Zhiren Cave in China. In Australia, evidence for a human presence dates to at least 42,000 years ago. Nothing like a human revolution precedes Homo sapiens’ first appearances in any of these regions. And all these Homo sapiens fossils were found with either Lower or Middle Paleolithic stone tool industries.
There are differences between the skeletons of these early Homo sapiens and Upper Paleolithic Europeans. The best-documented differences involve variation in skull shape. Yet, as Daniel Lieberman of Harvard University writes in the recently published The Evolution of the Human Head, we are just beginning to understand the genetic and behavioral basis for variation in human skulls. [Editor’s note: Lieberman’s book is reviewed on page 158.] It makes no sense whatsoever to draw major evolutionary distinctions among humans based on skull shape unless we understand the underlying sources of cranial variation. There is no simple morphological dividing line among these fossil skulls. Most fossils combine “primitive” (ancestral) characteristics as well as “derived” (recently evolved) ones. Even if physical anthropologists divided prehistoric humans into archaic and modern groups, it would be foolish for archaeologists to invoke this difference as an explanation for anything unless we knew how specific skeletal differences related to specific aspects of behavior preserved in the archaeological record.
Early Homo sapiens fossils in Africa and Asia are associated with “precocious,” or unexpectedly early, evidence for modern behaviors such as those seen in the European Upper Paleolithic. They include intensive fish and shellfish exploitation, the production of complex projectile weapons, the use of symbols in the form of mineral pigments and perforated shells, and even rare burials with grave goods in them. But as Erella Hovers and Anna Belfer-Cohen of The Hebrew University of Jerusalem argued in a chapter of Transitions Before the Transition, “Now You See It, Now You Don’t—Modern Human Behavior in the Middle Paleolithic,” much of this evidence is recursive. It is not a consistent feature of the archaeological record. Evidence for one or more of these modern behaviors appears at a few sites or for a few thousand years in one region or another, and then it vanishes. If behavioral modernity were both a derived condition and a landmark development in the course of human history, one would hardly expect it to disappear for prolonged periods in our species’ evolutionary history.
For me, the most surprising aspect about the debate regarding when Homo sapiens became human is that archaeologists have not tested the core hypothesis that there were significant behavioral differences between the earliest and more recent members of our species. Because modernity is a typological category, it is not easy to test this hypothesis. One is either behaviorally modern or not. And, not all groups classified as behaviorally modern have left clear and unambiguous evidence for that modernity at all times and in all contexts. For example, expedient and opportunistic flintknapping of river pebbles and cobbles by living humans often creates stone tools indistinguishable from the pebble tools knapped by Homo habilis or Homo erectus. This similarity reflects the nature of the tool-making strategies, techniques and raw materials, not the evolutionary equivalence of the toolmakers. Thus, the archaeological record abounds in possibilities of false-negative findings about prehistoric human behavioral modernity.
This issue caught my interest in 2002 while I was excavating 195,000-year-old archaeological sites associated with early Homo sapiens fossils in the Lower Omo River Valley Kibish Formation in Ethiopia. I am an archaeologist, but I am also a flintknapper. Nothing about the stone tools from Omo Kibish struck me as archaic or primitive. (When I teach flintknapping at my university, I have ample opportunity to see what happens when people with rudimentary skills try to knap stone and how those skills vary with experience and motivation.) The Omo Kibish tools showed that their makers had great versatility in effectively knapping a wide range of rock types. This set me to thinking: Have we been asking the wrong questions about early humans’ behavior?

A Better Way

Documenting and analyzing behavioral variability is a more theoretically sound approach to studying differences among prehistoric people than searching for the transition to behavioral modernity. Nearly everything humans do, we do in more than one identifiably different way. As Richard Potts of the Smithsonian Institution argued in Humanity’s Descent in 1996, our species’ capacity for wide behavioral variability appears to be uniquely derived. No other animal has as wide a behavioral repertoire as Homo sapiens does. And variability can be investigated empirically, quantitatively, and with fewer problems than occur in ranking prehistoric people in terms of their modernity.
2011-03SheaF4.jpgClick to Enlarge ImageOne way to gauge early Homo sapiens’ behavioral variability is to compare their lithic technologies. Lithics, or stone tools, are nearly indestructible and are found everywhere hominins lived in Pleistocene times. Stone tools do not tell us everything we might wish to know about prehistoric human behavior, but they are no less subject to the selective pressures that create variation in other types of archaeological evidence. Lithic artifacts made by recent humans are more complex and variable than those associated with early hominins. Early Paleolithic stone tools are more complex and variable than those made by nonhuman primates. Thus, there is reason to expect that analysis of these tools will produce a valid signal about earlyHomo sapiens’ capacity for behavioral variability. Eastern Africa is an especially good place in which to compare early and later Homo sapiens’ stone technology because that region preserves our species’ longest continuous archaeological record. Restricting this comparison to eastern Africa minimizes the complicating effects of geographic constraints on stone-tool technology.
One of the most popular ways of describing variation among stone-tool industries is a framework that the British archaeologist Grahame Clark proposed in World Prehistory: A New Synthesis (1969). This framework describes lithic technological variability in terms of five modes of core technology. (In flintknapping, “cores” are the rocks from which flakes are struck, with the flakes later developed into various kinds of tools.) Variation in core technology is thought to reflect differences in ecological adaptations. Clark’s framework is a crude instrument, but it can be made into a reasonably sensitive register of technological variability if we simply note which of these modes are represented in each of a series of lithic assemblages. When it is applied to sites in eastern Africa dating 284,000 to 6,000 years ago, a more complex view of prehistoric life there emerges. One does not see a steady accumulation of novel core technologies since our species first appeared or anything like a “revolution.” Instead one sees a persistent pattern of wide technological variability.
What does this variability mean? Archaeologists’ understanding of lithic technology continues to grow, from experiments, from studies of recent stone-tool-using groups and from contextual clues in the archaeological record. Our understanding is far from perfect, but we do know enough to make some plausible interpretations. Pebble-core reduction (mode 1 in Clark’s framework), in which toolmakers strike flakes opportunistically from rounded pebbles or cobbles, is the simplest way to obtain a cutting edge from stone. Stone tools are still made this way in rural parts of eastern Africa. Its ubiquity in the archaeological assemblages probably reflects a stable strategy of coping expediently with unpredictable needs for cutting edges.
2011-03SheaF5.jpgClick to Enlarge ImageLarge bifacial core tools (mode 2) are thought to have been dual-purpose tools. Their heft and long cutting edges make them effective in heavy-duty tasks, such as woodworking or the butchering of large animal carcasses. Thinner flakes knapped from bifacial core tools can be used for lighter-duty cutting or retouched into more functionally specialized forms. In recent archaeological contexts, large bifacial cutting tools are often correlated with people who moved their residences frequently, whereas expedient pebble cores are correlated with more lengthy occupations. High topographic relief and wide seasonal variation in rainfall make residential stability a difficult thing for even recent eastern African pastoralist groups to achieve. The persistence of this technology may reflect relatively high residential mobility among prehistoric eastern Africans.
The behavioral correlates of Levallois prepared-core technology (mode 3) are less clear, if only because the term encompasses so many different core-knapping strategies. Some archaeologists see Levallois prepared cores as reflecting knappers’ efforts to obtain desired tool shapes, or to produce relatively broad and thin flakes that efficiently recover cutting edge. These hypotheses are not mutually exclusive, and in the long run, each of them probably explains some part of why people made such cores in eastern Africa for a very long time.
2011-03SheaF6.jpgClick to Enlarge ImagePrismatic-blade core technology (mode 4) involves detaching long rectangular flakes one after another from a cone-shaped core. The most widely repeated hypothesis about the appeal of prismatic-blade production is that it produces greater amounts of cutting edge per unit mass of stone than other strategies. However, recent experiments by Metin Eren at Southern Methodist University and his colleagues have shown that this hypothesis is wrong. A far more likely appeal of this strategy is that the blades’ morphological consistency makes them easier to attach to a handle. Attaching a stone tool to a handle vastly increases leverage and mechanical efficiency, but it also restricts the range of tool movement and limits the portion of the tool that can be resharpened. The comings and goings of blade core technology in eastern Africa probably reflect a complex interplay of these strategic considerations.
Differing amounts of geometric microlithic technology (mode 5) are preserved in the most ancient and most recent assemblages in the east African sample. Geometric microliths are small tools made by segmenting blades or flakes and shaping them into triangles, rectangles, crescents and other geometric forms by blunting one or more of their edges. Too small to have been useful while hand-held, geometric microliths were almost certainly used as hafted tools. They are easy to attach to handles, making them suitable for use as projectile armatures, woodworking tools and aids to preparing plant foods. Archaeologists view microlithic stone-tool technology as a strategy for optimizing versatility and minimizing risk. Microlithic technologies first appear and proliferate among African and Eurasian human populations from about 50,000 years ago to around 10,000 years ago. This was a period of hypervariable climate, and it makes a certain amount of sense that humans at that time devised versatile and efficiently transportable stone tools. If, for example, climate change required people to frequently shift from hunting game to reaping grasses and back again, using microlith-barbed arrows and microlith-edged sickles would allow them to do this efficiently, without any major change to their technological strategies. Because microlithic tools are small, they preserve high ratios of cutting edge to mass. This means that if climate shifts required more seasonal migrations, individuals transporting microliths would be carrying the most cutting edge per unit mass of stone. Variability in the use of microlithic technology in eastern Africa probably reflects strategic responses to environmental unpredictability along with efforts to cope with increased subsistence risk by optimizing versatility in stone-tool technology.
How do the differences between earlier and later eastern African core technologies compare to variation among recent stone-tool-using humans? The range of variability in recent human stone-tool technology is greater than that in the eastern African sample. All five of Clark’s modes are to be found among the lithic technology of recent humans. Yet some technologies are not represented in the African sample. For example, more than 30,000 years ago in Australia, and later elsewhere, people began grinding and polishing the edges of stone tools. Such grinding and polishing reduces friction during work, making cutting tools more efficient to use and resharpen. In the New World, ancestral Native American flintknappers deployed a wide range of bifacial-core technologies fundamentally different from those seen in eastern Africa. They used these tools in contexts ranging from hunter-gatherer campsites on the Great Plains to Mesoamerican city-states like Teotihuacan. Differences in recent stone-tool technology reflect variability in adaptive strategies. No anthropologists in their right minds would attribute this variability to evolutionary differences among recent humans. If this kind of explanation makes so little sense in the present, what possible value can it have for explaining past behavioral variability among Homo sapiens?
The lithic evidence reviewed here challenges the hypothesis that there were significant behavioral differences between the earliest and more recent members of our species in eastern Africa. Obviously, there is more to human behavioral variability than what is reflected in stone tools. Using Clark’s technological modes to capture that variability, as described here, is just a first step. But it is a step forward. This emphasis on variability will gain strength if and when it is supported by more detailed analyses of the stone tools and by other archaeological evidence.

Abandoning a Myth

2011-03SheaF7.jpgClick to Enlarge ImageOne could view these findings as just another case of precocious modern behavior by early Homo sapiensin Africa, but I think they have a larger lesson to teach us. After all, something is only precocious if it is unexpected. The hypothesis that there were skeletally modern-looking humans whose behavioral capacities differed significantly from our own is not supported by uniformitarian principles (explanations of the past based on studies of the present), by evolutionary theory or by archaeological evidence. There are no known populations of Homo sapiens with biologically constrained capacities for behavioral variability. Generations of anthropologists have sought in vain for such primitive people in every corner of the world and have consistently failed to find them. The parsimonious interpretation of this failure is that such humans do not exist.
Nor is there any reason to believe that behaviorally archaic Homo sapiens ever did exist. If there ever were significant numbers of Homo sapiens individuals with cognitive limitations on their capacity for behavioral variability, natural selection by intraspecific competition and predation would have quickly and ruthlessly winnowed them out. In the unforgiving Pleistocene environments in which our species evolved, reproductive isolation was the penalty for stupidity, and lions and wolves were its cure. In other words: No villages, no village idiots. If any such cognitive “winner take all” wipeout event ever happened, it was probably among earlier hominins (Homo ergaster/erectus or Homo heidelbergensis) or during the evolutionary differentiation of our species from these hominin ancestors.
Dividing Homo sapiens into modern and archaic or premodern categories and invoking the evolution of behavioral modernity to explain the difference has never been a good idea. Like the now-discredited scientific concept of race, it reflects hierarchical and typological thinking about human variability that has no place in a truly scientific anthropology. Indeed, the concept of behavioral modernity can be said to be worse than wrong, because it is an obstacle to understanding. Time, energy and research funds that could have been spent investigating the sources of variability in particular behavioral strategies and testing hypotheses about them have been wasted arguing about behavioral modernity.
Anthropology has already faced this error. Writing in the early 20th century, the American ethnologist Franz Boas railed against evolutionary anthropologists who ranked living human societies along an evolutionary scale from primitive to advanced. His arguments found an enthusiastic reception among his colleagues, and they remain basic principles of anthropology to this day. A similar change is needed in the archaeology of human origins. We need to stop looking at artifacts as expressions of evolutionary states and start looking at them as byproducts of behavioral strategies.
2011-03SheaF8.jpgClick to Enlarge ImageThe differences we discover among those strategies will lead us to new and very different kinds of questions than those we have asked thus far. For instance, do similar environmental circumstances elicit different ranges of behavioral variability? Are there differences in the stability of particular behavioral strategies? Are certain strategies uniquely associated with particular hominin species, and if so, why? By focusing on behavioral variability, archaeologists will move toward a more scientific approach to human-origins research. The concept of behavioral modernity, in contrast, gets us nowhere.
Even today, a caveman remains the popular image of what a prehistoric person looked like. This individual usually is shown with enlarged eyebrows, a projecting face, long hair and a beard. The stereotypical caveman is inarticulate and dim-witted, and possesses a limited capacity for innovation. In 2006, GEICO commercials put an ironic twist on this image. Their cavemen were more intelligent, articulate, creative and culturally sophisticated than many “modern” people. In a striking case of life imitating art, recent archaeological discoveries are overturning long-standing misconceptions about early human behavior.

Bibliography

  • Bar-Yosef, O. 2002. The Upper Paleolithic revolution. Annual Review of Anthropology 31:363–393.
  • Clark, G. 1969. World Prehistory: A New Synthesis. Cambridge: Cambridge University Press.
  • Klein, R. G. 2009. The Human Career, third edition. Chicago: University of Chicago Press.
  • Landau, M. L. 1984. Human evolution as narrative. American Scientist 72:262–268.
  • McBrearty, S., and A. S. Brooks. 2000. The revolution that wasn’t: A new interpretation of the origin of modern human behavior. Journal of Human Evolution 39:453–563.
  • Mellars, P., and C. Stringer. 1989. The Human Revolution: Behavioural and Biological Perspectives on the Origins of Modern Humans. Edinburgh: Edinburgh University Press.
  • Nowell, A. 2010. Defining behavioral modernity in the context of Neandertal and anatomically modern human populations. Annual Review of Anthropology 39:437–452.
  • Potts, R. 1998. Variability selection and hominid evolution. Evolutionary Anthropology 7(3):81–96.
  • Shea, J. J. 2008. The Middle Stone Age archaeology of the Lower Omo Valley Kibish Formation: Excavations, lithic assemblages, and inferred patterns of early Homo sapiens behavior. Journal of Human Evolution 55 (3):448–485.
  • Shea, J. J. 2011. Homo sapiens is as Homo sapiens was: Behavioral variability versus “behavioral modernity” in Paleolithic archaeology. Current Anthropology 52(1):1–35.

No comments:

Post a Comment