To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.

Behavioral modernity

From Wikipedia, the free encyclopedia

Upper Paleolithic (16,000-year-old) cave painting from Lascaux cave in France
Upper Paleolithic (16,000-year-old) cave painting from Lascaux cave in France
The Paleolithic
Pliocene (before Homo)

Behavioral modernity[1] is a suite of behavioral and cognitive traits that distinguishes current Homo sapiens from other anatomically modern humans, hominins, and primates. Although often debated, most scholars agree that modern human behavior can be characterized by abstract thinking, planning depth, symbolic behavior (e.g., art, ornamentation), music and dance, exploitation of large game, and blade technology, among others.[2][3] Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically. Some of these human universal patterns are cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin.[4][5] It has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Maximum, was largely responsible for the human replacement of Neanderthals and the other species of humans of the rest of the world.[3][6]

Arising from differences in the archaeological record, a debate continues as to whether anatomically modern humans were behaviorally modern as well. There are many theories on the evolution of behavioral modernity. These generally fall into two camps: gradualist and cognitive approaches. The Later Upper Paleolithic Model refers to the theory that modern human behavior arose through cognitive, genetic changes abruptly around 40,000–50,000 years ago.[7] Other models focus on how modern human behavior may have arisen through gradual steps; the archaeological signatures of such behavior only appearing through demographic or subsistence-based changes.[2][3][8][9][10]

YouTube Encyclopedic

  • 1/5
    5 243
    2 466
    20 924
    467 804
  • ✪ Impact of Tool Use and Technology on Evolution of the Mind - Leah Krubitzer John Shea Paula Tallal
  • ✪ One Of The Earliest Representations Of A Human Face And Hairstyle (Aurignacian, 30, 000 Years Ago)
  • ✪ Rationality, Irrationality and Human Nature
  • ✪ The Elephant in the Room: The Psychology of Innuendo and Euphemism
  • ✪ Mysteries of Unseen Physics


(classical music) - [Male] We are the paradoxical ape. Bipedal, naked, large brain, lone the master of tools, fire and language but still trying to understand ourselves. Aware that death is inevitable yet filled with optimism. We grow up slowly. We hand down knowledge. We empathize and deceive. We shape the future from our shared understanding of the past. Carta brings together experts from diverse disciplines to exchange insights on who we are and how we got here. An exploration made possible by the generosity of humans like you. (upbeat music) - Thank you for the invitation. It's been really exciting for me. This talk is gonna be about brains and I'm not gonna actually talk about primates or humans but I'm gonna come to some conclusions about humans and culture. So this is the brain of a bottlenose dolphin, the back of the brain of a bottlenose dolphin and this is the brain of a small marsupial and you can see the neocortex is really large and I'm gonna talk about the neocortex today, very specifically for a couple of reasons. One is that it's the part of the brain that's changed most dramatically in mammals over time and in humans in particular, they have an extremely large neocortex. It's the part of the brain that's involved in cognition, language, and things like tool use. The question that we're interested in addressing in my laboratory is how the brain gets more complex or how the neocortex gets more complex. So we know that early mammals had a neocortex that was very, very tiny and had a few cortical fields. And cortical fields are the functional units of the neocortex. And we know that as brains evolved, and particularly primate brains the neocortex became a really, really enormous cortical sheet that now is proposed, humans are proposed to have hundreds of cortical fields. So the question is how do you get from a very, very simple form to a very, very complicated form. But this is a serious problem to address because these types of changes, certainly from early mammals, are 200 million years of changes. And we can even see the sorts of changes that have occurred in the human line over a six million year period. So how do you address these changes? And there are two ways that you can get at this question of evolution. You can say what has evolution produced? And the way we can understand what evolution has produced is we look at brains and bodies, we do a comparative approach. So I can look at a lot of different brains using a variety of different techniques, electrophysiological, anatomical, and I can say, what sorts of changes have come about? The problem is evolution is a moving picture and the life of an individual is a moving picture. So any time we study an individual or a mammal, we're taking one snapshot of that moving picture of life and trying to figure it out. So what we do with a comparative analysis is we look at a lot of snapshots and try to put this moving picture back together. However, while these sort of comparative studies can tell us what evolution has produced, they don't tell us how phenotypic transformations occur, how do I get more cortical fields? How do I change connections of a cortex which in turn changes behavior? And this is where studies of development come into play because the evolution of any aspect of the body, the brain, is actually the evolution of developmental mechanisms that give rise to that aspect of the phenotype. So my talk is gonna focus a little bit on this. So comparative analysis, this is many, many years of work. This is a cladogram and these are neocortices. And you're gonna see these sort of cartoons throughout the talk. This is the neocortex. These red and blue and yellow areas indicate cortical fields. And the most important thing to take away from this is that you can have common ancestors, which we don't know about, humans, macaque monkeys, cats, squirrels. There is a constellation of cortical fields that all species possess, even in the absence of use, and this is due to inheritance from a common ancestry and it's also due to the way genes are deployed in development. But that's gonna be a different story. So there are similarities in brains. My brain is very similar to a mouse brain in some ways but it's also very different. And here are some of the sorts of differences you're gonna see. This is a flattened view of the neocortex. This is the front of the brain, this is the top of the brain. This is a macaque monkey and this is a mouse. And they're not drawn to scale. This would actually, this mouse neocortex would be a little tiny portion of the macaque neocortex. And what we can see is that there are changes in the size of the cortical sheet, there are changes in cortical field number, there are changes in the relative size of cortical fields. There are changes in the connection patterns of homologous cortical fields. And of course the question is to what extent are these differences due to genes that are intrinsic to the developing neocortex? Have they changed in species over time? And they probably have. Genes associated with the development of the body. Because the body changes, we use the body differently and you cannot think about the brain without thinking about the body. There's a middle man there, right? The brain is not embodied itself. It needs the body to provide all information about the world. Epigenetic influences. And by this I mean sensory-driven or environmental or context-driven changes to the brain and the body. Or is it some combination of these factors? Is any creature due to just one sort of thing or is it some sort of combination in different species over the course of evolution? So I'm gonna give you some examples of comparative work and changes in peripheral morphology that give us some clue about what the answer to that question is. So this is one of my very favorite examples on the planet. It's a duck-billed platypus. This is a real animal, this is the bill and what's really cool, yeah and the reason I'm also interested in the body is when I started working on different animals in Australia, I had to catch every single animal I worked on and it gives you a really, really great appreciation for the body and how important it is because they're hard to catch. Anyway, I digress. So what's really cool about the platypus is that it has mechanosensory receptors that are exquisitely sensitive to touch, inter-digits with electrosensory receptors on its bill, and when it does anything important, capture prey items, mate, it closes its eyes, its ears and its nose. So all it has is a bill. And if you look at its neocortex using electrophysiological recording techniques, this is the front of the brain, this is the top of the brain, this is the representation of the bill on the cortical sheet. There are a number of cortical fields. So within the somatosensory cortex, it takes up about 90% of the entire representation of the somatosensory cortex. And if you look at the cortex itself, it takes up about 75% of the entire cortical sheet. This animal is one big huge bill. And it's hard to imagine, this is called cortical magnification. And the question is is this due to genes intrinsic to the neocortex that have made this cortex one big huge bill, changes to the body? It has to be to some extent due to changes in the body. The environment and how the animal uses the body in the environment. And I gave you the example of the duck-billed platypus because it's an extraordinary example of cortical magnification. But we must appreciate that we see this across the board in a lot of species, including humans. If you look at the evolution of the supralaryngeal rack and the specializations of this body morphology, we have an expanded representation, somatosensory cortex, motor cortex, premotor cortex of this specialized structure which we call Broca's Area. So they are following the same roles of evolution as other species. I'm gonna switch really rapidly to experimental manipulations of peripheral morphology. So you say, okay, to what extent is the ratio of incoming sensory inputs to the developing neocortex determining the functional organization of the neocortex? And for this, we used little short-tailed opossums. They are models of early blindness. We bilaterally nucleated, removed the eyes when these animals are embryos. So it's basically a little layer of skin. We let them grow up and we look at their connections, we look at the functional organization of their brain and what we see is that here's visual cortex in this animal. This is the front of the brain and this is the top of the brain. And all of what would normally be visual cortex, which is in blue here, is now processing inputs from the somatosensory and auditory system. So we've totally functionally changed the reorganization of this neocortex. And if we look at the connections, these are connections, these are visual areas here in blue, primary visual cortex and it's getting input from other visual structures. These are auditory structures in yellow and somatosensory structures in red. And what we see in our bilateral nucleus is that now the connections of the brain have changed. We've done nothing to the neocortex itself, not one thing. We've simply removed all visual input really early in development. And what we see is this functional takeover and connectional takeover of the developing brain. And if we look at what that region is representing, which would normally be visual, it's representing the head vibracy. We've basically, I want to say we've made a platypus, right? So something like 80% of the neocortex is now processing inputs from this stuff, from the vibracy and the snout and the face. And of course, it's great to have a big brain and transform the neocortex but you have to ask yourself what does this mean for behavior? Which is the target of selection. And so we had train discrimination tasks and natural behavior. And I wanted to show you the natural behavior 'cause it's really cool. These animals, they end up becoming super tactile animals. They do really, really well with tactile discrimination tasks. This is the ladder rung task where the animal starts at one end of this ladder and moves to the other end and you train him to walk on evenly spaced rungs. And then what you do is you then space the rungs unevenly and you have him do this task again. And I'm gonna show you some movies of this. This is a normal animal, this is a bilateral nucleate and this is really pretty beautiful. So here's what a normal animal looks like and you can score them based on when their legs fall through and how well they do. You'll see this and his legs falling through, oops, oops, oops. He was trained but this is now a novel task. Here's the bilateral nucleate, no eyes whatsoever. So it's never had any access to visual input ever. And so this is what the bilateral nucleate looks like on this novel task. Look at this, bam. Okay, this is good. Okay, so we propose that because the whiskers are so important and have this huge representation of the whiskers now, we trim the whiskers. I know, I know, I know. They grew back, they grew back, okay. But check this out. This is so cool. This is the same animal with his whiskers trimmed. And here's what happens. So this is really cool because you are showing this enormous brain change and behavior change and the two coincide. And just to end this little tiny portion of it. So blindness is not an absence of light because this animal has functional respecification changes in cortical and subcortical connections, a really big magnification of its vibracy. So its entire nervous system has been kind of rewired and reorganized based on this lack of sensory input. But that's pretty, that's sort of like taking a sledgehammer to the system. Let me show you something a little more subtle. This is natural differences in rearing conditions, cultural transmission of rearing style and these are voles. Voles are biparental, both parents rear their young. And we can measure differences in total tactile contact with the young. We look at high contact parents and low contact parents and important to remember, is that high contact offspring show differences in behavior, high contact offspring become high contact parents and if you cross foster them on the day of birth, you take low contact parents and put them with high contact parents, they become high contact parents. So this is a social transmission of a rearing style. And if we look at the connections of their somatosensory cortex, particularly in the regions of the body that are being touched, we see the connections are mostly the same. This is an injection site and these are connections to it. But we also see differences in connections. And remember this is pretty subtle. And differences in connections of frontal cortex as well. So what factors contribute to the phenotype? So I'm just showing you a much more subtle example of the role of sensory input in shaping the brain, in shaping connectivity. Well genes definitely contribute to cortical sheet size, cortical field size, cortical connections, peripheral morphology and cellular mechanisms involved in plasticity, maybe genetically specified which allows the environment to impact some of these same things. I can change cortical field size, cortical connectivity, peripheral morphology, which I showed in these experiments. And I've given you examples from the bill of the platypus and I suggested the same thing is occurring in humans but what about things like social learning, language and culture? I would suggest that the best way to think about these as these are simply complex patterns of physical stimuli that are impinging on the developing nervous system. So mothers love is temperature, touch, cadence of a voice, nothing more. And this can impact how the brain wires itself. It can impact size of cortical fields and connections of cortical fields. So I'm gonna end a little bit with this. This is a human evolution, not quite a cladogram. And on the bottom is environment and social context and on the top it's sort of truncated here, is meant to be morphology or genes. And modern hand is proposed to have been around this for about 700,000 years but what's really pretty fascinating is that until recently, we were using stone tools. And big brains happened way back here. So we had the big brain, we had the modern hand but we were not doing what we're doing with the modern hand. And if you believe that all behaviors is generated by the brain, and I believe that all behavior is generated by the brain, then the brain must've changed. And if the brain didn't change by changes in DNA sequence, it must've changed by activity-dependent mechanisms and/or culture. And so, it's hard to believe that the industrial revolution was less than 300 years ago. And if you look, we've actually changed the scape of this planet and now we have daily and prolonged interactions with computers and machines and tools, really sophisticated tools. And I'll end by saying we are combinatorial creatures constructed by genes, bodies, behaviors and environmental context and I think, you can look for the genes that are distinguishing humans, I think you're gonna find only a few, maybe some involved in expansion of the neocortex. But humans that have evolved an extraordinary capacity to construct our neocortex over the course of a prolonged infancy and childhood allowing for rapid phenotypic change even within a single generation. I would say that if Leah Krubitzer were born 30 years ago or 100 years ago or 300,000 years ago, I wouldn't be Leah Krubitzer hitting a rock or using a spear, I would be a different brain. We have a remarkable fluid brain body interface with the environment such that tools and machines can extend our embodiment and our peri personal space and expand the loop between our brains, our bodies and the world. And I think this has made us unique biohybrid creatures who's brains adapt and bootstrap themselves with the technologies that we give rise to and for better or worse, with whom are futures are increasingly entwined. Okay, thank you. (audience applause) - Paleoanthropologists once thought humans evolved in Europe and Asia around 40 to 50,000 years ago. We now know homo sapiens evolved in Africa more than 100 to 300,000 years ago. The evidence showing these earlier humans behaved in similar ways to recent humans continues to surprise us. But should it? Here's an example. Several weeks ago, there was an article published in Nature, an abstract drawing from the 73,000 year old levels at Blombos Cave in South Africa. Now this is an important discovery and it's published in one of the world's foremost and prestigious scientific journals but abstract drawings are human cultural universals. You're looking at some of them right now. Everyone makes them. Shouldn't we expect that humans who lived in South Africa 73,000 years ago made abstract drawings too? Are early humans not behavioring modern humans? Were they these primitive humans that anthropologists have been looking for and not finding for more than two centuries? I think we have to ask what is behavioral modernity and take a closer look at it. It's inferred quality based on unexpected archeological discoveries, art, symbols, carved bone tools and similar things rather than things predicted from prior evolutionary theory before the excavations started. Modernity claims come first, supporting evidence, arguments and theoretical justifications are developed after the fact. This is not a good way to do science and it is a demonstrably poor way to do justice because long ago, people in Salem, Massachusetts used this same approach to identify witches. Witchcraft accusations came first. Evidence was then gathered and theories developed after the fact. More than 200 people were accused, dozens jailed. 19 hanged and one tortured to death. And yet Massachusetts exonerated most accused and convicted witches 17 years later in 1703. Why should we expect our search for prehistoric behaviorally modern humans to be any more successful and enduring than Salem's search for witches? I don't think we should. So what's wrong with behavioral modernity? First off, it's anti-evolutionary. It assumes all humans evolved convergently towards the same end and regardless of divergent selective pressures. Secondly, it's a metaphor rather than measurable property of the evidence. And a metaphor that archeologists choose selectively. It doesn't predict. All of its interpretations are made after the fact. I think it's better to focus, and I'll propose, it's better to focus on behavioral complexity instead. With behavioral modernity, different lightning bolts will all produce the same kinds of stone tools. Don't nap outdoors during a lightning storm. With behavioral complexity, different causes beget different results. So what is behavioral complexity? Well, key features of complex phenomena include multiple parts that interact with one another systematically. Different inputs create different outputs. And historical and geographic variation among inputs creates complexly-patterned variability among the outputs. That whole system, too much systems theory. Here's a practical example. Folk art is a complex phenomenon in which artists, cultures, materials and markets interact dynamically with one another. If you think folk art is simple do please join us in Santa Fe, New Mexico for the International Folk Art Festival that happens each July. Handcrafted artifacts vary historically and geographically in complexly patterned ways. So how complex was early human stone working? Today, stone working or making stone tools is a handy craft, a folk art, of which I'm a practitioner. It's also a reasonable source of hypotheses about prehistoric stoneworking because it's familiar, we can observe it. If early Homo sapiens had similar capacities for behavioral variability, excuse me, for behavioral complexity to ours and different ones from earlier hominins, then their first appearances in the fossil record should coincide with onsets of complexly patterned stone tool variability. Here's a prediction. Does it? Well indeed it does. After 100 to 300,000 years ago, Homo sapiens first appearances everywhere coincide with increasingly complex and patterned stone tool variability. So what is complexly patterned stone tool variability? I hope this slide captures it well. To make my point, I've stripped away all the hints you would have to the origins of these artifacts from their raw materials and substituted them with drawings. The objects in blue here date to more than 300,000 years ago and their artifact designs, their shapes, their morphology, their patterns of modification, only weakly indicate geographic origin and their age, if they do so at all. Now in the past, I've offered to have a contest and offered a $20 bill to anybody who can guess where the object, the teardrop shaped object comes from, but I'm told that that would be improper, hehe. (giggles) Okay, now less than 100,000 years ago, artifact designs indicate age and geographic origin with increasing precision. The objects outlined in red are a bit older, they're closer to around 100,000 years ago. If you're to guess their ages, you'd probably come within 10,000 years, maybe 100,000 years. But the objects in black, the objects in black are more recent. They're younger than 40,000 years. And with some of them, we can narrow down their occurrence to 2,000 years. The object, let's see, one, two, three, four, fourth from the right there, the one with the hollow sort of base, that's an artifact found here in North America but from the Mississippi River to the Rocky Mountains and up to the Great Plains over the course of about two centuries. Excuse me, 2,000 years. What do stone tools teach us? First off, stone tools are indestructible evidence about mind-technology relationships. This evidence shows complex mind-technology relationships were in place among early humans at least 100 to 300,000 years ago. Increasingly complex stone tool and other artifact variability since 100,000 years ago suggests mind-technology relationships changed during the course of human evolution. So if we're trying to discover the influence of the mind on technology and technology on the mind, we need to remember we're shooting at a moving target. So why? These are the fun questions. Why does stone tool variability become increasingly complex and patterned? Tools can be evolutionarily advantageous but they come with costs few other animals tolerate. Other primates occasionally use tools but only Homo sapiens are obligatory tool users. Obligatory tool use is irreversible, even the best survival experts require tools. How many of you guys will fess up to seeing this show, Naked and Afraid? Come on, you've seen this. I know at least one other of you in the audience has been like me, recruited to participate in it. I don't want to treat my students to pictures of me running around without clothes. They couldn't get insurance for this program unless they equipped the contestants with tools. It wouldn't be called Naked and Afraid, it would be called Naked and Dead. Now as I say, it's irreversible, we can't go back. More tool use increases cultural differences potential, influence on tool making strategies. I like to call this the no, you're doing it wrong effect. So what do we know, what do we not know and what do we want to know? We know obligatory tool use is intrinsically complex and variable and these are quantities we can measure using the stone tool evidence. We do not know whether obligatory tool use was a uniquely derived characteristic of Homo sapien's behavior or one shared by other hominins now extinct, such as denisovans and neanderthals, perhaps. We find ourselves in a happy spot where we have new research questions. You always want to come up with new research questions. You don't want to just hang up your spears and go do something else. How did obligatory tool use change the human mind from earlier primate and hominin conditions? And how do we didect obligatory tool use? Now some of my colleagues might say, and I'm not referring to anyone in the room, some of my colleagues might say we have an arsenal of ways of measuring stone tools and quantifying their variability that we've been using for 200 years. But archeologists who formulated those measurements 200 years ago weren't thinking about obligatory tool use. So we're in a situation where we have a tool box full of hammers and chisels and we're setting forth to change the battery on an iPhone. So I don't think the existing tools archeologists used to characterize the stone tool record are necessarily unhelpful but we have to prove that value. Thank you. (audience applause) - I'd like to thank Tim and Pat for inviting me to this very interesting symposium and of course the entire Carta organization. Language coevolved with the human brain throughout evolution of Homo sapiens. Writing, on the other hand, which is actually what my topic is, is a relatively new technology that was invented by humans to translate spoken language into a visual form for the purpose of transmitting verbal communication broadly to many people over large distances and time. As such, reading and writing can be considered the first social media. Reading is an example of what's called neuronal recycling, the recruitment of previously evolved neural circuits to accomplish cultural innovations. It's important to point out that while language develops naturally in most humans based on exposure rather than explicit instruction, reading and writing require painstaking instruction and years of practice to reach proficiency. This slide shows the typical reading brain network with its key components as established initially in an older view by studies of patients with acquired brain lesions. And more recently, using modern brain imaging technologies, we have a lot more information. This talk will trace the history of the invention of writing, how English writing developed and ended up with such a cockeyed spelling system which leads in many cases to dyslexia, and discuss how modern technologies are impacting reading and writing today. I'd like to begin with a brief overview of the history of the invention of writing. Writing is a physical manifestation of a spoken language. It's thought that human beings developed spoken language about 35,000 BC as evidenced by, wow yeah, by cave paintings from the period of cro magnum man. Writing numbers for the purpose of record keeping began long before writing of language. Written language doesn't emerge until the invention in Sumer in southern Mesopotamia around 3,500 BC. This writing system is known as Cuneiform writing. It's generally agreed that true writing of language was independently conceived and developed by at least two ancient civilizations and possibly more. In Sumer and also in Mesoamerica by the Maya in 500 to 300 BC. Writing systems also arose in Egypt around 3100 BC and China around 1200 BC. But historians debate whether these writing systems were developed completely independently of Sumerian writing or whether either or both were inspired by Sumerian writing by a process of cultural diffusion. The Sumerian Cuneiform writing system was in use for more than three millennia through several stages of development as can be seen on this slide by looking at the transformation of the sign for head SAG beginning with pictograms in 3000 BC. Ultimately it was completely replaced by alphabetic writing in the general sense. In the course of the Roman era, there are no Cuneiform writing systems in use today. While the development of Phoenician alphabet is often attributed to, the Phonetic alphabet is also often contributed to the Phoenicians, they didn't actually invent it or create it. Rather, they imported it piecemeal from Egypt and Crete and took it to every city on the Mediterranean. By the time of Homer, the Greeks were taking over the Phoenician or allied Aramaic, alphabetic writing and were calling it by the Semitic names of the first two letters, alpha and beta. The Latin alphabet was established by the seventh century BC. Before the invention of the printing press in Europe around 1455, all books were handwritten and usually highly decorated until about the 12th century, the most elaborate and beautiful illuminations were devoted to religious works and most manuscripts were produced in monasteries. During the middle ages, the Latin alphabet was used extensively for writing in Europe. With the age of colonialism and Christian evangelism the Latin scripts spread widely well beyond Europe. I'd now like to change the focus to the history of English writing and explain how English writing ended up with such a cockeyed spelling system which leads in many cases to dyslexia. The English language itself is a compilation of several different languages, mostly Anglo-Saxon, French and Latin because England just kept getting invaded by people who spoke different languages. The only thing we have in English up until about 1400 are called patois. They're a clumsy peasant language without any standardized writing system. There wasn't even a vocabulary to carry on business in English. It was all done in French or Latin. However, when King Henry the Fifth took the throne of England in 1412, he had plans to invade France. So speaking French wasn't gonna work. He realized that if he wanted to get support from the common people, they would have to change the language of Parliament and business from French to English. At this time, the only people who could write at all in England were just a handful of men called chancery scribes who wrote by hand and almost exclusively in Latin. Latin has a transparent one to one correspondence between the number of discrete sounds, called phonemes, in spoken Latin and how they're mapped by the letters in the alphabet. As such, learning to read Latin's a piece of cake. All you have to do is learn the single letter that goes with each phoneme in Latin and then just pronounce them in order to produce words. King Henry the Fifth directed his chancery scribes to create a written version of English so that he could communicate more broadly with his subjects. However, there was little or no instruction as to how they were supposed to do this. The problem they faced was that the written Latin alphabet was designed to represent one on one, each of the 23 phonemes in spoken Latin. It was never fit well for English which has 44 phonemes. For example, Latin didn't have a sh, th or ch sound. So with only 23 Latin letters, there was no letter to represent these and many of the other 44 English phonemes. Faced with this problem, the chancery scribes just apparently made up multi-letter combinations to represent these single phonemes. So for example on this slide, you could see that the letter string t-h-e pronounced phonetically should be te-hu-e and s-h-e, sa-hu-e. So as many children are told just say those and blend the together quickly and what do you get? Sa-hu-e. (audience laughing) In case you ever wondered how we ended up with the non-transparent mapping from spoken to written English that leads so many children and people learning English as a second language to struggle to learn to read and spell. It all began here in the 15th century. Nearly 600 years before Gutenberg, Chinese monks were setting ink to paper using a method known as block printing. What really set Gutenberg's technology apart from the Chinese was the development of a press that could mechanize the transfer of ink from movable type to paper. For the first time in history, books could be mass produced. Gutenberg's first major work in 1454 was the 42 line Gutenberg bible printed in Latin. With the invention of the printing press, printing soon became the first means of mass communication, really the first social media. It put more knowledge in the hands of more people faster and more cheaply than ever before. As a result, reading and writing, which up until that time was really something done by a very, very small number of people spread widely and rapidly. Caxton brought the first printing press from Bruges to London in 1476 for the Canterbury Tales. With the advent of the printing press, the complex and often cockeyed spelling patterns for English that remain to this day were solidified giving rise to generation after generation of children and adults who struggled to learn to read and write, particularly in English. Let me acknowledge here that I'm indebted to a wonderful website which I'll highly recommend to you called The website's creator, David Bolton, gave much of the information by interviewing scientists and anthropologists for many years and I've synthesized a lot of this for this talk. I highly recommend that you explore this website for additional information on this topic of reading and writing, the history of reading and writing and especially dyslexia. I quote from David Bolton, "There weren't any cognitive scientists "or neuroscientists or psychologists "or even child development experts "involved in creating the English writing system. "There wasn't any concern "for hundreds of millions of children "who would struggle "since this code was created in the 15th century. "There were just a small number of chancery scribes "doing their very best "to shove a technology created for Latin "into a language it was never designed for. "Those who eventually do manage to overcome the confusion "and learn how to break the code, "to become phonologically aware "are able to use the areas of the brain "that reading has recycled from speech and language." These areas here. "Those who cannot break this code become dyslexic." According to the United States Department of Education in 2017, more than 60% of US K through 12 school children are reading below proficiency and more than 70% writing below the level of proficiency for that grade level. Let these stunning numbers sink in. Reading is the skill that matters most to success in school and children who fall behind in reading are in great academic danger. But it's not just the lack of reading skills that most endanger these children. It's the collateral damage. It's their mind shame that fates their future. As seen on this video clip being shown with permission of Children of the Code. - My teacher asked me to come into the front of the class and read a book and everybody was just staring at me and I got real nervous 'cause I didn't want to mess up or anything. And then when I started reading, I started messing up and I just couldn't help it and everybody started laughing at me and stuff. - Like the teacher would ask me to read something and I would read it and I'd get a wrong word or I'd go too slow and they'll make fun of me. - I kept messing up on the words and people kept laughing at me. - They said that you don't know how to read, I bet you won't be able to how to read when you grow up. - They always laugh at me if I get twist up with words. It makes my heart drop 'cause it seem like they not my friend no more. - The collateral damage caused by literacy problems in the United States alone is immense. 75% to 80% of students identified as learning disabled have their basic deficits in language and reading. Academic success, as defined by high school graduation, can be predicted by knowing someone's reading skill at the end of third grade. 56% of students with learning disabilities will drop out of school and be arrested. 60% of adolescents in treatment for substance abuse have learning disabilities. 50% of females with learning disabilities will be mothers, many of them single mothers, within three to five years of leaving high school. Learning disabilities and substance abuse are the number one reason for keeping welfare recipients from becoming and remaining employed. Now I'm gonna turn to discussing how modern technologies are impacting reading and writing today. Lest we forget, the human brain is an exquisitely adaptable machine. Even though written language did not evolve like spoken language did over tens of millions of years ago, since the invention of writing and reading, the remarkable neuroplasticity of the human brain has supported the rapid formation of a highly elaborate literacy brain circuit. Literacy is a unique epigenetic achievement that changes what we perceive, how we think and how we feel, that is who we are. I think it's fair to say that reading and writing, perhaps more than any other technology have had the greatest impact on the advancement of Homo sapien brain and mind. As we gain proficiency in reading and writing, the continual use of our literacy brain circuit feeds back, elaborates and strengthens itself. But just as this brain circuit developed very rapidly in evolutionary terms or perhaps because it was so rapid, this circuit is also more vulnerable to change, if not continuously reinforced by experience and use. Only if we continuously work to develop and use the elaborated, analytical, inferential and empathetic skills that have been developed by literacy will the neural networks underlying these skills continue to sustain our capacity to be attentive, thoughtful, critical thinkers rather than passive consumers of facts, real or fake. Despite alphabetic writing not changing much since the invention of the printing press in the 15th century, modern technologies, particularly social media technologies are having a very rapid and profound effect on writing practices in the 21st century. Interestingly, many of these new technologies are making less use of alphabetic writing systems and more use once again of pictographs, as can be seen here comparing the Egyptian hieroglyphics of 3100 BC to a thank you note I recently received from my niece that says, "Dinner was awesome. "It was a gift. "Thanks." And another example, a Chinese pictograph for medical marijuana originally from 10,000 years ago and a clip art symbol for medical marijuana from 2017. Since the 15th century, despite many changes in spoken English from old English to middle English to early modern and late modern English, like here it says, then she went to speak the whatever English tongue, written English has used the same 26 letters to represent the 44 sounds of English until now. The bottom line reads, "Then she went to speak this digital age English tongue." Neuroscientists like me who specialize in reading are frequently asked these days how might these rapid technological changes in reading and writing affect the human mind? My colleague, Dr. Maryanne Wolf, focuses on exactly this question in her new book Reader, Come Home. I'd like to close with some of the profound insights from this timely book. "As we move from a literary and word-based culture "into a far faster paced digital and screen-based one, "we face an existential dilemma in this new millennium. "There are many things that would be lost "if we slowly lose the cognitive patience "to immerse ourselves in the worlds "that are created by books "and the lives, feelings, thoughts and insights "of the characters who inhabit them." In a culture that increasingly rewards immediacy, ease and efficiency, the demanding time and effort involved in what Dr. Wolfe refers to as, deep reading make it an increasingly embattled entity. In closing, I ask each of you to consider this question, will the very plasticity of the literate brain as it begins to reflect the characteristics of digital media that we and our children are increasingly immersing ourselves in precipitate the atrophy of our most essential thought processes, sustained attention, critical analysis, empathy and reflection and indeed wisdom, to the detriment of democratic societies that critically depend on these, the most essential characteristics of the enlightened, literate individual? Thank you. (audience applause) (upbeat music)



To classify what traits should be included in modern human behavior, it is necessary to define behaviors that are universal among living human groups. Some examples of these human universals are abstract thought, planning, trade, cooperative labor, body decoration, control and use of fire. Along with these traits, humans possess much reliance on social learning.[11][12] This cumulative cultural change or cultural "ratchet" separates human culture from social learning in animals. As well, a reliance on social learning may be responsible in part for humans' rapid adaptation to many environments outside of Africa. Since cultural universals are found in all cultures including some of the most isolated indigenous groups, these traits must have evolved or have been invented in Africa prior to the exodus.[13][14][15][16]

Archaeologically, a number of empirical traits have been used as indicators of modern human behavior. While these are often debated[17] a few are generally agreed upon. Archaeological evidence of behavioral modernity includes:[3][7]


Several critiques have been placed against the traditional concept of behavioral modernity, both methodologically and philosophically.[3][17] Shea (2011) outlines a variety of problems with this concept, arguing instead for "behavioral variability", which, according to the author, better describes the archaeological record. The use of trait lists, according to Shea (2011), runs the risk of taphonomic bias, where some sites may yield more artifacts than others despite similar populations; as well, trait lists can be ambiguous in how behaviors may be empirically recognized in the archaeological record.[17] Shea (2011) in particular cautions that population pressure, cultural change, or optimality models, like those in human behavioral ecology, might better predict changes in tool types or subsistence strategies than a change from "archaic" to "modern" behavior.[17] Some researchers argue that a greater emphasis should be placed on identifying only those artifacts which are unquestionably, or purely, symbolic as a metric for modern human behavior.[3]

Theories and models

Late Upper Paleolithic Model or "Revolution"

The Late Upper Paleolithic Model, or Upper Paleolithic Revolution, refers to the idea that, though anatomically modern humans first appear around 150,000 years ago, they were not cognitively or behaviorally "modern" until around 50,000 years ago, leading to their expansion into Europe and Asia.[7][18][19] These authors note that traits used as a metric for behavioral modernity do not appear as a package until around 40–50,000 years ago. Klein (1995) specifically describes evidence of fishing, bone shaped as a tool, hearths, significant artifact diversity, and elaborate graves are all absent before this point.[7] Although assemblages before 50,000 years ago show some diversity the only distinctly modern tool assemblages appear in Europe at 48,000.[18] According to these authors, art only becomes common beyond this switching point, signifying a change from archaic to modern humans.[7] Most researchers argue that a neurological or genetic change, perhaps one enabling complex language, such as FOXP2, caused this revolutionary change in our species.[7][19]

Alternative models

Contrasted with this view of a spontaneous leap in cognition among ancient humans, some authors like Alison S. Brooks, primarily working in African archaeology, point to the gradual accumulation of "modern" behaviors, starting well before the 50,000 year benchmark of the Upper Paleolithic Revolution models.[2][3][20] Howiesons Poort, Blombos, and other South African archaeological sites, for example, show evidence of marine resource acquisition, trade, the making of bone tools, blade and microlith technology, and abstract ornamentation at least by 80,000 years ago.[2][8] Given evidence from Africa and the Middle East, a variety of hypotheses have been put forth to describe an earlier, gradual transition from simple to more complex human behavior. Some authors have pushed back the appearance of fully modern behavior to around 80,000 years ago in order to incorporate the South African data.[20]

Others focus on the slow accumulation of different technologies and behaviors across time. These researchers[2][3] describe how anatomically modern humans could have been cognitively the same and what we define as behavioral modernity is just the result of thousands of years of cultural adaptation and learning. D'Errico and others have looked at Neanderthal culture, rather than early human behavior exclusively, for clues into behavioral modernity.[6] Noting that Neanderthal assemblages often portray traits similar to those listed for modern human behavior, researchers stress that the foundations for behavioral modernity may in fact lie deeper in our hominin ancestors.[21] If both modern humans and Neanderthals express abstract art and complex tools then "modern human behavior" cannot be a derived trait for our species. They argue that the original "human revolution" theory reflects a profound Eurocentric bias. Recent archaeological evidence, they argue, proves that humans evolving in Africa some 300,000 or even 400,000 years ago were already becoming cognitively and behaviourally "modern". These features include blade and microlithic technology, bone tools, increased geographic range, specialized hunting, the use of aquatic resources, long distance trade, systematic processing and use of pigment, and art and decoration. These items do not occur suddenly together as predicted by the "human revolution" model, but at sites that are widely separated in space and time. This suggests a gradual assembling of the package of modern human behaviours in Africa, and its later export to other regions of the Old World.

Between these extremes is the view – currently supported by archaeologists Chris Henshilwood,[22] Curtis Marean,[23] Ian Watts[24] and others – that there was indeed some kind of 'human revolution' but that it occurred in Africa and spanned tens of thousands of years. The term "revolution" in this context would mean not a sudden mutation but a historical development along the lines of "the industrial revolution" or "the Neolithic revolution".[25] In other words, it was a relatively accelerated process, too rapid for ordinary Darwinian "descent with modification" yet too gradual to be attributed to a single genetic or other sudden event. These archaeologists point in particular to the relatively explosive emergence of ochre crayons and shell necklaces apparently used for cosmetic purposes. These archaeologists see symbolic organisation of human social life as the key transition in modern human evolution. Recently discovered at sites such as Blombos Cave and Pinnacle Point, South Africa, pierced shells, pigments and other striking signs of personal ornamentation have been dated within a time-window of 70,000–160,000 years ago in the African Middle Stone Age, suggesting that the emergence of Homo sapiens coincided, after all, with the transition to modern cognition and behaviour.[26] While viewing the emergence of language as a 'revolutionary' development, this school of thought generally attributes it to cumulative social, cognitive and cultural evolutionary processes as opposed to a single genetic mutation.[27]

A further view, taken by archaeologists such as Francesco D'Errico[28] and João Zilhão,[29] is a multi-species perspective arguing that evidence for symbolic culture in the form of utilised pigments and pierced shells are also found in Neanderthal sites, independently of any "modern" human influence.

Cultural evolutionary models may also shed light on why although evidence of behavioral modernity exists before 50,000 years ago it is not expressed consistently until that point. With small population sizes, human groups would have been affected by demographic and cultural evolutionary forces that may not have allowed for complex cultural traits.[9][10][11][12] According to some authors[9] until population density became significantly high, complex traits could not have been maintained effectively. It is worth noting that some genetic evidence supports a dramatic increase in population size before human migration out of Africa.[19] High local extinction rates within a population also can significantly decrease the amount of diversity in neutral cultural traits, regardless of cognitive ability.[10]

Highly speculatively, bicameral mind theory argues for an additional, and cultural rather than genetic, shift from selfless to self-perceiving forms of human cognition and behavior very late in human history, in the Bronze Age. This is based on a literary analysis of Bronze Age texts which claims to show the first appearances of the concept of self around this time, replacing the voices of gods as the primary form of recorded human cognition. This non-mainstream theory is not widely accepted but does receive serious academic interest from time to time.

Archaeological evidence


Before the Out of Africa theory was generally accepted, there was no consensus on where the human species evolved and, consequently, where modern human behavior arose. Now, however, African archaeology has become extremely important in discovering the origins of humanity. Since the first Cro-Magnon expansion into Europe around 48,000 years ago is generally accepted as already "modern",[18] the question becomes whether behavioral modernity appeared in Africa well before 50,000 years ago, as a late Upper Paleolithic "revolution" which prompted migration out of Africa, or arose outside Africa and diffused back.

A variety of evidence of abstract imagery, widened subsistence strategies, and other "modern" behaviors have been discovered in Africa, especially South and North Africa. The Blombos Cave site in South Africa, for example, is famous for rectangular slabs of ochre engraved with geometric designs. Using multiple dating techniques, the site was confirmed to be around 77,000 years old.[30] Beads and other personal ornamentation have been found from Morocco which might be as much as 130,000 years old; as well, the Cave of Hearths in South Africa has yielded a number of beads significantly before 50,000 years ago.[2] Specialized projectile weapons as well have been found at various sites in Middle Stone Age Africa, including bone and stone arrowheads at South African sites such as Sibudu Cave (along with an early bone needle found at Sibudu) dating approximately 60,000-70,000 years ago[31][32][33], and bone harpoons at the Central African site of Katanda dating ca. 90,000 years ago.[34] Evidence also exists for the systematic heat treating of silcrete stone to increased its flake-ability for the purpose of toolmaking, beginning approximately 164,000 years ago at the South African site of Pinnacle Point and becoming common there for the creation of microlithic tools at ca. 72,000 years ago.[35] In 2008, an ochre processing workshop likely for the production of paints was uncovered dating to ca. 100,000 bc at Blombos Cave, South Africa. Analysis shows that a liquefied pigment-rich mixture was produced and stored in the two abalone shells, and that ochre, bone, charcoal, grindstones and hammer-stones also formed a composite part of the toolkits. Evidence for the complexity of the task includes procuring and combining raw materials from various sources (implying they had a mental template of the process they would follow), possibly using pyrotechnology to facilitate fat extraction from bone, using a probable recipe to produce the compound, and the use of shell containers for mixing and storage for later use.[36] Modern behaviors, such as the making of shell beads, bone tools and arrows, and the use of ochre pigment, are evident at a Kenyan site by 78,000-67,000 years ago.[37] The oldest known stone-tipped projectile weapons (a characteristic tool of homo sapiens), the stone tips of javelins or throwing spears, are known from the Ethiopian site of Gademotta, and date to ca. 279,000 years ago.[38]

Expanding subsistence strategies beyond big-game hunting and the consequential diversity in tool types has been noted as signs of behavioral modernity. A number of South African sites have shown an early reliance on aquatic resources from fish to shellfish. Pinnacle Point, in particular, shows exploitation of marine resources as early as 120,000 years ago, perhaps in response to more arid conditions inland.[8] Establishing a reliance on predictable shellfish deposits, for example, could reduce mobility and facilitate complex social systems and symbolic behavior. Blombos Cave and Site 440 in Sudan both show evidence of fishing as well. Taphonomic change in fish skeletons from Blombos Cave have been interpreted as capture of live fish, clearly an intentional human behavior.[2]

Humans in North Africa (Nazlet Sabaha, Egypt) are known to have dabbled in chert mining, as early as ≈100,000 years ago, for the construction of stone tools.[39][40]


While traditionally described as evidence for the later Upper Paleolithic Model,[7] European archaeology has shown that the issue is more complex. A variety of stone tool technologies are present at the time of human expansion into Europe and show evidence of modern behavior. Despite the problems of conflating specific tools with cultural groups, the Aurignacian tool complex, for example, is generally taken as a purely modern human signature.[41][42] The discovery of "transitional" complexes, like "proto-Aurignacian", have been taken as evidence of human groups progressing through "steps of innovation".[41] If, as this might suggest, human groups were already migrating into eastern Europe around 40,000 years and only afterward show evidence of behavioral modernity, then either the cognitive change must have diffused back into Africa or was already present before migration.

In light of a growing body of evidence of Neanderthal culture and tool complexes some researchers have put forth a "multiple species model" for behavioral modernity.[6][21][43] Neanderthals were often cited as being an evolutionary dead-end, apish cousins who were less advanced than their human contemporaries. Personal ornaments were relegated as trinkets or poor imitations compared to the cave art produced by H. sapiens. Despite this, European evidence has shown a variety of personal ornaments and artistic artifacts produced by Neanderthals; for example, the Neanderthal site of Grotte du Renne has produced grooved bear, wolf, and fox incisors, ochre and other symbolic artifacts.[43] Although burials are few and controversial, there has been circumstantial evidence of Neanderthal ritual burials.[21] There are two options to describe this symbolic behavior among Neanderthals: they copied cultural traits from arriving modern humans or they had their own cultural traditions comparative with behavioral modernity. If they just copied cultural traditions, which is debated by several authors,[6][21] they still possessed the capacity for complex culture described by behavioral modernity. As discussed above, if Neanderthals also were "behaviorally modern" then it cannot be a species-specific derived trait.


Most debates surrounding behavioral modernity have been focused on Africa or Europe but an increasing amount of focus has been placed on East Asia. This region offers a unique opportunity to test hypotheses of multi-regionalism, replacement, and demographic effects.[44] Unlike Europe, where initial migration occurred around 50,000 years ago, human remains have been dated in China to around 100,000 years ago.[45] This early evidence of human expansion calls into question behavioral modernity as an impetus for migration.

Stone tool technology is particularly of interest in East Asia. Following Homo erectus migrations out of Africa, Acheulean technology never seems to appear beyond present-day India and into China. Analogously, Mode 3, or Levallois technology, is not apparent in China following later hominin dispersals.[46] This lack of more advanced technology has been explained by serial founder effects and low population densities out of Africa.[47] Although tool complexes comparative to Europe are missing or fragmentary, other archaeological evidence shows behavioral modernity. For example, the peopling of the Japanese archipelago offers an opportunity to investigate the early use of watercraft. Although one site, Kanedori in Honshu, does suggest the use of watercraft as early as 84,000 years ago, there is no other evidence of hominins in Japan until 50,000 years ago.[44]

The Zhoukoudian cave system near Beijing has been excavated since the 1930s and has yielded precious data on early human behavior in East Asia. Although disputed, there is evidence of possible human burials and interred remains in the cave dated to around 34-20,000 years ago.[44] These remains have associated personal ornaments in the form of beads and worked shell, suggesting symbolic behavior. Along with possible burials, numerous other symbolic objects like punctured animal teeth and beads, some dyed in red ochre, have all been found at Zhoukoudian.[44] Although fragmentary, the archaeological record of eastern Asia shows evidence of behavioral modernity before 50,000 years ago but, like the African record, it is not fully apparent until that time.

See also


  1. ^
  2. ^ a b c d e f g McBrearty, Sally; Brooks, Allison (2000). "The revolution that wasn't: a new interpretation of the origin of modern human behavior". Journal of Human Evolution. 39 (5): 453–563. doi:10.1006/jhev.2000.0435. PMID 11102266.
  3. ^ a b c d e f g h Henshilwood, Christopher; Marean, Curtis (2003). "The Origin of Modern Human Behavior: Critique of the Models and Their Test Implications". Current Anthropology. 44 (5): 627–651. doi:10.1086/377665.
  4. ^ Hill, Kim; et al. (2009). "The Emergence of Human Uniqueness: Characters Underlying Behavioral Modernity". Evolutionary Anthropology. 18 (5): 187–200. CiteSeerX doi:10.1002/evan.20224.
  5. ^ Klein, R. G. 1999. The human career: human biological and cultural origins. Chicago: University of Chicago Press.
  6. ^ a b c d D'Errico, F; et al. (1998). "Neanderthal Acculturation in Western Europe? A Critical Review of the Evidence and Its Interpretation". Current Anthropology. 39 (S1): S1–S44. doi:10.1086/204689.
  7. ^ a b c d e f g Klein, Richard (1995). "Anatomy, behavior, and modern human origins". Journal of World Prehistory. 9 (2): 167–198. doi:10.1007/bf02221838.
  8. ^ a b c Marean, Curtis; et al. (2007). "Early human use of marine resources and pigment in South Africa during the Middle Pleistocene". Nature. 449 (7164): 905–908. Bibcode:2007Natur.449..905M. doi:10.1038/nature06204. PMID 17943129.
  9. ^ a b c Powell, Adam; et al. (2009). "Late Pleistocene Demography and the Appearance of Modern Human Behavior". Science. 324 (5932): 1298–1301. Bibcode:2009Sci...324.1298P. doi:10.1126/science.1170165. PMID 19498164.
  10. ^ a b c Premo, Luke; Kuhn, Steve (2010). "Modeling Effects of Local Extinctions on Culture Change and Diversity in the Paleolithic". PLOS ONE. 5 (12): e15582. Bibcode:2010PLoSO...515582P. doi:10.1371/journal.pone.0015582. PMC 3003693. PMID 21179418.
  11. ^ a b Boyd, Robert; Richerson, Peter (1988). Culture and the Evolutionary Process (2 ed.). University of Chicago Press. ISBN 9780226069333.
  12. ^ a b Nakahashi, Wataru (2013). "Evolution of improvement and cumulative culture". Theoretical Population Biology. 83: 30–38. doi:10.1016/j.tpb.2012.11.001. PMID 23153511.
  13. ^ Wade, Nicholas (2003-07-15). "leap to language". New York Times. Retrieved 2009-09-10.
  14. ^ Buller, David (2005). Adapting Minds: Evolutionary Psychology and the Persistent Quest for Human Nature. PMIT Press. p. 468. ISBN 978-0-262-02579-9.
  15. ^ "80,000-year-old Beads Shed Light on Early Culture". 2007-06-18. Retrieved 2009-09-10.
  16. ^ "three distinct human populations". Retrieved 2009-09-10.
  17. ^ a b c d Shea, John (2011). "Homo sapiens Is as Homo sapiens Was". Current Anthropology. 52 (1): 1–35. doi:10.1086/658067.
  18. ^ a b c Hoffecker, John (2009). "The spread of modern humans in Europe". PNAS. 106 (38): 16040–16045. Bibcode:2009PNAS..10616040H. doi:10.1073/pnas.0903446106. PMC 2752585. PMID 19571003.
  19. ^ a b c Tattersall, Ian (2009). "Human origins: Out of Africa". PNAS. 106 (38): 16018–16021. Bibcode:2009PNAS..10616018T. doi:10.1073/pnas.0903207106. PMC 2752574. PMID 19805256.
  20. ^ a b Foley, Robert; Lahr, Marta (1997). "Mode 3 Technologies and the Evolution of Modern Humans". Cambridge Archaeological Journal. 7 (1): 3–36. doi:10.1017/s0959774300001451.
  21. ^ a b c d D'Errico, Francesco (2003). "The Invisible Frontier A Multiple Species Model for the Origin of Behavioral Modernity". Evolutionary Anthropology. 12 (4): 188–202. doi:10.1002/evan.10113.
  22. ^ Henshilwood, C. S.; d'Errico, F.; Yates, R.; Jacobs, Z.; Tribolo, C.; Duller, G. A. T.; Mercier, N.; Sealy, J. C.; Valladas, H.; Watts, I.; Wintle, A. G. (2002). "Emergence of modern human behavior: Middle Stone Age engravings from South Africa". Science. 295 (5558): 1278–1280. Bibcode:2002Sci...295.1278H. doi:10.1126/science.1067575. PMID 11786608.
  23. ^ Henshilwood, C.; Marean, C. W. (2003). "The origin of modern human behavior". Current Anthropology. 44 (5): 627–651. doi:10.1086/377665.
  24. ^ Watts, I. 2009. Red ochre, body painting, and language: interpreting the Blombos ochre. In R. Botha and C. Knight (eds), The Cradle of Language. Oxford: Oxford University Press, pp. 62-92.
  25. ^ Mellars, P. A., K. Boyle, O. Bar-Yosef and C. Stringer (eds), 2007. Rethinking the Human Revolution: new behavioural and biological perspectives on the origin and dispersal of modern humans. Cambridge: McDonald Institute for Archaeological Research.
  26. ^ Henshilwood, C. S. and B. Dubreuil 2009. Reading the artifacts: gleaning language skills from the Middle Stone Age in southern Africa. In R. Botha and C. Knight (eds), The Cradle of Language. Oxford: Oxford University Press, pp. 41-61.
  27. ^ Botha, R. and C. Knight (eds), The Cradle of Language. Oxford: Oxford University Press.
  28. ^ D'Errico, F (2003). "The invisible frontier: a multiple species model for the origin of behavioral modernity". Evolutionary Anthropology. 12 (4): 188–202. doi:10.1002/evan.10113.
  29. ^ Zilhão, J (2006). "Neandertals and moderns mixed, and it matters". Evolutionary Anthropology. 15 (5): 183–195. doi:10.1002/evan.20110.
  30. ^ Henshilwood, Christopher; et al. (2002). "Emergence of Modern Human Behavior: Middle Stone Age Engravings from South Africa". Science. 295 (5558): 1278–1280. Bibcode:2002Sci...295.1278H. doi:10.1126/science.1067575. PMID 11786608.
  31. ^ Blackwell, Lucinda (2008). "Middle Stone Age bone tools from the Howiesons Poort layers, Sibudu Cave, South Africa". Journal of Archaeological Science. 35.
  32. ^ Wadley, Lyn (2008). "The Howieson's Poort industry of Sibudu Cave". South African Archaeological Society Goodwin Series. 10.
  33. ^ Backwell L, d'Errico F, Wadley L.(2008). Middle Stone Age bone tools from the Howiesons Poort layers, Sibudu Cave, South Africa. Journal of Archaeological Science, 35:1566–1580. doi:10.1016/j.jas.2007.11.006
  34. ^ Yellen, JE; AS Brooks; E Cornelissen; MJ Mehlman; K Stewart (28 April 1995). "A middle stone age worked bone industry from Katanda, Upper Semliki Valley, Zaire". Science. 268 (5210): 553–556. doi:10.1126/science.7725100. PMID 7725100.
  35. ^ Brown, Kyle S.; Marean, Curtis W.; Herries, Andy I.R.; Jacobs, Zenobia; Tribolo, Chantal; Braun, David; Roberts, David L.; Meyer, Michael C.; Bernatchez, J. (14 August 2009), "Fire as an Engineering Tool of Early Modern Humans", Science, 325 (5942): 859–862, doi:10.1126/science.1175028, PMID 19679810
  36. ^ Henshilwood, Christopher S., et al. (2011) A 100,000-Year-Old Ochre-Processing Workshop at Blombos Cave, South Africa. Science, 334, 219-222.
  37. ^ Shipton C, d'Errico F, Petraglia M, et al. (2008). 78,000-year-old record of Middle and Later Stone Age innovation in an East African tropical forest. Nature Communications
  38. ^ Sahle, Y.; Hutchings, W. K.; Braun, D. R.; Sealy, J. C.; Morgan, L. E.; Negash, A.; Atnafu, B. (2013). Petraglia, Michael D (ed.). "Earliest Stone-Tipped Projectiles from the Ethiopian Rift Date to >279,000 Years Ago". PLoS ONE. 8 (11): e78092. doi:10.1371/journal.pone.0078092. PMC 3827237. PMID 24236011.
  39. ^
  40. ^ Guinness World Records (10 September 2015). Guinness World Records 2016. Guinness World Records. p. 27. ISBN 978-1-910561-03-4.
  41. ^ a b Joris, Olaf; Street, Martin (2008). "At the end of the 14C time scaledthe Middle to Upper Paleolithic record of western Eurasia". Journal of Human Evolution. 55 (5): 782–802. doi:10.1016/j.jhevol.2008.04.002. PMID 18930513.
  42. ^ Anikovich, M.; et al. (2007). "Early Upper Paleolithic in Eastern Europe and Implications for the Dispersal of Modern Humans". Science. 315 (5809): 223–226. Bibcode:2007Sci...315..223A. doi:10.1126/science.1133376. PMID 17218523.
  43. ^ a b Abadia, Oscar Moro; Gonzalez Morales, Manuel R. (2010). "REDEFINING NEANDERTHALS AND ART: AN ALTERNATIVE INTERPRETATION OF THE MULTIPLE SPECIES MODEL FOR THE ORIGIN OF BEHAVIOURAL MODERNITY". Oxford Journal of Archaeology. 29 (3): 229–243. doi:10.1111/j.1468-0092.2010.00346.x.
  44. ^ a b c d Norton, Christopher; Jin, Jennie (2009). "The Evolution of Modern Human Behavior in East Asia: Current Perspectives". Evolutionary Anthropology. 18 (6): 247–260. doi:10.1002/evan.20235.
  45. ^ Liu, Wu; et al. (2010). "Human remains from Zhirendong, South China, and modern human emergence in East Asia". PNAS. 107 (45): 19201–19206. Bibcode:2010PNAS..10719201L. doi:10.1073/pnas.1014386107. PMC 2984215. PMID 20974952.
  46. ^ Norton, Christopher; Bae, K. (2008). "The Movius Line sensu lato (Norton et al. 2006) further assessed and defined". Journal of Human Evolution. 55 (6): 1148–1150. doi:10.1016/j.jhevol.2008.08.003. PMID 18809202.
  47. ^ Lycett, Stephen; Norton, Christopher (2010). "A demographic model for Palaeolithic technological evolution: The case of East Asia and the Movius Line". Quaternary International. 211 (1–2): 55–65. Bibcode:2010QuInt.211...55L. doi:10.1016/j.quaint.2008.12.001.

External links

This page was last edited on 27 April 2019, at 13:17
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.