Evolutionary Hypotheses for Human Childhood / BARRY BOGIN


Evolutionary Hypotheses for Human Childhood
BARRY BOGIN Department of Behavioral Sciences, University of Michigan-Dearborn, Dearborn, Michigan 48128

This is an intriguing and very readable paper that delineates clear differences in thought regarding the evolution of Homo sapiens. Bogin rejects non-reproduction-based explanations (welcome surprise!) for the “appearance” of extended infancy, childhood and juvenile stages of development. He rejects neoteny or heterochrony as sole mechanisms to account for unique aspects of human development; however, some of his claims are simply wrong regarding human uniqueness – and, within his hypothesis, there is nothing to rule out heterochrony and neoteny as critical mechanisms behind the addition or extension of the human infant-child-juvenile sequence of development.

My emphasis in this blog is on sexual selection for tameness / domestication due to the shift to agriculture from hunting, foraging, scavenging and gathering (nomadism). The inevitable sedentary-urban lifestyle that the new dependence on less nutritious food, and the increase in labor required for food production, necessitated adaptations that we see in a neotenic modern social type that dominates today. Many steps involving differential evolution of brain function, reproduction, behavior, culture and physiology have likely taken place to produce Homo sapiens sapiens (neurotypical humans).

I’d like to comment point by point, but this is a long essay…  


Evolutionary Hypotheses for Human Childhood (1997)

BARRY BOGIN Department of Behavioral Sciences, University of Michigan-Dearborn, Dearborn, Michigan


The origins of human childhood have fascinated scholars from many disciplines. Some researchers argue that childhood, and many other human characteristics, evolved by heterochrony, an evolutionary process that alters the timing of growth stages from ancestors to their descendants. Other scholars argue against heterochrony, but so far have not offered a well-developed alternative hypothesis. This essay presents such an alternative.

Childhood is defined as a unique developmental stage of humans. Childhood is the period following infancy, when the youngster is weaned from nursing but still depends on older people for feeding and protection. The biological constraints of childhood, which include an immature dentition, a small digestive system, and a calorie-demanding brain that is both relatively large and growing rapidly, necessitate the care and feeding that older individuals must provide.

Evidence is presented that childhood evolved as a new stage hominid life history, first appearing, perhaps, during the time of Homo habilis. The value of childhood is often ascribed to learning many aspects of human culture. It is certainly true that childhood provides ‘‘extra’’ time for brain development and learning. However, the initial selective value of childhood may be more closely related to parental strategies to increase reproductive success. Childhood allows a woman to give birth to new offspring and provide care for existing dependent young. Understanding the nature of childhood helps to explain why humans have lengthy development and low fertility, but greater reproductive success than any other species. Yrbk Phys Anthropol 40:63–89, 1997. r 1997 Wiley-Liss, Inc.

Childhood fascinates scholars and practitioners from many disciplines. Virtually all human cultures recognize a time of life that may be called ‘‘childhood.’’ Many historical sources from Egyptian times to the 19th century, including Wordsworth in the poem above, mention that ‘‘childhood’’occupies the first 6 to 7 years of life (Boyd, 1980). Some explanations for the origins and functions of childhood have been proposed, but none of these is accepted universally. Perhaps the lack of agreement is due to the nature of human evolutionary biology.

Much, much more…..


Orangutans hanging out, from James Tan.

There are a lot of conflicting claims about “childhood” and other aspects of comparative primate development…… The orangutan has the longest childhood dependence on the mother of any animal in the world, because there is so much for a young orangutan to learn in order to survive. The babies nurse until they are about six years of age….Orangutan females only give birth about once every 8 years – the longest time between births of any mammal on earth. (This results in only 4 to 5 babies in her lifetime.) This is why orangutan populations are very slow to recover from disturbance.

from http://primatefacts.tumblr.com


Self-mythologizing / Homo sapiens NT strikes again

Every once in awhile, I like to check in with neurotypical “pop science” versions of WHO WE ARE – narcissism knows no limits. 

From SLATE.com

Science / The state of the universe. (Not too pompous!)
Jan. 29 2013

Why Are We the Last Apes Standing?

There’s a misconception among a lot of us Homo sapiens that we and our direct ancestors are the only humans ever to have walked the planet. It turns out that the emergence of our kind isn’t nearly that simple. The whole story of human evolution is messy, and the more we look into the matter, the messier it becomes.


Before we go into this “messy” NT mythology – the author: His website is www.chipwalter.com


At last you have made your way to the website of Chip Walter. (Try to control your excitement.) If you’re a curious person – and your discovery of this site attests that you are – then you’ve arrived at the right place. Go ahead, browse…

Chip is a journalist, author, filmmaker and former CNN Bureau Chief. He has written four books, all of them, one way or another, explorations of human creativity, human nature and human curiosity. (That should be a warning: shameless BS to follow)


Paleoanthropologists have discovered as many as 27 different human species (the experts tend to debate where to draw the line between groups). These hominids diverged after our lineage split from a common ancestor we shared with chimpanzees 7 million years ago, give or take a few hundred millennia.

Many of these species crossed paths, competed, and mated. Populations ebbed and flowed in tight little tribes, at first on the expanding savannahs of Africa, later throughout Europe, Asia, and all the way to Indonesia. Just 100,000 years ago, there were several human species sharing the planet, possibly more: Neanderthals in Europe and West Asia, the mysterious Denisovan people of Siberia, the recently discovered Red Deer Cave people living in southern China, Homo floresiensis (the Hobbits of Indonesia), and other yet unknown descendants of Homo erectus who left indications that they were around (the DNA of specialized body lice, to be specific). And, of course, there was our kind, Homo sapiens sapiens (the wise, wise ones), still living in Africa, not yet having departed the mother continent. At most, each species consisted of a few tens of thousands of people hanging on by their battered fingernails. Somehow, out of all of these struggles, our particular brand of human emerged as the sole survivor and then went on, rather rapidly, to materially rearrange the world.

If there once were so many other human species wandering the planet, why are we alone still standing? After all, couldn’t another version or two have survived and coexisted with us on a world as large as ours? Lions and tigers coexist; so do jaguars and cheetahs. Gorillas, orangutans, bonobos, and chimpanzees do as well (though barely). Two kinds of elephants and multiple versions of dolphins, sharks, bears, birds, and beetles—countless beetles—inhabit the planet. Yet only one kind of human? Why?

More than once, one variety may have done in another either by murdering its rivals outright or outcompeting them for limited resources. But the answer isn’t as simple or dramatic as a war of extermination with one species turning on the other in some prehistoric version of Planet of the Apes. The reason we are still here to ruminate on why we are still here is because, of all those other human species, only we evolved a long childhood.

Over the course of the past 1.5 million years, the forces of evolution inserted an extra six years between infancy and pre-adolescence—a childhood—into the life of our species. And that changed everything.

Why should adding a childhood help us escape extinction’s pitiless scythe? Looked at logically, it shouldn’t. All it would seem to do is lengthen the time between birth and mating, which would slow down the clamoring business of the species’ own continuance. But there was one game-changing side effect of a long childhood. Those six years of life between ages 1 and 7 are the time when we lay the groundwork for the people we grow up to become. Without childhood you and I would never have the opportunity to step away from the dictates of our genes and develop the talents, quirks, and foibles that make us all the devastatingly charming, adaptable, and distinctive individuals we are.

Childhood came into existence as the result of a peculiar evolutionary phenomenon known generally as neoteny. (More about this sweeping misinterpretation later) The term comes from two Greek words, neos meaning “new” (in the sense of “juvenile”) and teinein meaning to “extend,” and it means the retention of youthful traits. In the case of humans, it meant that our ancestors passed along to us a way to stretch youth farther into life.

More than a million years ago, our direct ancestors found themselves in a real evolutionary pickle. One the one hand, their brains were growing larger than those of their rain forest cousins, and on the other, they had taken to walking upright because they spent most of their time in Africa’s expanding savannas. Both features would seem to have substantially increased the likelihood of their survival, and they did, except for one problem: Standing upright favors the evolution of narrow hips and therefore narrows the birth canal. And that made bringing larger-headed infants to full term before birth increasingly difficult.

If we were born as physically mature as, say, an infant gorilla, our mothers would be forced to carry us for 20 months! But if they did carry us that long, our larger heads wouldn’t make it through the birth canal. We would be, literally, unbearable. The solution: Our forerunners, as their brains expanded, began to arrive in the world sooner, essentially as fetuses, far less developed than other newborn primates, and considerably more helpless.

Bolk enumerated 25 specific fetal or juvenile features that disappear entirely in apes as they grow to adulthood but persist in humans. Flatter faces and high foreheads, for example, and a lack of body hair. The shape of our ears, the absence of large brow ridges over our eyes, a skull that sits facing forward on our necks, a straight rather than thumblike big toe, and the large size of our heads compared with the rest of our bodies. You can find every one of these traits in fetal, infant, or toddling apes, and modern human adults.

In the nasty and brutish prehistoric world our ancestors inhabited, arriving prematurely could have been a very bad thing. But to see the advantages of being born helpless and fetal, all you have to do is watch a 2-year-old. Human children are the most voracious learners planet Earth has ever seen, and they are that way because their brains are still rapidly developing after birth. Neoteny, and the childhood it spawned, not only extended the time during which we grow up but ensured that we spent it developing not inside the safety of the womb but outside in the wide, convoluted, and unpredictable world.

The same neuronal networks that in other animals are largely set before or shortly after birth remain open and flexible in us. Other primates also exhibit “sensitive periods” for learning as their brains develop, but they pass quickly, and their brain circuitry is mostly established by their first birthday, leaving them far less touched by the experiences of their youth.

The major problem with all this NT self-congratulatory aggrandizement is this: the equally possible scenario that this “open, externalized brain development” leaves human fetuses-infants-children highly vulnerable to disastrous consequences: death in infancy by neglect, disease and predation; maternal death, brain and nervous system damage due to not-so-healthy human environments, insufficient care and nutrition during critical post-birth growth, plus the usual demands and perils of nature.  And in “modern” societies, the necessity of a tremendous amount of medical-technological intervention in problem pregnancies: extreme premature birth, caesarian section delivery, long periods of ICU support, and growing incidence of life-long impairment.    

“Inattentional Blindness” to any negative consequences of human evolution is a true failure in NT perception of the human condition.

Based on the current fossil evidence, this was true to a lesser extent of the 26 other savanna apes and humans. Homo habilis, H. ergaster, H. erectus, even H. heidelbergensis (which is likely the common ancestor of Neanderthals, Denisovans, and us), all had prolonged childhoods compared with chimpanzees and gorillas, but none as long as ours. In fact, Harvard paleoanthropologist Tanya Smith and her colleagues have found that Neanderthals reversed the trend. By the time they met their end around 30,000 years ago, they were reaching childbearing age at about the age of 11 or 12, which is three to five years earlier than their Homo sapiens cousins. Was this in response to evolutionary pressure to accelerate childbearing to replenish the dwindling species? Maybe. But in the bargain, they traded away the flexibility that childhood delivers, and that may have ultimately led to their demise.

Aye, yai, yai! This string of NT echolalia, copied and pieced together from pop-science interpretations of “science projects” is worthy of Biblical mythology… a montage, a disordered mosaic; a collage of key words, that condenses millions of years of evolutionary change into a “slightly longer” (call it 6 million years instead of 6 thousand – sounds more scientific) – history of Creation… this is for neurotypical consumption: It’s okay… Evolution is really just magic, after all! 

We are different. During those six critical years, our brains furiously wire and rewire themselves, capturing experience, encoding and applying it to the needs of our particular life. Our extended childhood essentially enables our brains to better match our experience and environment. (Whatever that is supposed to mean – like wearing Bermuda shorts to the beach?) It is the foundation of the thing we call our personalities, the attributes that make you you and me me. Without it, you would be far more similar to everyone else, far less quirky and creative and less, well … you. Our childhood also helps explain how chimpanzees, remarkable as they are, can have 99 percent of our DNA but nothing like the same level of diversity, complexity, or inventiveness.

You are creative and quirky (dull and conformist) – and even if that’s a shameless lie (it is), AT LEAST you’re smarter than a chimpanzee!  

Our long childhood has allowed us to collectively engage in ever broadening conversations as we keep finding new ways to communicate; we jabber and bristle with invention and pool together waves of fresh ideas, good and bad, into that elaborate, rambling edifice we call human civilization. Without all of this variety, all of these interlocked notions and accomplishments, the world, for better or worse, would not be as it is, brimming with this species of self-aware conflicted apes, ingenious enough to rocket rovers off to Mars and construct the Internet, wage wars on international scales, invent both WMDs and symphonies. If not for our long childhoods, we would not be here at all, the last apes standing. Can we remain standing? Possibly. I’m counting on the child in us, the part that loves to meander and play, go down blind alleys, wonder why and fancy the impossible.

How shockingly stupid (and awful) writing. 


Homo erectus in Middle East / Emergence of “Fat Hunters”

New ideas on Homo erectus and an evolutionary shift to “a new hominin lineage” in the Middle East. 

Go to original paper for details and much more…

See also an interesting commentary on H. erectus by John Hawks




Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

  • Miki Ben-Dor, Avi Gopher, Israel Hershkovitz, Ran Barkai
  • Published: December 9, 2011


It is our contention that two distinct elements combined in the Levant to propel the evolutionary process of replacing H. erectus by a new hominin lineage ([1], As the classification of varieties of the genus Homo is problematic, we refrain in this paper from any taxonomic designations that would indicate species or subspecies affiliation for the hominins of Qesem Cave. (Thank-you!) 

The Qesem Cave hominin, based on the analysis of teeth shares dental characteristics with the Skhul/Qafzeh Middle Paleolithic populations and to some extent also with Neandertals). One was the disappearance of the elephant (Elephas antiquus) – an ideal food-package in terms of fat and protein content throughout the year – which was until then a main calorie contributor to the diet of the H. erectus in the Levant. The second was the continuous necessity of H. erectus to consume animal fat as part of their diet, especially when taking into account their large brains [2]. The need to consume animal fat is the result of the physiological ceiling on the consumption of protein and plant foods. The obligatory nature of animal fat consumption turned the alleged large prey preference [3], [4] of H. erectus into a large prey dependence. Daily energy expenditure (DEE) of the hominins would have increased when very large animals such as the elephant had diminished and a larger number of smaller, faster animals had to be captured to provide the same amount of calories and required fat. This fitness pressure would have been considerably more acute during the dry seasons that prevail in the Levant. Such an eventuality, we suggest, led to the evolution of a better equipped species, in comparison with H. erectus, that also had a lighter body [5], a greater lower limb to weight ratio ([6]:194), and improved levels of knowledge, skill, and coordination ([7]:63) allowing it to better handle the hunting of an increased number of smaller animals and most probably also develop a new supporting social organization. (Chicken or egg? Did the environmental change “promote” a newer, leaner, more coordinated version of Homo erectus, or did a “new hominin” move in from elsewhere?)

We also suggest that this evolutionary process was related to the appearance of a new and innovative local cultural complex – the Levantine Acheulo-Yabrudian [8], [9]. Moreover, a recent study of dental remains from the Acheulo-Yabrudian site of Qesem Cave in Israel dated to 400-200 kyr ago [10], [11] has indicated that the hominins inhabiting the cave were not H. erectus but were rather most similar to later populations (e.g., Skhul/Qafzeh) of this region ([1] and references therein).

The Broader Context

Our direct ancestor, H. erectus, was equipped with a thick and large skull, a large brain (900 cc on average), impressive brow ridges and a strong and heavy body, heavier than that of its H. sapiens successor (e.g., [12], [13], [14]). Inhabiting the old world for some 1.5 million years, H. erectus is commonly associated with the Acheulian cultural complex, which is characterized by the production of large flakes and handaxes – large tools shaped by bifacial flaking. Handaxes are interpreted as tools associated with the butchering of large game (e.g., [15], [16]). H. erectus was also suggested in recent years to have used fire [17], [18]; however the supporting evidence is inconclusive. Albeit the positive archaeological evidence from the site of Gesher Benot Ya’aqov (henceforth GBY) dated to around 780 kyr [19], [20], [21], the habitual use of fire became widely spread only after 400 kyr [22], [23], [24], [25].

Archaeological evidence seems to associate H. erectus with large and medium-sized game {Namely, Body Size Group A (BSGA Elephant, >1000 kg), BSGB (Hippopotamus, rhinoceros approx. 1000 kg), and BSGC (Giant deer, red deer, boar, bovine, 80–250 kg); (after [26])}, most conspicuously elephants, whose remains are commonly found at Acheulian sites throughout Africa, Asia, and Europe (e.g., [26], [27], [28], [29], [30]). In some instances elephant bones and tusks were also transformed into shaped tools, specifically artifacts reminiscent of the characteristic Acheulian stone handaxes [31].

In Africa, H. sapiens appears around 200 kyr ago, most probably replacing H. erectus and/or H. heidelbergensis [32], [33], [34]. Early African H. sapiens used both handaxes and the sophisticated tool-manufacturing technologies known as the Levallois technique (e.g., [35], [36]) while its sites are devoid of elephants [32], [35]. The presence of elephants in many Acheulian African sites and their absence from later Middle Stone Age sites [29], [37], evoked an overkill hypothesis ([38]:382), which was never convincingly demonstrated. Thus no link was proposed, in the case of Africa, between human evolution and the exclusion of elephants from the human diet, and no evolutionary reasoning was offered for the emergence of H. sapiens in Africa [39].

In Europe, H. erectus was replaced by H. heidelbergensis [40] and later by hominins associated with the Neanderthal evolutionary lineage [41]. In spite of significant cultural changes, such as the adoption of the Levallois technique and the common use of fire, the manufacture and use of handaxes and the association with large game persisted in post-erectus Europe until the demise of the Neandertals, around 30 kyr BP (e.g., [42]). H. sapiens did not evolve in Europe but migrated to it no earlier than 40 kyr BP (e.g., [43]).

In the Levant, dental remains from the Acheulo-Yabrudian site of Qesem Cave, Israel [10], [11] demonstrate resemblance to dental records of later, Middle Paleolithic populations in the region [1] indicating that H. erectus was replaced some 400 kyr ago by a new hominin ancestral to later populations in the Levant. A rich and well-dated (400-200 kyr) archaeological dataset known from the Levant offers a glimpse into this significant process and a better understanding of the circumstances leading to the later emergence of modern humans thus suggesting a possible link between the cultural and biological processes. This dataset pertains to the unique local cultural complex known as the Acheulo-Yabrudian, a diversified and innovative cultural complex (e.g., [8], [44], [45]), which appeared some 400 kyr ago, immediately following the Acheulian cultural complex [10], [11], and which lasted some 200 kyr. Acheulo-Yabrudian sites as well as sites associated with subsequent cultures in the Levant show no elephant remains in their faunal assemblages.


For more than two decades a view dominated anthropological discussions that all modern human variation derived from Africa within a relatively recent chronological framework. Recent years challenged this paradigm with new discoveries from Europe, China, and other localities, as well as by new advances in theory and methodology. These developments are now setting the stage for a new understanding of the human story in general and the emergence of modern humans in particular (e.g., [1], [39], [132], [133], [134], [135], [136], [137], [138], [139], [140], [141], [142], [143], [144], [145], [146]). In this respect, the Qesem hominins may play an important role. Analysis of their dental remains [1] suggests a much deeper time frame between at least some of the ancestral populations and modern humans than that which is assumed by the “Out of Africa” model. This, combined with previous genetic studies (e.g., [147], [148], [149], [150]), lends support to the notion of assimilation (e.g., [144]) between populations migrating “out of Africa” and populations already established in these parts of Eurasia.

It is still premature to indicate whether the Qesem hominin ancestors evolved in Africa prior to 400 kyr [136], developed blade technologies [151], [152], and then migrated to the Levant to establish the new and unique Acheulo-Yabrudian cultural complex; or whether (as may be derived from our model) we face a local, Levantine emergence of a new hominin lineage. (If it’s local, from which species did the “new hominin” evolve? Is this the putative location where H. erectus “became” H. sapiens?) The plethora of hominins in the Levantine Middle Paleolithic fossil record (Qafzeh, Skhul, Zuttiyeh, Tabun) and the fact that the Acheulo-Yabrudian cultural complex has no counterparts in Africa may hint in favor of local cultural and biological developments. This notion gains indirect support from the Denisova finds that raise the possibility that several different hominin groups spread out across Europe and Asia for hundreds of thousands of years, probably contributing to the emergence of modern human populations [153], [154], [155].

It should not come as a surprise that H. erectus, and its successors managed, and in fact evolved, to obtain a substantial amount of the densest form of nutritional energy available in nature – fat – to the point that it became an obligatory food source. Animal fat was an essential food source necessary in order to meet the daily energy expenditure of these Pleistocene hominins, especially taking into account their large energy-demanding brains. It should also not come as a surprise that for a predator, the disappearance of a major prey animal may be a significant reason for evolutionary change. The elephant was a uniquely large and fat-rich food-package and therefore a most attractive target during the Levantine Lower Palaeolithic Acheulian. Our calculations show that the elephant’s disappearance from the Levant just before 400 kyr was significant enough an event to have triggered the evolution of a species that was more adept, both physically and mentally, to obtain dense energy (such as fat) from a higher number of smaller, more evasive animals. The concomitant emergence of a new and innovative cultural complex – the Acheulo-Yabrudian, heralds a new set of behavioral habits including changes in hunting and sharing practices [9], [23], [45] that are relevant to our model.

Thus, the particular dietary developments and cultural innovations joined together at the end of the Lower Paleolithic period in the Levant, reflecting a link between human biological and cultural/behavioral evolution. If indeed, as we tried to show, the dependence of humans on fat was so fundamental to their existence, the application is made possible, perhaps after some refinement, of this proposed bioenergetic model to the understanding of other important developments in human evolutionary history.


John Hawks on NEW Homo naledi research “So Human”

Naledi dated to? Ask a geologist! Nice demonstration of how one’s point of view (academic discipline) “changes” what you get from the fossils… and other problems of location, spatial relationships, genetics, time. 


Cranial Deformation / Is it Benign?

This is one of those topics that is ignored, except by “ancient alien” believers: Why? The general opinion is that skull deformation has no effect on brain function. Where did this idea originate? There is abundant information on “medical” conditions such as microcephaly, etc. but very little on intentional post-birth deformation. Could it be due to early “eugenic” focus on the implications of skull shape? If so, this is unfortunate: perhaps research on ACD has been neglected?

It is very odd that the people who expend so much time and energy on describing human skulls for the purpose of explaining human evolution, and those who purport to be the experts on human behavior, apparently aren’t interested in this subject.


A typical dismissal of any consequences to brain function: 

However, numerous studies have indicated that head binding has only negligible effects on the skull itself and that the inevitable modification of brain shape has no unfortunate side-effects. As long as the volume of the brain is unchanged, its functioning seemingly remains unimpaired.” Jun 13, 2016 

A Paracus skull, Peru

Note the implications for “frontal lobe” disasters: If high status “leaders” were impaired by this process, what effect would that have on executive functions? On decision-making abilities; on rational thought; on violent behavior?  

Artificial Cranial Deformation: Potential Implications for Affected Brain Function

Tyler G O’Brien1*, Lauren R Peters1 and Marc E Hines2 1 Department of Sociology, Anthropology and Criminology, University of Northern Iowa, Cedar Falls, USA 2 Department of Neurology, Covenant Hospital, Waterloo, USA

Full PDF: (provides good info on the developing brain)


Abstract The anthropological study of the ancient cross-cultural practice of artificial cranial deformation (ACD), or intentional head modification, allows for the opportunity to assess the effects of functional interactions of the dynamic altered growth and development processes. Intentionally altering the infant skull is produced through mechanical means by attaching a device to the child’s head. Through the application of a deforming apparatus directly to the infant’s head, soon after birth and up to as long as four years, the child’s head becomes permanently altered. The amount of cranial modification and subsequent deformation is dependent upon the extent of time the molding apparatus is applied to the infant’s head. The longer the amount of time applied the greater the resulting stress and subsequent deformation. This paper explores the potential of inhibited cranial development or spatial disorientation and the subsequent effects it may have on adjacent functionally and morphologically related structures, especially as it pertains to brain function. A theoretical analysis is presented because of the practically non-existent data for this ancient practice. However, based on bioarchaeological and neurological analyses of the cranium and brain, it is highly suspected that ACD, in general, would have produced negative results to the lobes and abilities of the individual; such as: influencing vision, object recognition, hearing ability, impairing memory, promoting inattentiveness, inability to concentrate and motor aphasia, contributing to behavior disorders and difficulty in learning new information.

Discussion and Conclusions Since ACD is no longer popularly practiced only a theory can be generated about what implication these various ACD practices had on the individuals’ mental abilities and functions. Based on similar deforming conditions to the brain, like plagiocephaly, the results of these studies can help in understanding the circumstances of the past. Whether the pressures applied to these areas had harmful, beneficial, or insignificant influences can only be theoretically determined. A closer look at where pressure from the bands was applied on the various lobes of the brain, along with a fair, impartial consideration of the possible implications that occurred, sheds light on some interesting possibilities that ACD practices could have had on the individuals. The practice of annular ACD would affect the frontal and occipital areas of the skull. Pressure to these areas would potentially affect the functions correlated with these areas in the lobes of the brain. Using information and experiences from the modern world that simulate similar states helps to form a hypothesis of whether the implications were harmful or beneficial. Damage to the frontal lobe that has been documented by doctors shows that impairment to memory, inattentiveness, inability to concentrate, behavior disorders, difficulty in learning new information, and motor aphasia take place. Documentation also shows that the pressure from the bands, from both tabular and/or annular ACD, in the occipital region may have influenced vision and ability to recognize objects. Pressure to temporal areas may induce damage that results in changes of worsening hearing ability, agitation, and irritability [29]. In the cultures that practiced ACD, distant in both time and numerous other factors, it is possible that such deformations were of little significance to the society as a whole. However, it can generally be reasonably argued that most afterbirth deformational techniques do not improve functional outcomes of the tissues themselves. In a particular cultural or exposure setting however, one could argue that some advantage could be construed or obtained. The extent of these implications could have ranged from individuals due to duration and intensity of the pressure. An understanding of the full extent of the implications of ACD may never be met, but perhaps in the future more knowledge will be available through more research in the areas of craniosynostosis and plagiocephaly that could aid in this uncertainty.

much more…

Predator Behavior / H. sapiens compared to H. Neanderthalensis

I’ve been thinking about H. sapiens and H. Neanderthalensis as predator types who both had to contend with the local competition, whether in Africa, Eurasia, or the far north. This video on Yellowstone demonstrates competition between grizzlies, wolves, coyote, and carnivorous birds. For each type, it’s not “either or” scavenger, hunter or opportunist: it’s all opportunistic, actually, whether hunting to kill, hunting to drive off a competitor from a kill, or grabbing whatever part or leftover can be had. Prey animals such as bison and elk are not passive victims; predators and prey are making instaneous ongoing assessments of risk and reward; of flexible strategies that utilize environmental topography, water, snow, plant type and distribution, forested areas and seasonal change.

It’s tempting to try to place H. sapiens and H. Neanderthalensis into similar environments. Bear versus wolf is an interesting comparison: bears possess individual size, strength and speed and compete with each other. Wolves hunt singly, in pairs and in packs, with size of prey a major factor. Picking off the young, injured and old is a major theme, as well as stealing carcasses and eating rotten carcasses, for both wolves and bears; bears hibernate in winter leaving excellent opportunity for wolves to thrive. Bears and wolves otherwise compete directly for food.

As much or more time and energy is spent harassing and exhausting opponents and prey – our concept of swift kills is usually all wrong. Endurance, persistence, luck and knowing when to quit, are vital knowledge learned from adults and by daily experience.  Successful hunting occurs in stages, more like boxing rounds, than swift kills as seen in predatory birds.

While thinking about all this, I watched the following video, because the idea of Neanderthal as an apex predator is currently favored; if so, would its behavior be more like big cats, in which sheer mass, size, ambush, stealth and short bursts of speed overwhelm prey, or the behavior of wolves? Persistent wearing down of an isolated target, tactical wounding, stress from the chase and mobbing, with ultimate death by exhaustion the goal? Of course, the advent of projectiles radically changed all this.

I have no opinion overall of the video’s presentation as accurate, but appreciate many points that xxx brings up; especially the massive muscular body of Neanderthal. I object to  the inky black skin, head position, red eyes and chimp-gorilla face as unnecessarily extreme. Body hair – very possible. I do believe that our “obsession with monsters” is a phantom of the collective memory in Homo sapiens, as are fears of snakes, spiders, and the  irrational hatred of wolves – I have suggested that some types of “giants” in myths could be a memory of Neanderthal overlap with Homo sapiens; even the models for early “gods”. 

I think the notion that we must “re-orient” our analysis to predatory behavior in both H. neanderthal and H. sapiens, in competition with other top predators, in various environments, is the point here.

Two myths need to be challenged: that injuries in Neanderthals were like those of contemporary rodeo athletes (they are not, and in fact are no different than those found in AMH, who were their contemporaries) and that ARCHAIC AMH were “just like us” both physically and mentally. ARCHAIC AMH were more similar to Neanderthal, than to Modern Social Humans (neurotypicals). It is nearly impossible to identify many archaic fossil skulls as definitely H. sapiens or H. neanderthaliensis.

The Them and Us Theory is no more fanciful that most “official narratives” that downplay the wildness of Archaic AMH, both in appearance and behavior – for me, what the Them and Us Theory points to, and misses entirely, is that,

Modern Social Humans are the “oddity”.

That modern social humans are JUVENALIZED AND DOMESTICATED in relation to both H. Neanderthal and Archaic H. sapiens.

The present-day “modern human skull” is an “infantile” skull. 

Upper Paleo “Cave Artists” more accurate than Da Vinci

Cavemen Were Better at Depicting Quadruped Walking than Modern Artists: Erroneous Walking Illustrations in the Fine Arts from Prehistory to Today

December 5, 2012 / https://doi.org/10.1371/journal.pone.0049786


The experts of animal locomotion well know the characteristics of quadruped walking since the pioneering work of Eadweard Muybridge in the 1880s. Most of the quadrupeds advance their legs in the same lateral sequence when walking, and only the timing of their supporting feet differ more or less. How did this scientific knowledge influence the correctness of quadruped walking depictions in the fine arts? Did the proportion of erroneous quadruped walking illustrations relative to their total number (i.e. error rate) decrease after Muybridge? How correctly have cavemen (upper palaeolithic Homo sapiens) illustrated the walking of their quadruped prey in prehistoric times? The aim of this work is to answer these questions. We have analyzed 1000 prehistoric and modern artistic quadruped walking depictions and determined whether they are correct or not in respect of the limb attitudes presented, assuming that the other aspects of depictions used to determine the animals gait are illustrated correctly. The error rate of modern pre-Muybridgean quadruped walking illustrations was 83.5%, much more than the error rate of 73.3% of mere chance. It decreased to 57.9% after 1887, that is in the post-Muybridgean period. Most surprisingly, the prehistoric quadruped walking depictions had the lowest error rate of 46.2%. All these differences were statistically significant. Thus, cavemen were more keenly aware of the slower motion of their prey animals and illustrated quadruped walking more precisely than later artists.

Figure 4. An erroneous modern, pre-Muybridgean horse drawing of Leonardo da Vinci (http://www.davincisketches.com).
(A, B) The erroneous horse drawing fits into the cell Eh of the walking matrix. (A) Picture of the graphic art. (B) Schematic drawing of the horse. (C, D) Two possible corrections of the horse: C keeps the postures of the hind legs and corrects the attitudes of the fore legs, thus falls into the cell Gh of the walking matrix. D, keeping the postures of the fore legs and correcting the attitudes of the hind legs, belongs to the cell Ee of the walking matrix.
Full article:  

Biology of Language / What’s up?


The Technology Org website publishes information about various science and technology topics. We concentrate on providing news from different sources and do not limit ourselves on some topics, but try to provide information ranging from pure technical disciplines to natural and social sciences.

Our web partner / Electrical Engineering News, Resources, and Community

Biological origins of language remain a mystery

Posted June 3, 2014

Archaeological evidence suggests that spoken or sign language is a relatively recent phenomenon that emerged right after our divergence with Neanderthals. Gathered data show that the life of early Homo sapiens was replete with external symbolic representations. However, such findings provide little to no insight as to when the relevant semantic and syntactic features evolved, as well as what pressures were responsible for their emergence.

Biological evolution of language is peculiar due to the fact that humans are the only species capable of such a developed form of communication. Most species are capable of communicating in one way or another. Such communication can be compared to that of other related species and in this way explained in evolutionary terms.

An example of successful evolutionary analysis of intra-species communication is the case of túngara frogs. Most male frogs whine during the period of their sexual advertising, however túngara frogs are peculiar in that their males add chucks to their whining. How did they come to do this?

In order to answer this question, one must compare sister species of the túngara frog and determine, which differences could have caused chucks in their communication. It turned out that females of the sister species do not have sufficient auditory mechanisms to recognize the chucks by male túngara. This, in turn, aids in explaining, why túngara males have evolved the larynx allowing them to produce chucks.

But evolutionary analysis of such kind is mostly irrelevant, when investigating the origins of language development. Homo sapiens have no sister species that would be capable of anything even remotely similar to human language. This may sound like a bold or even arrogant statement, however the empirical data suggests that no other species including nonhuman primates is able to perform high level computations required for language processing.

A language consists of abstract information units that are organized and combined following some specific computational procedures. Such computation allows for distinguishing between many ambiguities that are prevalent in any natural language. As an example, the word “unlockable” is ambiguous because it can be understood both as something which is not possible to lock ([un-[lock-able]]), as well as something that is fairly easy to unlock ([[un-lock]-able]). The order in which the word is processed, which is in turn grammatically dependent, lets us distinguish between phonetically identical words according to the context.

Another example of high-level computational capacity, needed for natural language processing, comes from research on language acquisition. It has been observed that a child first makes use of incorrect but biologically possible grammars, later narrowing them down to the target grammar of their mother tongue. For example, many kids omit the subject and say “tickles me” instead of “he tickles me”.

The former is ungrammatical for English, though fully consistent with grammars such as Mandarin. As the child acquires more grammatical rules, forms like that are gradually eliminated in favor of grammatical forms in English. This suggests that children are endowed with a capacity to acquire a wide range of possible grammars, which are then selected by the linguistic data in the specific environment.

In comparative studies, humans are often compared to songbirds or nonhuman primates. Songbird communication is a highly specialized and intricate system. However, it is also finite and linked to a single (acoustic) sensory channel. Human language, on the other hand, has recursively definable syntax with an infinite number of possible combinations, as well as gesticular, symbolic or other forms of visual representation. What is more, when song syllables comprise longer structures, new combinations have little or no impact on the meaning of the song, whereas human language semantics is highly dependent on the way sentences are formed.

(See post on Piraha language) 

Comparison of human and nonhuman primate communication does not provide much insight neither. Even though our vocal apparatus is nearly identical to that of a chimpanzee, there are no parallels between human language and chimpanzee communication. For example, primate children do not acquire their communicational capabilities in the same way we do – they do not babble, they do not learn vocally and they do not induce new syntactic structures from the ones they already know.

Given the data above, researchers conclude that there is no empirical basis to suppose that a nonhuman animal form of communication served as a precursor to the modern human form. The evidence from comparative animal behavior provides little insight into how our language phenotype evolved.

Another method prevalent in evolutionary analysis is modelling. In language evolution research it is common to model a population of individuals communicating by means of their particular languages. Some kind of fitness measure is introduced which differentially affects the following generations of the modelled individuals. However, the vast majority of modelling efforts assumes an already existing language phenotype, thus shedding no light on how it emerged in the first place.

Since our pre-linguistic past is not available for observation, the only reliable method is to observe the current changes in languages and test the assumptions according to what is called the uniformitarian principle of historical science: if something has played a role in the evolution, the force of this phenomenon should be observable in currently observable processes as well.

Among the assumptions to be tested by means of evolutionary modelling is the fitness metric – what makes one form of language better than another in terms of survival of a species. A leading proposal is to identify language fitness with communicative success: if individuals can communicate better and learn languages more efficiently, then they have a greater reproductory success. However, even in this case evidence is perplexing. For example, the final consonant in the word “walked” is usually omitted, making it difficult to discern it from the present tense “walk”.

However, it is easier to recognize “walked” as a past participle because of its syntactical properties (i.e. it is usually accompanied with the words “have” or “by”), whereas simple past tense “walked” is phonetically indistinguishable from “walk” when used as a past tense verb. If the communication hypothesis were true, language evolution should favor easily discernible forms to those that are ambiguous, but the deletion rates for the past participle and for past tense do not differ in relevant contexts.

Therefore, modelling, just like comparative analysis, shows how limited our methods for language evolution analysis are, and how poor is our knowledge of our most distinguishing feature.

Original research article: Hauser MD, Yang C, Berwick RC, Tattersall I, Ryan MJ, Watumull J, Chomsky N and Lewontin RC (2014) The mystery of language evolution. Front. Psychol. 5:401. doi: 10.3389/fpsyg.2014.00401

Let’s get REAL:

Warning: kill shot included.

As usual, I think that what is missing from the modern social typical discussion around language, is specific comprehension and assessment of early human survival challenges, and an inability to “put oneself” into natural environments.

Hunting was a drastic change in lifestyle from being a “prey” animal. Prey behave and communicate among their kind AND between “predator-prey” match ups. Early humans would have “copied” the language of both single predators and pack hunters along with successful strategies. There would have been a “lifestyle” basis for language: silence, gestures, eye movement, body language are all evident in this video. Especially important is SILENCE; knowing when to shut up. It also explains male behavior vis-a- vis language and emotion.

The incessant vocalizing in present day neurotypicals would have been disastrous; language is need-application-based; that is, pragmatic and oriented to survival in specific environments.

Modern social communication regimes reflect the modern social hierarchy: INTRASPECIES predation by “top males” on the mass of humanity, or “human prey”.

Females and language? That’s another set of lifestyle stories…