Evolutionary Hypotheses for Human Childhood / BARRY BOGIN


Evolutionary Hypotheses for Human Childhood
BARRY BOGIN Department of Behavioral Sciences, University of Michigan-Dearborn, Dearborn, Michigan 48128

This is an intriguing and very readable paper that delineates clear differences in thought regarding the evolution of Homo sapiens. Bogin rejects non-reproduction-based explanations (welcome surprise!) for the “appearance” of extended infancy, childhood and juvenile stages of development. He rejects neoteny or heterochrony as sole mechanisms to account for unique aspects of human development; however, some of his claims are simply wrong regarding human uniqueness – and, within his hypothesis, there is nothing to rule out heterochrony and neoteny as critical mechanisms behind the addition or extension of the human infant-child-juvenile sequence of development.

My emphasis in this blog is on sexual selection for tameness / domestication due to the shift to agriculture from hunting, foraging, scavenging and gathering (nomadism). The inevitable sedentary-urban lifestyle that the new dependence on less nutritious food, and the increase in labor required for food production, necessitated adaptations that we see in a neotenic modern social type that dominates today. Many steps involving differential evolution of brain function, reproduction, behavior, culture and physiology have likely taken place to produce Homo sapiens sapiens (neurotypical humans).

I’d like to comment point by point, but this is a long essay…  


Evolutionary Hypotheses for Human Childhood (1997)

BARRY BOGIN Department of Behavioral Sciences, University of Michigan-Dearborn, Dearborn, Michigan


The origins of human childhood have fascinated scholars from many disciplines. Some researchers argue that childhood, and many other human characteristics, evolved by heterochrony, an evolutionary process that alters the timing of growth stages from ancestors to their descendants. Other scholars argue against heterochrony, but so far have not offered a well-developed alternative hypothesis. This essay presents such an alternative.

Childhood is defined as a unique developmental stage of humans. Childhood is the period following infancy, when the youngster is weaned from nursing but still depends on older people for feeding and protection. The biological constraints of childhood, which include an immature dentition, a small digestive system, and a calorie-demanding brain that is both relatively large and growing rapidly, necessitate the care and feeding that older individuals must provide.

Evidence is presented that childhood evolved as a new stage hominid life history, first appearing, perhaps, during the time of Homo habilis. The value of childhood is often ascribed to learning many aspects of human culture. It is certainly true that childhood provides ‘‘extra’’ time for brain development and learning. However, the initial selective value of childhood may be more closely related to parental strategies to increase reproductive success. Childhood allows a woman to give birth to new offspring and provide care for existing dependent young. Understanding the nature of childhood helps to explain why humans have lengthy development and low fertility, but greater reproductive success than any other species. Yrbk Phys Anthropol 40:63–89, 1997. r 1997 Wiley-Liss, Inc.

Childhood fascinates scholars and practitioners from many disciplines. Virtually all human cultures recognize a time of life that may be called ‘‘childhood.’’ Many historical sources from Egyptian times to the 19th century, including Wordsworth in the poem above, mention that ‘‘childhood’’occupies the first 6 to 7 years of life (Boyd, 1980). Some explanations for the origins and functions of childhood have been proposed, but none of these is accepted universally. Perhaps the lack of agreement is due to the nature of human evolutionary biology.

Much, much more…..


Orangutans hanging out, from James Tan.

There are a lot of conflicting claims about “childhood” and other aspects of comparative primate development…… The orangutan has the longest childhood dependence on the mother of any animal in the world, because there is so much for a young orangutan to learn in order to survive. The babies nurse until they are about six years of age….Orangutan females only give birth about once every 8 years – the longest time between births of any mammal on earth. (This results in only 4 to 5 babies in her lifetime.) This is why orangutan populations are very slow to recover from disturbance.

from http://primatefacts.tumblr.com


Self-mythologizing / Homo sapiens NT strikes again

Every once in awhile, I like to check in with neurotypical “pop science” versions of WHO WE ARE – narcissism knows no limits. 

From SLATE.com

Science / The state of the universe. (Not too pompous!)
Jan. 29 2013

Why Are We the Last Apes Standing?

There’s a misconception among a lot of us Homo sapiens that we and our direct ancestors are the only humans ever to have walked the planet. It turns out that the emergence of our kind isn’t nearly that simple. The whole story of human evolution is messy, and the more we look into the matter, the messier it becomes.


Before we go into this “messy” NT mythology – the author: His website is www.chipwalter.com


At last you have made your way to the website of Chip Walter. (Try to control your excitement.) If you’re a curious person – and your discovery of this site attests that you are – then you’ve arrived at the right place. Go ahead, browse…

Chip is a journalist, author, filmmaker and former CNN Bureau Chief. He has written four books, all of them, one way or another, explorations of human creativity, human nature and human curiosity. (That should be a warning: shameless BS to follow)


Paleoanthropologists have discovered as many as 27 different human species (the experts tend to debate where to draw the line between groups). These hominids diverged after our lineage split from a common ancestor we shared with chimpanzees 7 million years ago, give or take a few hundred millennia.

Many of these species crossed paths, competed, and mated. Populations ebbed and flowed in tight little tribes, at first on the expanding savannahs of Africa, later throughout Europe, Asia, and all the way to Indonesia. Just 100,000 years ago, there were several human species sharing the planet, possibly more: Neanderthals in Europe and West Asia, the mysterious Denisovan people of Siberia, the recently discovered Red Deer Cave people living in southern China, Homo floresiensis (the Hobbits of Indonesia), and other yet unknown descendants of Homo erectus who left indications that they were around (the DNA of specialized body lice, to be specific). And, of course, there was our kind, Homo sapiens sapiens (the wise, wise ones), still living in Africa, not yet having departed the mother continent. At most, each species consisted of a few tens of thousands of people hanging on by their battered fingernails. Somehow, out of all of these struggles, our particular brand of human emerged as the sole survivor and then went on, rather rapidly, to materially rearrange the world.

If there once were so many other human species wandering the planet, why are we alone still standing? After all, couldn’t another version or two have survived and coexisted with us on a world as large as ours? Lions and tigers coexist; so do jaguars and cheetahs. Gorillas, orangutans, bonobos, and chimpanzees do as well (though barely). Two kinds of elephants and multiple versions of dolphins, sharks, bears, birds, and beetles—countless beetles—inhabit the planet. Yet only one kind of human? Why?

More than once, one variety may have done in another either by murdering its rivals outright or outcompeting them for limited resources. But the answer isn’t as simple or dramatic as a war of extermination with one species turning on the other in some prehistoric version of Planet of the Apes. The reason we are still here to ruminate on why we are still here is because, of all those other human species, only we evolved a long childhood.

Over the course of the past 1.5 million years, the forces of evolution inserted an extra six years between infancy and pre-adolescence—a childhood—into the life of our species. And that changed everything.

Why should adding a childhood help us escape extinction’s pitiless scythe? Looked at logically, it shouldn’t. All it would seem to do is lengthen the time between birth and mating, which would slow down the clamoring business of the species’ own continuance. But there was one game-changing side effect of a long childhood. Those six years of life between ages 1 and 7 are the time when we lay the groundwork for the people we grow up to become. Without childhood you and I would never have the opportunity to step away from the dictates of our genes and develop the talents, quirks, and foibles that make us all the devastatingly charming, adaptable, and distinctive individuals we are.

Childhood came into existence as the result of a peculiar evolutionary phenomenon known generally as neoteny. (More about this sweeping misinterpretation later) The term comes from two Greek words, neos meaning “new” (in the sense of “juvenile”) and teinein meaning to “extend,” and it means the retention of youthful traits. In the case of humans, it meant that our ancestors passed along to us a way to stretch youth farther into life.

More than a million years ago, our direct ancestors found themselves in a real evolutionary pickle. One the one hand, their brains were growing larger than those of their rain forest cousins, and on the other, they had taken to walking upright because they spent most of their time in Africa’s expanding savannas. Both features would seem to have substantially increased the likelihood of their survival, and they did, except for one problem: Standing upright favors the evolution of narrow hips and therefore narrows the birth canal. And that made bringing larger-headed infants to full term before birth increasingly difficult.

If we were born as physically mature as, say, an infant gorilla, our mothers would be forced to carry us for 20 months! But if they did carry us that long, our larger heads wouldn’t make it through the birth canal. We would be, literally, unbearable. The solution: Our forerunners, as their brains expanded, began to arrive in the world sooner, essentially as fetuses, far less developed than other newborn primates, and considerably more helpless.

Bolk enumerated 25 specific fetal or juvenile features that disappear entirely in apes as they grow to adulthood but persist in humans. Flatter faces and high foreheads, for example, and a lack of body hair. The shape of our ears, the absence of large brow ridges over our eyes, a skull that sits facing forward on our necks, a straight rather than thumblike big toe, and the large size of our heads compared with the rest of our bodies. You can find every one of these traits in fetal, infant, or toddling apes, and modern human adults.

In the nasty and brutish prehistoric world our ancestors inhabited, arriving prematurely could have been a very bad thing. But to see the advantages of being born helpless and fetal, all you have to do is watch a 2-year-old. Human children are the most voracious learners planet Earth has ever seen, and they are that way because their brains are still rapidly developing after birth. Neoteny, and the childhood it spawned, not only extended the time during which we grow up but ensured that we spent it developing not inside the safety of the womb but outside in the wide, convoluted, and unpredictable world.

The same neuronal networks that in other animals are largely set before or shortly after birth remain open and flexible in us. Other primates also exhibit “sensitive periods” for learning as their brains develop, but they pass quickly, and their brain circuitry is mostly established by their first birthday, leaving them far less touched by the experiences of their youth.

The major problem with all this NT self-congratulatory aggrandizement is this: the equally possible scenario that this “open, externalized brain development” leaves human fetuses-infants-children highly vulnerable to disastrous consequences: death in infancy by neglect, disease and predation; maternal death, brain and nervous system damage due to not-so-healthy human environments, insufficient care and nutrition during critical post-birth growth, plus the usual demands and perils of nature.  And in “modern” societies, the necessity of a tremendous amount of medical-technological intervention in problem pregnancies: extreme premature birth, caesarian section delivery, long periods of ICU support, and growing incidence of life-long impairment.    

“Inattentional Blindness” to any negative consequences of human evolution is a true failure in NT perception of the human condition.

Based on the current fossil evidence, this was true to a lesser extent of the 26 other savanna apes and humans. Homo habilis, H. ergaster, H. erectus, even H. heidelbergensis (which is likely the common ancestor of Neanderthals, Denisovans, and us), all had prolonged childhoods compared with chimpanzees and gorillas, but none as long as ours. In fact, Harvard paleoanthropologist Tanya Smith and her colleagues have found that Neanderthals reversed the trend. By the time they met their end around 30,000 years ago, they were reaching childbearing age at about the age of 11 or 12, which is three to five years earlier than their Homo sapiens cousins. Was this in response to evolutionary pressure to accelerate childbearing to replenish the dwindling species? Maybe. But in the bargain, they traded away the flexibility that childhood delivers, and that may have ultimately led to their demise.

Aye, yai, yai! This string of NT echolalia, copied and pieced together from pop-science interpretations of “science projects” is worthy of Biblical mythology… a montage, a disordered mosaic; a collage of key words, that condenses millions of years of evolutionary change into a “slightly longer” (call it 6 million years instead of 6 thousand – sounds more scientific) – history of Creation… this is for neurotypical consumption: It’s okay… Evolution is really just magic, after all! 

We are different. During those six critical years, our brains furiously wire and rewire themselves, capturing experience, encoding and applying it to the needs of our particular life. Our extended childhood essentially enables our brains to better match our experience and environment. (Whatever that is supposed to mean – like wearing Bermuda shorts to the beach?) It is the foundation of the thing we call our personalities, the attributes that make you you and me me. Without it, you would be far more similar to everyone else, far less quirky and creative and less, well … you. Our childhood also helps explain how chimpanzees, remarkable as they are, can have 99 percent of our DNA but nothing like the same level of diversity, complexity, or inventiveness.

You are creative and quirky (dull and conformist) – and even if that’s a shameless lie (it is), AT LEAST you’re smarter than a chimpanzee!  

Our long childhood has allowed us to collectively engage in ever broadening conversations as we keep finding new ways to communicate; we jabber and bristle with invention and pool together waves of fresh ideas, good and bad, into that elaborate, rambling edifice we call human civilization. Without all of this variety, all of these interlocked notions and accomplishments, the world, for better or worse, would not be as it is, brimming with this species of self-aware conflicted apes, ingenious enough to rocket rovers off to Mars and construct the Internet, wage wars on international scales, invent both WMDs and symphonies. If not for our long childhoods, we would not be here at all, the last apes standing. Can we remain standing? Possibly. I’m counting on the child in us, the part that loves to meander and play, go down blind alleys, wonder why and fancy the impossible.

How shockingly stupid (and awful) writing. 


Thoughts on Ancient Males / Life in the flesh

In the ancient world a common greeting among travelers was, “Which gods do you worship?” Deities were compared, traded, and adopted in recognition that strangers had something of value to offer. Along with the accretion of ancestor gods into extensive pantheons, an exchange of earthly ideas and useful articles took place. Pantheons were insurance providers who covered women, children, tradesman, sailors and warriors – no matter how dangerous or risky their occupations; no matter how lowly. Multiple gods meant that everyone had a sympathetic listener, one that might increase a person’s chances for a favorable outcome to life’s ventures, large and small. 

404px-Athena_owl_Met_09_221_43 27784514 Brygos_Painter_lekythos_Athena_holding_spear_MET

A curious female type: The goddess Athena is incomprehensible to modern humans. Here she models the Trojan horse for the Greeks.

A curious female: The goddess Athena is incomprehensible to modern humans; and yet for the ancient Greeks, she was the cornerstone of civilization. Here she models the Trojan horse for the “clever” takedown of Troy.




 In The Iliad

…the gods are manifestations of physical states; the rush of adrenalin, sexual arousal, and rage. For the Homeric male, these are the gods that must be obeyed. There is no power by which a man can override the impulse-to-action of these god forces. The gifts of the notorious killer Achilles originate in the divine sphere, but he is human like his comrades; consumed by self pity and emotionally erratic.

In Ancient Greek culture, consequences accompanied individual gifts. Achilles must choose an average life (adulthood) and obscurity, or death at Troy and an immortal name. Achilles sulks like a boy, but we know that he will submit to his fate, because fate is the body, and no matter how extraordinary that body is, the body must die. Immortality for Homeric Greeks did not mean supernatural avoidance of death. To live forever meant that one’s name and deeds were preserved by the attention and skill of the poet. In Ancient Greek culture it was the artist who had the power to confer immortality.

There was no apology for violence in Homeric time. The work of men was grim adventure. Raids on neighbors and distant places for slave women, for horses and gold, for anything of value, was a man’s occupation. The Iliad is packed with unrelenting gore, and yet we continue to this day to be mesmerized by men who hack each other to death. Mundane questions arise: were these Bronze Age individuals afflicted with post traumatic stress disorder? How could women and children, as well as warriors not be traumatized by a life of episodic brutality? If they were severely damaged mentally and emotionally, how did they create a legacy of poetry, art, science and philosophy? Did these human beings inhabit a mind space that deflected trauma as if it were a rain shower? Was their literal perception of reality a type of protection?

imagesD8PA00S5riace bronze

Women will forever be drawn to the essential physicality of Homeric man. He is the original sexual male; the man whose qualities can be witnessed in the flesh. His body was a true product of nature and habit. Disfiguring scars proved his value in battle. Robust genes may have been his only participation in fatherhood.

Time and culture have produced another type of man, a supernatural creature with no marked talent, one who can offer general, but not specific, loyalty. Domestic man, propertied man, unbearably dull man, emotionally-retarded man. In his company a woman shrivels to her aptitude for patience and endurance, for heating dinner in the microwave and folding laundry. Her fate is a life of starvation.


Noble Penelope reduced to a neurotypical nag.

Homo erectus in Middle East / Emergence of “Fat Hunters”

New ideas on Homo erectus and an evolutionary shift to “a new hominin lineage” in the Middle East. 

Go to original paper for details and much more…

See also an interesting commentary on H. erectus by John Hawks




Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

  • Miki Ben-Dor, Avi Gopher, Israel Hershkovitz, Ran Barkai
  • Published: December 9, 2011


It is our contention that two distinct elements combined in the Levant to propel the evolutionary process of replacing H. erectus by a new hominin lineage ([1], As the classification of varieties of the genus Homo is problematic, we refrain in this paper from any taxonomic designations that would indicate species or subspecies affiliation for the hominins of Qesem Cave. (Thank-you!) 

The Qesem Cave hominin, based on the analysis of teeth shares dental characteristics with the Skhul/Qafzeh Middle Paleolithic populations and to some extent also with Neandertals). One was the disappearance of the elephant (Elephas antiquus) – an ideal food-package in terms of fat and protein content throughout the year – which was until then a main calorie contributor to the diet of the H. erectus in the Levant. The second was the continuous necessity of H. erectus to consume animal fat as part of their diet, especially when taking into account their large brains [2]. The need to consume animal fat is the result of the physiological ceiling on the consumption of protein and plant foods. The obligatory nature of animal fat consumption turned the alleged large prey preference [3], [4] of H. erectus into a large prey dependence. Daily energy expenditure (DEE) of the hominins would have increased when very large animals such as the elephant had diminished and a larger number of smaller, faster animals had to be captured to provide the same amount of calories and required fat. This fitness pressure would have been considerably more acute during the dry seasons that prevail in the Levant. Such an eventuality, we suggest, led to the evolution of a better equipped species, in comparison with H. erectus, that also had a lighter body [5], a greater lower limb to weight ratio ([6]:194), and improved levels of knowledge, skill, and coordination ([7]:63) allowing it to better handle the hunting of an increased number of smaller animals and most probably also develop a new supporting social organization. (Chicken or egg? Did the environmental change “promote” a newer, leaner, more coordinated version of Homo erectus, or did a “new hominin” move in from elsewhere?)

We also suggest that this evolutionary process was related to the appearance of a new and innovative local cultural complex – the Levantine Acheulo-Yabrudian [8], [9]. Moreover, a recent study of dental remains from the Acheulo-Yabrudian site of Qesem Cave in Israel dated to 400-200 kyr ago [10], [11] has indicated that the hominins inhabiting the cave were not H. erectus but were rather most similar to later populations (e.g., Skhul/Qafzeh) of this region ([1] and references therein).

The Broader Context

Our direct ancestor, H. erectus, was equipped with a thick and large skull, a large brain (900 cc on average), impressive brow ridges and a strong and heavy body, heavier than that of its H. sapiens successor (e.g., [12], [13], [14]). Inhabiting the old world for some 1.5 million years, H. erectus is commonly associated with the Acheulian cultural complex, which is characterized by the production of large flakes and handaxes – large tools shaped by bifacial flaking. Handaxes are interpreted as tools associated with the butchering of large game (e.g., [15], [16]). H. erectus was also suggested in recent years to have used fire [17], [18]; however the supporting evidence is inconclusive. Albeit the positive archaeological evidence from the site of Gesher Benot Ya’aqov (henceforth GBY) dated to around 780 kyr [19], [20], [21], the habitual use of fire became widely spread only after 400 kyr [22], [23], [24], [25].

Archaeological evidence seems to associate H. erectus with large and medium-sized game {Namely, Body Size Group A (BSGA Elephant, >1000 kg), BSGB (Hippopotamus, rhinoceros approx. 1000 kg), and BSGC (Giant deer, red deer, boar, bovine, 80–250 kg); (after [26])}, most conspicuously elephants, whose remains are commonly found at Acheulian sites throughout Africa, Asia, and Europe (e.g., [26], [27], [28], [29], [30]). In some instances elephant bones and tusks were also transformed into shaped tools, specifically artifacts reminiscent of the characteristic Acheulian stone handaxes [31].

In Africa, H. sapiens appears around 200 kyr ago, most probably replacing H. erectus and/or H. heidelbergensis [32], [33], [34]. Early African H. sapiens used both handaxes and the sophisticated tool-manufacturing technologies known as the Levallois technique (e.g., [35], [36]) while its sites are devoid of elephants [32], [35]. The presence of elephants in many Acheulian African sites and their absence from later Middle Stone Age sites [29], [37], evoked an overkill hypothesis ([38]:382), which was never convincingly demonstrated. Thus no link was proposed, in the case of Africa, between human evolution and the exclusion of elephants from the human diet, and no evolutionary reasoning was offered for the emergence of H. sapiens in Africa [39].

In Europe, H. erectus was replaced by H. heidelbergensis [40] and later by hominins associated with the Neanderthal evolutionary lineage [41]. In spite of significant cultural changes, such as the adoption of the Levallois technique and the common use of fire, the manufacture and use of handaxes and the association with large game persisted in post-erectus Europe until the demise of the Neandertals, around 30 kyr BP (e.g., [42]). H. sapiens did not evolve in Europe but migrated to it no earlier than 40 kyr BP (e.g., [43]).

In the Levant, dental remains from the Acheulo-Yabrudian site of Qesem Cave, Israel [10], [11] demonstrate resemblance to dental records of later, Middle Paleolithic populations in the region [1] indicating that H. erectus was replaced some 400 kyr ago by a new hominin ancestral to later populations in the Levant. A rich and well-dated (400-200 kyr) archaeological dataset known from the Levant offers a glimpse into this significant process and a better understanding of the circumstances leading to the later emergence of modern humans thus suggesting a possible link between the cultural and biological processes. This dataset pertains to the unique local cultural complex known as the Acheulo-Yabrudian, a diversified and innovative cultural complex (e.g., [8], [44], [45]), which appeared some 400 kyr ago, immediately following the Acheulian cultural complex [10], [11], and which lasted some 200 kyr. Acheulo-Yabrudian sites as well as sites associated with subsequent cultures in the Levant show no elephant remains in their faunal assemblages.


For more than two decades a view dominated anthropological discussions that all modern human variation derived from Africa within a relatively recent chronological framework. Recent years challenged this paradigm with new discoveries from Europe, China, and other localities, as well as by new advances in theory and methodology. These developments are now setting the stage for a new understanding of the human story in general and the emergence of modern humans in particular (e.g., [1], [39], [132], [133], [134], [135], [136], [137], [138], [139], [140], [141], [142], [143], [144], [145], [146]). In this respect, the Qesem hominins may play an important role. Analysis of their dental remains [1] suggests a much deeper time frame between at least some of the ancestral populations and modern humans than that which is assumed by the “Out of Africa” model. This, combined with previous genetic studies (e.g., [147], [148], [149], [150]), lends support to the notion of assimilation (e.g., [144]) between populations migrating “out of Africa” and populations already established in these parts of Eurasia.

It is still premature to indicate whether the Qesem hominin ancestors evolved in Africa prior to 400 kyr [136], developed blade technologies [151], [152], and then migrated to the Levant to establish the new and unique Acheulo-Yabrudian cultural complex; or whether (as may be derived from our model) we face a local, Levantine emergence of a new hominin lineage. (If it’s local, from which species did the “new hominin” evolve? Is this the putative location where H. erectus “became” H. sapiens?) The plethora of hominins in the Levantine Middle Paleolithic fossil record (Qafzeh, Skhul, Zuttiyeh, Tabun) and the fact that the Acheulo-Yabrudian cultural complex has no counterparts in Africa may hint in favor of local cultural and biological developments. This notion gains indirect support from the Denisova finds that raise the possibility that several different hominin groups spread out across Europe and Asia for hundreds of thousands of years, probably contributing to the emergence of modern human populations [153], [154], [155].

It should not come as a surprise that H. erectus, and its successors managed, and in fact evolved, to obtain a substantial amount of the densest form of nutritional energy available in nature – fat – to the point that it became an obligatory food source. Animal fat was an essential food source necessary in order to meet the daily energy expenditure of these Pleistocene hominins, especially taking into account their large energy-demanding brains. It should also not come as a surprise that for a predator, the disappearance of a major prey animal may be a significant reason for evolutionary change. The elephant was a uniquely large and fat-rich food-package and therefore a most attractive target during the Levantine Lower Palaeolithic Acheulian. Our calculations show that the elephant’s disappearance from the Levant just before 400 kyr was significant enough an event to have triggered the evolution of a species that was more adept, both physically and mentally, to obtain dense energy (such as fat) from a higher number of smaller, more evasive animals. The concomitant emergence of a new and innovative cultural complex – the Acheulo-Yabrudian, heralds a new set of behavioral habits including changes in hunting and sharing practices [9], [23], [45] that are relevant to our model.

Thus, the particular dietary developments and cultural innovations joined together at the end of the Lower Paleolithic period in the Levant, reflecting a link between human biological and cultural/behavioral evolution. If indeed, as we tried to show, the dependence of humans on fat was so fundamental to their existence, the application is made possible, perhaps after some refinement, of this proposed bioenergetic model to the understanding of other important developments in human evolutionary history.


Hibernation / Science and Personal Fantasy

For all but approximately 3 years of my life I’ve lived in “bad winter” locations, and I like winter; it’s the length of the season that gets to me. So I fantasize about the ability to hibernate for a month or two. February is daunting, especially in this part of Wyoming, where the year’s snowfall tends to be delayed until March, April and into May. In other wintry places, one may think of February as the last step toward spring; for us, it’s a dead zone that means 3 more months of winter to go. I’ve also considered the notion that archaic humans, Neanderthals especially, may have entered states of “altered metabolism” during the worst climate / weather periods. Perhaps we call it depression today; perhaps mania too, was a “normal” state of intense activity in warm periods. Bipolar disorder has been investigated as a disruption to circadian rhythm cycles. 

And, getting enough quality sleep is a “modern social problem” that seriously affects health and performance. 


Why Does Hibernating Make Animals Tired?

Hibernation tires animals out, because it may be more like wakefulness than previously thought.


Matteo Cerri is a hibernation researcher at the University of Bologna, Italy. He is currently consulting for the European Space Agency about ways to make humans hibernate during long space missions.

Hibernating mammals are able to actively suppress their metabolism, meaning they can tell their body to use less energy. Hibernation is a marvelous physiological and molecular event, and it’s still a mystery how the behavior is activated and regulated. One of the most curious mysteries about hibernation that I and my fellow hibernation researchers are trying to answer is why hibernating animals are so tired when they wake up.

There are several types of hibernation, which can last an entire season or just a part of a day (this is called “torpor”), and can even happen when the ambient temperature is high (which is called “aestivation”).

For sure, the brain plays a key role in starting the entire chain of events, but how and which part is still unknown. Among the many unexpected facets of hibernation, one is incredibly surprising.

Traditionally, hibernation is commonly seen as a “big sleep,” a way for animals to stave winter off when no food is around. But it’s actually not. Hibernation is a state characterized by the active inhibition of metabolism, and in this state, the activity of the brain differs substantially from sleep and may in fact be closer to wakefulness than many people realize. Hibernators are known to wake up periodically from their “cold sleep,” and most people would think “it’s to eat, of course!”

But that is not the case. Hibernators don’t eat during hibernation season (and, for what it’s worth, they also don’t drink or produce any urine). So, why are they waking up? To check out the weather?

Electroencephalographic recordings of the brain of hibernators give a surprising answer: They wake up to sleep. And it’s not like they shift from hibernating to a nap. These animals wake up and pass out like they’re exhausted. Delta brainwave readings, which can be used to measure the deepness or intensity of sleep, show that animals that have just woken up from hibernation are indeed sleeping intensely.

This observation has been confirmed both in seasonal hibernators, such as golden-mantled ground squirrels and European ground squirrels, and in animals that perform torpor, such as the Djungarian hamster. Why this is the case is the subject of great debate among hibernation researchers, and it matters because my team and others around the world are working on research that could lead to the possibility of human hibernation. We’d like to know as much about the process as possible.

There are two main hypotheses. The first one suggests that sleep is such a deep and necessary process for the brain—that it serves such a vital role that the brain itself has to command the body out of hibernation to recover the sleep it’s lost during hibernation. In fact, the idea that hibernation is more similar to wakefulness than it is to sleep is the subject of a recent study conducted by me and some of my colleagues at the University of Bologna in Italy.

This hypothesis has been tested with an interesting experiment. If a scientist disturbed a hibernator of this “recovery sleep” for a few hours after it wakes up, then it stands to reason that after that the animal would make up for this time when it actually does fall asleep (it would sleep for the same total length of time as hibernating animals that weren’t deprived of sleep immediately after they woke up). if that sleep was so important, it would be recovered at the end of the deprivation period. In other words, if the animal had a sleep debt, that debt would have to paid, sooner or later.

The second hypothesis takes a different view of the whole process. Brain activity is strongly affected by hibernation, and the brain itself goes through some intense changes during hibernation. For instance, during hibernation, there is a process of disconnection of neurons. Many synapses are in fact re­absorbed by the brain in what is very similar to a transitory state of Alzheimer’s disease. This disconnection is quickly reversed after an animal wakes up, rewiring the brain in the same way it was before, which brings back all the information that was stored in the neurons.

During the re­connection process, which happens in the first few hours after an animal comes out of hibernation, the brain is in a highly plastic state. Therefore, it’s thought that the EEG activity that we see during these stages is not real “sleep,” but just a nonspecific pattern of neuronal reactivation. If this is the case, in the experiment described before, we should not see any recovery sleep after the sleep deprivation, which would suggest there’s not sleep debt in first place. In other words, hibernation wouldn’t actually be making the animals tired, they would simply sleep to reform these neural connections.

The experiment I’ve suggested has actually been performed, more than once. But the results are conflicting. A team at the University of Zürich, Switzerland, found evidence of sleep debt in hibernating animals. They even went as far as testing different durations of sleep deprivation, and showed that, during torpor, sleep debt accumulates 2.75 times slower than during wakefulness.

A separate experiment by a team at Berkeley and Stanford reported that no rebound was observed after sleep deprivation, and so did teams from the University of Alaska and the University of Groningen in the Netherlands.

How can we explain the conflicting results? They looked at different animals. The Zurich group looked at the Djungarian hamster, while the Berkeley group looked at the ground squirrel. The main difference between the two species is that hamsters undergo daily torpor (hibernation that lasts less than 24 hours), while ground squirrels are seasonal hibernators.

So, the answer to our initial question is that we still don’t know why animals fall asleep immediately after they wake up from hibernating. No more experiments on the topic have been conducted since the ones I described. But, my own work at the University of Bologna in Italy has supported the idea that brain activity during torpor or hibernation is more similar to wakefulness than it is to sleep.

In this experiment, a torpor-­like state was induced for the first time in a rat, a non-­hibernator (our goal is to eventually open the way to human hibernation in the not-so-near future). Brain activity recordings in this kind of suspended animation state did indeed resemble activity during wakefulness, but the activity became slower and slower as body temperature decreased, as if the frames of a movie were being projected slower and slower as the movie progressed.

No one knows what it’s like to be in a state of hibernation, and we don’t know if hibernators are still somewhat conscious. Perhaps if we can teach ourselves to hibernate, we’ll learn the answer. In the meantime, I’m hoping that research on this topic will flourish again.

You’ll Sleep When You’re Dead is Motherboard’s exploration of the future of sleep. Read more stories.

Everything sounds better in Italian, including English… Note the importance (and revelation) that synaptic connectivity is plastic…

Fate / Human maladaptation to a future world

Each of us is born into a world that is not of his or her own making. The trouble is, that it’s no longer a world that nature has prepared us for. DNA is like a suitcase full of physiological plans, functions and designs; of physics, chemistry, thermodynamics and electro- magnetic energy arranged by billions of years of testing for operational brilliance in an environment that no longer exists.

Human babies are like time travelers, adapted to a strenuous existence in forest and desert; along rivers, lakes and seashores; ready to learn, survive and excel, and to be a wild animal, among wild animals.

We arrive in a place many futures ahead of where we belong. In a hospital. Among machines, without which more and more babies would die on arrival. Not a living thing in sight. To parents whose bodies have adapted rather badly to an artificial world, not of their own making. Trapped in a world not of their own making. The dysfunction of being born into a toxic future, for which our DNA suitcase does not prepares us, accelerates, not by a few years, but thousands of years in mere generations.

The DNA suitcase is becoming useless. We don’t function; we cannot adapt; we can only maladapt.

So what do we do? A frantic response: Attack our DNA. Cut it apart, rearrange it, combine it, mix it like a salad. Hope that we can keep ahead of the future, a future in which dysfunction is normal. Are we there yet?  



Confrontational scavenging of large vertebrate carcasses / Early Homo

Freshly scavenged elk carcass


Humans and Scavengers: The Evolution of Interactions and Ecosystem Services


BioScience, Volume 64, Issue 5, 1 May 2014, Pages 394–403, https://doi.org/10.1093/biosci/biu034
Published: 22 March 2014

Excerpt: Diet of early humans: Food provisioning and the onset of cultural services

Around the time of the Pliocene–Pleistocene transition, increasing seasonality in precipitation occurred in African savannas (Cerling et al. 2011a). This forced the australopithecine ancestors of humans to diversify their diet in order to cope with the developing seasonal bottleneck in fruits and other soft plant foods. While hominins of the genus Paranthropus became adapted to exploit durable seeds, roots, and sedges (Cerling et al. 2011b, Klein 2013, Sponheimer et al. 2013), the lineage leading to Homo turned to the meat provided by large vertebrate carcasses to overcome the effects of the increasingly seasonal production of fruits and new plant growth (Foley and Lee 1989, Bunn and Ezzo 1993, Ungar et al. 2006, Klein 2013). Although the relative role of hunting and scavenging by early humans remains controversial (Domínguez-Rodrigo 2002, Ungar et al. 2006, Pickering 2013), many anthropologists contend that the earliest humans obtained animal food largely through confrontational scavenging (also called power scavenging and aggressive scavenging) by driving large carnivores from their kills (figure 1; O’Connell et al. 1988, Bunn and Ezzo 1993, Brantingham 1998, Ragir 2000, Domínguez-Rodrigo and Pickering 2003, Klein 2009, Bickerton and Szathmáry 2011). Indeed, it has been proposed that the emergence of endurance running could have helped early humans to secure sufficient access to the scattered and ephemeral resource that is carrion, although this might have been a later feature facilitating the hunting of live ungulate prey (Bramble and Lieberman 2011).

Figure 1.

Major meat acquisition strategies of humans (Homo spp.) in relation to key events during the Quaternary Period. (a) A logarithmic time scale (in thousands of years ago) showing the main human-related events that occurred during the Quaternary Period that shaped the interactions between humans and scavenging vertebrates. (b) The major means of meat acquisition by humans during the Quaternary Period. See the text for further details.

Major meat acquisition strategies of humans (Homo spp.) in relation to key events during the Quaternary Period. (a) A logarithmic time scale (in thousands of years ago) showing the main human-related events that occurred during the Quaternary Period that shaped the interactions between humans and scavenging vertebrates. (b) The major means of meat acquisition by humans during the Quaternary Period. See the text for further details.

Interference and resource competition probably accounted for most of the interactions among the earliest humans, vultures, bone-cracking hyenids, and other vertebrate scavengers (Bunn and Ezzo 1993, Owen-Smith 1999, Bickerton and Szathmáry 2011, Bramble and Lieberman 2011). In addition, confrontational scavenging would have exposed early humans to increased risks of injury or death while they were driving away the large carnivores that had killed the carcasses or driving away other fearsome scavengers present at them (Bunn and Ezzo 1993, Bickerton and Szathmáry 2011). But facilitatory interactions could also have been a feature, as it happens in current vertebrate scavenger guilds (Cortés-Avizanda et al. 2012, Pereira et al. 2014). For instance, observations of contemporary hunter–gatherers who actively exploit scavenging opportunities suggest that watching the behavior of vultures and large mammalian carnivores could have helped early humans locate carcasses (O’Connell et al. 1988). Such food provisioning probably represents the first ecosystem service that humans gained from scavenging vertebrates.

Moreover, a major function of the earliest stone tools crafted by early hominins was the processing of large carcasses to yield meat and marrow, a pattern of butchery that extended well into the Pleistocene (de Heinzelin et al. 1999). Competition with other scavengers probably contributed to the refinement of these tools and their use and, therefore, to cultural diversity. In addition, selective pressures associated with confrontational scavenging—specifically, the spatiotemporal unpredictability of carcasses and exposure to predation—probably contributed to the most distinctive features of humans: collaborative cooperation and language development (both of which were used to express where the resource was imagined to be awaiting; Bickerton and Szathmáry 2011). In turn, the improved diet quality due to increasing meat consumption has been related, along with other factors, to the extraordinary brain enlargement within the human lineage (Bramble and Lieberman 2011, Navarrete et al. 2011). Therefore, (confrontational) scavenging helped shape our modern cognitive identity.


Amensalism: any interaction between two individuals or groups of the same or different species in which one organism or group is harmed but the other is unaffected.

Carrion: any type of dead animal tissue.

Coevolution: reciprocal selective pressure that makes the evolution of one taxon partially dependent on the evolution of another (Brantingham 1998).

Commensalism: any interaction between two individuals or groups of the same or different species in which one organism or group benefits without affecting the other.

Competition: any interaction between two individuals or groups of the same or different species that reduces access to a shared resource or set of resources. Competition is direct (interference) if one organism or group affects the ability of another to consume a given limiting resource or indirect (exploitation) if the consumption of a given limiting resource by one organism or group makes the resource unavailable for another.

Ecosystem services: benefits people obtain from ecosystems (MA 2005) or the set of ecosystem functions that are useful to humans (Kremen 2005). These include provisioning (products obtained from ecosystems), regulating (related to the regulation of ecosystem processes), and cultural (nonmaterial benefits) services that directly affect people, as well as the supporting services needed to maintain other services. Provisioning, regulating, and cultural services typically have relatively direct and short-term impacts on people, whereas the impact of supporting services is often indirect or occurs over a very long time period (MA 2005).

Facilitative processes: those processes whose effects on a given organism are beneficial and increase its development or fitness.

Facultative scavenger: an animal that scavenges at variable rates but that can subsist on other food resources in the absence of carrion. All mammalian predators (e.g., jackals, hyenas, and lions in Africa and southern Asia; foxes, raccoons, wolves, and bears in temperate ecosystems), numerous birds of prey (e.g., kites, most large eagles), and corvids (e.g., ravens, crows), as well as other vertebrates (e.g., crocodiles), can be considered, to a greater or lesser extent, facultative scavengers (DeVault et al. 2003, Pereira et al. 2014).

Mutualism: any beneficial and reciprocal interaction between two individuals or groups of different species. This relationship of mutual dependence can be obligate (when a given organism or group cannot survive or reproduce without its mutualistic partner).

Obligate scavenger: a scavenger that relies entirely or near entirely on carrion as food resource. Among Quaternary terrestrial vertebrates, only vultures (both Old and New World species—families Accipitridae and Cathartidae, respectively) are considered obligate scavengers.

Predation: an interaction in which one animal kills and eats all or part of another. Predation can affect prey through the two fundamental mechanisms of direct consumption and capture risk.

Scavenging: an interaction in which one animal eats all or part of a dead animal. Scavenging is active (also called confrontational, aggressive, or power scavenging) when the predator that was responsible for the kill is chased away and most of the meat on the carcass is procured, or it is passive when the bones, which may contain fragments of meat, marrow, and skull contents, are collected.

Much, much, more…

History as Literature / Lewis Mumford, The City…


Lewis Mumford / Harcourt Brace Jovanovich, 1961

“Mid the wanderings of Paleolithic man, the dead were the first to have a permanent dwelling: a cavern, a mound marked by a cairn, a collective barrow.”

“The city of the dead antedates the city of the living. In one sense indeed, the city of the dead is the forerunner, almost the core, of every city. Urban life spans the historic space between the earliest burial ground for dawn man and the final cemetery, the necropolis, in which one civilization after another, has met its end.”


I’ve been sorting piles of books to find those that I can dispense with, at the same time   reacquainting myself with those to which I return for inspiration and reference, and vitally, responsible for a handful of ideas that set me off on a journey many years ago toward understanding human behavior, which for this Asperger, is/was a critical topic. It is my hypothesis that Asperger types have a hyposocial, visually-based brain organization that “resembles” that of pre-agricultural Wild Homo sapiens.

The City in History, by Lewis Mumford, is one of those books. I have never read all 576 pages of its exhaustive details; the quote at top occurs near the beginning, and it struck me immediately with its importance to modern human destiny; not predestined destiny, but the path of human civilization as it has played out over the previous 10-15,000 years of humans becoming domesticated humans, a distinction that has become more obvious to me as I have explored this “thing” called Asperger’s.

Modern social destiny, and the “type” Homo sapiens sapiens who created it, continues to be further defined by adaptation to hypersocial modern environments.  This social destiny was not a collective direction decided upon by “mankind” but the result of individuals pursuing survival. Climatic change and other natural geologic processes forced the dependence on agriculture and a sedentary life; the “idea” of controlling nature must have seemed to be a great and victorious reality at the time, one which could only be “good”. This quest for dominance over nature and its contents, remains the central self-important and disastrous goal for modern techno-social humans, but from this one step into domestication 10-15,000 years ago, a global environmental tragedy has followed.

Mumford’s book is filled with the grandiose “narrative” that archaeologists and anthropologists envy – frustrated novelists that they are. Historians are free to do this;  history has always been a scheme of cultural focus, of mythology with few facts, or a deluge of facts, added to “support” the myth. Our mistake is in thinking that mythology is “false” and has no value, and that history must be “scientific” – which it is not. It is literature that serves to remind us of the hundreds of millions of lives that have been lived, and great writers like Mumford remind us of the delusional belief that we are a supreme and intelligent species that has fulfilled a supernatural evolutionary destiny, but instead, our behavior shows us to be one more repetition of the necropolis stage of civilization.

John Hawks on NEW Homo naledi research “So Human”

Naledi dated to? Ask a geologist! Nice demonstration of how one’s point of view (academic discipline) “changes” what you get from the fossils… and other problems of location, spatial relationships, genetics, time.