Go watch this on NETFLIX. “Odd” human behavior against the backdrop of spectacular volcanic forces. Don’t miss segment on north Korea…
Most people don’t choose their beliefs; their beliefs are culturally inherited.
SEE ALSO: “Religious States of America, in 22 maps”
New look at archaic DNA rewrites human evolution story
Contradicts convention on Denisovans, Neanderthals, modern humans
- Date: August 7, 2017
Excerpt: Previous estimates of the Neanderthal population size are very small — around 1,000 individuals. However, a 2015 study showed that these estimates underrepresent the number of individuals if the Neanderthal population was subdivided into isolated, regional groups. The Utah team suggests that this explains the discrepancy between previous estimates and their own much larger estimate of Neanderthal population size.
“Looking at the data that shows how related everything was, the model was not predicting the gene patterns that we were seeing,” said Ryan Bohlender, post-doctoral fellow at the M. D. Anderson Cancer Center at the University of Texas, and co-author of the study. “We needed a different model and, therefore, a different evolutionary story.”
The team developed an improved statistical method, called legofit, that accounts for multiple populations in the gene pool. They estimated the percentage of Neanderthal genes flowing into modern Eurasian populations, the date at which archaic populations diverged from each other, and their population sizes.
These population trees with embedded gene trees show how mutations can generate nucleotide site patterns. The four branch tips of each gene tree represent genetic samples from four populations: modern Africans, modern Eurasians, Neanderthals, and Denisovans. In the left tree, the mutation (shown in blue) is shared by the Eurasian, Neanderthal and Denisovan genomes. In the right tree, the mutation (shown in red) is shared by the Eurasian and Neanderthal genomes.
A family history in DNA
The human genome has about 3.5 billion nucleotide sites. Over time, genes at certain sites can mutate. If a parent passes down that mutation to their kids, who pass it to their kids, and so on, that mutation acts as a family seal stamped onto the DNA. Scientists use these mutations to piece together evolutionary history hundreds of thousands of years in the past. By searching for shared gene mutations along the nucleotide sites of various human populations, scientists can estimate when groups diverged, and the sizes of populations contributing to the gene pool.
“You’re trying to find a fingerprint of these ancient humans in other populations. It’s a small percentage of the genome, but it’s there,” said Rogers.
They compared the genomes of four human populations: Modern Eurasians (living today), modern Africans, Neanderthals and Denisovans. The modern samples came from Phase I of the 1000-Genomes project and the archaic samples came from the Max Planck Institute for Evolutionary Anthropology. The Utah team analyzed a few million nucleotide sites that shared a gene mutation in two or three human groups, and established 10 distinct nucleotide site patterns.
Against conventional wisdom
The new method confirmed previous estimates that modern (living today) Eurasians share about 2 percent of Neanderthal DNA. However, other findings questioned established theories. Their analysis revealed that 20 percent of nucleotide sites exhibited a mutation only shared by Neanderthals and Denisovans, a genetic timestamp marking the time before the archaic groups diverged. The team calculated that Neanderthals and Denisovans separated about 744,000 years ago, much earlier than any other estimation of the split. (Was the last common ancestor Homo erectus?)
“If Neanderthals and Denisovans had separated later, then there ought to be more sites at which the mutation is present in the two archaic samples, but is absent from modern (living today) samples,” said Rogers. The analysis also questioned whether the Neanderthal population had only 1,000 individuals. There is some evidence for this; Neanderthal DNA contains mutations that usually occur in small populations with little genetic diversity. However, Neanderthal remains found in various locations are genetically different from each other. This supports the study’s finding that regional Neanderthals were likely small bands of individuals, which explains the harmful mutations, while the global population was quite large.
“The idea is that there are these small, geographically isolated populations, like islands, that sometimes interact, but it’s a pain to move from island to island. So, they tend to stay with their own populations,” said Bohlender.
Their analysis revealed that the Neanderthals grew to tens of thousands of individuals living in fragmented, isolated populations.
“There’s a rich Neanderthal fossil record. There are lots of Neanderthal sites,” said Rogers. “It’s hard to imagine that there would be so many of them if there were only 1,000 individuals in the whole world.”
Rogers is excited to apply the new method in other contexts.
“To some degree, this is a proof of concept that the method can work. That’s exciting,” said Rogers. “We have remarkable ability to estimate things with high precision, much farther back in the past than anyone has realized.”
Early history of Neanderthals and Denisovans
Neanderthal and Homo erectus (Turkana Boy – 1.5-1.6 mya) reconstructions by Elizabeth Daynes / Field Museum, Chicago.
HUNGER: The prime motivator of human behavior and technology. Primitive tools compensate for “puny human” lack of claws, reduced olfactory sense, and other assets possessed by the competition: other hungry animals, including many much smaller than humans, had superior strength, speed, meat-or tough vegetation-tearing teeth (cooking required), protective fur, athletic ability, specialized body parts and instinctive tactics. Early humans HAD TO develop tools!
Our type of brain most likely developed as a “tool” that compensated for (and competed with) the “equipment” of other animals in particular environments. The brain as technology – think about it! LOL
I’m working up to the problem of visual and sensory thinking being all but ignored (or even dismissed) by the “cognition and behavior sciences” as a primary mode of perception and cognition in evolutionary history. This ignorance or arrogance on the part of “researchers” is especially negligent on the part of those whose declared interest is ASD / Asperger’s and other non-typical diagnosis. The irony is that these diagnosis of “abnormality” may simply demonstrate the bias or outright prejudice that only the “social” language of scripted word concepts / formal academic constructs is “important” to human thought and behavior. That is, rigid restrictions have been placed on human thought, behavior and personal expression that may reflect the inability of the “social engineering class” to think in any other mode. Can this group have become so isolated from “natural” human behavior, that only individuals who are similarly limited to social constructs and rigid narratives are “accepted, selected for” inclusion in the class of those who dictate social behavior, thus increasingly diminishing the diversity of ideas about “what it is to be human” to their own impoverished experiences? The peasant classes are urged to function only on emotional reactivity and scripted social behavior, thus remaining powerless.
WIKI on Cognition:
“Cognition is “the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses”. It encompasses processes such as attention, the formation of knowledge, memory, and working memory, judgement and evaluation, reasoning and “computation,” problem-solving and decision making, comprehension and production of language. Cognitive processes use existing knowledge and generate new knowledge.”
Note that “producing language” is only one of many thinking processes; the “expressive – action based” fields of art and music, dance and kinesthetic “thinking” must be assumed to be included under experience and the senses; otherwise these thought processes are missing from the list. Why? The stress is on “conscious” cognition; “unconscious” cognition is considered to be “low-level” cognition and has been segregated from “high-level cognition” – an error that has had severe consequences to the understanding of “how the brain works” in relation to the “whole” human organism and how it interacts with the environment. This “social conception” of human biology, physiology and behavior serves the western socio-religious narcissism of “man” as a special creation isolated from the reality of evolution.
“The processes are analyzed from different perspectives within different contexts, notably in the fields of linguistics, anesthesia, neuroscience, psychiatry, psychology, education, philosophy, anthropology, biology, systemics, logic, and computer science. These and other different approaches to the analysis of cognition are synthesized in the developing field of cognitive science, a progressively autonomous academic discipline.”
Again, we must assume that “the arts” are included somewhere in this disconnected “chopped salad” of academic reserves, which often are “at war” with each other over “domains of expertise” (territories) without much flow of information or “honest” discussion between academics. Genuine scientific competition and progress requires constant questioning of assumptions (hypothesis, theories); this necessity is hampered by most of these disciplines being based on theories, rather than truly investigative “reality-based” research that is open to challenges by other researchers.
A severe problem with current concepts of cognition and intelligence: The 300,000 y.o. Jebel Irhoud Homo sapiens, considered to be the “earliest so far” true Homo sapiens. If judged on the decision / conceit that only “conscious social cognition and behavior” count toward being classified as Homo sapiens, how do we explain the survival of any hominid? The current explanation is that these early Homo sapiens were “cognitively and socially identical to modern social humans.” A reality based conclusion would be, that given the variety and range of difficult environments and conditions in which they survived and successfully reproduced, these humans would have had to be more intelligent than modern domesticated humans, who have the advantage of 300,000 years of collective human experience and culture HANDED TO THEM by default.
The “human brain and behavior” community would have us believe that this fellow survived by relying on modern social word-concepts and social theories of behavior.
Au contraire! Survival would have demanded the “action” intelligences of sensory processing: art and technology production, acute and immediate visual-sensory analysis of threats and opportunities presented by a wild ‘natural’ environment, memorization / mapping of geographical, geological and faunal-flora details of food availability; cooperation, sharing and mutual respect for individual skills and talents, and a precise (not vague or generalized) use of verbal language, gestures, imitative animal communication and graphic symbols.
Pyura Chilensis, contains 10 million times the level of vanadium than in the surrounding seawater. Just add a saucy slurry of tar polluting the beach. Or is that black stuff produced by the animal? Yum!
Real Paleo Diet: early hominids ate just about everything
Postdoctoral Fellow in Primate and Human Evolution, Georgia State University
Reconstructions of human evolution are prone to simple, overly-tidy scenarios. Our ancestors, for example, stood on two legs to look over tall grass, or began to speak because, well, they finally had something to say. Like much of our understanding of early hominid behavior, the imagined diet of our ancestors has also been over-simplified.
Take the trendy Paleo Diet which draws inspiration from how people lived during the Paleolithic or Stone Age that ran from roughly 2.6 million to 10,000 years ago. It encourages practitioners to give up the fruits of modern culinary progress – such as dairy, agricultural products and processed foods – and start living a pseudo-hunter-gatherer lifestyle. Adherents recommend a very specific “ancestral” menu, replete with certain percentages of energy from carbohydrates, proteins and fats, and suggested levels of physical activity. These prescriptions are drawn mainly from observations of modern humans who live at least a partial hunter-gatherer existence.
But from a scientific standpoint, these kinds of simple characterizations of our ancestors’ behavior generally don’t add up. Recently, fellow anthropologist C. Owen Lovejoy and I took a close look at this crucial question in human behavioral evolution: the origins of hominid diet. We focused on the earliest phase of hominid evolution from roughly 6 to 1.6 million years ago, both before and after the first use of modified stone tools. This time frame includes, in order of appearance, the hominids Ardipithecus and Australopithecus, and the earliest members of our own genus, the comparatively brainy Homo. None of these were modern humans, which appeared much later, but rather our distant forerunners.
We examined the fossil, chemical and archaeological evidence, and also closely considered the foraging behavior of living animals. Why is this crucial? Observing animals in nature for even an hour will provide a ready answer: almost all of what an organism does on a daily basis is simply related to staying alive; that includes activities such as feeding, avoiding predators and setting itself up to reproduce. That’s the evolutionary way.
What did our ancestors actually eat? In some cases, researchers can enlist modern technology to examine the question. Researchers study the chemical makeup of fossil dental enamel to figure out relative amounts of foods the hominid ate derived from woody plants (or the animals that ate them) versus open country plants. Other scientists look in ancient tooth tartar for bits of silica from plants that can be identified to type – for example, fruit from a particular plant family. Others examine the small butchering marks made on animal bones by stone tools. Researchers have found, for example, that hominids even 2.6 million years ago were eating the meat and bone marrow of antelopes; whether they were hunted or scavenged is hotly debated.
Such techniques are informative, but ultimately give only a hazy picture of diet. They provide good evidence that plants’ underground storage organs (such as tubers), sedges, fruits, invertebrate and vertebrate animals, leaves and bark were all on the menu for at least some early hominids. But they don’t give us information about the relative importance of various foods. And since these foods are all eaten at least occasionally by living monkeys and apes, these techniques don’t explain what sets hominids apart from other primates.
So how should we proceed? As my colleague Lovejoy says, to reconstruct hominid evolution, you need to take the rules that apply to beavers and use them to make a human. In other words, you must look at the “rules” for foraging. We aren’t the first researchers to have dabbled in this. As long ago as 1953, anthropologists George Bartholomew and Joseph Birdsell attempted to characterize the ecology of early hominids by applying general biological principles.
Does American fast food qualify as “profitable foods” in optimal foraging theory?
Happily, ecologists have long been compiling these rules in an area of research dubbed optimal foraging theory (OFT). OFT uses simple mathematical models to predict how certain animals would forage in a given circumstance. For instance, given a set of potential foods of estimated energetic value, abundance and handling time (how long it takes to acquire and consume), one classic OFT model calculates which resources should be eaten and which ones should be passed over. One prediction — sort of a “golden rule” of foraging — is that when profitable foods (those high in energy and low in handling time) are abundant, an animal should specialize on them, but when they are scarce, an animal should broaden its diet. (DUH!)
Data from living organisms as disparate as insects and modern humans generally fall in line with such predictions. In the Nepal Himalaya, for example, high-altitude gray langur monkeys eschew leathery mature evergreen leaves and certain types of roots and bark — all calorie-deficient and high in fibers and handling time — during most of the year. But in the barren winter, when better foodstuffs are rare or unavailable, they’ll greedily devour them.
In another more controlled study, when differing quantities of almonds in or out of the shell are buried in view of chimpanzees, they later recover larger quantities (more energy), those physically closer (less pursuit time), and those without shells (less processing time) before smaller, more distant, or “with-shell” nuts. This suggests that at least some animals can remember optimal foraging variables and utilize them even in cases where foods are distant and outside the range of immediate perception. Both of these studies support key predictions from OFT.
If one could estimate the variables important to foraging, one could potentially predict the diet of particular hominids that lived in the distant past. It’s a daunting proposition, but this human evolution business was never meant to be easy. The OFT approach forces researchers to learn how and why animals exploit particular resources, which leads to more thoughtful considerations of early hominid ecology. A smattering of scientists have utilized OFT with success, most notably in archaeological treatments of comparatively recent hominids, such as Neandertals and anatomically modern humans.
But a few brave souls have delved into more remote human dietary history. One team, for example, utilized OFT, modern analogue habitats, and evidence from the fossil record, to estimate the predicted optimal diet of Australopithecus boisei. That’s the famed “Nutcracker Man” that lived in East Africa close to 2 million years ago. The research suggests a wide range of potential foods, greatly varying movement patterns – based on characteristics such as habitat or use of digging sticks — and the seasonal importance of certain resources, such as roots and tubers, for meeting estimated caloric requirements.
Researchers Tom Hatley and John Kappelman noted in 1980 that hominids have bunodont – low, with rounded cusps – back teeth that show much in common with bears and pigs. If you’ve watched these animals forage, you know they’ll eat just about anything: tubers, fruits, leafy materials and twigs, invertebrates, honey and vertebrate animals, whether scavenged or hunted. The percentage contribution of each food type to the diet will depend (you guessed it) on the energetic value of specific foods in specific habitats, at specific times of year. Evidence from the entirety of human evolution suggests that our ancestors, and even we as modern humans, are just as omnivorous.
And the idea that our more ancient ancestors were great hunters is likely off the mark, as bipedality — at least before the advance of sophisticated cognition and technology — is a mighty poor way to chase game. Even more so than bears and pigs, our mobility is limited. The anthropologist Bruce Latimer has pointed out that the fastest human being on the planet can’t catch up to your average rabbit. Another reason to be opportunistic about food.
Simple characterizations of hominid ecology are divorced from the actual, and wonderful, complexity of our shared history. The recent addition of pastoral and agricultural products to many modern human diets — for which we have rapidly evolved physiological adaptations — is but one extension of an ancient imperative. Hominids didn’t spread first across Africa, and then the entire globe, by utilizing just one foraging strategy or sticking to a precise mix of carbohydrates, proteins and fats. We did it by being ever so flexible, both socially and ecologically, and always searching for the greener grass (metaphorically), or riper fruit (literally).
Why is it that in any anthropologic scenario, one group must win and “the other” group must become extinct? There is a difference between one community of people (let’s say the Roanoke Colony), failing to thrive, and this “failure” being proof that all English people became extinct. We project the “winner versus looser” plot onto evolutionary history, that as yet, we do not understand.
Video from the scientific article “U-Th dating of carbonate crusts reveals Neanderthal origin of Iberian cave art” (www.sciencemag.org)
One comment: It continues to baffle the logical Asperger, as to why neurotypicals insist that any intentional mark on a rock, or any other object, is automatically “symbolic” expression and “proves” abstract thought in the brain of the “mark maker” when a drawing can be (and usually is) concrete and literal: the drawing of a cave lion is a lion. The arrangement of lines in a drawing into which animals are being driven, is a corral; the animals are specific animals. “Bad” prehistoric drawings (inept person attempting to draw an object) are not 20th C. abstract art!
John Hawks on evidence of Neanderthal / H. sapiens occupation and cultural sharing in the Carmel area of northern Israel.
The always sane and rational John Hawks…
And for two other narratives, go to:
Because we are primates! Find the original BBC series; it’s on Netflix.
The bad news: Dear fellow Aspergers – we may not be Homo sapiens; we may not even be primates.