One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Light Skin and Lactose / Recent Adaptations to Cereal Diet

IFL Science

Why Do Europeans Have White Skin?

April 6, 2015 | by Stephen Luntz (shortened to get to the point)

The 1000 Genomes Project is comparing the genomes of modern individuals from specific regions in Europe with 83 samples taken from seven ancient European cultures. Harvard University’s Dr. Iain Mathieson has identified five features which  spread through Europe, indicating a strong selection advantage.

At the annual conference of the American Association of Physical Anthropologists, Mathieson said his team distinguished, “between traits that have changed consistently with population turnovers, traits that have changed apparently neutrally, and traits that have changed dramatically due to recent natural selection.”

… most people of European descent are lactose tolerant, to the extent that milk products not only form a major source of nutrition but are a defining feature of European cultures…that the capacity to digest lactose as an adult appeared in the population after the development of farming. Two waves of farmers settled Europe 7,800 and 4,800 years ago, but it was only 500 years later that the gene for lactose tolerance became widespread.

…hunter-gatherers in what is now Spain, Luxumberg and Hungary had dark-skinned versions of the two genes more strongly associated with skin color. The oldest pale versions of the SLC24A5 and SLC45A2 genes that Mathieson found were at Motala in southern Sweden 7,700 years ago. The gene associated with blue eyes and blond hair was found in bodies from the same site. H/T ScienceMag.

world-solar-energy-map

____________________________________________________________________________________________

From: Civilization Fanatics Forum

Debunking the theory that lighter skin gradually arose in Europeans nearly 40,000 years ago, new research has revealed that it evolved recently – only 7,000 years ago

People in tropical to subtropical parts of the world manufacture vitamin D in their skin as a result of UV exposure. At northern latitudes, dark skin would have reduced the production of vitamin D. If people weren’t getting much vitamin D in their diet, then selection for pre-existing mutations for lighter skin (less pigment) would “sweep” the farming population.  

New scientific findings show that prehistoric European hunter-gatherers were dark-skinned, but ate vitamin D-rich meat, fish, mushrooms and fruits. With the switch to agriculture, the amount of vitamin D in the diet decreased – and resulted in selection for pale skin among European farmers.

Findings detailed today (Jan. 26, 2014) in the journal Nature, “also hint that light skin evolved not to adjust to the lower-light conditions in Europe compared with Africa, but instead to the new diet that emerged after the agricultural revolution”, said study co-author Carles Lalueza-Fox, a paleogenomics researcher at Pompeu Fabra University in Spain.

The finding implies that for most of their evolutionary history, Europeans were not what people today are known as  ‘Caucasian’, said Guido Barbujani, president of the Associazione Genetica Italiana in Ferrara, Italy, who was not involved in the study.

Kostenki_14

 

 

 

Neanderthal mtDNA from before 220,000 y.o. Early Modern Human

Fact or Baloney…read on…

Neandertals and modern humans started mating early

For almost a century, Neandertals were considered the ancestors of modern humans. But in a new plot twist in the unfolding mystery of how Neandertals were related to modern humans, it now seems that members of our lineage were among the ancestors of Neandertals. Researchers sequenced ancient DNA from the mitochondria—tiny energy factories inside cells—from a Neandertal who lived about 100,000 years ago in southwest Germany. They found that this DNA, which is inherited only from the mother, resembled that of early modern humans.

After comparing the mitochondrial DNA (mtDNA) with that of other archaic and modern humans, the researchers reached a startling conclusion: A female member of the lineage that gave rise to Homo sapiens in Africa mated with a Neandertal male more than 220,000 years ago—much earlier than other known encounters between the two groups. Her children spread her genetic legacy through the Neandertal lineage, and in time her African mtDNA completely replaced the ancestral Neandertal mtDNA.

Other researchers are enthusiastic about the hypothesis, described in Nature Communications this week, but caution that it will take more than one genome to prove. “It’s a nice story that solves a cool mystery—how did Neandertals end up with mtDNA more like that of modern humans,” says population geneticist Ilan Gronau of the Interdisciplinary Center Herzliya in Israel. But “they have not nailed it yet.”

 The study adds to a catalog of ancient genomes, including mtDNA as well as the much larger nuclear genomes, from more than a dozen Neandertals. Most of these lived at the end of the species’ time on Earth, about 40,000 to 50,000 years ago. Researchers also have analyzed the complete nuclear and mtDNA genomes of another archaic group from Siberia, called the Denisovans. The nuclear DNA suggested that Neandertals and Denisovans were each other’s closest kin and that their lineage split from ours more than 600,000 years ago. But the Neandertal mtDNA from these samples posed a mystery: It was not like Denisovans’ and was closely related to that of modern humans—a pattern at odds with the ancient, 600,000 year divergence date. Last year Svante Pääbo’s team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, offered a startling solution: Perhaps the “Neandertal” mtDNA actually came from modern humans.

______________________________

Strange! Everything I’ve read previously has said the Neanderthal mtDna was not at all similar to any H. sapiens mtDna haplogroups. 

______________________________

In the new study, paleogeneticists Johannes Krause and Cosimo Posth of the Max Planck Institute for the Science of Human History in Jena, Germany, test this wild idea with ancient mtDNA from a Neandertal thighbone found in 1937 in the Hohlenstein-Stadel cave (HST) in Germany. Isotopes in animal bones found with the Neandertal suggest that it lived in a woodland known to have vanished at least 100,000 years ago.

Researchers compared the coding region of the HST Neandertal’s mtDNA with that of 17 other Neandertals, three Denisovans, and 54 modern humans. The HST Neandertal’s mtDNA was significantly different even from that of proto-Neandertals that date to 430,000 years ago at Sima de los Huesos in Spain, suggesting that their mtDNA had been completely replaced. But the HST sample was also surprisingly distinct from that of other Neandertals, allowing researchers to build a phylogenetic tree and study how Neandertal mtDNA evolved over time.

Using modern humans’ mtDNA mutation rate to calculate the timing, the researchers conclude that the HST mtDNA split from that of all other Neandertals at least 220,000 years ago. The ancient H. sapiens’ mtDNA must have entered the Neandertal lineage before this time, but after 470,000 years ago, the earliest date for when modern human and Neandertal mtDNA diverged. That’s early enough for the new form of mtDNA to have spread among Neandertals and replaced all their mtDNA.

“The mtDNA of Neandertals is not actually from Neandertals, but from an early modern human from Africa,” Krause says. The researchers speculate that this key mating may have happened in the Middle East, where early H. sapiens may have ventured. Other researchers find the scenario remarkable but plausible. “It seems magical but this type of thing happens all the time … especially if the populations are very small,” Gronau says. For example, the mtDNA in some grizzly bears has been completely replaced by that of polar bears, Krause says.

But some experts say DNA from other Neandertals is needed to prove that their mtDNA was inherited entirely from an early H. sapiens rather than from an ancient ancestor the two groups shared. “Is there other evidence of another [early] mtDNA introgression event?” asks Chris Stringer of the Natural History Museum in London.

Not yet, Posth says. Pääbo is seeking evidence of early gene swapping by trying to get nuclear DNA from the HST Neandertal and others. “We will learn a lot about the population history of Neandertals over the next few years,” he says.

Posted in: Evolution

doi:10.1126/science.aan70

 

The Whoa! Whoa! Whoa! Reaction / Neanderthal Myths

The “Whoa! Whoa! Whoa!” reaction is what happens when I read articles written for public consumption that “boil down” science for the “educated public” – those who are genuinely interested in the physical universe, but may or may not  have a science background. One of my favorite examples is how Neanderthals are “created” out of the modern social typical penchant (and temperamental obligation) to write stories (myths) from scant, contradictory or preliminary information.

Claiming that Neanderthals were "dumb" is dumb.

The claim that Neanderthals were “dumb” is dumb. Are these skulls to scale?


Science Shows Why You’re Smarter Than a Neanderthal

Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition

By Joseph Stromberg Smithsonian.com March 12, 2013

https://www.smithsonianmag.com/science-nature/science-shows-why-youre-smarter-than-a-neanderthal-1885827/ Full article

COMMENTS: This article hits the Whoa! Stop! barrier before getting past the subhead. “Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition.”

  1. This view of the brain as having a “capacity” related to volume, like a closet that can be packed with X amount of clothing and Y amount of shoes, and if you want to add more shoes or ski equipment, you have to remove the clothes to make room, defies what we know (and brag about endlessly) about the brain: it’s built of networks that connect across regions and functions, and these are PLASTIC – what is referred to as “able to rewire itself in reaction to the environment.” This blows apart much of what the article has to say.
  2. Visual thinking is judged to be INFERIOR, low level cognition. Tell that to a raptor, such as a hawk, raven or eagle; to giant squid or octopi and the myriad species which utilize various segments of the electro-magnetic spectrum to perceive the environment. This opinion is based in ignorance and the noises made by the perpetual cheer leaders for Homo sapiens, who believe humans are the pinnacle of evolution, and therefore, whatever “we” do is de facto superior.
  3. Which brings us to the question, if human abilities are superior, why must we compensate for our lack of sensory, cognitive and physical abilities by inventing technology? The average “know-it-all” American CONSUMES the products invented and developed by a handful of creative people in each generation. Knowledge is purchased in the form of “gadgets” that for the most part, do not educate, but distract the average individual from pursuing direct experience and interaction with the environment.
  4. Which means, “we” cognitive masterminds are taking a whole lot of credit for adaptations that are INHERITED from our “inferior, stupid, ancestors” who over the previous 200,000 years, not only survived, but built the culture that made us modern humans –
  5. Which comes to the egregious error of ignoring context: Compare an imaginary modern social human who exists in a context that is utterly dependent on manmade systems that supply food, water, shelter, medical care, economic opportunity, government control, cultural benefits and instant communication with a Neanderthal (or archaic Homo sapiens) whose environment is a largely uninhabited wilderness. One of the favorite clichés of American entertainment is “Male Monsters of Survival” cast into the wilderness (with a film crew and helicopter on call) recreating the Myth of Homo sapiens, Conqueror of Nature. These overconfident males are often lucky to last a week; injuries are common, starvation the norm.
  6. If visual thinking is so inferior, why do hunters rely on airplane and helicopter “flyovers” to locate game, and now drones, and add scopes, binoculars, game cameras,  and a multitude of “sensory substitutes” to their repertoire? Ever been to a sporting goods store? They’re packed with every possible gadget that will improve the DIMINISHED senses and cognitive ability of modern social humans to function outside of manmade environments and to be successful hunters and fishermen.
  7. As for forcing Neanderthals into extinction, modern social humans could accomplish this: we have a horrific history of wiping out indigenous peoples and continue to destroy not only human groups, but hundreds of species and the environments they are adapted to. Modern social humans could bomb Neanderthals “back to the Stone Age”. Kill them off with chemical weapons, shred them with cluster bombs, the overkill of targeted assassination and nuclear weapons.
  8. BUT there is no proof that Archaic Homo sapiens “extincted” Homo Neanderthal. We know that in some areas they lived cheek by jowl, had sex and produced offspring, but modern social humans maintain that Neanderthals were so “socially stupid” that the entire species fell to the magnificence of party-hearty Homo sapiens.  Actually, a modern social human would have difficulty distinguishing the two fearsome types: the challenge may have been like distinguishing a polar bear from a grizzly bear, which are actually both brown bears adapted to different environments. rather irrelevant if you’re facing down either one with a sharp stick.
  9. The myth that Homo sapiens individuals outside of Africa “contain” a variable 1-4% of Neanderthal DNA, with  specific “snips” related to various functions in modern humans, is incomplete. Rarely included in articles about how Homo sapiens and Neanderthal are connected is whole genome sequencing results which show that overall, the Homo sapiens genome, even now, is all but identical to the Neanderthal genome. This is logical: the divergence between the common ancestor of Chimps and  African great Apes (us) occurred 5-6 m.y.a. and yet, the human and chimp genomes share 99% of our DNA. How similar then, is Neanderthal and Denisovan genome to ours? This is a simple math question.
  10. What we need to compare is the Neanderthal genome and the ARCHAIC Homo sapiens genome – two groups of humans who were contemporaries.

 

 

 

Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

Empirical Planet Blog / Critique of All Things Neurotypical

http://empiricalplanet.blogspot.com

About Empirical Planet:  (Jason) I’m a neuroscience PhD student and hopeless news junkie. I’m passionate about bringing empirical thinking to the masses. @jipkin

Blog written by a fellow cranky complainer and hopeless believer in converting the masses to a love of reality, which is a pointless endeavor:

(Posted 07/2003)This idea of there being a “primitive” brain crops up all over the place, and from reputable sources: They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities” (Picower Professor of Neuroscience Earl Miller, MIT, said that).

It’s like adding scoops to an ice cream cone.  So if you imagine the lizard brain as a single-scoop ice cream cone, the way you make a mouse brain out of a lizard brain isn’t to throw the cone and the first scoop away and start over and make a banana split — rather, it’s to put a second scoop on top of the first scoop.” (Professor David Linden, Johns Hopkins, said that).

Now let me explain why this is all complete BS.

First, semantics.  What is “primitive”?  How do you measure a tissue’s “primitivity”?  In the common usage of the word, primitive means simple, especially in the context of a task, idea, structure, way of life, etc that was employed a long time ago.  Cavemen were more primitive than us, for example.  Unfortunately, this means that “primitive” is a word that refers to things both “ancient” AND “simple”.  Which, as we’ll see, is a big problem when you start applying it to mean only one of those things as occurs with the “primitive brain” meme.

Second, what are people actually talking about when they say “primitive brain”?  This is confused as well, but in general the thought is structured like this: Most primitive – brain stem, pons, cerebellum.  (The hindbrain, basically). Also primitive – the limbic system where “limbic” means “border, edge”.  This includes the hippocampus, amygdala, parts of the thalamus (who knows why), some of the cortex, and some other bits.  It’s supposed to do “emotion” and this is what Daniel Goleman is referring to when he talks about the “primitive brain”.

Really though it’s just a lumping together of all the structures right near the inner border of cortex, because why not lump it all together? – the mighty cortex, you know, is the glorious wrinkly part on the outside.

Third, why do people say that these particular brain structures are “primitive”?  The idea is that evolutionarily, one came after the other.  As in, the first vertebrate just had a brain stem.  Then it evolved more stuff after that.  Then as things kept evolving, more and more stuff got added on.  This is the “ice cream cone” model that David Linden espouses.  It’s also incredibly stupid (or at least misleading).  Let’s break it down.

Evolution did not happen like this: untitled-png-linear-evo

Evolution did happen like this: untitled-png-tree-evo

I hope everyone pretty much understands the concept of a phylogeny (phylogeny, the history of the evolution of a species or group, especially in reference to lines of descent and relationships among broad groups of organisms) and the fact that every vertebrate came from one common ancestor. Yes, the common ancestor was a kind of fish.  No, today’s fish aren’t the same as the common ancestor. They evolved from it just like everyone else, although “rates” of evolutionary phenomena like mutation and drift can vary and that’s beyond the scope of this post.

The point is that the “primitive” brain meme is born in the idea that the brain components shared by fish, lizards, mice, and humans share must be evolutionarily ancient and were likely shared in common by the common ancestor.  So, homologous structures across the phylogeny indicate “ancientness”.  And “ancientness” = “primitive”.  (Except it doesn’t, but more on that in a second). And since we all share structures that resemble the brain stem, voilà!  That’s the most primitive part of the brain.  Here’s where things go astray.

First, we don’t just share the brain stem with all animals.

EmbryonicBrain_svgHere’s the real “ice cream cone” of the brain: And when I say “the” brain, I should say “all vertebrate brains”.  Every fish, bird (including reptiles), amphibian, and mammal has a brain that starts out looking like the pictures to the right. Each colored bump (or “scoop of ice cream”) represents a region of the brain very early on in development, when the whole nervous system is basically a simple tube.  Each bump goes on to expand to varying sizes into varying kind of structures and yadda yadda depending on the species.  The point, though, is that all vertebrates have a forebrain, a midbrain, and a hindbrain.  And the hindbrain, by the way, is the “primitive” brain stem. 

But clearly, humans, fish, lizards, and mice all evolved from a common ancestor that had all brain regions, not just the hindbrain.

This is why David Linden’s ice cream analogy is so dumb.  He’s implying that first you start with one scoop, the hindbrain, then add on another (the midbrain), and finally one more (the forebrain).

When mammals like mice came along, the lizard brain didn’t go away. It simply became the brain stem, which is perched on top of the spine, Linden says.  Then evolution slapped more brain on top of the brain stem. But that’s not what happened at all.  All the scoops were there to begin with.  Then as adaptation took its course, different scoops got bigger or smaller or just different as you began comparing across the entire phylogeny.  Yes, humans have an enormous cortex and lizards don’t.  And yes, lizards simply evolved a different-looking kind of forebrain.  That’s all.

Second, homology (“likeness”) does NOT imply “ancientness”.  Even if the hindbrain looks pretty similar across the vertebrate phylogeny as it exists today, that doesn’t make it “ancient”.  The hindbrain has been evolving just like the midbrain and the forebrain.  Maybe it’s not been changing as much, but it’s still been changing.

This leads me to why the “primitive” notion is so misleading, and should be avoided:

(1) Calling a part of the brain “primitive” suggests what David Linden articulated: that brain evolution happened like stacking scoops of ice cream.  It implies that our brain stem is no different than that of a lizard, or of a mouse, or of a fish.  Yet despite their vast similarities, they are clearly not the same.  You can’t plug a human forebrain into a lizard hindbrain and expect the thing to work.  The hindbrain of humans HAD to adapt to having an enormous forebrain.  There’s something seductive in the idea that inside all of us is a primal being, a “reptilian brain”.  There isn’t.  It’s a human brain, top to bottom.

(2) Calling brain parts “primitive” because they are shared across phylogeny is often used to justify how amazing our human cortex is.  Look at what makes us, us!  We’re so great!  Well, I guess.  But we are just one little excursion among many that evolution has produced.  The lizard brain is adapted for what lizards need to do.  The fish brain is adapted for what fish need to do.  They don’t have “primitive” brains.  They have highly adapted brains, just like any other vertebrate.

(3) Simply using the word “primitive” makes the casual reader think of cavemen.  It just does.  And that’s even more ridiculous, because ancient Homo sapiens were still Homo sapiens.  Read what this poor misinformed blogger has written:

“So, let me explain the Primitive brain in simple terms. We have an Intellectual (rational) part of the brain and a Primitive (emotional) part of the brain. In the picture above, the Primitive brain is around the Hippocampus and Hypothalamus areas. In some texts, it has also been called the Limbic System. The subconscious Primitive part has been there ever since we were cavemen and cavewomen, and houses our fight/flight/freeze response (in the amygdala in between the Hippocampus and the Hypothalamus). Its function is to ensure our survival.”

AHHHHHHHHHHHHHHH.  You see?  You see??????

(4) There is not just a “primitive, emotional brain” and a “complex, intellectual brain”.  That is so…. wrong.  Factually wrong.  Yet people like Daniel Goleman sell books about emotional intelligence claiming that people need to develop their “emotional brain”

_______________________________

Asperger individuals are belittled as developmentally disordered because we don’t have the imaginary-mythical social / emotional “normal” human brain. 

_______________________________________________

…and then bloggers like Carrie (above) start internalizing and spreading the misinformation.  Folks.  Let me be clear.  You have ONE brain.  One.  It has many parts, which is to say that humans looking at brains have found ways to subdivide them into various areas and regions and structures and whatnot.  Regardless of all that, the whole damn thing works together.  It’s one big circuit that does emotion, rationality, sensation, movement, the whole shebang.  There isn’t a simplistic “emotion” part and an “intellectual” part.  The cortex is involved in emotions and feelings.  The basal ganglia are involved in cognition.  In fact, the whole idea of splitting emotion and reason into two separate camps is fraught, as emotion turns out to be critical in reasoning tasks like decision-making.

__________________________________________________________________________

Asperger individuals aren’t “able to – allowed to” claim that emotion influences our thinking (nor are we granted any feelings toward other humans) because we’re “missing” the non-existent “social brain” and every idiot knows that not having an “social brain” makes a person “subhuman” or even psychopathic – we are irretrievably “broken”. The real story is that Asperger “emotions”, which technically, and for every animal, are reactions to the environment, are different because our sensory acquisition and perceptual processing are different: we focus on PHYSICAL REALITY. Hypersocial humans focus on each other.  

_____________________________________________________________________________ 

(5) The “primitiveness” of lizard brains is vastly overstated.  Things like this get written about the “primitive brain”: A lizard brain is about survival — it controls heart rate and breathing, and processes information from the eyes and ears and mouth.

This implies, to the casual reader, that lizards are just sitting around breathing.  Maybe there’s some “survival instinct” in there: look out for that big hawk!  Yeah, okay.  But guess what?  Lizards gotta do other stuff too.  They have to reproduce, find food, navigate their environment, learn, remember, make choices, etc.  They aren’t just breathing and instinct machines.  And because they aren’t, that means their brains aren’t just doing that either.  And why is it always lizards and reptiles?  You’d think fish would get picked on more.

(6) “Primitive” in the context of a brain part primarily means “ancient”.  But the word “primitive”, as we already saw, connotes simplicity.  This leaves laypeople with many misconceptions.  First, that the brain stem, or the “emotional brain”, or whatever, is simple.  Or even that they’re simpler.  Nope.  Not really.  Pretty much every part of the brain is complex.  Second, it reinforces, in the case of the “emotional brain”, that emotions are beneath intellect. (In the U.S. “emotional responses” have been elevated OVER intellect, because no one wants an analytical consumer or voter.)  They came first, they are older, they are simpler, they are the stupid part of your brain.  Again, just no.  You need emotions to survive just as you need your intellect to survive.  Fish need  emotions (an emotion, after all, is just a representation of bodily state imbued with a certain postive/negative valence) just like they need their reasoning abilities as well.

(7) People (who use the word) “primitive” (copy scientists) because it can sound cool and surprising.  Look at how Earl Miller framed it, from above:

“They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities”

Look at that result!  A primitive thing did something advanced! 

The forgotten thing is important!  Or maybe – this is going to sound crazy – the whole system evolved together in order to support such essential tasks like learning.  There never was a primitive part or an advanced part, despite two areas or regions being labeled as such.  Every part of the human brain has been evolving for the same amount of time as every other part, and has been evolving to work as best as possible with each of those other parts.

(8) Finally, let’s return to Daniel Goleman, who argues that “emotional intelligence” arises from the “primitive emotional brain”.  Then he waxes on and on about the value of emotional intelligence, particularly as it relates to social abilities.  Ability to understand your own emotions.  Ability to perceive those of others.  Ability to interact well with others on the basis of understanding their emotions.  Et cetera. 

That’s all fine, but by saying this comes from an ancient, primitive, emotional brain might make people think that (neurotypicals are primitive and stupid) and ancient vertebrates really had to know themselves, be able to read others, and interact socially.  ( ie; ancient vertebrates were as intelligent as modern humans) But there’s a whole lot of solitary, nonsocial vertebrate species out there, and they have brainstems and limbic systems too.

Hopefully never again will you refer to a part of the brain as “primitive.”  Some structures probably more closely resemble their homologous counterparts in the last common ancestor of vertebrates, but all the basic parts were there from the beginning.  And remember, evolution didn’t happen (only) to make humans. (And specifically, EuroAmerican white males.)  We aren’t more advanced in an evolutionary sense than fish, lizards, or mice.  Each species is just adapted to the roles it finds itself in, and continues to adapt.  Our sense of being “advanced” comes purely from our own self-regard and anthropocentric tendencies. The human brain is not the best brain, nor is it the most advanced brain, because there’s no scale on which to measure how good a brain is.

Actually, the process of evolution appears to settle for “good enough” as the standard for successful adaptation!

Mental Development / Genetics of Visual Attention

Twin study finds genetics affects where children look, shaping mental development

https://www.sciencedaily.com/releases/2017/11/171109131152.htm

November 9, 2017 / Indiana University

A study that tracked the eye movement of twins has found that genetics plays a strong role in how people attend to their environment.

Conducted in collaboration with researchers from the Karolinska Institute in Sweden, the study offers a new angle on the emergence of differences between individuals and the integration of genetic and environmental factors in social, emotional and cognitive development. This is significant because visual exploration is also one of the first ways infants interact with the environment, before they can reach or crawl.

“The majority of work on eye movement has asked ‘What are the common features that drive our attention?'” said Daniel P. Kennedy, an assistant professor in the IU Bloomington College of Arts and Sciences’ Department of Psychological and Brain Sciences. “This study is different. We wanted to understand differences among individuals and whether they are influenced by genetics.”

Kennedy and co-author Brian M. D’Onofrio, a professor in the department, study neurodevelopmental problems from different perspectives. This work brings together their contrasting experimental methods: Kennedy’s use of eye tracking for individual behavioral assessment and D’Onofrio’s use of genetically informed designs, which draw on data from large population samples to trace the genetic and environmental contributions to various traits. As such, it is one of the largest-ever eye-tracking studies.

In this particular experiment, the researchers compared the eye movements of 466 children — 233 pairs of twins (119 identical and 114 fraternal) — between ages 9 and 14 as each child looked at 80 snapshots of scenes people might encounter in daily life, half of which included people. Using an eye tracker, the researchers then measured the sequence of eye movements in both space and time as each child looked at the scene. They also examined general “tendencies of exploration”; for example, if a child looked at only one or two features of a scene or at many different ones.

Published Nov. 9 in the journal Current Biology, the study found a strong similarity in gaze patterns within sets of identical twins, who tended to look at the same features of a scene in the same order. It found a weaker but still pronounced similarity between fraternal twins.

This suggests a strong genetic component to the way individuals visually explore their environments: Insofar as both identical and fraternal twins each share a common environment with their twin, the researchers can infer that the more robust similarity in the eye movements of identical twins is likely due to their shared genetic makeup. The researchers also found that they could reliably identify a twin with their sibling from among a pool of unrelated individuals based on their shared gaze patterns — a novel method they termed “gaze fingerprinting.”

“People recognize that gaze is important,” Kennedy said. “Our eyes are moving constantly, roughly three times per second. We are always seeking out information and actively engaged with our environment, and ultimately where you look affects your development.”

After early childhood, the study suggests that genes influence at the micro-level — through the immediate, moment-to-moment selection of visual information — the environments individuals create for themselves.

“This is not a subtle statistical finding,” Kennedy said. “How people look at images is diagnostic of their genetics. Eye movements allow individuals to obtain specific information from a space that is vast and largely unconstrained. It’s through this selection process that we end up shaping our visual experiences.

“Less known are the biological underpinnings of this process,” he added. “From this work, we now know that our biology affects how we seek out visual information from complex scenes. It gives us a new instance of how biology and environment are integrated in our development.”

“This finding is quite novel in the field,” D’Onofrio said. “It is going to surprise people in a number of fields, who do not typically think about the role of genetic factors in regulating such processes as where people look.”

_____________________________________________________

Comment: 

(Note: Many individuals can learn the “scientific method”- techniques, procedures and the use of math, without having an “understanding” of  “physical reality”. This is a problem in American “science” today.)

Why is the Asperger “attentional preference” for “physical reality” labeled a developmental defect? Because modern social humans BELIEVE that only the social environment EXISTS!

This “narrow” field of attention in modern social humans is the result of domestication / neoteny. The “magical thinking” stage of childhood development is carried into adulthood. This “arrested development” retains the narcissistic infantile perception of reality.  

A genetic basis for this “perceptual” knowledge of reality would support the Asperger “Wrong Planet” sense of alienation from neurotypical social environments. Our “real world” orientation is not a “defect” – our perception is that of an adult Homo sapiens. The hypersocial “magical” perception of the environment is that of the self-centered infant, whose very survival depends on the manipulation of “big mysterious beings” (parents – puppeteers) who make up the infant’s ENTIRE UNIVERSE.  

The Neurotypical Universe

 


Journal Reference:

  1. Daniel P. Kennedy, Brian M. D’Onofrio, Patrick D. Quinn, Sven Bölte, Paul Lichtenstein, Terje Falck-Ytter. Genetic Influence on Eye Movements to Complex Scenes at Short Timescales. Current Biology, 2017 DOI: 10.1016/j.cub.2017.10.007

Geologists discover 5.7 myo “human-like” footprints / CRETE

ORIGINAL PAPER: http://www.sciencedirect.com/science/article/pii/S001678781730113X

Fossil footprints challenge established theories of human evolution

August 31, 2017 / Uppsala University

Summary: Newly discovered human-like footprints from Crete may put the established narrative of early human evolution to the test. The footprints are approximately 5.7 million years old and were made at a time when previous research puts our ancestors in Africa — with ape-like feet.

Ever since the discovery of fossils of Australopithecus in South and East Africa during the middle years of the 20th century, the origin of the human lineage has been thought to lie in Africa. More recent fossil discoveries in the same region, including the iconic 3.7 million year old Laetoli footprints from Tanzania which show human-like feet and upright locomotion, have cemented the idea that hominins (early members of the human lineage) not only originated in Africa but remained isolated there for several million years before dispersing to Europe and Asia. The discovery of approximately 5.7 million year old human-like footprints from Crete, published online this week by an international team of researchers, overthrows this simple picture and suggests a more complex reality.

Human feet have a very distinctive shape, different from all other land animals. The combination of a long sole, five short forward-pointing toes without claws, and a hallux (“big toe”) that is larger than the other toes, is unique. The feet of our closest relatives, the great apes, look more like a human hand with a thumb-like hallux that sticks out to the side. The Laetoli footprints, thought to have been made by Australopithecus, are quite similar to those of modern humans except that the heel is narrower and the sole lacks a proper arch. By contrast, the 4.4 million year old Ardipithecus ramidus from Ethiopia, the oldest hominin known from reasonably complete fossils, has an ape-like foot. The researchers who described Ardipithecus argued that it is a direct ancestor of later hominins, implying that a human-like foot had not yet evolved at that time.

The new footprints, from Trachilos in western Crete, have an unmistakably human-like form. This is especially true of the toes. The big toe is similar to our own in shape, size and position; it is also associated with a distinct ‘ball’ on the sole, which is never present in apes. The sole of the foot is proportionately shorter than in the Laetoli prints, but it has the same general form. In short, the shape of the Trachilos prints indicates unambiguously that they belong to an early hominin, somewhat more primitive than the Laetoli trackmaker. They were made on a sandy seashore, possibly a small river delta, whereas the Laetoli tracks were made in volcanic ash.

‘What makes this controversial is the age and location of the prints,’ says Professor Per Ahlberg at Uppsala University, last author of the study.

At approximately 5.7 million years, they are younger than the oldest known fossil hominin, Sahelanthropus from Chad, and contemporary with Orrorin from Kenya, but more than a million years older than Ardipithecus ramidus with its ape-like feet. This conflicts with the hypothesis that Ardipithecus is a direct ancestor of later hominins. Furthermore, until this year, all fossil hominins older than 1.8 million years (the age of early Homo fossils from Georgia) came from Africa, leading most researchers to conclude that this was where the group evolved. However, the Trachilos footprints are securely dated using a combination of foraminifera (marine microfossils) from over- and underlying beds, plus the fact that they lie just below a very distinctive sedimentary rock formed when the Mediterranean sea briefly dried out, 5.6 millon years ago. By curious coincidence, earlier this year, another group of researchers reinterpreted the fragmentary 7.2 million year old primate Graecopithecus from Greece and Bulgaria as a hominin. Graecopithecus is only known from teeth and jaws.

During the time when the Trachilos footprints were made, a period known as the late Miocene, the Sahara Desert did not exist; savannah-like environments extended from North Africa up around the eastern Mediterranean. Furthermore, Crete had not yet detached from the Greek mainland. It is thus not difficult to see how early hominins could have ranged across south-east Europe and well as Africa, and left their footprints on a Mediterranean shore that would one day form part of the island of Crete.

‘This discovery challenges the established narrative of early human evolution head-on and is likely to generate a lot of debate. Whether the human origins research community will accept fossil footprints as conclusive evidence of the presence of hominins in the Miocene of Crete remains to be seen,’ says Per Ahlberg.

 


 

Energy use by Eem Neanderthals / Bent Sørensena, Roskilde University

doi:10.1016/j.jas.2009.06.003 Journal of Archaeological Science

Energy use by Eem Neanderthals

http://energy.ruc.dk/Energy%20use%20by%20Eem%20Neanderthals.pdf

Bent Sørensena, Department of Environmental, Social and Spatial Change, Roskilde University, DK 4000 Roskilde, Denmark.

_______________________________________________________________________________

Abstract

An analysis of energy use by Neanderthals in Northern Europe during the mild Eem interglacial period is carried out with consideration of the metabolic energy production required for compensating energy losses during sleep, at daily settlement activities and during hunting expeditions, including transport of food from slain animals back to the settlement. Additional energy sources for heat, security and cooking are derived from fireplaces in the open or within shelters such as caves or huts. The analysis leads to insights not available from archaeological findings that are mostly limited to durable items such as those made of stone: Even during the benign Eem period, Neanderthals faced a considerable heat loss problem. Wearing tailored clothes or some similar measure was necessary for survival. An animal skin across the shoulder would not have sufficed to survive even average cold winter temperatures and body cooling by convection caused by wind. Clothes and particularly footwear had to be sewn together tightly in order to prevent intrusion of water or snow. The analysis of hunting activity evolvement in real time further shows that during summer warmth, transport of meat back to the base settlement would not be possible without some technique to avoid that the meat rots. The only likely technique is meat drying at the killing site, which indicates further skills in Neanderthal societies that have not been identified by other routes of investigation.

_______________________________________________________________________________

1. Introduction and background

The Neanderthals had an average body mass above that of modern humans, a more sturdy bone structure and a lower height. Food was primarily obtained by hunting big game. The aim of the present paper is to explore the energy requirements of Neanderthals during the warm interglacial Eem period (around 125 ky BP), and based on such analysis to discuss the need for clothes and footwear, as well as methods for food conservation and preparation. The climatic environment is taken as that of Northern Europe, using Eem temperature data from Bispingen close to the Neanderthal site Lehringen near Hamburg (Kühl and Litt, 2007). The climatic conditions would be similar in most of Northern Germany, Belgium and Denmark, while Eastern Europe would have slightly colder winters, as would Finland. Some 30 European Neanderthal sites dating from the Eem, with roughly equal shares in Southern, Middle and Northern Europe, are listed by Wenzel (2007). Traces of seasonal presence have been found at Hollerup, Denmark (Møhl-Hansen, 1954) and at Susiluola Cave, Finland (Schulz, 2001, 2006), indicating that Neanderthals had a high mobility. Assuming a group * Correspondence: boson@ruc.dk (Bent Sørensen)
size of 25 (Hassan, 1981), the total European Neanderthal population at any given time within the Eem period could have been around 1000, depending on how many sites were simultaneously populated and which fraction the sites surviving and found are of the true settlement count. Patou-Mathis (2000) lists 73 Eem settlement levels in Middle and Northern Europe, indicating that some sites were re-occupied at different times within the Eem period. Throughout their presence in Europe, the diet of Neanderthals consisted mainly of meat. Large herbivores seem to have been actively hunted rather than scavenged (Bocherens et al., 2005). The Neanderthal hunting strategy was to specialise and concentrate on a few large herbivore mammal species, among which horse (equus sp.) red deer (cervus elaphus), woolly rhinocerous (coelodonta antiquitatis), woolly mammoth (mammuthus prinigenius) and bison (bison priscus) were present both during the Eem interglacial and the adjacent colder periods. During the Eem, additional forestbased species appeared, and the volume of species preferring open space, such as mammoth, was lower than in adjacent periods: mammoth is found in 22.5% of the Eem-occupied site levels considered by Patou-Mathis (2000), but in 50-60% of the levels belonging to adjacent time periods. Mammoth is in this study selected as an example for investigating the energy use involved in Neanderthal hunting, slaying and readying meat for eating, because of its size that demands a maximum of logistic skills by the hunters. However, proof of mammoth presence in North-Western Europe during the Eem period is nearly absent. Mammoth remains have been found in Grotte Scladina, Belgium (in level 4, dated by thermo-luminescence to between 106 and 133 ky BP, and in level 5, dated to 110-150 ky BP), but according to magnetic susceptibility relative dating in the lowest part of the uncertainty intervals and thus possibly younger than the marine isotope stage 5e usually associated with the Eem period (Döppes et al. 2008; Ellwood et al., 2004). The scarcity of mammoth finds at the settlement sites may be explained by only meat, not bones, being carried back to camp after a kill (Patou-Mathis, 2000; Balter and Simon, 2006). A straight-tusked elephant, a species preferring the warmer Eem environment, has been found at Lehringen with a Neanderthal spear through its rib, so it could also have been used as an example of extreme-weight prey. In the assessment made in the present study, the species exemplified is represented by its meat-mass alone, and results (such as energy-use by carrying) scale directly with the mass of the parts transported back to the group. Large herbivores were hunted and killed by sinking spears into their body by a party of several Neanderthals (Wenzel, 2007). Spears were rarely thrown, as deduced from the absence of shoulder and upper arm bone asymmetries characteristic of more recent hunters using spear-throwing techniques (Rhodes and Churchill, 2009). The Neanderthal population density was low enough to make big-game hunting sustainable, and famines due to insufficient food would not normally occur (Harpending, 1998). One similar analysis of needs for clothes and footwear has been made for the marine isotope stage 3 around 40 ky BP (Aiello and Wheeler, 2004), however with some unrealistic features: It uses subjective wind-chill temperatures (Lee, 2001) in the heat-loss expression valid for real temperatures (to which heat loss by the action of wind could have been added in the way it is done below), and it uses an arbitrary increase of the average metabolic rate to three times the basic one. This is suggesting an ability of the Neanderthal body to regulate its metabolism according to ambient temperature (Steegmann et al., 2002), in contrast to the body of modern humans, where this can be done only in a very minor way in infants, through adrenaline stimulation of a special fat deposit in brown adipose tissues (BAT) near the shoulder (Farmer, 2009). Steegmann et al. (2002) suggest that recent Tierra del Fuego Ona (Selk’nam) aborigines had a particular cold adaptation and that the Neanderthals might also have had it. However, no search for BAT in Ona remains has to my knowledge been made, and the photo reproduced in Steegmann et al. (2002) shows people posing for a 1901-picture, dressed in heavy fur and similar hats, but with bare feet or moccasins. To make inferences, one should have time distributions of clothing used and corresponding temperatures, and to
to compare with Neanderthals, the differences between the Ona, deriving their food from the camellike guanaco, from gathering and from fishing, e.g of seal, and the Neanderthal providing food by big-game hunt with walking or running over extended distances and durations should be kept in mind. The Neanderthals would during cold periods be better compared with present or recent Inuit populations, which use several layers of clothing and heavy, furred footwear. Should the Neanderthals really have had a genetic advantage in cold adaptation that modern humans do not have, it becomes more difficult to understand why modern humans and not Neanderthals survived the cooling period up to the last glacial maximum, 40-20 ky BP. Without ad hoc assumptions on the genetic make-up of Neanderthals, one must assume that in order to increase metabolism, muscle work is required, so that a sleeping and freezing Neanderthal person must get up and swing the arms, jump up and down or otherwise perform the muscle work that will bring the level of metabolism up and create the associated heat that can keep the body warm. Generating a total of 300 W of heat (including the basic metabolic heat production during sleep of 80-90 W) requires about 100 W of muscle work (Sørensen, 2004, p. 17). The analysis of energy production and use presented below is divided into two parts: first the energy balance is evaluated during sleep, and then the various components of energy production and use during activities taking up the wake time of Eem Neanderthals.

much more…

________________________________________

Even with the “fat” gene, heavy clothing is essential.

And from genetic studies:

Arctic Inuit, Native American cold adaptations may originate from extinct hominids

https://www.sciencedaily.com/releases/2016/12/161220175552.htm Full Article

December 20, 2016, Molecular Biology and Evolution (Oxford University Press)

Summary:
In the Arctic, the Inuits have adapted to severe cold and a predominantly seafood diet. Now, a team of scientists has followed up on the first natural selection study in Inuits to trace back the origins of these adaptations. The results provide convincing evidence that the Inuit variant of the TBX15/WARS2 region first came into modern humans from an archaic hominid population, likely related to the Denisovans.
%d bloggers like this: