One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Overlap in Prey / Neanderthal, Hyena

_45464284_neander_sites466x268
Comparison of Neanderthal and Hyena as “top predators”.

Isotopic evidence for diet and subsistence pattern of the Saint-Césaire I Neanderthal: review and use of a multi-source mixing model.

Author information

  • 1Institut des Sciences de l’Evolution, UMR 5554, Université Montpellier 2, Place E. Bataillon, F-34095 Montpellier cedex 05, France. bocheren@isem.univ-montp2.fr

Abstract

The carbon and nitrogen isotopic abundances of the collagen extracted from the Saint-Césaire I Neanderthal have been used to infer the dietary behaviour of this specimen. A review of previously published Neanderthal collagen isotopic signatures with the addition of 3 new collagen isotopic signatures from specimens from Les Pradelles allows us to compare the dietary habits of 5 Neanderthal specimens from OIS 3 and one specimen from OIS 5c.

This comparison points to a trophic position as top predator in an open environment, with little variation through time and space. In addition, a comparison of the Saint-Césaire I Neanderthal with contemporaneous hyaenas has been performed using a multi-source mixing model, modified from Phillips and Gregg (2003, Oecologia 127, 171). It appears that the isotopic differences between the Neanderthal specimen and hyaenas can be accounted for by much lower amounts of reindeer and much higher amounts of woolly rhinoceros and woolly mammoth in the dietary input of the Neanderthal specimen than in that of hyaenas, with relatively similar contributions of bovinae, large deer and horse for both predators, a conclusion consistent with the zooarchaeological data. The high proportion of very large herbivores, such as woolly rhinoceros and woolly mammoth, in Neanderthal’s diet compare to that of the scavenging hyaenas suggests that Neanderthals could not acquire these prey through scavenging. They probably had to hunt for proboscideans and rhinoceros. Such a prey selection could result from a long lasting dietary tradition in Europe.

PMID:
15869783

_________________________________________________________________________________________

(Below: Not the Saint-Cesaire 1 specimen) “Mystery” Neanderthal species allows artists to speculate on the “reality” of multiple human types. There is no satisfactory evidence of blue eyes in Neanderthal.

neanderthal female reconstruction, viktor deak

Neanderthal female reconstruction, Viktor Deak

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza 2

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza 2

Emergence of “humans” / Berkeley.edu + Comments

slidec8

Simplified socio-cultural guide to identifying male / female.

 

The evolution of Primates – Gender dimorphism /

Top: Orangutan male and female. Middle: Modern social human; all “cases” of allowable bathroom use. Bottom: Idiot’s guide to gender ID; U.S.

 

Low sexual dimorphism in modern social humans? Really? Sexual dimorphism is created culturally in humans, and wow! Gender assignment is all mixed up! In fact, one might observe, that body alteration, decoration, behavior and costume are how Homo sapiens compensates for being a strange hairless ape, born without the elaborate fur, plumage, texture, color and behavioral displays of other species. We “copy” other animals and utilize materials in the environment to socially broadcast our sex and gender  – from the violent hyper male to the “big boob” sex object that is the “ideal” American woman. Some cultures  disguise or blur a person’s sex / gender. Neoteny promotes childlike appearance in males and females – the current trend is toward androgeny.

Any questions about this guy’s gender? 

papua13

Old school “gun”

50%20cent

Below: Modern neotenic “feminized” male – androgeny is the popular goal.

jaejoong-jyj korean

__________________________________________________________________________________________

How bizarre can the “story” of human evolution get?

The following chapter “The Emergence of Humans” is from Berkeley.edu, a site about evolution for students. I confess that to my Asperger type of thinking, this review of evolutionary studies is excruciating: One (dumb) point of view is especially mind-boggling; that chimpanzees are a legitimate focus of “study and research” into ancestral humans and modern human behavior, merely because “they are alive” and eligible for torture in labs’; they don’t have “souls” or “suffer.” And they appeal to neotenic social humans, by scoring high on the “cute” scale.

The apparent inability of researchers to get past this 19th C. world view is stunning; instead of a thorough examination of assumptions across disciplines, we again see “warfare” between disciplines, and the ongoing attempt to assemble a human “dinosaur” from bits and pieces of fossilized thinking. In fact, paleontology has exploded with new ideas since “old” dinosaur reconstructions were discovered to be highly inaccurate. Hint, hint.

FOUND! The last common ancestor of Humans and Chimps.

imagesZYC0W6GI

Berkeley.edu / The emergence of humans

The narratives of human evolution are oft-told and highly contentious. There are major disagreements in the field about whether human evolution is more like a branching tree or a crooked stick, depending partly on how many species one recognizes. Interpretations of almost every new find will be sure to find opposition among other experts. Disputes often center on diet and habitat, and whether a given animal could occasionally walk bipedally or was fully upright. What can we really tell about human evolution from our current understanding of the phylogenetic relations of hominids and the sequence of evolution of their traits?

Hominid evogram

(consistency problem)

To begin with, let’s take a step back. Although the evolution of hominid features is sometimes put in the framework of “apes vs. humans,” the fact is that humans are apes, just as they are primates and mammals. A glance at the evogram shows why. The other apes — chimp, bonobo, gorilla, orangutan, gibbon — would not form a natural, monophyletic group (i.e., a group that includes all the descendants of a common ancestor) — if humans were excluded. Humans share many traits with other apes, and those other “apes” (i.e., non-human apes) don’t have unique features that set them apart from humans. Humans have some features that are uniquely our own, but so do gorillas, chimps, and the rest. Hominid evolution should not be read as a march to human-ness (even if it often appears that way from narratives of human evolution). Students should be aware that there is not a dichotomy between humans and apes. Humans are a kind of ape.

Virtually all systematists and taxonomists agree that we should only give names to monophyletic groups. However, this evogram shows that this guideline is not always followed. For an example, consider Australopithecus. On the evogram you can see a series of forms, from just after Ardipithecus to just before Homo in the branching order, that are all called Australopithecus. (Even Paranthropus is often considered an australopithecine.) But as these taxa appear on the evogram, “Australopithecus” is not a natural group, because it is not monophyletic: some forms, such as A. africanus, are found to be closer to humans than A. afarensis and others. Beyond afarensis, for example, all other Australopithecus and Homo share “enlarged cheek teeth and jaws,” because they have a more recent common ancestor. Eventually, several of these forms will have to have new genus names if we want to name only monophyletic groups. Students should avoid thinking of “australopithecines” as a natural group with uniquely evolved traits that link its members together and set it apart from Homo. Instead they should focus on the pattern of shared traits among these species and the Homo clade, recognizing that each species in this lineage gains more and more features that are shared by Homo.

In popular fiction and movies, the concept of the wild “ape-man” is often that of a tree-living, vine-swinging throwback like Tarzan. However, the pantheon of hominids is much richer than this, as the evogram shows with forms as different as Paranthropus and Ardipithecus shows. For example, imagine going back in time to the common ancestor of humans and chimps (including bonobos). What did that common ancestor look like? In the Origin of Species Darwin noted that the extinct common ancestor of two living forms should not be expected to look like a perfect intermediate between them. Rather, it could look more like one branch or the other branch, or something else entirely.

Found! The last common ancestor of humans and chimps.

Did the common ancestor of humans and chimps conform to the ape-man myth and live in the trees, swinging from vines? To answer this, we have to focus not only on anatomy but on behavior, and we have to do it in a phylogenetic context. Apes such as the gibbon and orangutan, which are more distantly related to humans, are largely arboreal (i.e., tree-living). The more closely related apes such as the gorilla and chimps are relatively terrestrial, although they can still climb trees. The feet of the first hominids have a considerable opposition of the big toe to the others but relatively flat feet, as arboreal apes generally do. But other features of their skeleton, such as the position of the foramen magnum underneath the skull, the vertically shortened and laterally flaring hips, and the larger head of the femur, suggest that they were not just mainly terrestrial but habitually bipedal, unlike their knuckle-walking relatives. Most evidence suggests that the hominid lineage retained some of the anatomical features related to arboreal life and quadrupedal gait even after it had evolved a more terrestrial lifestyle and a bipedal gait. There is no fossil record of these behaviors, but the balance of the available evidence supports the hypothesis that the hominid ancestor was terrestrial and bipedal.

Much discussion in human paleontology surrounds the evolution of a bipedal, upright stance. When and why did this occur? One thing to keep in mind is that “bipedal” and “upright” are not equivalent terms. An animal can be bipedal without having a vertical backbone (think T. rex). It seems clear from the fossil record of hominids that habitual bipedality preceded the evolution of a recurved spine and upright stance. Other changes in the gait, such as how the relatively “splayed” gait of chimps evolved into the gait of humans, who put one foot directly in front of the other, involve studying the hip joint, the femur, and the foot. The famous Laetoli footprints attributed to Australopithecus afarensis are bipedal, but they are still relatively splayed compared to the tracks of living humans. (WOW! they are doing it again despite their own caution: humans did not evolve from chimpanzees!)

Another extremely interesting feature in hominid evolution is the degree of sexual dimorphism (i.e., physical differences between the sexes) in different species. Sexual dimorphism is linked to features of sociality and mate competition in many sorts of animals. To understand the evolution of this feature in humans, which have relatively low sexual dimorphism, we need to consider the other apes, in which sexual dimorphism tends to be moderate to high (with exceptions). 

(Again, culture is utterly ignored: the fact is; women and men “self-morph” according to socio-cultural “genders” into very dimorphic animals)

We don’t have sufficient evidence about Sahelanthropus, Orrorin, and Ardipithecus to understand much about sex differences in these species, but we do know that A. afarensis had relatively high sexual dimorphism: the males were considerably larger than the females. The difference seems to have been less in A. africanus, Paranthropus, and most of the Homo lineage. The evolutionary explanation for A. afarensis‘ dimorphism is not entirely clear. The larger males may have used their size to attract females and/or repel rivals, which would fit with an explanation based on sexual selection. Or the males and females may have been differently sized because they played different roles in their groups, the males hunting and gathering and the females caring for the young. Darwin thought that this differentiation of the sexes may have played a critical role in human evolution, but we simply do not know much about the role of this feature in A. afarensis. Some, all, or none of these functions may have been in play. (Novel-writing again! If we don’t have facts about a subject, why not say so? Speculation becomes dogma in the “magic word syndrome” social mind and people argue over imaginary histories and qualities.  Also – I suspect that once again the writers have “EuroAmerican humans in mind regarding sexual dimorphism: why?

We do know that by the time the animals known as Homo evolved, they could make tools, and their hands were well suited for complex manipulations. These features were eventually accompanied by the reduction of the lower face, particularly the jaws and teeth, the recession of the brow, the enlargement of the brain, the evolution of a more erect posture, and the evolution of a limb more adapted for extended walking and running (along with the loss of arboreally oriented features). The evogram shows the hypothesized order of acquisition of these traits. Yet each of the Homo species was unique in its own way, so human evolution should not be seen as a simple linear progression of improvement toward our own present-day form. (But, we show it that way, anyway!)

More…. Should you need a mind-boggling experience:

https://en.wikibooks.org/wiki/Survey_of_Communication_Study/Chapter_13_-_Gender_Communication

And to clarify all this: 

Beard Guys / Two Best

Shut up and fight / Thank the gods; Vikings is back 11/29

Shut up and bake / A guy who looks scrumptious in a beard and can make a perfect pie crust? Sign me up!

Neanderthal mtDNA from before 220,000 y.o. Early Modern Human

Fact or Baloney…read on…

Neandertals and modern humans started mating early

For almost a century, Neandertals were considered the ancestors of modern humans. But in a new plot twist in the unfolding mystery of how Neandertals were related to modern humans, it now seems that members of our lineage were among the ancestors of Neandertals. Researchers sequenced ancient DNA from the mitochondria—tiny energy factories inside cells—from a Neandertal who lived about 100,000 years ago in southwest Germany. They found that this DNA, which is inherited only from the mother, resembled that of early modern humans.

After comparing the mitochondrial DNA (mtDNA) with that of other archaic and modern humans, the researchers reached a startling conclusion: A female member of the lineage that gave rise to Homo sapiens in Africa mated with a Neandertal male more than 220,000 years ago—much earlier than other known encounters between the two groups. Her children spread her genetic legacy through the Neandertal lineage, and in time her African mtDNA completely replaced the ancestral Neandertal mtDNA.

Other researchers are enthusiastic about the hypothesis, described in Nature Communications this week, but caution that it will take more than one genome to prove. “It’s a nice story that solves a cool mystery—how did Neandertals end up with mtDNA more like that of modern humans,” says population geneticist Ilan Gronau of the Interdisciplinary Center Herzliya in Israel. But “they have not nailed it yet.”

 The study adds to a catalog of ancient genomes, including mtDNA as well as the much larger nuclear genomes, from more than a dozen Neandertals. Most of these lived at the end of the species’ time on Earth, about 40,000 to 50,000 years ago. Researchers also have analyzed the complete nuclear and mtDNA genomes of another archaic group from Siberia, called the Denisovans. The nuclear DNA suggested that Neandertals and Denisovans were each other’s closest kin and that their lineage split from ours more than 600,000 years ago. But the Neandertal mtDNA from these samples posed a mystery: It was not like Denisovans’ and was closely related to that of modern humans—a pattern at odds with the ancient, 600,000 year divergence date. Last year Svante Pääbo’s team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, offered a startling solution: Perhaps the “Neandertal” mtDNA actually came from modern humans.

______________________________

Strange! Everything I’ve read previously has said the Neanderthal mtDna was not at all similar to any H. sapiens mtDna haplogroups. 

______________________________

In the new study, paleogeneticists Johannes Krause and Cosimo Posth of the Max Planck Institute for the Science of Human History in Jena, Germany, test this wild idea with ancient mtDNA from a Neandertal thighbone found in 1937 in the Hohlenstein-Stadel cave (HST) in Germany. Isotopes in animal bones found with the Neandertal suggest that it lived in a woodland known to have vanished at least 100,000 years ago.

Researchers compared the coding region of the HST Neandertal’s mtDNA with that of 17 other Neandertals, three Denisovans, and 54 modern humans. The HST Neandertal’s mtDNA was significantly different even from that of proto-Neandertals that date to 430,000 years ago at Sima de los Huesos in Spain, suggesting that their mtDNA had been completely replaced. But the HST sample was also surprisingly distinct from that of other Neandertals, allowing researchers to build a phylogenetic tree and study how Neandertal mtDNA evolved over time.

Using modern humans’ mtDNA mutation rate to calculate the timing, the researchers conclude that the HST mtDNA split from that of all other Neandertals at least 220,000 years ago. The ancient H. sapiens’ mtDNA must have entered the Neandertal lineage before this time, but after 470,000 years ago, the earliest date for when modern human and Neandertal mtDNA diverged. That’s early enough for the new form of mtDNA to have spread among Neandertals and replaced all their mtDNA.

“The mtDNA of Neandertals is not actually from Neandertals, but from an early modern human from Africa,” Krause says. The researchers speculate that this key mating may have happened in the Middle East, where early H. sapiens may have ventured. Other researchers find the scenario remarkable but plausible. “It seems magical but this type of thing happens all the time … especially if the populations are very small,” Gronau says. For example, the mtDNA in some grizzly bears has been completely replaced by that of polar bears, Krause says.

But some experts say DNA from other Neandertals is needed to prove that their mtDNA was inherited entirely from an early H. sapiens rather than from an ancient ancestor the two groups shared. “Is there other evidence of another [early] mtDNA introgression event?” asks Chris Stringer of the Natural History Museum in London.

Not yet, Posth says. Pääbo is seeking evidence of early gene swapping by trying to get nuclear DNA from the HST Neandertal and others. “We will learn a lot about the population history of Neandertals over the next few years,” he says.

Posted in: Evolution

doi:10.1126/science.aan70

 

The Whoa! Whoa! Whoa! Reaction / Neanderthal Myths

The “Whoa! Whoa! Whoa!” reaction is what happens when I read articles written for public consumption that “boil down” science for the “educated public” – those who are genuinely interested in the physical universe, but may or may not  have a science background. One of my favorite examples is how Neanderthals are “created” out of the modern social typical penchant (and temperamental obligation) to write stories (myths) from scant, contradictory or preliminary information.

Claiming that Neanderthals were "dumb" is dumb.

The claim that Neanderthals were “dumb” is dumb. Are these skulls to scale?


Science Shows Why You’re Smarter Than a Neanderthal

Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition

By Joseph Stromberg Smithsonian.com March 12, 2013

https://www.smithsonianmag.com/science-nature/science-shows-why-youre-smarter-than-a-neanderthal-1885827/ Full article

COMMENTS: This article hits the Whoa! Stop! barrier before getting past the subhead. “Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition.”

  1. This view of the brain as having a “capacity” related to volume, like a closet that can be packed with X amount of clothing and Y amount of shoes, and if you want to add more shoes or ski equipment, you have to remove the clothes to make room, defies what we know (and brag about endlessly) about the brain: it’s built of networks that connect across regions and functions, and these are PLASTIC – what is referred to as “able to rewire itself in reaction to the environment.” This blows apart much of what the article has to say.
  2. Visual thinking is judged to be INFERIOR, low level cognition. Tell that to a raptor, such as a hawk, raven or eagle; to giant squid or octopi and the myriad species which utilize various segments of the electro-magnetic spectrum to perceive the environment. This opinion is based in ignorance and the noises made by the perpetual cheer leaders for Homo sapiens, who believe humans are the pinnacle of evolution, and therefore, whatever “we” do is de facto superior.
  3. Which brings us to the question, if human abilities are superior, why must we compensate for our lack of sensory, cognitive and physical abilities by inventing technology? The average “know-it-all” American CONSUMES the products invented and developed by a handful of creative people in each generation. Knowledge is purchased in the form of “gadgets” that for the most part, do not educate, but distract the average individual from pursuing direct experience and interaction with the environment.
  4. Which means, “we” cognitive masterminds are taking a whole lot of credit for adaptations that are INHERITED from our “inferior, stupid, ancestors” who over the previous 200,000 years, not only survived, but built the culture that made us modern humans –
  5. Which comes to the egregious error of ignoring context: Compare an imaginary modern social human who exists in a context that is utterly dependent on manmade systems that supply food, water, shelter, medical care, economic opportunity, government control, cultural benefits and instant communication with a Neanderthal (or archaic Homo sapiens) whose environment is a largely uninhabited wilderness. One of the favorite clichés of American entertainment is “Male Monsters of Survival” cast into the wilderness (with a film crew and helicopter on call) recreating the Myth of Homo sapiens, Conqueror of Nature. These overconfident males are often lucky to last a week; injuries are common, starvation the norm.
  6. If visual thinking is so inferior, why do hunters rely on airplane and helicopter “flyovers” to locate game, and now drones, and add scopes, binoculars, game cameras,  and a multitude of “sensory substitutes” to their repertoire? Ever been to a sporting goods store? They’re packed with every possible gadget that will improve the DIMINISHED senses and cognitive ability of modern social humans to function outside of manmade environments and to be successful hunters and fishermen.
  7. As for forcing Neanderthals into extinction, modern social humans could accomplish this: we have a horrific history of wiping out indigenous peoples and continue to destroy not only human groups, but hundreds of species and the environments they are adapted to. Modern social humans could bomb Neanderthals “back to the Stone Age”. Kill them off with chemical weapons, shred them with cluster bombs, the overkill of targeted assassination and nuclear weapons.
  8. BUT there is no proof that Archaic Homo sapiens “extincted” Homo Neanderthal. We know that in some areas they lived cheek by jowl, had sex and produced offspring, but modern social humans maintain that Neanderthals were so “socially stupid” that the entire species fell to the magnificence of party-hearty Homo sapiens.  Actually, a modern social human would have difficulty distinguishing the two fearsome types: the challenge may have been like distinguishing a polar bear from a grizzly bear, which are actually both brown bears adapted to different environments. rather irrelevant if you’re facing down either one with a sharp stick.
  9. The myth that Homo sapiens individuals outside of Africa “contain” a variable 1-4% of Neanderthal DNA, with  specific “snips” related to various functions in modern humans, is incomplete. Rarely included in articles about how Homo sapiens and Neanderthal are connected is whole genome sequencing results which show that overall, the Homo sapiens genome, even now, is all but identical to the Neanderthal genome. This is logical: the divergence between the common ancestor of Chimps and  African great Apes (us) occurred 5-6 m.y.a. and yet, the human and chimp genomes share 99% of our DNA. How similar then, is Neanderthal and Denisovan genome to ours? This is a simple math question.
  10. What we need to compare is the Neanderthal genome and the ARCHAIC Homo sapiens genome – two groups of humans who were contemporaries.

 

 

 

Baboons, Social Typicals, Aspergers / STRESS

The usual human approach: stress is a killer; modern social environments are high stress; lets “engineer” humans to be able to tolerate high stress. What about changing environments so that human beings experience less stress? Of course not: that would benefit the average human. This is about what the top of the hierarchy wants – change the peasants so that they can live with extreme stress –

This article has dire implications for those of us who are born “Asperger” or with other neurodiverse brain types. 

_____________________________________________________

http://www.wired.com/2010/07/ff_stress_cure/

by: Jonah Lehrer

Under Pressure: The Search for a Stress Vaccine

Excerpts: 

Baboons are nasty, brutish, and short. They have a long muzzle and sharp fangs designed to inflict deadly injury. Their bodies are covered in thick, olive-colored fur, except on their buttocks, which are hairless. The species is defined by its social habits: The primates live in groups of several dozen individuals. These troops have a strict hierarchy, and each animal is assigned a specific rank. While female rank is hereditary — a daughter inherits her mother’s status — males compete for dominance. These fights can be bloody, but the stakes are immense: A higher rank means more sex. The losers, in contrast, face a bleak array of options — submission, exile, or death.

In 1978, Robert Sapolsky was a recent college graduate with a degree in biological anthropology and a job in Kenya. He had set off for a year of fieldwork by himself among baboons… here he was in Nairobi, speaking the wrong kind of Swahili and getting ripped off by everyone he met. Eventually he made his way to the bush, a sprawling savanna filled with zebras and wildebeests and elephants…

Sapolsky slowly introduced himself to a troop of baboons, letting them adjust to his presence. After a few weeks, he began recognizing individual animals, giving them nicknames from the Old Testament. It was a way of rebelling against his childhood Hebrew-school teachers, who rejected the blasphemy of Darwinian evolution…

Before long, Sapolsky’s romantic vision of fieldwork collided with the dismal reality of living in the African bush. (The baboons) seemed to devote all of their leisure time — and baboon life is mostly leisure time — to mischief and malevolence. “One of the first things I discovered was that I didn’t like baboons very much,” he says. “They’re quite awful to one another, constantly scheming and backstabbing. They’re like chimps but without the self-control.”

____________________________________________________________

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Olive baboon male standing on his hind legs watching a female presenting her rear (Papio cynocephalus anubis). Maasai Mara National Reserve, Kenya. Feb 2009.

____________________________________________________________

While Sapolsky was disturbed by the behavior of the baboons — this was nature, red in tooth and claw — he realized that their cruelty presented an opportunity to investigate the biological effects of social upheaval. He noticed, for instance, that the males at the bottom of the hierarchy were thinner and more skittish. “They just didn’t look very healthy,” Sapolsky says. “That’s when I began thinking about how damn stressful it must be to have no status. You never know when you’re going to get beat up. You never get laid. You have to work a lot harder for food.”

(Asperger types – is this us?)

So Sapolsky set out to test the hypothesis that the stress involved in being at the bottom of the baboon hierarchy led to health problems…“It struck most doctors as extremely unlikely that your feelings could affect your health. Viruses, sure. Carcinogens, absolutely. But stress? No way.” Sapolsky, however, was determined to get some data… Instead, he was busy learning how to shoot baboons with anesthetic darts and then, while they were plunged into sleep, quickly measure their immune system function and the levels of stress hormones and cholesterol in their blood….

A similarly destructive process is at work in humans. While doctors speculated for years that increasing rates of cardiovascular disease in women might be linked to the increasing number of females employed outside the home, that correlation turned out to be nonexistent. Working women didn’t have more heart attacks. There were, however, two glaring statistical exceptions to the rule: Women developed significantly more heart disease if they performed menial clerical work or when they had an unsupportive boss. The work, in other words, wasn’t the problem. It was the subordination.

(Female gender = subordinate in modern social hierarchy.)

One of the most tragic aspects of the stress response is the way it gets hardwired at a young age — an early setback can permanently alter the way we deal with future stressors. The biological logic of this system is impeccable: If the world is a rough and scary place, then the brain assumes it should invest more in our stress machinery, which will make us extremely wary and alert. There’s also a positive feedback loop at work, so that chronic stress actually makes us more sensitive to the effects of stress.

The physiology underlying this response has been elegantly revealed in the laboratory. When lab rats are stressed repeatedly, the amygdala — an almond-shaped nub in the center of the brain — enlarges dramatically. (See post  on amygdala, hippocampus) (This swelling comes at the expense of the hippocampus, which is crucial for learning and memory and shrinks under severe stress.) The main job of the amygdala is to perceive danger and help generate the stress response; it’s the brain area turned on by dark alleys and Hitchcock movies. Unfortunately, a swollen amygdala means that we’re more likely to notice potential threats in the first place, which means we spend more time in a state of anxiety. (This helps explain why a more active amygdala is closely correlated with atherosclerosis.) The end result is that we become more vulnerable to the very thing that’s killing us.

__________________________________________________________________________________________

Imagine you are a newborn: everything about you “looks normal” and your parents show you off; send photos to friends and relatives. They coo and gurgle over your parents’ splendid achievement. A Perfect Baby.
Then reality sets in: Human parents are obsessed with the fear of giving birth to a less-than-perfect baby. Can we deny the social pressure endured by an infant and parent, when parents “freak out” over a growing suspicion that their child is “abnormal”? They rush the child to the “witch doctor” – the expert, the authority, the interpreter of all human behavior; the priest or priestess who has the power to decide the fate of a child as a member of its society. What power over individual destinies these “judges” have!  
In American culture, it is the medical / behavioral industry which decides whether or not a child is conforming to a rigid schedule of physical, social, emotional and mental development. This used to be “the job” of religious authorities (and still is in many communities), but the “Helping Caring Fixing” industry has become a “co-religion” for many believers.  
An imaginary epidemic of “defective children” has grown into a reign of terror in contemporary American culture: children are labeled, isolated, shamed, bullied and virtually discarded; drugged into submission, simply for being children. 

Baboons: 

Baboons are African and Arabian Old World monkeys belonging to the genus Papio, part of the subfamily Cercopithecinae. The five species are some of the largest non-hominoid members of the primate order; only the mandrill and the drill are larger. Baboons use at least 10 different vocalizations to communicate with other members of the troop. Wikipedia

Scientific name: Papio / Lifespan: Guinea baboon: 35 – 45 years
Height: Olive baboon: 2.3 ft. / Hamadryas baboon: 44 – 66 lbs, Olive baboon: 22 – 82 lbs, Guinea baboon: 29 – 57 lbs
Fantastic photos:

 

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

The most important “developmental” fact of life

is death.

It just happens: We grow old. It’s a natural progression, without doubt. But not in the U.S., of course, where openly denying death is a frenzied passion. Getting old is a crime in a society terrified of “growing up” and becoming adult.

Old people are proof of the most basic facts of life, so much so, that being old has become taboo. And if one lives to the “new” expectation of 80 or so, that means 30 years of life beyond the new “old age” of 50. That’s a long time to “fake” being “young, beautiful, athletic and sexy”. 

Growing old is tough enough without a “new” set of instructions; don’t look old, act old, get sick, become feeble or need help (unless that help is covered by insurance.) Don’t remind younger people, by your very presence, that there is an end; it is believed now that one can “look good” until the end – which will entail a short, or long, period of degeneration. This period of “old age” is rarely seen as a “good” time of life as valid as one’s childhood, young adulthood, or middle age, unless one has the funds to at least pretend to be “youngish”.

Contrary to popular American belief, it remains a fruitful time of personal development. As long as our bodies continue to function, learning and thinking continue to be what humans do.

If life has been one long illusion that only “social” rewards count, and life has been a display of materials owned, status achieved, people “bested”, then one will likely keep up the illusion, with whatever “solutions” the anti-aging industry has to offer.

I live in a town in which most people are “getting old” – not much opportunity for the young to work, to develop a career, to join the circus of material wealth and ambition. Traditionally, young people have returned to the area after college, and a stint in corporate America, time in the military, or success in finding a spouse. Having “grown up” in this unique place, it was where they chose to establish families and to be close to loved ones. The Wyoming landscape and lifestyle have always been a fundamental fact in this choice to return, and it pulls relentlessly on those who leave.

Disastrous policies, and frankly criminal wars, prosecuted from Washington D.C. in league with corporate-Wall Street crooks, and funded by abused taxpayers, demonstrate the general belief on both coasts that the people who inhabit the “rest of the U.S.” just don’t matter. We are indeed worthless and disposable inferiors willing to enrich a ruling class that despises them, and to literally die for “blood” profits in their service.

Our town needs new people to survive as a community; we need children and young families, but opportunity is lacking. Small businesses are closing and not reopening: the owners have retired and are dying off. Competition from online retailers has siphoned off local spending and old people have very little to spend anyway. Every dime goes to necessities and the obscene cost of healthcare.

The American dream left our town long ago. Wyoming’s existence has been plagued by Federal and corporate control from the beginning, when the railroad opened the West to outright looting of it’s resources by far away “global” entities. Pillage of the land and it’s resources funded the American coastal empires; exploitation of immigrants provided cheap labor. “Colonialization” by U.S. and European nations was not limited to the invasion of “foreign lands” but happened here also – and continues to this day.

Native Americans (not being suited to corporate life and labor) were killed off with conscious purpose – a policy of mass murder; the remnants confined to “reservations” where their descendants are expected to remain “invisible” – to whither away and to eventually die off, by a slow suicide of formerly unique human beings. Diversity? A smoke screen.

These thoughts occupy my meditations as I pass through a human being’s last opportunity for personal development. It’s a time of recognizing that the universe goes on without us; that our deepest questions will not be answered. It’s a time to understand that the individual cannot correct or improve much that goes on in an increasing cluttered and entangled social world, which doesn’t mean that we ought not try to improve our ourselves and our small areas of influence.  Our lives are eventually “finished” for us by nature, in disregard for our insistence that our life is essential to the universe and therefore, ought to go on forever.

____________________________________________

It is shocking to confront the fact that so much human effort, inventiveness, hard labor, suffering, and resource depletion was, and still is, devoted to the imaginary “immortality” of a few (not so admirable) individuals; Pharaohs, emperors, kings, dictators, war lords, ideologues, criminals, Popes and priests; not the best of humanity, but often the worst.

The big lie is an old lie: Immortality can be purchased. 

Yes, there is a pyramid for immortality-mortality also: The Pharaohs of our time will not be mummified. (A crude process of desiccation, which however has been wildly socially successful! They continue to be A -List celebrities that attract fans of the “rich and famous”.)

Today’s 1% equivalents will not be made immortal by being dried out like fish, cheese or jerky – no, they will be made “immortal” by means of “sophisticated” technology. What an advancement in human civilization! 

These immortality technologies, and lesser life extension, of replacements of organs and skeletal architecture, part by failing part, are being promoted as “mankind’s future” – What a lie! As if the today’s Pharaohs really intend to share their immortality with 15 billion humans!

timecover

2045: The year Man becomes Immortal. Right: All estimate 15 billion of us.

A few elite at the top may manage to purchase immortality of a limited sort: machines designed in their own image.

The mortal King Tut, a product of incest who died at age 19. How much human talent and potential has been wasted on fulfilling the fantasy of immortality for a predatory class of individuals?

It’s not King Tut, the Insignificant, who is immortal, but the lure of his “real estate” holdings, elite addresses, golden household furniture and knickknacks, layers of stone coffins, granite “countertops”, Jacuzzi bath tubs, fabulous jewelry, and rooms with a view of eternity, that keeps the envious modern social tourist coming back. 


This is not King Tut. This is a fabulous work of propaganda made by artisans, (Pharaohs had to impress the Gods in order to become a god – you wouldn’t show up for “judgement day” in anything less than the most impressive selections from your wardrobe) who rarely get credit (nameless) for their “creation of brands and products” that supply the magical connections necessary for supernatural belief in the pyramid of social hierarchy as the “definitive and absolute model” of the cosmos.  

Magic consists of the “transfer of power” between the “immortal mask” and the unimpressive person; the “mask” has become King Tut in the belief system of the socially-obsessed viewer.  

 

 

%d bloggers like this: