One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Light Skin and Lactose / Recent Adaptations to Cereal Diet

IFL Science

Why Do Europeans Have White Skin?

April 6, 2015 | by Stephen Luntz (shortened to get to the point)

The 1000 Genomes Project is comparing the genomes of modern individuals from specific regions in Europe with 83 samples taken from seven ancient European cultures. Harvard University’s Dr. Iain Mathieson has identified five features which  spread through Europe, indicating a strong selection advantage.

At the annual conference of the American Association of Physical Anthropologists, Mathieson said his team distinguished, “between traits that have changed consistently with population turnovers, traits that have changed apparently neutrally, and traits that have changed dramatically due to recent natural selection.”

… most people of European descent are lactose tolerant, to the extent that milk products not only form a major source of nutrition but are a defining feature of European cultures…that the capacity to digest lactose as an adult appeared in the population after the development of farming. Two waves of farmers settled Europe 7,800 and 4,800 years ago, but it was only 500 years later that the gene for lactose tolerance became widespread.

…hunter-gatherers in what is now Spain, Luxumberg and Hungary had dark-skinned versions of the two genes more strongly associated with skin color. The oldest pale versions of the SLC24A5 and SLC45A2 genes that Mathieson found were at Motala in southern Sweden 7,700 years ago. The gene associated with blue eyes and blond hair was found in bodies from the same site. H/T ScienceMag.

world-solar-energy-map

____________________________________________________________________________________________

From: Civilization Fanatics Forum

Debunking the theory that lighter skin gradually arose in Europeans nearly 40,000 years ago, new research has revealed that it evolved recently – only 7,000 years ago

People in tropical to subtropical parts of the world manufacture vitamin D in their skin as a result of UV exposure. At northern latitudes, dark skin would have reduced the production of vitamin D. If people weren’t getting much vitamin D in their diet, then selection for pre-existing mutations for lighter skin (less pigment) would “sweep” the farming population.  

New scientific findings show that prehistoric European hunter-gatherers were dark-skinned, but ate vitamin D-rich meat, fish, mushrooms and fruits. With the switch to agriculture, the amount of vitamin D in the diet decreased – and resulted in selection for pale skin among European farmers.

Findings detailed today (Jan. 26, 2014) in the journal Nature, “also hint that light skin evolved not to adjust to the lower-light conditions in Europe compared with Africa, but instead to the new diet that emerged after the agricultural revolution”, said study co-author Carles Lalueza-Fox, a paleogenomics researcher at Pompeu Fabra University in Spain.

The finding implies that for most of their evolutionary history, Europeans were not what people today are known as  ‘Caucasian’, said Guido Barbujani, president of the Associazione Genetica Italiana in Ferrara, Italy, who was not involved in the study.

Kostenki_14

 

 

 

Beard Guys / Two Best

Shut up and fight / Thank the gods; Vikings is back 11/29

Shut up and bake / A guy who looks scrumptious in a beard and can make a perfect pie crust? Sign me up!

Neanderthal mtDNA from before 220,000 y.o. Early Modern Human

Fact or Baloney…read on…

Neandertals and modern humans started mating early

For almost a century, Neandertals were considered the ancestors of modern humans. But in a new plot twist in the unfolding mystery of how Neandertals were related to modern humans, it now seems that members of our lineage were among the ancestors of Neandertals. Researchers sequenced ancient DNA from the mitochondria—tiny energy factories inside cells—from a Neandertal who lived about 100,000 years ago in southwest Germany. They found that this DNA, which is inherited only from the mother, resembled that of early modern humans.

After comparing the mitochondrial DNA (mtDNA) with that of other archaic and modern humans, the researchers reached a startling conclusion: A female member of the lineage that gave rise to Homo sapiens in Africa mated with a Neandertal male more than 220,000 years ago—much earlier than other known encounters between the two groups. Her children spread her genetic legacy through the Neandertal lineage, and in time her African mtDNA completely replaced the ancestral Neandertal mtDNA.

Other researchers are enthusiastic about the hypothesis, described in Nature Communications this week, but caution that it will take more than one genome to prove. “It’s a nice story that solves a cool mystery—how did Neandertals end up with mtDNA more like that of modern humans,” says population geneticist Ilan Gronau of the Interdisciplinary Center Herzliya in Israel. But “they have not nailed it yet.”

 The study adds to a catalog of ancient genomes, including mtDNA as well as the much larger nuclear genomes, from more than a dozen Neandertals. Most of these lived at the end of the species’ time on Earth, about 40,000 to 50,000 years ago. Researchers also have analyzed the complete nuclear and mtDNA genomes of another archaic group from Siberia, called the Denisovans. The nuclear DNA suggested that Neandertals and Denisovans were each other’s closest kin and that their lineage split from ours more than 600,000 years ago. But the Neandertal mtDNA from these samples posed a mystery: It was not like Denisovans’ and was closely related to that of modern humans—a pattern at odds with the ancient, 600,000 year divergence date. Last year Svante Pääbo’s team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, offered a startling solution: Perhaps the “Neandertal” mtDNA actually came from modern humans.

______________________________

Strange! Everything I’ve read previously has said the Neanderthal mtDna was not at all similar to any H. sapiens mtDna haplogroups. 

______________________________

In the new study, paleogeneticists Johannes Krause and Cosimo Posth of the Max Planck Institute for the Science of Human History in Jena, Germany, test this wild idea with ancient mtDNA from a Neandertal thighbone found in 1937 in the Hohlenstein-Stadel cave (HST) in Germany. Isotopes in animal bones found with the Neandertal suggest that it lived in a woodland known to have vanished at least 100,000 years ago.

Researchers compared the coding region of the HST Neandertal’s mtDNA with that of 17 other Neandertals, three Denisovans, and 54 modern humans. The HST Neandertal’s mtDNA was significantly different even from that of proto-Neandertals that date to 430,000 years ago at Sima de los Huesos in Spain, suggesting that their mtDNA had been completely replaced. But the HST sample was also surprisingly distinct from that of other Neandertals, allowing researchers to build a phylogenetic tree and study how Neandertal mtDNA evolved over time.

Using modern humans’ mtDNA mutation rate to calculate the timing, the researchers conclude that the HST mtDNA split from that of all other Neandertals at least 220,000 years ago. The ancient H. sapiens’ mtDNA must have entered the Neandertal lineage before this time, but after 470,000 years ago, the earliest date for when modern human and Neandertal mtDNA diverged. That’s early enough for the new form of mtDNA to have spread among Neandertals and replaced all their mtDNA.

“The mtDNA of Neandertals is not actually from Neandertals, but from an early modern human from Africa,” Krause says. The researchers speculate that this key mating may have happened in the Middle East, where early H. sapiens may have ventured. Other researchers find the scenario remarkable but plausible. “It seems magical but this type of thing happens all the time … especially if the populations are very small,” Gronau says. For example, the mtDNA in some grizzly bears has been completely replaced by that of polar bears, Krause says.

But some experts say DNA from other Neandertals is needed to prove that their mtDNA was inherited entirely from an early H. sapiens rather than from an ancient ancestor the two groups shared. “Is there other evidence of another [early] mtDNA introgression event?” asks Chris Stringer of the Natural History Museum in London.

Not yet, Posth says. Pääbo is seeking evidence of early gene swapping by trying to get nuclear DNA from the HST Neandertal and others. “We will learn a lot about the population history of Neandertals over the next few years,” he says.

Posted in: Evolution

doi:10.1126/science.aan70

 

The Whoa! Whoa! Whoa! Reaction / Neanderthal Myths

The “Whoa! Whoa! Whoa!” reaction is what happens when I read articles written for public consumption that “boil down” science for the “educated public” – those who are genuinely interested in the physical universe, but may or may not  have a science background. One of my favorite examples is how Neanderthals are “created” out of the modern social typical penchant (and temperamental obligation) to write stories (myths) from scant, contradictory or preliminary information.

Claiming that Neanderthals were "dumb" is dumb.

The claim that Neanderthals were “dumb” is dumb. Are these skulls to scale?


Science Shows Why You’re Smarter Than a Neanderthal

Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition

By Joseph Stromberg Smithsonian.com March 12, 2013

https://www.smithsonianmag.com/science-nature/science-shows-why-youre-smarter-than-a-neanderthal-1885827/ Full article

COMMENTS: This article hits the Whoa! Stop! barrier before getting past the subhead. “Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition.”

  1. This view of the brain as having a “capacity” related to volume, like a closet that can be packed with X amount of clothing and Y amount of shoes, and if you want to add more shoes or ski equipment, you have to remove the clothes to make room, defies what we know (and brag about endlessly) about the brain: it’s built of networks that connect across regions and functions, and these are PLASTIC – what is referred to as “able to rewire itself in reaction to the environment.” This blows apart much of what the article has to say.
  2. Visual thinking is judged to be INFERIOR, low level cognition. Tell that to a raptor, such as a hawk, raven or eagle; to giant squid or octopi and the myriad species which utilize various segments of the electro-magnetic spectrum to perceive the environment. This opinion is based in ignorance and the noises made by the perpetual cheer leaders for Homo sapiens, who believe humans are the pinnacle of evolution, and therefore, whatever “we” do is de facto superior.
  3. Which brings us to the question, if human abilities are superior, why must we compensate for our lack of sensory, cognitive and physical abilities by inventing technology? The average “know-it-all” American CONSUMES the products invented and developed by a handful of creative people in each generation. Knowledge is purchased in the form of “gadgets” that for the most part, do not educate, but distract the average individual from pursuing direct experience and interaction with the environment.
  4. Which means, “we” cognitive masterminds are taking a whole lot of credit for adaptations that are INHERITED from our “inferior, stupid, ancestors” who over the previous 200,000 years, not only survived, but built the culture that made us modern humans –
  5. Which comes to the egregious error of ignoring context: Compare an imaginary modern social human who exists in a context that is utterly dependent on manmade systems that supply food, water, shelter, medical care, economic opportunity, government control, cultural benefits and instant communication with a Neanderthal (or archaic Homo sapiens) whose environment is a largely uninhabited wilderness. One of the favorite clichés of American entertainment is “Male Monsters of Survival” cast into the wilderness (with a film crew and helicopter on call) recreating the Myth of Homo sapiens, Conqueror of Nature. These overconfident males are often lucky to last a week; injuries are common, starvation the norm.
  6. If visual thinking is so inferior, why do hunters rely on airplane and helicopter “flyovers” to locate game, and now drones, and add scopes, binoculars, game cameras,  and a multitude of “sensory substitutes” to their repertoire? Ever been to a sporting goods store? They’re packed with every possible gadget that will improve the DIMINISHED senses and cognitive ability of modern social humans to function outside of manmade environments and to be successful hunters and fishermen.
  7. As for forcing Neanderthals into extinction, modern social humans could accomplish this: we have a horrific history of wiping out indigenous peoples and continue to destroy not only human groups, but hundreds of species and the environments they are adapted to. Modern social humans could bomb Neanderthals “back to the Stone Age”. Kill them off with chemical weapons, shred them with cluster bombs, the overkill of targeted assassination and nuclear weapons.
  8. BUT there is no proof that Archaic Homo sapiens “extincted” Homo Neanderthal. We know that in some areas they lived cheek by jowl, had sex and produced offspring, but modern social humans maintain that Neanderthals were so “socially stupid” that the entire species fell to the magnificence of party-hearty Homo sapiens.  Actually, a modern social human would have difficulty distinguishing the two fearsome types: the challenge may have been like distinguishing a polar bear from a grizzly bear, which are actually both brown bears adapted to different environments. rather irrelevant if you’re facing down either one with a sharp stick.
  9. The myth that Homo sapiens individuals outside of Africa “contain” a variable 1-4% of Neanderthal DNA, with  specific “snips” related to various functions in modern humans, is incomplete. Rarely included in articles about how Homo sapiens and Neanderthal are connected is whole genome sequencing results which show that overall, the Homo sapiens genome, even now, is all but identical to the Neanderthal genome. This is logical: the divergence between the common ancestor of Chimps and  African great Apes (us) occurred 5-6 m.y.a. and yet, the human and chimp genomes share 99% of our DNA. How similar then, is Neanderthal and Denisovan genome to ours? This is a simple math question.
  10. What we need to compare is the Neanderthal genome and the ARCHAIC Homo sapiens genome – two groups of humans who were contemporaries.

 

 

 

Baboons, Social Typicals, Aspergers / STRESS

The usual human approach: stress is a killer; modern social environments are high stress; lets “engineer” humans to be able to tolerate high stress. What about changing environments so that human beings experience less stress? Of course not: that would benefit the average human. This is about what the top of the hierarchy wants – change the peasants so that they can live with extreme stress –

This article has dire implications for those of us who are born “Asperger” or with other neurodiverse brain types. 

_____________________________________________________

http://www.wired.com/2010/07/ff_stress_cure/

by: Jonah Lehrer

Under Pressure: The Search for a Stress Vaccine

Excerpts: 

Baboons are nasty, brutish, and short. They have a long muzzle and sharp fangs designed to inflict deadly injury. Their bodies are covered in thick, olive-colored fur, except on their buttocks, which are hairless. The species is defined by its social habits: The primates live in groups of several dozen individuals. These troops have a strict hierarchy, and each animal is assigned a specific rank. While female rank is hereditary — a daughter inherits her mother’s status — males compete for dominance. These fights can be bloody, but the stakes are immense: A higher rank means more sex. The losers, in contrast, face a bleak array of options — submission, exile, or death.

In 1978, Robert Sapolsky was a recent college graduate with a degree in biological anthropology and a job in Kenya. He had set off for a year of fieldwork by himself among baboons… here he was in Nairobi, speaking the wrong kind of Swahili and getting ripped off by everyone he met. Eventually he made his way to the bush, a sprawling savanna filled with zebras and wildebeests and elephants…

Sapolsky slowly introduced himself to a troop of baboons, letting them adjust to his presence. After a few weeks, he began recognizing individual animals, giving them nicknames from the Old Testament. It was a way of rebelling against his childhood Hebrew-school teachers, who rejected the blasphemy of Darwinian evolution…

Before long, Sapolsky’s romantic vision of fieldwork collided with the dismal reality of living in the African bush. (The baboons) seemed to devote all of their leisure time — and baboon life is mostly leisure time — to mischief and malevolence. “One of the first things I discovered was that I didn’t like baboons very much,” he says. “They’re quite awful to one another, constantly scheming and backstabbing. They’re like chimps but without the self-control.”

____________________________________________________________

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Olive baboon male standing on his hind legs watching a female presenting her rear (Papio cynocephalus anubis). Maasai Mara National Reserve, Kenya. Feb 2009.

____________________________________________________________

While Sapolsky was disturbed by the behavior of the baboons — this was nature, red in tooth and claw — he realized that their cruelty presented an opportunity to investigate the biological effects of social upheaval. He noticed, for instance, that the males at the bottom of the hierarchy were thinner and more skittish. “They just didn’t look very healthy,” Sapolsky says. “That’s when I began thinking about how damn stressful it must be to have no status. You never know when you’re going to get beat up. You never get laid. You have to work a lot harder for food.”

(Asperger types – is this us?)

So Sapolsky set out to test the hypothesis that the stress involved in being at the bottom of the baboon hierarchy led to health problems…“It struck most doctors as extremely unlikely that your feelings could affect your health. Viruses, sure. Carcinogens, absolutely. But stress? No way.” Sapolsky, however, was determined to get some data… Instead, he was busy learning how to shoot baboons with anesthetic darts and then, while they were plunged into sleep, quickly measure their immune system function and the levels of stress hormones and cholesterol in their blood….

A similarly destructive process is at work in humans. While doctors speculated for years that increasing rates of cardiovascular disease in women might be linked to the increasing number of females employed outside the home, that correlation turned out to be nonexistent. Working women didn’t have more heart attacks. There were, however, two glaring statistical exceptions to the rule: Women developed significantly more heart disease if they performed menial clerical work or when they had an unsupportive boss. The work, in other words, wasn’t the problem. It was the subordination.

(Female gender = subordinate in modern social hierarchy.)

One of the most tragic aspects of the stress response is the way it gets hardwired at a young age — an early setback can permanently alter the way we deal with future stressors. The biological logic of this system is impeccable: If the world is a rough and scary place, then the brain assumes it should invest more in our stress machinery, which will make us extremely wary and alert. There’s also a positive feedback loop at work, so that chronic stress actually makes us more sensitive to the effects of stress.

The physiology underlying this response has been elegantly revealed in the laboratory. When lab rats are stressed repeatedly, the amygdala — an almond-shaped nub in the center of the brain — enlarges dramatically. (See post  on amygdala, hippocampus) (This swelling comes at the expense of the hippocampus, which is crucial for learning and memory and shrinks under severe stress.) The main job of the amygdala is to perceive danger and help generate the stress response; it’s the brain area turned on by dark alleys and Hitchcock movies. Unfortunately, a swollen amygdala means that we’re more likely to notice potential threats in the first place, which means we spend more time in a state of anxiety. (This helps explain why a more active amygdala is closely correlated with atherosclerosis.) The end result is that we become more vulnerable to the very thing that’s killing us.

__________________________________________________________________________________________

Imagine you are a newborn: everything about you “looks normal” and your parents show you off; send photos to friends and relatives. They coo and gurgle over your parents’ splendid achievement. A Perfect Baby.
Then reality sets in: Human parents are obsessed with the fear of giving birth to a less-than-perfect baby. Can we deny the social pressure endured by an infant and parent, when parents “freak out” over a growing suspicion that their child is “abnormal”? They rush the child to the “witch doctor” – the expert, the authority, the interpreter of all human behavior; the priest or priestess who has the power to decide the fate of a child as a member of its society. What power over individual destinies these “judges” have!  
In American culture, it is the medical / behavioral industry which decides whether or not a child is conforming to a rigid schedule of physical, social, emotional and mental development. This used to be “the job” of religious authorities (and still is in many communities), but the “Helping Caring Fixing” industry has become a “co-religion” for many believers.  
An imaginary epidemic of “defective children” has grown into a reign of terror in contemporary American culture: children are labeled, isolated, shamed, bullied and virtually discarded; drugged into submission, simply for being children. 

Baboons: 

Baboons are African and Arabian Old World monkeys belonging to the genus Papio, part of the subfamily Cercopithecinae. The five species are some of the largest non-hominoid members of the primate order; only the mandrill and the drill are larger. Baboons use at least 10 different vocalizations to communicate with other members of the troop. Wikipedia

Scientific name: Papio / Lifespan: Guinea baboon: 35 – 45 years
Height: Olive baboon: 2.3 ft. / Hamadryas baboon: 44 – 66 lbs, Olive baboon: 22 – 82 lbs, Guinea baboon: 29 – 57 lbs
Fantastic photos:

 

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

The most important “developmental” fact of life

is death.

It just happens: We grow old. It’s a natural progression, without doubt. But not in the U.S., of course, where openly denying death is a frenzied passion. Getting old is a crime in a society terrified of “growing up” and becoming adult.

Old people are proof of the most basic facts of life, so much so, that being old has become taboo. And if one lives to the “new” expectation of 80 or so, that means 30 years of life beyond the new “old age” of 50. That’s a long time to “fake” being “young, beautiful, athletic and sexy”. 

Growing old is tough enough without a “new” set of instructions; don’t look old, act old, get sick, become feeble or need help (unless that help is covered by insurance.) Don’t remind younger people, by your very presence, that there is an end; it is believed now that one can “look good” until the end – which will entail a short, or long, period of degeneration. This period of “old age” is rarely seen as a “good” time of life as valid as one’s childhood, young adulthood, or middle age, unless one has the funds to at least pretend to be “youngish”.

Contrary to popular American belief, it remains a fruitful time of personal development. As long as our bodies continue to function, learning and thinking continue to be what humans do.

If life has been one long illusion that only “social” rewards count, and life has been a display of materials owned, status achieved, people “bested”, then one will likely keep up the illusion, with whatever “solutions” the anti-aging industry has to offer.

I live in a town in which most people are “getting old” – not much opportunity for the young to work, to develop a career, to join the circus of material wealth and ambition. Traditionally, young people have returned to the area after college, and a stint in corporate America, time in the military, or success in finding a spouse. Having “grown up” in this unique place, it was where they chose to establish families and to be close to loved ones. The Wyoming landscape and lifestyle have always been a fundamental fact in this choice to return, and it pulls relentlessly on those who leave.

Disastrous policies, and frankly criminal wars, prosecuted from Washington D.C. in league with corporate-Wall Street crooks, and funded by abused taxpayers, demonstrate the general belief on both coasts that the people who inhabit the “rest of the U.S.” just don’t matter. We are indeed worthless and disposable inferiors willing to enrich a ruling class that despises them, and to literally die for “blood” profits in their service.

Our town needs new people to survive as a community; we need children and young families, but opportunity is lacking. Small businesses are closing and not reopening: the owners have retired and are dying off. Competition from online retailers has siphoned off local spending and old people have very little to spend anyway. Every dime goes to necessities and the obscene cost of healthcare.

The American dream left our town long ago. Wyoming’s existence has been plagued by Federal and corporate control from the beginning, when the railroad opened the West to outright looting of it’s resources by far away “global” entities. Pillage of the land and it’s resources funded the American coastal empires; exploitation of immigrants provided cheap labor. “Colonialization” by U.S. and European nations was not limited to the invasion of “foreign lands” but happened here also – and continues to this day.

Native Americans (not being suited to corporate life and labor) were killed off with conscious purpose – a policy of mass murder; the remnants confined to “reservations” where their descendants are expected to remain “invisible” – to whither away and to eventually die off, by a slow suicide of formerly unique human beings. Diversity? A smoke screen.

These thoughts occupy my meditations as I pass through a human being’s last opportunity for personal development. It’s a time of recognizing that the universe goes on without us; that our deepest questions will not be answered. It’s a time to understand that the individual cannot correct or improve much that goes on in an increasing cluttered and entangled social world, which doesn’t mean that we ought not try to improve our ourselves and our small areas of influence.  Our lives are eventually “finished” for us by nature, in disregard for our insistence that our life is essential to the universe and therefore, ought to go on forever.

____________________________________________

It is shocking to confront the fact that so much human effort, inventiveness, hard labor, suffering, and resource depletion was, and still is, devoted to the imaginary “immortality” of a few (not so admirable) individuals; Pharaohs, emperors, kings, dictators, war lords, ideologues, criminals, Popes and priests; not the best of humanity, but often the worst.

The big lie is an old lie: Immortality can be purchased. 

Yes, there is a pyramid for immortality-mortality also: The Pharaohs of our time will not be mummified. (A crude process of desiccation, which however has been wildly socially successful! They continue to be A -List celebrities that attract fans of the “rich and famous”.)

Today’s 1% equivalents will not be made immortal by being dried out like fish, cheese or jerky – no, they will be made “immortal” by means of “sophisticated” technology. What an advancement in human civilization! 

These immortality technologies, and lesser life extension, of replacements of organs and skeletal architecture, part by failing part, are being promoted as “mankind’s future” – What a lie! As if the today’s Pharaohs really intend to share their immortality with 15 billion humans!

timecover

2045: The year Man becomes Immortal. Right: All estimate 15 billion of us.

A few elite at the top may manage to purchase immortality of a limited sort: machines designed in their own image.

The mortal King Tut, a product of incest who died at age 19. How much human talent and potential has been wasted on fulfilling the fantasy of immortality for a predatory class of individuals?

It’s not King Tut, the Insignificant, who is immortal, but the lure of his “real estate” holdings, elite addresses, golden household furniture and knickknacks, layers of stone coffins, granite “countertops”, Jacuzzi bath tubs, fabulous jewelry, and rooms with a view of eternity, that keeps the envious modern social tourist coming back. 


This is not King Tut. This is a fabulous work of propaganda made by artisans, (Pharaohs had to impress the Gods in order to become a god – you wouldn’t show up for “judgement day” in anything less than the most impressive selections from your wardrobe) who rarely get credit (nameless) for their “creation of brands and products” that supply the magical connections necessary for supernatural belief in the pyramid of social hierarchy as the “definitive and absolute model” of the cosmos.  

Magic consists of the “transfer of power” between the “immortal mask” and the unimpressive person; the “mask” has become King Tut in the belief system of the socially-obsessed viewer.  

 

 

Notorious Abuser of Autistic Children / Bruno Bettelheim

How many psychologists, teachers, education and treatment center employees are “child abusers” masquerading as “child saviors”? The recent and long overdue exposure of sexual predators ought to generate investigations into “just who” are the predators in the autism industry.

How many are employed despite phony or inadequate credentials? 

_____________________________

Bruno Bettelheim (August 28, 1903 – March 13, 1990) was an infamous child psychologist. He earned a degree in philosophy, writing a dissertation relating to the history of art. He was interested in psychology for much of his life but never studied it formally.

After buying his release from a concentration camp, he traveled to the United States, where by fraudulent means, he presented himself to be a professor of psychology. He claimed that the Nazis had destroyed proof of his credentials. Shockingly he was hired as director of the Sonia Shankman Orthogenic School at the University of Chicago, a home for emotionally disturbed children.

He suffered from depression and committed suicide in 1990; after his suicide, evidence of Bettelheim’s dark side began to surface. Although many of his counsellors at the Orthogenic School considered him brilliant and admirable, others call him a cruel tyrant.

Although untrained in analysis, Bettelheim was a Freudian fundamentalist. Bettelheim was convinced, in spite of overwhelming evidence to the contrary, that autism had no organic basis but was caused entirely by cold mothers, who he dubbed “refrigerator mothers,” and absent fathers. “All my life,” he wrote, “I have been working with children whose lives have been destroyed because their mothers hated them.” Other Freudian analysts, as well as scientists who were not psychiatrists, followed Bettelheim in blaming mothers for their child’s autism. Bettelheim’s work has been  discredited.

___________________________________________________________________________

A personal aside: I grew up in a suburb of Chicago: Family friends had a son who was diagnosed autistic. After many attempts to find help and not finding any, the parents  were referred to Bruno Bettelheim at the Orthogenic School, where their son became a resident. After a few weeks, the mother was devastated by the verbal abuse that she endured. Bettelheim blamed her for her son’s difficulties. She was attacked for being highly educated and accomplished: a type of profiling that Bettelheim used to discredit and shame mothers of autistic children. In addition, when visiting her son she discovered evidence of beatings; such was the unassailable reputation of Bruno Bettelheim, that this poor woman FELT GUILTY for even questioning his authority. The family was devastated and eventually torn apart, and the son remained at the “school”, much too long given the evidence of abuse. 

This rampant “denial” of abuse is NORMAL in the U.S. due to social protection of “high class” psychopaths.

_________________________________________________________________________________

Notice the reaction of alarm on Dick Cavett’s face while listening to this dangerous man and his seriously twisted assertion that ONLY HE cares about autistic children, whom “the world” and “parents” want “dead.”

____________________________________________________________________________________

Bruno Bettelheim’s abuse of autistic children in his care points to an all-to-familiar pattern of protecting child abusers in the United States.

Bruno Bettelheim arrived in the USA without any credentials in psychiatry or psychotherapy, but was appointed Director of the University of Chicago’s Sonia Shankman Orthogenic School for disturbed (autistic) children: in 1956 the school received a Ford Foundation research grant of nearly half a million dollars. Bettelheim’s meteoric success, without qualification for the position, point to the classic “social skills” that manipulate others into “trusting” the psychopath, who then gains entry into a high status class in the social hierarchy. If he could “fool” personnel at the University of Chicago, what else was possible?  This false presentation of knowledge, skills and trustworthiness is the classic way that abusers gain access to children.

Psychopaths use tales of personal experiences to build cults of personality. Bettelheim was held in Dachau and Buchenwald for ten and a half months in 1938-9 and believed that he saw a valid parallel between the behavior of autistic children and prisoners who had given up hope; avoided eye contact, refused to eat, and become completely passive and zombie-like. If children with autism acted like this, it could only be because their mothers were like Nazi guards. Proliferation of this wildly twisted conclusion found acceptance and propagation in the American media, and became the notorious refrigerator mother theory of autism.

According to Bettelheim’s employees and colleagues, physical and emotional abuse was a part of everyday life at the Orthogenic School, with children living as trapped and terrified prisoners. Bettelheim’s most serious defect was his lack of interest in genetic, medical, and constitutional factors in autism. Punishment, specifically slapping and hitting children in the presence of other “students” and teachers, and verbally shaming children, was an ongoing form of “therapy.”

High-functioning psychopaths “get away with” abuse of children by using social and political skill. To this day, Bettelheim’s abuse of autistic children is excused by many in the field of child psychology on the basis of his high position in the hierarchy: “The Great Man” delusion permits criminal behavior as the “privilege” of those with high academic and social status.

 

 

 

Mental Development / Genetics of Visual Attention

Twin study finds genetics affects where children look, shaping mental development

https://www.sciencedaily.com/releases/2017/11/171109131152.htm

November 9, 2017 / Indiana University

A study that tracked the eye movement of twins has found that genetics plays a strong role in how people attend to their environment.

Conducted in collaboration with researchers from the Karolinska Institute in Sweden, the study offers a new angle on the emergence of differences between individuals and the integration of genetic and environmental factors in social, emotional and cognitive development. This is significant because visual exploration is also one of the first ways infants interact with the environment, before they can reach or crawl.

“The majority of work on eye movement has asked ‘What are the common features that drive our attention?'” said Daniel P. Kennedy, an assistant professor in the IU Bloomington College of Arts and Sciences’ Department of Psychological and Brain Sciences. “This study is different. We wanted to understand differences among individuals and whether they are influenced by genetics.”

Kennedy and co-author Brian M. D’Onofrio, a professor in the department, study neurodevelopmental problems from different perspectives. This work brings together their contrasting experimental methods: Kennedy’s use of eye tracking for individual behavioral assessment and D’Onofrio’s use of genetically informed designs, which draw on data from large population samples to trace the genetic and environmental contributions to various traits. As such, it is one of the largest-ever eye-tracking studies.

In this particular experiment, the researchers compared the eye movements of 466 children — 233 pairs of twins (119 identical and 114 fraternal) — between ages 9 and 14 as each child looked at 80 snapshots of scenes people might encounter in daily life, half of which included people. Using an eye tracker, the researchers then measured the sequence of eye movements in both space and time as each child looked at the scene. They also examined general “tendencies of exploration”; for example, if a child looked at only one or two features of a scene or at many different ones.

Published Nov. 9 in the journal Current Biology, the study found a strong similarity in gaze patterns within sets of identical twins, who tended to look at the same features of a scene in the same order. It found a weaker but still pronounced similarity between fraternal twins.

This suggests a strong genetic component to the way individuals visually explore their environments: Insofar as both identical and fraternal twins each share a common environment with their twin, the researchers can infer that the more robust similarity in the eye movements of identical twins is likely due to their shared genetic makeup. The researchers also found that they could reliably identify a twin with their sibling from among a pool of unrelated individuals based on their shared gaze patterns — a novel method they termed “gaze fingerprinting.”

“People recognize that gaze is important,” Kennedy said. “Our eyes are moving constantly, roughly three times per second. We are always seeking out information and actively engaged with our environment, and ultimately where you look affects your development.”

After early childhood, the study suggests that genes influence at the micro-level — through the immediate, moment-to-moment selection of visual information — the environments individuals create for themselves.

“This is not a subtle statistical finding,” Kennedy said. “How people look at images is diagnostic of their genetics. Eye movements allow individuals to obtain specific information from a space that is vast and largely unconstrained. It’s through this selection process that we end up shaping our visual experiences.

“Less known are the biological underpinnings of this process,” he added. “From this work, we now know that our biology affects how we seek out visual information from complex scenes. It gives us a new instance of how biology and environment are integrated in our development.”

“This finding is quite novel in the field,” D’Onofrio said. “It is going to surprise people in a number of fields, who do not typically think about the role of genetic factors in regulating such processes as where people look.”

_____________________________________________________

Comment: 

(Note: Many individuals can learn the “scientific method”- techniques, procedures and the use of math, without having an “understanding” of  “physical reality”. This is a problem in American “science” today.)

Why is the Asperger “attentional preference” for “physical reality” labeled a developmental defect? Because modern social humans BELIEVE that only the social environment EXISTS!

This “narrow” field of attention in modern social humans is the result of domestication / neoteny. The “magical thinking” stage of childhood development is carried into adulthood. This “arrested development” retains the narcissistic infantile perception of reality.  

A genetic basis for this “perceptual” knowledge of reality would support the Asperger “Wrong Planet” sense of alienation from neurotypical social environments. Our “real world” orientation is not a “defect” – our perception is that of an adult Homo sapiens. The hypersocial “magical” perception of the environment is that of the self-centered infant, whose very survival depends on the manipulation of “big mysterious beings” (parents – puppeteers) who make up the infant’s ENTIRE UNIVERSE.  

The Neurotypical Universe

 


Journal Reference:

  1. Daniel P. Kennedy, Brian M. D’Onofrio, Patrick D. Quinn, Sven Bölte, Paul Lichtenstein, Terje Falck-Ytter. Genetic Influence on Eye Movements to Complex Scenes at Short Timescales. Current Biology, 2017 DOI: 10.1016/j.cub.2017.10.007
%d bloggers like this: