One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Male Beards / Covering up a Weak Chin?

The contemporary “love affair” that men are having with their ability to grow facial hair may be a reaction to the feminization (neoteny) of the male face that has been a trend for decades. Ironically, soldiers sent to Iraq and Afghanistan, who grew beards in order to “fit in” with ideals of manhood in those cultures, have encouraged the new “manly man” tradition.

No. Possibly the most unattractive type of beard: The Old Testament, patriarchal, we hate women facial hair.

The most creepy facial hair of all: The long and scraggly Patriarchal Old Testament, ‘we hate women’ beard. This style says, “I don’t know what a woman is, and I don’t want to know.”

______________________________________

I intended to write a post concerning “facial expression & mind reading.” Psychologists have made quite a big deal out of their contention that Asperger people are devoid of the ability to “read” the messages sent human to human via facial expressions and body language, and that this phantom “ability” must be displayed by an individual in order to be classified as “normal” or fully human. Other than the arrogance of this declaration, which to begin with, ignores cultural traditions and differences, one simply cannot get past asking questions about physical aspects that must be addressed in order to support the definition of “human” that has been derived by psychologists.

If facial expressions are necessary to human to human communication, doesn’t extensive facial hair negatively impact this “social ability”?

imagesSV2037JB orange-video-pub-avec-sebastien-chabal-L-1imagesWNFYGU3C

If you go hairy, you had better have the face and body to back it up. A beard does not “hide” a neotenic face. 

How does reading faces apply to earlier generations of males, and the many cultures around the world, that favor or demand that men grow varying amounts of facial hair? Shaving is a product of modern cultures beginning notably with the Egyptians who began removing facial hair and body hair because it harbored dirt and lice.  Other ancient cultures used beard growth as the transition to adult obligations and benefits, including the Greeks. Ancient Germanic males grew both long hair and full beards. The Romans made a ritual of a young male’s first shave, and then favored a clean face. Of course, growing a beard also depends on having hairy ancestors – or does it?

farnese-herculesLysippus greek

Top: Roman Hercules Bottom: Greek Hercules (Lysippus)

Reconstructions of Early Homo sapiens and a Neanderthal contemporary

Reconstructions of Early Homo sapiens and his Neanderthal contemporary

Right: Do we actually know how hairy early Homo species were? It would seem that without evidence, artists settle on a 5-day growth or scruffy short beard. Does a beard cover a “weak” Neanderthal chin?

The image of archaic humans, notably Neanderthals, as hairy and unkempt Cave Men has influenced how we interpret hairiness or hairlessness in Homo sapiens. Hair is extremely important in both favorable and unfavorable ways: hair can be a haven for disease and parasites; we need only look to the large amount of time that apes and monkeys spend grooming each other for lice, time that could be spent looking for food, learning about the environment, and practicing skills.

Growing hair requires energy. Our large human brain requires 20% of the energy that our body generates in order to power that brain. It could be that the growth of the modern brain (beginning with Homo erectus) was intricately tied up in a slow feedback cycle; the brain produces energy saving inventions (fire, tools, clothing, travel to more abundant environments) which means more energy to devote to the brain, which can increase brain connections, which makes increased technical innovation possible, which frees more energy for the brain. So, technology could be seen as part of streamlining the human animal into a energy-conserving species, which in turn improves brain function. In other words, the brain benefits from its own thinking when that thinking becomes a set of “apps” that manipulate the environment and the human body.

Meanwhile, what about facial hair? Personally, I’m thankful that I live in a time when men have the choice to grow, or not to grow.

 

____________________________________________________________________________

imagesKX09DYBA imagesLF2IM31T

 

 

 

 

Beard Guys / Two Best

Shut up and fight / Thank the gods; Vikings is back 11/29

Shut up and bake / A guy who looks scrumptious in a beard and can make a perfect pie crust? Sign me up!

Neanderthal mtDNA from before 220,000 y.o. Early Modern Human

Fact or Baloney…read on…

Neandertals and modern humans started mating early

For almost a century, Neandertals were considered the ancestors of modern humans. But in a new plot twist in the unfolding mystery of how Neandertals were related to modern humans, it now seems that members of our lineage were among the ancestors of Neandertals. Researchers sequenced ancient DNA from the mitochondria—tiny energy factories inside cells—from a Neandertal who lived about 100,000 years ago in southwest Germany. They found that this DNA, which is inherited only from the mother, resembled that of early modern humans.

After comparing the mitochondrial DNA (mtDNA) with that of other archaic and modern humans, the researchers reached a startling conclusion: A female member of the lineage that gave rise to Homo sapiens in Africa mated with a Neandertal male more than 220,000 years ago—much earlier than other known encounters between the two groups. Her children spread her genetic legacy through the Neandertal lineage, and in time her African mtDNA completely replaced the ancestral Neandertal mtDNA.

Other researchers are enthusiastic about the hypothesis, described in Nature Communications this week, but caution that it will take more than one genome to prove. “It’s a nice story that solves a cool mystery—how did Neandertals end up with mtDNA more like that of modern humans,” says population geneticist Ilan Gronau of the Interdisciplinary Center Herzliya in Israel. But “they have not nailed it yet.”

 The study adds to a catalog of ancient genomes, including mtDNA as well as the much larger nuclear genomes, from more than a dozen Neandertals. Most of these lived at the end of the species’ time on Earth, about 40,000 to 50,000 years ago. Researchers also have analyzed the complete nuclear and mtDNA genomes of another archaic group from Siberia, called the Denisovans. The nuclear DNA suggested that Neandertals and Denisovans were each other’s closest kin and that their lineage split from ours more than 600,000 years ago. But the Neandertal mtDNA from these samples posed a mystery: It was not like Denisovans’ and was closely related to that of modern humans—a pattern at odds with the ancient, 600,000 year divergence date. Last year Svante Pääbo’s team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, offered a startling solution: Perhaps the “Neandertal” mtDNA actually came from modern humans.

______________________________

Strange! Everything I’ve read previously has said the Neanderthal mtDna was not at all similar to any H. sapiens mtDna haplogroups. 

______________________________

In the new study, paleogeneticists Johannes Krause and Cosimo Posth of the Max Planck Institute for the Science of Human History in Jena, Germany, test this wild idea with ancient mtDNA from a Neandertal thighbone found in 1937 in the Hohlenstein-Stadel cave (HST) in Germany. Isotopes in animal bones found with the Neandertal suggest that it lived in a woodland known to have vanished at least 100,000 years ago.

Researchers compared the coding region of the HST Neandertal’s mtDNA with that of 17 other Neandertals, three Denisovans, and 54 modern humans. The HST Neandertal’s mtDNA was significantly different even from that of proto-Neandertals that date to 430,000 years ago at Sima de los Huesos in Spain, suggesting that their mtDNA had been completely replaced. But the HST sample was also surprisingly distinct from that of other Neandertals, allowing researchers to build a phylogenetic tree and study how Neandertal mtDNA evolved over time.

Using modern humans’ mtDNA mutation rate to calculate the timing, the researchers conclude that the HST mtDNA split from that of all other Neandertals at least 220,000 years ago. The ancient H. sapiens’ mtDNA must have entered the Neandertal lineage before this time, but after 470,000 years ago, the earliest date for when modern human and Neandertal mtDNA diverged. That’s early enough for the new form of mtDNA to have spread among Neandertals and replaced all their mtDNA.

“The mtDNA of Neandertals is not actually from Neandertals, but from an early modern human from Africa,” Krause says. The researchers speculate that this key mating may have happened in the Middle East, where early H. sapiens may have ventured. Other researchers find the scenario remarkable but plausible. “It seems magical but this type of thing happens all the time … especially if the populations are very small,” Gronau says. For example, the mtDNA in some grizzly bears has been completely replaced by that of polar bears, Krause says.

But some experts say DNA from other Neandertals is needed to prove that their mtDNA was inherited entirely from an early H. sapiens rather than from an ancient ancestor the two groups shared. “Is there other evidence of another [early] mtDNA introgression event?” asks Chris Stringer of the Natural History Museum in London.

Not yet, Posth says. Pääbo is seeking evidence of early gene swapping by trying to get nuclear DNA from the HST Neandertal and others. “We will learn a lot about the population history of Neandertals over the next few years,” he says.

Posted in: Evolution

doi:10.1126/science.aan70

 

The Whoa! Whoa! Whoa! Reaction / Neanderthal Myths

The “Whoa! Whoa! Whoa!” reaction is what happens when I read articles written for public consumption that “boil down” science for the “educated public” – those who are genuinely interested in the physical universe, but may or may not  have a science background. One of my favorite examples is how Neanderthals are “created” out of the modern social typical penchant (and temperamental obligation) to write stories (myths) from scant, contradictory or preliminary information.

Claiming that Neanderthals were "dumb" is dumb.

The claim that Neanderthals were “dumb” is dumb. Are these skulls to scale?


Science Shows Why You’re Smarter Than a Neanderthal

Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition

By Joseph Stromberg Smithsonian.com March 12, 2013

https://www.smithsonianmag.com/science-nature/science-shows-why-youre-smarter-than-a-neanderthal-1885827/ Full article

COMMENTS: This article hits the Whoa! Stop! barrier before getting past the subhead. “Neanderthal brains had more capacity devoted to vision and body control, with less left over for social interactions and complex cognition.”

  1. This view of the brain as having a “capacity” related to volume, like a closet that can be packed with X amount of clothing and Y amount of shoes, and if you want to add more shoes or ski equipment, you have to remove the clothes to make room, defies what we know (and brag about endlessly) about the brain: it’s built of networks that connect across regions and functions, and these are PLASTIC – what is referred to as “able to rewire itself in reaction to the environment.” This blows apart much of what the article has to say.
  2. Visual thinking is judged to be INFERIOR, low level cognition. Tell that to a raptor, such as a hawk, raven or eagle; to giant squid or octopi and the myriad species which utilize various segments of the electro-magnetic spectrum to perceive the environment. This opinion is based in ignorance and the noises made by the perpetual cheer leaders for Homo sapiens, who believe humans are the pinnacle of evolution, and therefore, whatever “we” do is de facto superior.
  3. Which brings us to the question, if human abilities are superior, why must we compensate for our lack of sensory, cognitive and physical abilities by inventing technology? The average “know-it-all” American CONSUMES the products invented and developed by a handful of creative people in each generation. Knowledge is purchased in the form of “gadgets” that for the most part, do not educate, but distract the average individual from pursuing direct experience and interaction with the environment.
  4. Which means, “we” cognitive masterminds are taking a whole lot of credit for adaptations that are INHERITED from our “inferior, stupid, ancestors” who over the previous 200,000 years, not only survived, but built the culture that made us modern humans –
  5. Which comes to the egregious error of ignoring context: Compare an imaginary modern social human who exists in a context that is utterly dependent on manmade systems that supply food, water, shelter, medical care, economic opportunity, government control, cultural benefits and instant communication with a Neanderthal (or archaic Homo sapiens) whose environment is a largely uninhabited wilderness. One of the favorite clichés of American entertainment is “Male Monsters of Survival” cast into the wilderness (with a film crew and helicopter on call) recreating the Myth of Homo sapiens, Conqueror of Nature. These overconfident males are often lucky to last a week; injuries are common, starvation the norm.
  6. If visual thinking is so inferior, why do hunters rely on airplane and helicopter “flyovers” to locate game, and now drones, and add scopes, binoculars, game cameras,  and a multitude of “sensory substitutes” to their repertoire? Ever been to a sporting goods store? They’re packed with every possible gadget that will improve the DIMINISHED senses and cognitive ability of modern social humans to function outside of manmade environments and to be successful hunters and fishermen.
  7. As for forcing Neanderthals into extinction, modern social humans could accomplish this: we have a horrific history of wiping out indigenous peoples and continue to destroy not only human groups, but hundreds of species and the environments they are adapted to. Modern social humans could bomb Neanderthals “back to the Stone Age”. Kill them off with chemical weapons, shred them with cluster bombs, the overkill of targeted assassination and nuclear weapons.
  8. BUT there is no proof that Archaic Homo sapiens “extincted” Homo Neanderthal. We know that in some areas they lived cheek by jowl, had sex and produced offspring, but modern social humans maintain that Neanderthals were so “socially stupid” that the entire species fell to the magnificence of party-hearty Homo sapiens.  Actually, a modern social human would have difficulty distinguishing the two fearsome types: the challenge may have been like distinguishing a polar bear from a grizzly bear, which are actually both brown bears adapted to different environments. rather irrelevant if you’re facing down either one with a sharp stick.
  9. The myth that Homo sapiens individuals outside of Africa “contain” a variable 1-4% of Neanderthal DNA, with  specific “snips” related to various functions in modern humans, is incomplete. Rarely included in articles about how Homo sapiens and Neanderthal are connected is whole genome sequencing results which show that overall, the Homo sapiens genome, even now, is all but identical to the Neanderthal genome. This is logical: the divergence between the common ancestor of Chimps and  African great Apes (us) occurred 5-6 m.y.a. and yet, the human and chimp genomes share 99% of our DNA. How similar then, is Neanderthal and Denisovan genome to ours? This is a simple math question.
  10. What we need to compare is the Neanderthal genome and the ARCHAIC Homo sapiens genome – two groups of humans who were contemporaries.

 

 

 

Baboons, Social Typicals, Aspergers / STRESS

The usual human approach: stress is a killer; modern social environments are high stress; lets “engineer” humans to be able to tolerate high stress. What about changing environments so that human beings experience less stress? Of course not: that would benefit the average human. This is about what the top of the hierarchy wants – change the peasants so that they can live with extreme stress –

This article has dire implications for those of us who are born “Asperger” or with other neurodiverse brain types. 

_____________________________________________________

http://www.wired.com/2010/07/ff_stress_cure/

by: Jonah Lehrer

Under Pressure: The Search for a Stress Vaccine

Excerpts: 

Baboons are nasty, brutish, and short. They have a long muzzle and sharp fangs designed to inflict deadly injury. Their bodies are covered in thick, olive-colored fur, except on their buttocks, which are hairless. The species is defined by its social habits: The primates live in groups of several dozen individuals. These troops have a strict hierarchy, and each animal is assigned a specific rank. While female rank is hereditary — a daughter inherits her mother’s status — males compete for dominance. These fights can be bloody, but the stakes are immense: A higher rank means more sex. The losers, in contrast, face a bleak array of options — submission, exile, or death.

In 1978, Robert Sapolsky was a recent college graduate with a degree in biological anthropology and a job in Kenya. He had set off for a year of fieldwork by himself among baboons… here he was in Nairobi, speaking the wrong kind of Swahili and getting ripped off by everyone he met. Eventually he made his way to the bush, a sprawling savanna filled with zebras and wildebeests and elephants…

Sapolsky slowly introduced himself to a troop of baboons, letting them adjust to his presence. After a few weeks, he began recognizing individual animals, giving them nicknames from the Old Testament. It was a way of rebelling against his childhood Hebrew-school teachers, who rejected the blasphemy of Darwinian evolution…

Before long, Sapolsky’s romantic vision of fieldwork collided with the dismal reality of living in the African bush. (The baboons) seemed to devote all of their leisure time — and baboon life is mostly leisure time — to mischief and malevolence. “One of the first things I discovered was that I didn’t like baboons very much,” he says. “They’re quite awful to one another, constantly scheming and backstabbing. They’re like chimps but without the self-control.”

____________________________________________________________

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Olive baboon male standing on his hind legs watching a female presenting her rear (Papio cynocephalus anubis). Maasai Mara National Reserve, Kenya. Feb 2009.

____________________________________________________________

While Sapolsky was disturbed by the behavior of the baboons — this was nature, red in tooth and claw — he realized that their cruelty presented an opportunity to investigate the biological effects of social upheaval. He noticed, for instance, that the males at the bottom of the hierarchy were thinner and more skittish. “They just didn’t look very healthy,” Sapolsky says. “That’s when I began thinking about how damn stressful it must be to have no status. You never know when you’re going to get beat up. You never get laid. You have to work a lot harder for food.”

(Asperger types – is this us?)

So Sapolsky set out to test the hypothesis that the stress involved in being at the bottom of the baboon hierarchy led to health problems…“It struck most doctors as extremely unlikely that your feelings could affect your health. Viruses, sure. Carcinogens, absolutely. But stress? No way.” Sapolsky, however, was determined to get some data… Instead, he was busy learning how to shoot baboons with anesthetic darts and then, while they were plunged into sleep, quickly measure their immune system function and the levels of stress hormones and cholesterol in their blood….

A similarly destructive process is at work in humans. While doctors speculated for years that increasing rates of cardiovascular disease in women might be linked to the increasing number of females employed outside the home, that correlation turned out to be nonexistent. Working women didn’t have more heart attacks. There were, however, two glaring statistical exceptions to the rule: Women developed significantly more heart disease if they performed menial clerical work or when they had an unsupportive boss. The work, in other words, wasn’t the problem. It was the subordination.

(Female gender = subordinate in modern social hierarchy.)

One of the most tragic aspects of the stress response is the way it gets hardwired at a young age — an early setback can permanently alter the way we deal with future stressors. The biological logic of this system is impeccable: If the world is a rough and scary place, then the brain assumes it should invest more in our stress machinery, which will make us extremely wary and alert. There’s also a positive feedback loop at work, so that chronic stress actually makes us more sensitive to the effects of stress.

The physiology underlying this response has been elegantly revealed in the laboratory. When lab rats are stressed repeatedly, the amygdala — an almond-shaped nub in the center of the brain — enlarges dramatically. (See post  on amygdala, hippocampus) (This swelling comes at the expense of the hippocampus, which is crucial for learning and memory and shrinks under severe stress.) The main job of the amygdala is to perceive danger and help generate the stress response; it’s the brain area turned on by dark alleys and Hitchcock movies. Unfortunately, a swollen amygdala means that we’re more likely to notice potential threats in the first place, which means we spend more time in a state of anxiety. (This helps explain why a more active amygdala is closely correlated with atherosclerosis.) The end result is that we become more vulnerable to the very thing that’s killing us.

__________________________________________________________________________________________

Imagine you are a newborn: everything about you “looks normal” and your parents show you off; send photos to friends and relatives. They coo and gurgle over your parents’ splendid achievement. A Perfect Baby.
Then reality sets in: Human parents are obsessed with the fear of giving birth to a less-than-perfect baby. Can we deny the social pressure endured by an infant and parent, when parents “freak out” over a growing suspicion that their child is “abnormal”? They rush the child to the “witch doctor” – the expert, the authority, the interpreter of all human behavior; the priest or priestess who has the power to decide the fate of a child as a member of its society. What power over individual destinies these “judges” have!  
In American culture, it is the medical / behavioral industry which decides whether or not a child is conforming to a rigid schedule of physical, social, emotional and mental development. This used to be “the job” of religious authorities (and still is in many communities), but the “Helping Caring Fixing” industry has become a “co-religion” for many believers.  
An imaginary epidemic of “defective children” has grown into a reign of terror in contemporary American culture: children are labeled, isolated, shamed, bullied and virtually discarded; drugged into submission, simply for being children. 

Baboons: 

Baboons are African and Arabian Old World monkeys belonging to the genus Papio, part of the subfamily Cercopithecinae. The five species are some of the largest non-hominoid members of the primate order; only the mandrill and the drill are larger. Baboons use at least 10 different vocalizations to communicate with other members of the troop. Wikipedia

Scientific name: Papio / Lifespan: Guinea baboon: 35 – 45 years
Height: Olive baboon: 2.3 ft. / Hamadryas baboon: 44 – 66 lbs, Olive baboon: 22 – 82 lbs, Guinea baboon: 29 – 57 lbs
Fantastic photos:

 

Baboon behavior compared with modern humans: One advantage of the “bipedal stance” – showing off “the junk”. Could the female be “twerking”?

Who’s Safe With a Gun? Don’t Ask a Shrink

The Daily Beast, May 2013 Background Checks

guns-mattel-swscan08492

Forget any guidance from psychiatry’s bible, the DSM-5, when it comes to background checks for gun buyers, writes the psychotherapist author of The Book of Woe. (Gary Greenburg)

Many years ago, a man I was seeing in therapy decided he wanted to take up a new hobby: high explosives. The state he lived in licensed purchasers of dynamite and other incendiaries only after a background check. He wanted to know: Would I write a letter declaring him fit to blow up stuff in his backyard for fun?

Aside from the fact that this was how he wanted to pass the weekend, I didn’t have any reason to think otherwise, so I gave him the note. He got the license. A few years after he stopped seeing me, I had occasion to visit him at his office. He had all his digits and limbs and, to my knowledge, had committed no antisocial acts with his legally obtained explosives. My note attesting to his mental health was framed on his wall.

I’ve been thinking about this guy recently, ever since our politicians’ imaginations have fastened upon background checks as the solution to our gun problems. I’ve also been thinking about a couple of other patients. One of them, a middle-aged professional, a ramrod-straight retired Marine, father of a little girl, faithful husband, the kind of man who buys a special lockbox just for transporting his weapon between home and gun club. The other: a 27-year-old hothead, an absentee father who never met a drug or a woman he didn’t like. His idea of fun was riding his motorcycle between lanes on the interstate at 100 mph, and he was the proud owner of (by his count) 37 guns. In the three years prior to arriving at my office, he’d been fired from four jobs, arrested for six or seven driving offenses and a few drug charges, and helped to bury three of his friends who met untimely and violent ends.

No one asked me which of these two men I’d rather was a gun owner, let alone which one ought to have a firearms license. But I know what my answer would have been. Or I would have known until about a year ago, when the ex-Marine, inexplicably and without warning (although he’d just been put on an antidepressant as part of a treatment for chronic pain), sat at the base of the tree holding his favorite deer perch and shot himself in the mouth. Meantime, the hothead has cooled down. He’s been with the same woman for two years and the same job for one. He sees his son faithfully twice a week. He’s sold his motorcycle and more than half of his guns, and become obsessed with bodybuilding and responsibility. The transformation is not complete—he’s still dead certain the government wants to come to his house and confiscate what’s left of his arsenal, for instance—and I can’t take too much credit for it. He’s pursuing the pleasures of self-control with the same manic intensity as he once chased adrenaline. But I’m not all that worried about his guns anymore, and I’m really glad no one asked me if he should have them.

Because one thing they don’t teach you in therapy school: how to tell the future. Clinicians can assemble a story out of the ashes of a person’s life; we might even be able to spot what we think are the seeds of catastrophe, but we generally do that best in retrospect. And that’s why, if one of us insists he or she knows for sure what’s coming next, you should find another therapist. It’s also why, to the extent that background checks involve people like me, it wouldn’t do much more than reassure politicians that they are doing something about gun violence without simultaneously threatening their National Rifle Association ratings.

But wait a minute, you may be saying. Don’t mental-health workers have a whole huge book of diagnoses to turn to that can help you assess a person’s fitness to own a gun? No, we don’t. We have the book, of course, the Diagnostic and Statistical Manual of Mental Disorders, which is about to come out in its fifth edition. But while some of those disorders seem incompatible with responsible gun ownership, even a diagnosis of a severe mental illness like schizophrenia or bipolar disorder isn’t a good predictor of who is going to become violent. Indeed, only about 4 percent of violent crimes are committed by mentally ill people. We are not going to diagnose our way to safety.

There’s a reason for this. A diagnosis of a mental disorder is only a description of a person’s troubles. A neurologist presented with a patient suffering loss of coordination and muscle weakness can run tests and diagnose amyotrophic lateral sclerosis or a brain tumor. They can explain the symptoms and predict with some accuracy what will happen as the disease takes its expected course. The 200 or so diagnoses in the DSM, on the other hand, explain little and predict less. Until the book contains a diagnosis called Mass Slaughter Disorder, whose criteria would include having committed mass slaughter, it’s not going to offer much guidance on the subject, and, obviously, what guidance it provides is going to come too late.

With the mentally disordered, as with all of us (and let’s remember that in any given year, something like 30 percent of us will meet criteria for a mental disorder, and 11 percent of us are on antidepressants right now), there is no telling what will happen next. No matter how many diagnoses are in the DSM, and no matter how astutely they are used, they will not tell us in whose hands guns are safe. The psyche is more unfathomable, and evil more wily, than any doctor or any book.

 

 

 

Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

The most important “developmental” fact of life

is death.

It just happens: We grow old. It’s a natural progression, without doubt. But not in the U.S., of course, where openly denying death is a frenzied passion. Getting old is a crime in a society terrified of “growing up” and becoming adult.

Old people are proof of the most basic facts of life, so much so, that being old has become taboo. And if one lives to the “new” expectation of 80 or so, that means 30 years of life beyond the new “old age” of 50. That’s a long time to “fake” being “young, beautiful, athletic and sexy”. 

Growing old is tough enough without a “new” set of instructions; don’t look old, act old, get sick, become feeble or need help (unless that help is covered by insurance.) Don’t remind younger people, by your very presence, that there is an end; it is believed now that one can “look good” until the end – which will entail a short, or long, period of degeneration. This period of “old age” is rarely seen as a “good” time of life as valid as one’s childhood, young adulthood, or middle age, unless one has the funds to at least pretend to be “youngish”.

Contrary to popular American belief, it remains a fruitful time of personal development. As long as our bodies continue to function, learning and thinking continue to be what humans do.

If life has been one long illusion that only “social” rewards count, and life has been a display of materials owned, status achieved, people “bested”, then one will likely keep up the illusion, with whatever “solutions” the anti-aging industry has to offer.

I live in a town in which most people are “getting old” – not much opportunity for the young to work, to develop a career, to join the circus of material wealth and ambition. Traditionally, young people have returned to the area after college, and a stint in corporate America, time in the military, or success in finding a spouse. Having “grown up” in this unique place, it was where they chose to establish families and to be close to loved ones. The Wyoming landscape and lifestyle have always been a fundamental fact in this choice to return, and it pulls relentlessly on those who leave.

Disastrous policies, and frankly criminal wars, prosecuted from Washington D.C. in league with corporate-Wall Street crooks, and funded by abused taxpayers, demonstrate the general belief on both coasts that the people who inhabit the “rest of the U.S.” just don’t matter. We are indeed worthless and disposable inferiors willing to enrich a ruling class that despises them, and to literally die for “blood” profits in their service.

Our town needs new people to survive as a community; we need children and young families, but opportunity is lacking. Small businesses are closing and not reopening: the owners have retired and are dying off. Competition from online retailers has siphoned off local spending and old people have very little to spend anyway. Every dime goes to necessities and the obscene cost of healthcare.

The American dream left our town long ago. Wyoming’s existence has been plagued by Federal and corporate control from the beginning, when the railroad opened the West to outright looting of it’s resources by far away “global” entities. Pillage of the land and it’s resources funded the American coastal empires; exploitation of immigrants provided cheap labor. “Colonialization” by U.S. and European nations was not limited to the invasion of “foreign lands” but happened here also – and continues to this day.

Native Americans (not being suited to corporate life and labor) were killed off with conscious purpose – a policy of mass murder; the remnants confined to “reservations” where their descendants are expected to remain “invisible” – to whither away and to eventually die off, by a slow suicide of formerly unique human beings. Diversity? A smoke screen.

These thoughts occupy my meditations as I pass through a human being’s last opportunity for personal development. It’s a time of recognizing that the universe goes on without us; that our deepest questions will not be answered. It’s a time to understand that the individual cannot correct or improve much that goes on in an increasing cluttered and entangled social world, which doesn’t mean that we ought not try to improve our ourselves and our small areas of influence.  Our lives are eventually “finished” for us by nature, in disregard for our insistence that our life is essential to the universe and therefore, ought to go on forever.

____________________________________________

It is shocking to confront the fact that so much human effort, inventiveness, hard labor, suffering, and resource depletion was, and still is, devoted to the imaginary “immortality” of a few (not so admirable) individuals; Pharaohs, emperors, kings, dictators, war lords, ideologues, criminals, Popes and priests; not the best of humanity, but often the worst.

The big lie is an old lie: Immortality can be purchased. 

Yes, there is a pyramid for immortality-mortality also: The Pharaohs of our time will not be mummified. (A crude process of desiccation, which however has been wildly socially successful! They continue to be A -List celebrities that attract fans of the “rich and famous”.)

Today’s 1% equivalents will not be made immortal by being dried out like fish, cheese or jerky – no, they will be made “immortal” by means of “sophisticated” technology. What an advancement in human civilization! 

These immortality technologies, and lesser life extension, of replacements of organs and skeletal architecture, part by failing part, are being promoted as “mankind’s future” – What a lie! As if the today’s Pharaohs really intend to share their immortality with 15 billion humans!

timecover

2045: The year Man becomes Immortal. Right: All estimate 15 billion of us.

A few elite at the top may manage to purchase immortality of a limited sort: machines designed in their own image.

The mortal King Tut, a product of incest who died at age 19. How much human talent and potential has been wasted on fulfilling the fantasy of immortality for a predatory class of individuals?

It’s not King Tut, the Insignificant, who is immortal, but the lure of his “real estate” holdings, elite addresses, golden household furniture and knickknacks, layers of stone coffins, granite “countertops”, Jacuzzi bath tubs, fabulous jewelry, and rooms with a view of eternity, that keeps the envious modern social tourist coming back. 


This is not King Tut. This is a fabulous work of propaganda made by artisans, (Pharaohs had to impress the Gods in order to become a god – you wouldn’t show up for “judgement day” in anything less than the most impressive selections from your wardrobe) who rarely get credit (nameless) for their “creation of brands and products” that supply the magical connections necessary for supernatural belief in the pyramid of social hierarchy as the “definitive and absolute model” of the cosmos.  

Magic consists of the “transfer of power” between the “immortal mask” and the unimpressive person; the “mask” has become King Tut in the belief system of the socially-obsessed viewer.  

 

 

%d bloggers like this: