One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Emergence of “humans” / Berkeley.edu + Comments

slidec8

Simplified socio-cultural guide to identifying male / female.

 

The evolution of Primates – Gender dimorphism /

Top: Orangutan male and female. Middle: Modern social human; all “cases” of allowable bathroom use. Bottom: Idiot’s guide to gender ID; U.S.

 

Low sexual dimorphism in modern social humans? Really? Sexual dimorphism is created culturally in humans, and wow! Gender assignment is all mixed up! In fact, one might observe, that body alteration, decoration, behavior and costume are how Homo sapiens compensates for being a strange hairless ape, born without the elaborate fur, plumage, texture, color and behavioral displays of other species. We “copy” other animals and utilize materials in the environment to socially broadcast our sex and gender  – from the violent hyper male to the “big boob” sex object that is the “ideal” American woman. Some cultures  disguise or blur a person’s sex / gender. Neoteny promotes childlike appearance in males and females – the current trend is toward androgeny.

Any questions about this guy’s gender? 

papua13

Old school “gun”

50%20cent

Below: Modern neotenic “feminized” male – androgeny is the popular goal.

jaejoong-jyj korean

__________________________________________________________________________________________

How bizarre can the “story” of human evolution get?

The following chapter “The Emergence of Humans” is from Berkeley.edu, a site about evolution for students. I confess that to my Asperger type of thinking, this review of evolutionary studies is excruciating: One (dumb) point of view is especially mind-boggling; that chimpanzees are a legitimate focus of “study and research” into ancestral humans and modern human behavior, merely because “they are alive” and eligible for torture in labs’; they don’t have “souls” or “suffer.” And they appeal to neotenic social humans, by scoring high on the “cute” scale.

The apparent inability of researchers to get past this 19th C. world view is stunning; instead of a thorough examination of assumptions across disciplines, we again see “warfare” between disciplines, and the ongoing attempt to assemble a human “dinosaur” from bits and pieces of fossilized thinking. In fact, paleontology has exploded with new ideas since “old” dinosaur reconstructions were discovered to be highly inaccurate. Hint, hint.

FOUND! The last common ancestor of Humans and Chimps.

imagesZYC0W6GI

Berkeley.edu / The emergence of humans

The narratives of human evolution are oft-told and highly contentious. There are major disagreements in the field about whether human evolution is more like a branching tree or a crooked stick, depending partly on how many species one recognizes. Interpretations of almost every new find will be sure to find opposition among other experts. Disputes often center on diet and habitat, and whether a given animal could occasionally walk bipedally or was fully upright. What can we really tell about human evolution from our current understanding of the phylogenetic relations of hominids and the sequence of evolution of their traits?

Hominid evogram

(consistency problem)

To begin with, let’s take a step back. Although the evolution of hominid features is sometimes put in the framework of “apes vs. humans,” the fact is that humans are apes, just as they are primates and mammals. A glance at the evogram shows why. The other apes — chimp, bonobo, gorilla, orangutan, gibbon — would not form a natural, monophyletic group (i.e., a group that includes all the descendants of a common ancestor) — if humans were excluded. Humans share many traits with other apes, and those other “apes” (i.e., non-human apes) don’t have unique features that set them apart from humans. Humans have some features that are uniquely our own, but so do gorillas, chimps, and the rest. Hominid evolution should not be read as a march to human-ness (even if it often appears that way from narratives of human evolution). Students should be aware that there is not a dichotomy between humans and apes. Humans are a kind of ape.

Virtually all systematists and taxonomists agree that we should only give names to monophyletic groups. However, this evogram shows that this guideline is not always followed. For an example, consider Australopithecus. On the evogram you can see a series of forms, from just after Ardipithecus to just before Homo in the branching order, that are all called Australopithecus. (Even Paranthropus is often considered an australopithecine.) But as these taxa appear on the evogram, “Australopithecus” is not a natural group, because it is not monophyletic: some forms, such as A. africanus, are found to be closer to humans than A. afarensis and others. Beyond afarensis, for example, all other Australopithecus and Homo share “enlarged cheek teeth and jaws,” because they have a more recent common ancestor. Eventually, several of these forms will have to have new genus names if we want to name only monophyletic groups. Students should avoid thinking of “australopithecines” as a natural group with uniquely evolved traits that link its members together and set it apart from Homo. Instead they should focus on the pattern of shared traits among these species and the Homo clade, recognizing that each species in this lineage gains more and more features that are shared by Homo.

In popular fiction and movies, the concept of the wild “ape-man” is often that of a tree-living, vine-swinging throwback like Tarzan. However, the pantheon of hominids is much richer than this, as the evogram shows with forms as different as Paranthropus and Ardipithecus shows. For example, imagine going back in time to the common ancestor of humans and chimps (including bonobos). What did that common ancestor look like? In the Origin of Species Darwin noted that the extinct common ancestor of two living forms should not be expected to look like a perfect intermediate between them. Rather, it could look more like one branch or the other branch, or something else entirely.

Found! The last common ancestor of humans and chimps.

Did the common ancestor of humans and chimps conform to the ape-man myth and live in the trees, swinging from vines? To answer this, we have to focus not only on anatomy but on behavior, and we have to do it in a phylogenetic context. Apes such as the gibbon and orangutan, which are more distantly related to humans, are largely arboreal (i.e., tree-living). The more closely related apes such as the gorilla and chimps are relatively terrestrial, although they can still climb trees. The feet of the first hominids have a considerable opposition of the big toe to the others but relatively flat feet, as arboreal apes generally do. But other features of their skeleton, such as the position of the foramen magnum underneath the skull, the vertically shortened and laterally flaring hips, and the larger head of the femur, suggest that they were not just mainly terrestrial but habitually bipedal, unlike their knuckle-walking relatives. Most evidence suggests that the hominid lineage retained some of the anatomical features related to arboreal life and quadrupedal gait even after it had evolved a more terrestrial lifestyle and a bipedal gait. There is no fossil record of these behaviors, but the balance of the available evidence supports the hypothesis that the hominid ancestor was terrestrial and bipedal.

Much discussion in human paleontology surrounds the evolution of a bipedal, upright stance. When and why did this occur? One thing to keep in mind is that “bipedal” and “upright” are not equivalent terms. An animal can be bipedal without having a vertical backbone (think T. rex). It seems clear from the fossil record of hominids that habitual bipedality preceded the evolution of a recurved spine and upright stance. Other changes in the gait, such as how the relatively “splayed” gait of chimps evolved into the gait of humans, who put one foot directly in front of the other, involve studying the hip joint, the femur, and the foot. The famous Laetoli footprints attributed to Australopithecus afarensis are bipedal, but they are still relatively splayed compared to the tracks of living humans. (WOW! they are doing it again despite their own caution: humans did not evolve from chimpanzees!)

Another extremely interesting feature in hominid evolution is the degree of sexual dimorphism (i.e., physical differences between the sexes) in different species. Sexual dimorphism is linked to features of sociality and mate competition in many sorts of animals. To understand the evolution of this feature in humans, which have relatively low sexual dimorphism, we need to consider the other apes, in which sexual dimorphism tends to be moderate to high (with exceptions). 

(Again, culture is utterly ignored: the fact is; women and men “self-morph” according to socio-cultural “genders” into very dimorphic animals)

We don’t have sufficient evidence about Sahelanthropus, Orrorin, and Ardipithecus to understand much about sex differences in these species, but we do know that A. afarensis had relatively high sexual dimorphism: the males were considerably larger than the females. The difference seems to have been less in A. africanus, Paranthropus, and most of the Homo lineage. The evolutionary explanation for A. afarensis‘ dimorphism is not entirely clear. The larger males may have used their size to attract females and/or repel rivals, which would fit with an explanation based on sexual selection. Or the males and females may have been differently sized because they played different roles in their groups, the males hunting and gathering and the females caring for the young. Darwin thought that this differentiation of the sexes may have played a critical role in human evolution, but we simply do not know much about the role of this feature in A. afarensis. Some, all, or none of these functions may have been in play. (Novel-writing again! If we don’t have facts about a subject, why not say so? Speculation becomes dogma in the “magic word syndrome” social mind and people argue over imaginary histories and qualities.  Also – I suspect that once again the writers have “EuroAmerican humans in mind regarding sexual dimorphism: why?

We do know that by the time the animals known as Homo evolved, they could make tools, and their hands were well suited for complex manipulations. These features were eventually accompanied by the reduction of the lower face, particularly the jaws and teeth, the recession of the brow, the enlargement of the brain, the evolution of a more erect posture, and the evolution of a limb more adapted for extended walking and running (along with the loss of arboreally oriented features). The evogram shows the hypothesized order of acquisition of these traits. Yet each of the Homo species was unique in its own way, so human evolution should not be seen as a simple linear progression of improvement toward our own present-day form. (But, we show it that way, anyway!)

More…. Should you need a mind-boggling experience:

https://en.wikibooks.org/wiki/Survey_of_Communication_Study/Chapter_13_-_Gender_Communication

And to clarify all this: 

Male Beards / Covering up a Weak Chin?

The contemporary “love affair” that men are having with their ability to grow facial hair may be a reaction to the feminization (neoteny) of the male face that has been a trend for decades. Ironically, soldiers sent to Iraq and Afghanistan, who grew beards in order to “fit in” with ideals of manhood in those cultures, have encouraged the new “manly man” tradition.

No. Possibly the most unattractive type of beard: The Old Testament, patriarchal, we hate women facial hair.

The most creepy facial hair of all: The long and scraggly Patriarchal Old Testament, ‘we hate women’ beard. This style says, “I don’t know what a woman is, and I don’t want to know.”

______________________________________

I intended to write a post concerning “facial expression & mind reading.” Psychologists have made quite a big deal out of their contention that Asperger people are devoid of the ability to “read” the messages sent human to human via facial expressions and body language, and that this phantom “ability” must be displayed by an individual in order to be classified as “normal” or fully human. Other than the arrogance of this declaration, which to begin with, ignores cultural traditions and differences, one simply cannot get past asking questions about physical aspects that must be addressed in order to support the definition of “human” that has been derived by psychologists.

If facial expressions are necessary to human to human communication, doesn’t extensive facial hair negatively impact this “social ability”?

imagesSV2037JB orange-video-pub-avec-sebastien-chabal-L-1imagesWNFYGU3C

If you go hairy, you had better have the face and body to back it up. A beard does not “hide” a neotenic face. 

How does reading faces apply to earlier generations of males, and the many cultures around the world, that favor or demand that men grow varying amounts of facial hair? Shaving is a product of modern cultures beginning notably with the Egyptians who began removing facial hair and body hair because it harbored dirt and lice.  Other ancient cultures used beard growth as the transition to adult obligations and benefits, including the Greeks. Ancient Germanic males grew both long hair and full beards. The Romans made a ritual of a young male’s first shave, and then favored a clean face. Of course, growing a beard also depends on having hairy ancestors – or does it?

farnese-herculesLysippus greek

Top: Roman Hercules Bottom: Greek Hercules (Lysippus)

Reconstructions of Early Homo sapiens and a Neanderthal contemporary

Reconstructions of Early Homo sapiens and his Neanderthal contemporary

Right: Do we actually know how hairy early Homo species were? It would seem that without evidence, artists settle on a 5-day growth or scruffy short beard. Does a beard cover a “weak” Neanderthal chin?

The image of archaic humans, notably Neanderthals, as hairy and unkempt Cave Men has influenced how we interpret hairiness or hairlessness in Homo sapiens. Hair is extremely important in both favorable and unfavorable ways: hair can be a haven for disease and parasites; we need only look to the large amount of time that apes and monkeys spend grooming each other for lice, time that could be spent looking for food, learning about the environment, and practicing skills.

Growing hair requires energy. Our large human brain requires 20% of the energy that our body generates in order to power that brain. It could be that the growth of the modern brain (beginning with Homo erectus) was intricately tied up in a slow feedback cycle; the brain produces energy saving inventions (fire, tools, clothing, travel to more abundant environments) which means more energy to devote to the brain, which can increase brain connections, which makes increased technical innovation possible, which frees more energy for the brain. So, technology could be seen as part of streamlining the human animal into a energy-conserving species, which in turn improves brain function. In other words, the brain benefits from its own thinking when that thinking becomes a set of “apps” that manipulate the environment and the human body.

Meanwhile, what about facial hair? Personally, I’m thankful that I live in a time when men have the choice to grow, or not to grow.

 

____________________________________________________________________________

imagesKX09DYBA imagesLF2IM31T

 

 

 

 

The most important “developmental” fact of life

is death.

It just happens: We grow old. It’s a natural progression, without doubt. But not in the U.S., of course, where openly denying death is a frenzied passion. Getting old is a crime in a society terrified of “growing up” and becoming adult.

Old people are proof of the most basic facts of life, so much so, that being old has become taboo. And if one lives to the “new” expectation of 80 or so, that means 30 years of life beyond the new “old age” of 50. That’s a long time to “fake” being “young, beautiful, athletic and sexy”. 

Growing old is tough enough without a “new” set of instructions; don’t look old, act old, get sick, become feeble or need help (unless that help is covered by insurance.) Don’t remind younger people, by your very presence, that there is an end; it is believed now that one can “look good” until the end – which will entail a short, or long, period of degeneration. This period of “old age” is rarely seen as a “good” time of life as valid as one’s childhood, young adulthood, or middle age, unless one has the funds to at least pretend to be “youngish”.

Contrary to popular American belief, it remains a fruitful time of personal development. As long as our bodies continue to function, learning and thinking continue to be what humans do.

If life has been one long illusion that only “social” rewards count, and life has been a display of materials owned, status achieved, people “bested”, then one will likely keep up the illusion, with whatever “solutions” the anti-aging industry has to offer.

I live in a town in which most people are “getting old” – not much opportunity for the young to work, to develop a career, to join the circus of material wealth and ambition. Traditionally, young people have returned to the area after college, and a stint in corporate America, time in the military, or success in finding a spouse. Having “grown up” in this unique place, it was where they chose to establish families and to be close to loved ones. The Wyoming landscape and lifestyle have always been a fundamental fact in this choice to return, and it pulls relentlessly on those who leave.

Disastrous policies, and frankly criminal wars, prosecuted from Washington D.C. in league with corporate-Wall Street crooks, and funded by abused taxpayers, demonstrate the general belief on both coasts that the people who inhabit the “rest of the U.S.” just don’t matter. We are indeed worthless and disposable inferiors willing to enrich a ruling class that despises them, and to literally die for “blood” profits in their service.

Our town needs new people to survive as a community; we need children and young families, but opportunity is lacking. Small businesses are closing and not reopening: the owners have retired and are dying off. Competition from online retailers has siphoned off local spending and old people have very little to spend anyway. Every dime goes to necessities and the obscene cost of healthcare.

The American dream left our town long ago. Wyoming’s existence has been plagued by Federal and corporate control from the beginning, when the railroad opened the West to outright looting of it’s resources by far away “global” entities. Pillage of the land and it’s resources funded the American coastal empires; exploitation of immigrants provided cheap labor. “Colonialization” by U.S. and European nations was not limited to the invasion of “foreign lands” but happened here also – and continues to this day.

Native Americans (not being suited to corporate life and labor) were killed off with conscious purpose – a policy of mass murder; the remnants confined to “reservations” where their descendants are expected to remain “invisible” – to whither away and to eventually die off, by a slow suicide of formerly unique human beings. Diversity? A smoke screen.

These thoughts occupy my meditations as I pass through a human being’s last opportunity for personal development. It’s a time of recognizing that the universe goes on without us; that our deepest questions will not be answered. It’s a time to understand that the individual cannot correct or improve much that goes on in an increasing cluttered and entangled social world, which doesn’t mean that we ought not try to improve our ourselves and our small areas of influence.  Our lives are eventually “finished” for us by nature, in disregard for our insistence that our life is essential to the universe and therefore, ought to go on forever.

____________________________________________

It is shocking to confront the fact that so much human effort, inventiveness, hard labor, suffering, and resource depletion was, and still is, devoted to the imaginary “immortality” of a few (not so admirable) individuals; Pharaohs, emperors, kings, dictators, war lords, ideologues, criminals, Popes and priests; not the best of humanity, but often the worst.

The big lie is an old lie: Immortality can be purchased. 

Yes, there is a pyramid for immortality-mortality also: The Pharaohs of our time will not be mummified. (A crude process of desiccation, which however has been wildly socially successful! They continue to be A -List celebrities that attract fans of the “rich and famous”.)

Today’s 1% equivalents will not be made immortal by being dried out like fish, cheese or jerky – no, they will be made “immortal” by means of “sophisticated” technology. What an advancement in human civilization! 

These immortality technologies, and lesser life extension, of replacements of organs and skeletal architecture, part by failing part, are being promoted as “mankind’s future” – What a lie! As if the today’s Pharaohs really intend to share their immortality with 15 billion humans!

timecover

2045: The year Man becomes Immortal. Right: All estimate 15 billion of us.

A few elite at the top may manage to purchase immortality of a limited sort: machines designed in their own image.

The mortal King Tut, a product of incest who died at age 19. How much human talent and potential has been wasted on fulfilling the fantasy of immortality for a predatory class of individuals?

It’s not King Tut, the Insignificant, who is immortal, but the lure of his “real estate” holdings, elite addresses, golden household furniture and knickknacks, layers of stone coffins, granite “countertops”, Jacuzzi bath tubs, fabulous jewelry, and rooms with a view of eternity, that keeps the envious modern social tourist coming back. 


This is not King Tut. This is a fabulous work of propaganda made by artisans, (Pharaohs had to impress the Gods in order to become a god – you wouldn’t show up for “judgement day” in anything less than the most impressive selections from your wardrobe) who rarely get credit (nameless) for their “creation of brands and products” that supply the magical connections necessary for supernatural belief in the pyramid of social hierarchy as the “definitive and absolute model” of the cosmos.  

Magic consists of the “transfer of power” between the “immortal mask” and the unimpressive person; the “mask” has become King Tut in the belief system of the socially-obsessed viewer.  

 

 

Mystified Asperger Female / Sexual Assault and the Media

I shouldn’t have to say this, but I will: Any assault on another person is an assault. The “measure of severity” and consequence-punishment is a socio-cultural determination. Sexual assault has traditionally been considered a separate and “special” case, with various cultures having very different attitudes, customs and laws surrounding “who owns” a person’s body. It is a subject basic also to slavery; slavery is “ownership” of body, soul and mind” of another human being. Traditionally, females have been subject to “ownership”, from outright slavery, to marriage customs to “simply being inferior” by virtue of being biologically female – and by supposedly being little more than a child in “intelligence” and self-actuation. This has been the social condition of females for all of recorded history.

Much of how modern social humans “view” sex – and the myriad complications heaped on what is a biologic necessity – by hundreds of thousands of discussions, negotiations, codes, laws, practices, controls, moral-ethical stances, criminal statutes, marriage contracts and the consequent “control” of children, is rooted in this concept of “ownership”.

The qualitative and functional hierarchy goes like this:

Men own women.

Men own children, because men own women.

Men choose when and where to have sex with women and children.

UH-OH! That’s a recipe for male-on-male conflict, which is of immense threat to society.

The hierarchy forms:

Top Males choose for lesser males. (The history of male access to females is clear about this being extremely important). There’s a distinct “Top Predator” hierarchy of “sexual privilege”. That hierarchy of restricted access to sex is one very big reason why males want to be “Top Males”. (See Ghenghis Khan and Y haplogroup)

This “set up” hasn’t changed, just because a bunch of American women have decided, in the last century or so, that this is a fundamentally “bad system” for females. (Me included). The campaign for equality with men; in fact, opportunity and aspiration, has largely been the purview of women who have had the opportunities for education, work and personal expression due to family circumstance and expectations. Class distinctions.

The current “eruption” of female anger toward an inarguably predatory “sexual” culture, are women who have managed to “gain some measure of power” – in media, politics, entertainment – essentially $$$$. It’s politics, pure and simple. Why wouldn’t women who have gained a foothold in the status, power, and wealth hierarchy not “turn on” males, who are now their “equals” in politics, business and media-entertainment; that is, “competitors”? And, the traditional male hierarchy “permits” and even requires that younger males “knock off” Top Males who are “declining in potency”.

Meanwhile. What about the other 99% of men and women?

Most cannot afford to do anything but “slog on” trying to find the ways and means to have a decent life. A revolution is well underway that affects all of us. Men benefit from having strong female partners; they must learn not to abuse women who are adding much “good” to their lives. At the risk of being “optimistic”, which goes against my practical Asperger instincts, I would say that most men understand this, but they are up against the male “way of being” as dictated by thousands of years of cultural tradition, in a way that is fundamentally different than the experience of females. Males are “somebody” by virtue of being male. No matter how low on the pyramid they fall, there is always 50% of the population they “outrank”. Embracing equality requires a profound individual rejection of male tyranny.

 

 

Blood-Sucking Parasites in D.C. / Know Your Predators

IMG_0228 Elf You!

Social Security, Treasury target taxpayers for their parents’ decades-old debts

April 10, 2014

A few weeks ago, with no notice, the U.S. government intercepted Mary Grice’s tax refunds from both the IRS and the state of Maryland. Grice had no idea that Uncle Sam had seized her money until some days later, when she got a letter saying that her refund had gone to satisfy an old debt to the government — a very old debt. When Grice was 4, back in 1960, her father died, leaving her mother with five children to raise. Until the kids turned 18, Sadie Grice got survivor benefits from Social Security to help feed and clothe them.Now, Social Security claims it overpaid someone in the Grice family — it’s not sure who — in 1977.  After 37 years of silence, four years after Sadie Grice died, the government is coming after her daughter. Why the feds chose to take Mary’s money, rather than her surviving siblings’, is a mystery. Across the nation, hundreds of thousands of taxpayers who are expecting refunds this month are instead getting letters like the one Grice got, informing them that because of a debt they never knew about — often a debt incurred by their parents — the government has confiscated their check. The Treasury Department has intercepted $1.9 billion in tax refunds already this year — $75 million of that on debts delinquent for more than 10 years, said Jeffrey Schramek, assistant commissioner of the department’s debt management service. The aggressive effort to collect old debts started three years ago

— the result of a single sentence tucked into the farm bill lifting the 10-year statute of limitations on old debts to Uncle Sam. No one seems eager to take credit for reopening all these long-closed cases.

A Social Security spokeswoman says the agency didn’t seek the change; ask Treasury. Treasury says it wasn’t us; try Congress. Congressional staffers say the request probably came from the bureaucracy.

The only explanation the government provides for suddenly going after decades-old debts comes from Social Security spokeswoman Dorothy Clark: “We have an obligation to current and future Social Security beneficiaries to attempt to recoup money that people received when it was not due.”Since the drive to collect on very old debts began in 2011, the Treasury Department has collected $424 million in debts that were more than 10 years old. Those debts were owed to many federal agencies, but the one that has many Americans howling this tax season is the Social Security Administration, which has found 400,000 taxpayers who collectively owe $714 million on debts more than 10 years old. The agency expects to have begun proceedings against all of those people by this summer.“It was a shock,” said Grice, 58. “What incenses me is the way they went about this. They gave me no notice, they can’t prove that I received any overpayment, and they use intimidation tactics, threatening to report this to the credit bureaus.”
Grice filed suit against the Social Security Administration in federal court in Greenbelt this week, alleging that the government violated her right to due process by holding her responsible for a $2,996 debt supposedly incurred under her father’s Social Security number.
Social Security officials told Grice that six people — Grice, her four siblings and her father’s first wife, whom she never knew — had received benefits under her father’s account. The government doesn’t look into exactly who got the overpayment; the policy is to seek compensation from the oldest sibling and work down through the family until the debt is paid. 
 

The Federal Trade Commission, on its Web site, advises Americans that “family members typically are not obligated to pay the debts of a deceased relative from their own assets.” But Social Security officials say that if children indirectly received assistance from public dollars paid to a parent, the children’s money can be taken, no matter how long ago any overpayment occurred.

“While we are responsible for collecting delinquent debts owed to taxpayers, we understand the importance of ensuring that debtors are treated fairly,” Treasury’s Schramek said in a statement responding to questions from The Washington Post. He said Treasury requires that debtors be given due process.

Social Security spokeswoman Clark, who declined to discuss Grice’s or any other case, even with the taxpayer’s permission, said the agency is “sensitive to concerns about our attempts to arrange repayment of overpayments.” She said that before taking any money, Social Security makes “multiple attempts to contact debtors via the U.S. Mail and by phone.”

 

 

Empirical Planet Blog / Critique of All Things Neurotypical

http://empiricalplanet.blogspot.com

About Empirical Planet:  (Jason) I’m a neuroscience PhD student and hopeless news junkie. I’m passionate about bringing empirical thinking to the masses. @jipkin

Blog written by a fellow cranky complainer and hopeless believer in converting the masses to a love of reality, which is a pointless endeavor:

(Posted 07/2003)This idea of there being a “primitive” brain crops up all over the place, and from reputable sources: They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities” (Picower Professor of Neuroscience Earl Miller, MIT, said that).

It’s like adding scoops to an ice cream cone.  So if you imagine the lizard brain as a single-scoop ice cream cone, the way you make a mouse brain out of a lizard brain isn’t to throw the cone and the first scoop away and start over and make a banana split — rather, it’s to put a second scoop on top of the first scoop.” (Professor David Linden, Johns Hopkins, said that).

Now let me explain why this is all complete BS.

First, semantics.  What is “primitive”?  How do you measure a tissue’s “primitivity”?  In the common usage of the word, primitive means simple, especially in the context of a task, idea, structure, way of life, etc that was employed a long time ago.  Cavemen were more primitive than us, for example.  Unfortunately, this means that “primitive” is a word that refers to things both “ancient” AND “simple”.  Which, as we’ll see, is a big problem when you start applying it to mean only one of those things as occurs with the “primitive brain” meme.

Second, what are people actually talking about when they say “primitive brain”?  This is confused as well, but in general the thought is structured like this: Most primitive – brain stem, pons, cerebellum.  (The hindbrain, basically). Also primitive – the limbic system where “limbic” means “border, edge”.  This includes the hippocampus, amygdala, parts of the thalamus (who knows why), some of the cortex, and some other bits.  It’s supposed to do “emotion” and this is what Daniel Goleman is referring to when he talks about the “primitive brain”.

Really though it’s just a lumping together of all the structures right near the inner border of cortex, because why not lump it all together? – the mighty cortex, you know, is the glorious wrinkly part on the outside.

Third, why do people say that these particular brain structures are “primitive”?  The idea is that evolutionarily, one came after the other.  As in, the first vertebrate just had a brain stem.  Then it evolved more stuff after that.  Then as things kept evolving, more and more stuff got added on.  This is the “ice cream cone” model that David Linden espouses.  It’s also incredibly stupid (or at least misleading).  Let’s break it down.

Evolution did not happen like this: untitled-png-linear-evo

Evolution did happen like this: untitled-png-tree-evo

I hope everyone pretty much understands the concept of a phylogeny (phylogeny, the history of the evolution of a species or group, especially in reference to lines of descent and relationships among broad groups of organisms) and the fact that every vertebrate came from one common ancestor. Yes, the common ancestor was a kind of fish.  No, today’s fish aren’t the same as the common ancestor. They evolved from it just like everyone else, although “rates” of evolutionary phenomena like mutation and drift can vary and that’s beyond the scope of this post.

The point is that the “primitive” brain meme is born in the idea that the brain components shared by fish, lizards, mice, and humans share must be evolutionarily ancient and were likely shared in common by the common ancestor.  So, homologous structures across the phylogeny indicate “ancientness”.  And “ancientness” = “primitive”.  (Except it doesn’t, but more on that in a second). And since we all share structures that resemble the brain stem, voilà!  That’s the most primitive part of the brain.  Here’s where things go astray.

First, we don’t just share the brain stem with all animals.

EmbryonicBrain_svgHere’s the real “ice cream cone” of the brain: And when I say “the” brain, I should say “all vertebrate brains”.  Every fish, bird (including reptiles), amphibian, and mammal has a brain that starts out looking like the pictures to the right. Each colored bump (or “scoop of ice cream”) represents a region of the brain very early on in development, when the whole nervous system is basically a simple tube.  Each bump goes on to expand to varying sizes into varying kind of structures and yadda yadda depending on the species.  The point, though, is that all vertebrates have a forebrain, a midbrain, and a hindbrain.  And the hindbrain, by the way, is the “primitive” brain stem. 

But clearly, humans, fish, lizards, and mice all evolved from a common ancestor that had all brain regions, not just the hindbrain.

This is why David Linden’s ice cream analogy is so dumb.  He’s implying that first you start with one scoop, the hindbrain, then add on another (the midbrain), and finally one more (the forebrain).

When mammals like mice came along, the lizard brain didn’t go away. It simply became the brain stem, which is perched on top of the spine, Linden says.  Then evolution slapped more brain on top of the brain stem. But that’s not what happened at all.  All the scoops were there to begin with.  Then as adaptation took its course, different scoops got bigger or smaller or just different as you began comparing across the entire phylogeny.  Yes, humans have an enormous cortex and lizards don’t.  And yes, lizards simply evolved a different-looking kind of forebrain.  That’s all.

Second, homology (“likeness”) does NOT imply “ancientness”.  Even if the hindbrain looks pretty similar across the vertebrate phylogeny as it exists today, that doesn’t make it “ancient”.  The hindbrain has been evolving just like the midbrain and the forebrain.  Maybe it’s not been changing as much, but it’s still been changing.

This leads me to why the “primitive” notion is so misleading, and should be avoided:

(1) Calling a part of the brain “primitive” suggests what David Linden articulated: that brain evolution happened like stacking scoops of ice cream.  It implies that our brain stem is no different than that of a lizard, or of a mouse, or of a fish.  Yet despite their vast similarities, they are clearly not the same.  You can’t plug a human forebrain into a lizard hindbrain and expect the thing to work.  The hindbrain of humans HAD to adapt to having an enormous forebrain.  There’s something seductive in the idea that inside all of us is a primal being, a “reptilian brain”.  There isn’t.  It’s a human brain, top to bottom.

(2) Calling brain parts “primitive” because they are shared across phylogeny is often used to justify how amazing our human cortex is.  Look at what makes us, us!  We’re so great!  Well, I guess.  But we are just one little excursion among many that evolution has produced.  The lizard brain is adapted for what lizards need to do.  The fish brain is adapted for what fish need to do.  They don’t have “primitive” brains.  They have highly adapted brains, just like any other vertebrate.

(3) Simply using the word “primitive” makes the casual reader think of cavemen.  It just does.  And that’s even more ridiculous, because ancient Homo sapiens were still Homo sapiens.  Read what this poor misinformed blogger has written:

“So, let me explain the Primitive brain in simple terms. We have an Intellectual (rational) part of the brain and a Primitive (emotional) part of the brain. In the picture above, the Primitive brain is around the Hippocampus and Hypothalamus areas. In some texts, it has also been called the Limbic System. The subconscious Primitive part has been there ever since we were cavemen and cavewomen, and houses our fight/flight/freeze response (in the amygdala in between the Hippocampus and the Hypothalamus). Its function is to ensure our survival.”

AHHHHHHHHHHHHHHH.  You see?  You see??????

(4) There is not just a “primitive, emotional brain” and a “complex, intellectual brain”.  That is so…. wrong.  Factually wrong.  Yet people like Daniel Goleman sell books about emotional intelligence claiming that people need to develop their “emotional brain”

_______________________________

Asperger individuals are belittled as developmentally disordered because we don’t have the imaginary-mythical social / emotional “normal” human brain. 

_______________________________________________

…and then bloggers like Carrie (above) start internalizing and spreading the misinformation.  Folks.  Let me be clear.  You have ONE brain.  One.  It has many parts, which is to say that humans looking at brains have found ways to subdivide them into various areas and regions and structures and whatnot.  Regardless of all that, the whole damn thing works together.  It’s one big circuit that does emotion, rationality, sensation, movement, the whole shebang.  There isn’t a simplistic “emotion” part and an “intellectual” part.  The cortex is involved in emotions and feelings.  The basal ganglia are involved in cognition.  In fact, the whole idea of splitting emotion and reason into two separate camps is fraught, as emotion turns out to be critical in reasoning tasks like decision-making.

__________________________________________________________________________

Asperger individuals aren’t “able to – allowed to” claim that emotion influences our thinking (nor are we granted any feelings toward other humans) because we’re “missing” the non-existent “social brain” and every idiot knows that not having an “social brain” makes a person “subhuman” or even psychopathic – we are irretrievably “broken”. The real story is that Asperger “emotions”, which technically, and for every animal, are reactions to the environment, are different because our sensory acquisition and perceptual processing are different: we focus on PHYSICAL REALITY. Hypersocial humans focus on each other.  

_____________________________________________________________________________ 

(5) The “primitiveness” of lizard brains is vastly overstated.  Things like this get written about the “primitive brain”: A lizard brain is about survival — it controls heart rate and breathing, and processes information from the eyes and ears and mouth.

This implies, to the casual reader, that lizards are just sitting around breathing.  Maybe there’s some “survival instinct” in there: look out for that big hawk!  Yeah, okay.  But guess what?  Lizards gotta do other stuff too.  They have to reproduce, find food, navigate their environment, learn, remember, make choices, etc.  They aren’t just breathing and instinct machines.  And because they aren’t, that means their brains aren’t just doing that either.  And why is it always lizards and reptiles?  You’d think fish would get picked on more.

(6) “Primitive” in the context of a brain part primarily means “ancient”.  But the word “primitive”, as we already saw, connotes simplicity.  This leaves laypeople with many misconceptions.  First, that the brain stem, or the “emotional brain”, or whatever, is simple.  Or even that they’re simpler.  Nope.  Not really.  Pretty much every part of the brain is complex.  Second, it reinforces, in the case of the “emotional brain”, that emotions are beneath intellect. (In the U.S. “emotional responses” have been elevated OVER intellect, because no one wants an analytical consumer or voter.)  They came first, they are older, they are simpler, they are the stupid part of your brain.  Again, just no.  You need emotions to survive just as you need your intellect to survive.  Fish need  emotions (an emotion, after all, is just a representation of bodily state imbued with a certain postive/negative valence) just like they need their reasoning abilities as well.

(7) People (who use the word) “primitive” (copy scientists) because it can sound cool and surprising.  Look at how Earl Miller framed it, from above:

“They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities”

Look at that result!  A primitive thing did something advanced! 

The forgotten thing is important!  Or maybe – this is going to sound crazy – the whole system evolved together in order to support such essential tasks like learning.  There never was a primitive part or an advanced part, despite two areas or regions being labeled as such.  Every part of the human brain has been evolving for the same amount of time as every other part, and has been evolving to work as best as possible with each of those other parts.

(8) Finally, let’s return to Daniel Goleman, who argues that “emotional intelligence” arises from the “primitive emotional brain”.  Then he waxes on and on about the value of emotional intelligence, particularly as it relates to social abilities.  Ability to understand your own emotions.  Ability to perceive those of others.  Ability to interact well with others on the basis of understanding their emotions.  Et cetera. 

That’s all fine, but by saying this comes from an ancient, primitive, emotional brain might make people think that (neurotypicals are primitive and stupid) and ancient vertebrates really had to know themselves, be able to read others, and interact socially.  ( ie; ancient vertebrates were as intelligent as modern humans) But there’s a whole lot of solitary, nonsocial vertebrate species out there, and they have brainstems and limbic systems too.

Hopefully never again will you refer to a part of the brain as “primitive.”  Some structures probably more closely resemble their homologous counterparts in the last common ancestor of vertebrates, but all the basic parts were there from the beginning.  And remember, evolution didn’t happen (only) to make humans. (And specifically, EuroAmerican white males.)  We aren’t more advanced in an evolutionary sense than fish, lizards, or mice.  Each species is just adapted to the roles it finds itself in, and continues to adapt.  Our sense of being “advanced” comes purely from our own self-regard and anthropocentric tendencies. The human brain is not the best brain, nor is it the most advanced brain, because there’s no scale on which to measure how good a brain is.

Actually, the process of evolution appears to settle for “good enough” as the standard for successful adaptation!

J.E. Robison / Where has all the Autism funding gone?

I don’t follow John Elder: I do understand that he’s tried to work within the “official Autism community” to produce change. It seems he’s finally waking up to the exploitation-for-profit program that is the Autism Industry.

Sex, Lies, and Autism Research—Getting Value for Our Money

How can we get tangible benefit from the millions we spend on autism science? (No, it’s not science; it’s a business.)

The U.S. government is the world’s biggest funder of autism research.  For the past decade I have had the honor of advising various agencies and committees on how that money should be spent. As an adult with autism, sometimes I’ve been pleased at our government’s choices. Other times I’ve been disappointed. Every now and then I turn to reflect: What have we gotten for our investment?

Autistic people and their parents agree on this: The hundreds of millions we’ve spent on autism research every year has provided precious little benefit to families and individuals living with autism today. Over the past decade the expenditures have run into the billions, yet our quality of life has hardly changed at all.

It would be one thing if massive help was just around the corner, but it’s not. There are no breakthrough medicines or treatments in the pipeline. Autistic people still suffer from GI pain, epilepsy, anxiety, depression, and a host of other issues at the same rates we did before any of this research was funded.

I don’t mean to suggest that nothing has been accomplished.  Scientists have learned a lot. They know more about the biological underpinnings of autism. Researchers have found hundreds of genetic variations that are implicated in autism. We’ve quantified how autistic people are different with thousands of studies of eye gaze, body movement, and more. Scientists are rightly proud of many of their discoveries, which do advance medical and scientific knowledge. What they don’t do is make our lives better today. (Sorry John, that you feel you still need to “support” a corrupt system by buying into false claims of scientific progress or the value of bogus research.)

Why is that?

In the past I’ve written about the idea that taxpayer-funded research should be refocused on delivering benefit to autistic people. What I have not written about, is why that hasn’t happened, at the most fundamental level.

The answer is simple: Until quite recently, autistic people were not asked what we needed.

There are many reasons for that. Autism was first observed in children and no one expects children to have adult insight and self-reflection. When autism was recognized in adults, they were assumed to be too cognitively impaired to participate in conversations about their condition. Finally, in the spirit of the times, doctors often assumed that they knew best. They were the trained professionals, and we were the patients (or the inmates.) (Are we confusing “medical” doctors with non-medical psychologists? )

So doctors studied every question they could imagine, and then some, seldom seeking our opinions except in answer to their research questions. They assumed they knew what “normal” was, and we weren’t it. Countless million$ went down the rabbit hole of causation studies, whether in genetics, vaccines, or other environmental factors. Don’t get me wrong—the knowledge we’ve gotten is valuable for science. (Not really! It’s been valuable for the funding of universities, academics and research institutions) It just did not help me, or any autistic person I know. (It wasn’t INTENDED to help “autistic” people or their families).

Millions more have been spent observing us and detailing exactly the ways in which we are abnormal. Only recently have some scientists began to consider a different idea: Perhaps “normal” is different for autistic people, and we are it. Again the studies enhanced the scientists’ knowledge (of how to profit from human suffering) but didn’t do much to help us autistics.

Then there are the educators and psychologists. They observed our “deviations” and then considered therapy to normalize us. That led to ABA and a host of other therapies. Some of those have indeed been beneficial, but the money spent on beneficial therapy is just a drop in the bucket when considering what we taxpayers have funded overall.

Want a different and better outcome? Ask actual autistic people.

We can tell you what our problems are, in many cases very eloquently. I’m not going to re-state all our needs here. I’ll tell you this: Whenever this topic comes up at IACC (the Federal committee that produces the strategic plan for autism for the U.S. government), the priorities of autistic people seem rather different from those of the researchers our government has been funding for so long. (It’s a corrupt system; part of the general policy to redistribute wealth “up to” the 1%).

Autistic people have many disparate needs, but they all boil down to one thing: We have major challenges living in American society. Medical problems, communication challenges, learning difficulties, relationship issues, and chronic unemployment are all big deals for us. The issues are well laid out and many.

Before autistic people began speaking out in great numbers, all we had was parent advocacy. We should not dismiss that, and parents still have a role today, particularly in advocacy for small children and children who are older but unable to effectively advocate for themselves.

Even as we thank parents for their service, it’s time to recognize autistic voices (some of which belong to parents too) should be taking the lead.

As much as parents did for us, they also unwittingly contributed to harm. Parents misinterpreted harmless stimming, and encouraged therapists to suppress it, leaving us scarred in adulthood. Many autistics of my generation remember being placed into programs for troubled children with parental encouragement in hopes we’d become “more normal.” We didn’t. Parents have given us bleach enemas, and some of us have died from misguided chelation and other treatments to “cure” our autism.

I don’t blame parents for any of that. They did their best, given the knowledge of the day. But it’s a different day now. The children who grew up being “normalized” can talk about how it affected them, and parents and clinicians of today would be wise to listen.

Autistic voices are finally speaking in large numbers and it’s time to pay attention. No one else knows life with autism. Parents and no-autistic researchers are sometimes listening. Hard as this may be for them to hear, they are always guessing. With autistics speaking out all over the world, that’s no longer good enough.

For the first time, IACC has recognized this in the 2017 Strategic Plan Update. They say it’s time for a paradigm shift in how we do research. We need to focus on the needs of people living with autism today. That’s a realization that I appreciate, and it’s long overdue. (OMG! Please don’t fall for this universal neurotypical ploy: We wrote it down: SEE? End of story.)

So what’s the answer to why we’ve gotten so little return on our autism research investment: No one asked the autistic people what we wanted. It’s that simple. Had we been able to articulate our challenges, with the framework of knowledge we have today, and had we been listened to, we’d be in a very different place today.

Today is gone, but tomorrow isn’t here yet, and it can be different.

(c) John Elder Robison (Thank-you John for “stepping up” to the truth.)

John Elder Robison is an autistic adult and advocate for people with neurological differences. He’s the author of Look Me in the Eye, Be Different, Raising Cubby, and Switched On. He serves on the Interagency Autism Coordinating Committee of the U.S. Dept. of Health and Human Services and many other autism-related boards. He’s co-founder of the TCS Auto Program (a school for teens with developmental challenges), and he’s the Neurodiversity Scholar in Residence at the College of William and Mary in Williamsburg, Virginia, and a visiting professor of practice at Bay Path University in Longmeadow, Massachusetts.
The opinions expressed here are his own.

_________________________________________________

What more does the Autism Industry need?

Director of the Institute of Mental Health declares that Autism is a “real” epidemic and not due to changes in labels, diagnostic criteria and fear-mongering. No objective evidence needed when you have the Federal Government working FOR YOU. 

TACA is an “interesting NON-PROFIT – check out their website and the financial statements they provide. Hard to find out how much $$$ actually filters down to real people outside the “charity”. Here’s their “agenda”. Note the cliché about someday finding a “cure” which is not going to happen: creates a classic “American Non-Profit” demand for “donations” and funding in perpetuity. Think of all those “charities” that have collected billions for “research” etc, without a “cure” in sight.

Mental Development / Genetics of Visual Attention

Twin study finds genetics affects where children look, shaping mental development

https://www.sciencedaily.com/releases/2017/11/171109131152.htm

November 9, 2017 / Indiana University

A study that tracked the eye movement of twins has found that genetics plays a strong role in how people attend to their environment.

Conducted in collaboration with researchers from the Karolinska Institute in Sweden, the study offers a new angle on the emergence of differences between individuals and the integration of genetic and environmental factors in social, emotional and cognitive development. This is significant because visual exploration is also one of the first ways infants interact with the environment, before they can reach or crawl.

“The majority of work on eye movement has asked ‘What are the common features that drive our attention?'” said Daniel P. Kennedy, an assistant professor in the IU Bloomington College of Arts and Sciences’ Department of Psychological and Brain Sciences. “This study is different. We wanted to understand differences among individuals and whether they are influenced by genetics.”

Kennedy and co-author Brian M. D’Onofrio, a professor in the department, study neurodevelopmental problems from different perspectives. This work brings together their contrasting experimental methods: Kennedy’s use of eye tracking for individual behavioral assessment and D’Onofrio’s use of genetically informed designs, which draw on data from large population samples to trace the genetic and environmental contributions to various traits. As such, it is one of the largest-ever eye-tracking studies.

In this particular experiment, the researchers compared the eye movements of 466 children — 233 pairs of twins (119 identical and 114 fraternal) — between ages 9 and 14 as each child looked at 80 snapshots of scenes people might encounter in daily life, half of which included people. Using an eye tracker, the researchers then measured the sequence of eye movements in both space and time as each child looked at the scene. They also examined general “tendencies of exploration”; for example, if a child looked at only one or two features of a scene or at many different ones.

Published Nov. 9 in the journal Current Biology, the study found a strong similarity in gaze patterns within sets of identical twins, who tended to look at the same features of a scene in the same order. It found a weaker but still pronounced similarity between fraternal twins.

This suggests a strong genetic component to the way individuals visually explore their environments: Insofar as both identical and fraternal twins each share a common environment with their twin, the researchers can infer that the more robust similarity in the eye movements of identical twins is likely due to their shared genetic makeup. The researchers also found that they could reliably identify a twin with their sibling from among a pool of unrelated individuals based on their shared gaze patterns — a novel method they termed “gaze fingerprinting.”

“People recognize that gaze is important,” Kennedy said. “Our eyes are moving constantly, roughly three times per second. We are always seeking out information and actively engaged with our environment, and ultimately where you look affects your development.”

After early childhood, the study suggests that genes influence at the micro-level — through the immediate, moment-to-moment selection of visual information — the environments individuals create for themselves.

“This is not a subtle statistical finding,” Kennedy said. “How people look at images is diagnostic of their genetics. Eye movements allow individuals to obtain specific information from a space that is vast and largely unconstrained. It’s through this selection process that we end up shaping our visual experiences.

“Less known are the biological underpinnings of this process,” he added. “From this work, we now know that our biology affects how we seek out visual information from complex scenes. It gives us a new instance of how biology and environment are integrated in our development.”

“This finding is quite novel in the field,” D’Onofrio said. “It is going to surprise people in a number of fields, who do not typically think about the role of genetic factors in regulating such processes as where people look.”

_____________________________________________________

Comment: 

(Note: Many individuals can learn the “scientific method”- techniques, procedures and the use of math, without having an “understanding” of  “physical reality”. This is a problem in American “science” today.)

Why is the Asperger “attentional preference” for “physical reality” labeled a developmental defect? Because modern social humans BELIEVE that only the social environment EXISTS!

This “narrow” field of attention in modern social humans is the result of domestication / neoteny. The “magical thinking” stage of childhood development is carried into adulthood. This “arrested development” retains the narcissistic infantile perception of reality.  

A genetic basis for this “perceptual” knowledge of reality would support the Asperger “Wrong Planet” sense of alienation from neurotypical social environments. Our “real world” orientation is not a “defect” – our perception is that of an adult Homo sapiens. The hypersocial “magical” perception of the environment is that of the self-centered infant, whose very survival depends on the manipulation of “big mysterious beings” (parents – puppeteers) who make up the infant’s ENTIRE UNIVERSE.  

The Neurotypical Universe

 


Journal Reference:

  1. Daniel P. Kennedy, Brian M. D’Onofrio, Patrick D. Quinn, Sven Bölte, Paul Lichtenstein, Terje Falck-Ytter. Genetic Influence on Eye Movements to Complex Scenes at Short Timescales. Current Biology, 2017 DOI: 10.1016/j.cub.2017.10.007

SHY? / Be prepared for predatory rage…

To be “shy” in the U.S.A. is a social crime.

Shy people are relentlessly attacked. Note the implications:

You have a genetic defect; you’ve experienced child abuse; you have a social anxiety or a social phobia; you’re a narcissist; you have “negative thoughts”; you have low self-esteem; you’ll never have a boyfriend or girlfriend; you’re a bad person; you stutter; you’re ugly; you can’t win: (either you don’t talk enough or you talk too much). And on, and on.

Shyness is a deficit that one must overcome, otherwise life is not worth living: 

You probably hate people and must be anti-social:

Shyness carries a life sentence of social exile and failure:

Some weak links found. Shy 3 yr-olds become cautious teens. Difficult 3 yr-olds remain difficult. Well-adjusted 3 yr-olds also. Current research. Temperament and Big 5 related. May carry-over into adulthood.

And if that isn’t enough, let’s detail the social horrors:

How much more depressing can it get?

Propaganda: Shyness is pathologic. Your life is a mess; buy this crap.

Dear World: Be afraid; be very afraid. American psychology is coming for you…

___________________________________________________

A culture in flux

http://www.apa.org/monitor/2014/11/culture-flux.aspx

— Kirsten Weirm 2014, Vol 45, No. 10

When Heather Henderson, PhD, a psychologist at the University of Waterloo, lectures to students about her shyness research, she often shows videos of young kids playing. The response is predictable. “People laugh and smile at outgoing kids, and they become uncomfortable watching shy kids,” she says.

Were she to show that same video in rural China, she might get a very different response. In any culture, there’s a range of temperaments from very reserved to more outgoing. But culture strongly affects how those temperamental differences are judged.

In the 1990s, Xinyin Chen, PhD, a psychologist at the University of Pennsylvania, showed that while shy behaviors were linked to problems such as anxiety in North America, they were associated with positive school adjustment outcomes in China. Behaviorally inhibited students in China were held up as leaders in the classroom and rated as more likable by peers, says Robert J. Coplan, PhD, a psychologist at Carleton University in Ottawa who has collaborated on cross-cultural studies with Chen and colleagues in China.

But China has changed dramatically since the 1990s, with rapid modernization and strong influences from the West. Correspondingly, in large urban areas, shyness is starting to be seen as a detriment. “The same behavior, in a very short period of time, seems to have done an about-face in terms of its perceived adaptiveness in Chinese culture,” Coplan says.

While social inhibition is still praised in many rural areas, he says, “assertiveness and independence have now become more positively valued in the big urban centers.” The rapid turnabout could have major implications for Chinese society. Whereas an older teacher might admonish an outgoing child, the younger teacher down the hall might offer praise. Children born in cities versus rural villages may receive very different messages about how to behave.

For psychologists interested in the influence of culture on behavior, the change is astounding. Little more than a decade ago, Chinese teachers wished more children would act more reserved, Coplan says. And now? “On my latest visit, they were talking about setting up intervention programs to help young shy children.”