One of THOSE Discussions / God, Free Will and Absurdities

This post has gained momentum from having one of those “late night” discussions with a friend – the type that is popular when one is in college, a bit drunk (or otherwise deranged) and which, as one gets older and wiser, one vows to never again participate in. The gist of the argument was:

Determinism (God) is totally compatible with Free Will (The Declaration of Independence), so we have both.

I could stop right here, because this “set up” is thoroughly American “wacky” thinking. It demonstrates the absolute belief that “America” is a special case = exemption from reality, that was/is made possible by American Democracy (in case you weren’t aware, democracy is not a political creation of human origin) which came about by an Act of God. “Freedom” is a basic American goal: Free Will is therefore a mandatory human endowment (by virtue of the word Free appearing in both “concepts”). God created everything, so he must have created Free Will. Jesus is a kind of “sponge” that suffices to “soak up” all those bad choices Free Will allows, that is, if you turn over all your choices, decisions and Free Will to Jesus.

The irony is that this absurd, pointless discussion “cleared the air” over previously unspoken conflict with a dear friend, like blowing up the Berlin Wall; getting it out of the way, and establishing that friendship is not “rational” at all, but an agreement about what really matters; good intentions carried into actions, loyalty and a simple “rightness” – agreement on what constitutes “good behavior” on the part of human beings and a pledge of one’s best effort to stick to that behavior.

This entire HUGE neurotypical debate is nonsense.

God has nothing to do with Free Will, the Laws of physics, or any scientific pursuit of explanations for “the universe”. The whole reason for God’s existence is that He, or She, or They are totally outside the restrictions of “physical reality”. That’s what SUPERNATURAL means. So all the “word concept” machinations over “God” and “science” – from both ends of the false dichotomy – are absurd. Free Will is also a non-starter “concept” in science: reality proceeds from a complex system of “facts” and mathematical relationshipsthat cannot be “free-willed” away.

Total nonsense.

If one believes in the “supernatural” origin of the universe as a creation of supernatural “beings, forces and miraculous acts” then one does not believe in physical reality at all: “Physics” is a nonexistent explanation for existence. One can only try to coerce, manipulate, plead with, and influence the “beings” that DETERMINE human fate. Free Will is de facto an absurdity, conceived of as something like the Amendments to the U.S. Constitution, (inspired by God, after all – not really by the intelligence of the people who wrote it). In American thought, (political) rights grant permission to “do whatever I want”. The concept of responsibility connected to rights has been conveniently forgotten. Free Will in this context, is nothing more than intellectual, moral and ethical “cheating”.

So, the immense, complicated, false dichotomy of Determinism vs. Free Will, and the absurd 2,000+ year old philosophical waste of time that has followed, and continues, is very simple (at least) in the U.S. 

Whatever I do, is God’s Will: Whatever you do, isn’t. 

 

 

 

Advertisements

Overlap in Prey / Neanderthal, Hyena

_45464284_neander_sites466x268
Comparison of Neanderthal and Hyena as “top predators”.

Isotopic evidence for diet and subsistence pattern of the Saint-Césaire I Neanderthal: review and use of a multi-source mixing model.

Author information

  • 1Institut des Sciences de l’Evolution, UMR 5554, Université Montpellier 2, Place E. Bataillon, F-34095 Montpellier cedex 05, France. bocheren@isem.univ-montp2.fr

Abstract

The carbon and nitrogen isotopic abundances of the collagen extracted from the Saint-Césaire I Neanderthal have been used to infer the dietary behaviour of this specimen. A review of previously published Neanderthal collagen isotopic signatures with the addition of 3 new collagen isotopic signatures from specimens from Les Pradelles allows us to compare the dietary habits of 5 Neanderthal specimens from OIS 3 and one specimen from OIS 5c.

This comparison points to a trophic position as top predator in an open environment, with little variation through time and space. In addition, a comparison of the Saint-Césaire I Neanderthal with contemporaneous hyaenas has been performed using a multi-source mixing model, modified from Phillips and Gregg (2003, Oecologia 127, 171). It appears that the isotopic differences between the Neanderthal specimen and hyaenas can be accounted for by much lower amounts of reindeer and much higher amounts of woolly rhinoceros and woolly mammoth in the dietary input of the Neanderthal specimen than in that of hyaenas, with relatively similar contributions of bovinae, large deer and horse for both predators, a conclusion consistent with the zooarchaeological data. The high proportion of very large herbivores, such as woolly rhinoceros and woolly mammoth, in Neanderthal’s diet compare to that of the scavenging hyaenas suggests that Neanderthals could not acquire these prey through scavenging. They probably had to hunt for proboscideans and rhinoceros. Such a prey selection could result from a long lasting dietary tradition in Europe.

PMID:
15869783

_________________________________________________________________________________________

(Below: Not the Saint-Cesaire 1 specimen) “Mystery” Neanderthal species allows artists to speculate on the “reality” of multiple human types. There is no satisfactory evidence of blue eyes in Neanderthal.

neanderthal female reconstruction, viktor deak

Neanderthal female reconstruction, Viktor Deak

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza 2

reconstruction of the La Chapelle aux Saints Neanderthal, by Fabio Fogliazza 2

Emergence of “humans” / Berkeley.edu + Comments

slidec8

Simplified socio-cultural guide to identifying male / female.

 

The evolution of Primates – Gender dimorphism /

Top: Orangutan male and female. Middle: Modern social human; all “cases” of allowable bathroom use. Bottom: Idiot’s guide to gender ID; U.S.

 

Low sexual dimorphism in modern social humans? Really? Sexual dimorphism is created culturally in humans, and wow! Gender assignment is all mixed up! In fact, one might observe, that body alteration, decoration, behavior and costume are how Homo sapiens compensates for being a strange hairless ape, born without the elaborate fur, plumage, texture, color and behavioral displays of other species. We “copy” other animals and utilize materials in the environment to socially broadcast our sex and gender  – from the violent hyper male to the “big boob” sex object that is the “ideal” American woman. Some cultures  disguise or blur a person’s sex / gender. Neoteny promotes childlike appearance in males and females – the current trend is toward androgeny.

Any questions about this guy’s gender? 

papua13

Old school “gun”

50%20cent

Below: Modern neotenic “feminized” male – androgeny is the popular goal.

jaejoong-jyj korean

__________________________________________________________________________________________

How bizarre can the “story” of human evolution get?

The following chapter “The Emergence of Humans” is from Berkeley.edu, a site about evolution for students. I confess that to my Asperger type of thinking, this review of evolutionary studies is excruciating: One (dumb) point of view is especially mind-boggling; that chimpanzees are a legitimate focus of “study and research” into ancestral humans and modern human behavior, merely because “they are alive” and eligible for torture in labs’; they don’t have “souls” or “suffer.” And they appeal to neotenic social humans, by scoring high on the “cute” scale.

The apparent inability of researchers to get past this 19th C. world view is stunning; instead of a thorough examination of assumptions across disciplines, we again see “warfare” between disciplines, and the ongoing attempt to assemble a human “dinosaur” from bits and pieces of fossilized thinking. In fact, paleontology has exploded with new ideas since “old” dinosaur reconstructions were discovered to be highly inaccurate. Hint, hint.

FOUND! The last common ancestor of Humans and Chimps.

imagesZYC0W6GI

Berkeley.edu / The emergence of humans

The narratives of human evolution are oft-told and highly contentious. There are major disagreements in the field about whether human evolution is more like a branching tree or a crooked stick, depending partly on how many species one recognizes. Interpretations of almost every new find will be sure to find opposition among other experts. Disputes often center on diet and habitat, and whether a given animal could occasionally walk bipedally or was fully upright. What can we really tell about human evolution from our current understanding of the phylogenetic relations of hominids and the sequence of evolution of their traits?

Hominid evogram

(consistency problem)

To begin with, let’s take a step back. Although the evolution of hominid features is sometimes put in the framework of “apes vs. humans,” the fact is that humans are apes, just as they are primates and mammals. A glance at the evogram shows why. The other apes — chimp, bonobo, gorilla, orangutan, gibbon — would not form a natural, monophyletic group (i.e., a group that includes all the descendants of a common ancestor) — if humans were excluded. Humans share many traits with other apes, and those other “apes” (i.e., non-human apes) don’t have unique features that set them apart from humans. Humans have some features that are uniquely our own, but so do gorillas, chimps, and the rest. Hominid evolution should not be read as a march to human-ness (even if it often appears that way from narratives of human evolution). Students should be aware that there is not a dichotomy between humans and apes. Humans are a kind of ape.

Virtually all systematists and taxonomists agree that we should only give names to monophyletic groups. However, this evogram shows that this guideline is not always followed. For an example, consider Australopithecus. On the evogram you can see a series of forms, from just after Ardipithecus to just before Homo in the branching order, that are all called Australopithecus. (Even Paranthropus is often considered an australopithecine.) But as these taxa appear on the evogram, “Australopithecus” is not a natural group, because it is not monophyletic: some forms, such as A. africanus, are found to be closer to humans than A. afarensis and others. Beyond afarensis, for example, all other Australopithecus and Homo share “enlarged cheek teeth and jaws,” because they have a more recent common ancestor. Eventually, several of these forms will have to have new genus names if we want to name only monophyletic groups. Students should avoid thinking of “australopithecines” as a natural group with uniquely evolved traits that link its members together and set it apart from Homo. Instead they should focus on the pattern of shared traits among these species and the Homo clade, recognizing that each species in this lineage gains more and more features that are shared by Homo.

In popular fiction and movies, the concept of the wild “ape-man” is often that of a tree-living, vine-swinging throwback like Tarzan. However, the pantheon of hominids is much richer than this, as the evogram shows with forms as different as Paranthropus and Ardipithecus shows. For example, imagine going back in time to the common ancestor of humans and chimps (including bonobos). What did that common ancestor look like? In the Origin of Species Darwin noted that the extinct common ancestor of two living forms should not be expected to look like a perfect intermediate between them. Rather, it could look more like one branch or the other branch, or something else entirely.

Found! The last common ancestor of humans and chimps.

Did the common ancestor of humans and chimps conform to the ape-man myth and live in the trees, swinging from vines? To answer this, we have to focus not only on anatomy but on behavior, and we have to do it in a phylogenetic context. Apes such as the gibbon and orangutan, which are more distantly related to humans, are largely arboreal (i.e., tree-living). The more closely related apes such as the gorilla and chimps are relatively terrestrial, although they can still climb trees. The feet of the first hominids have a considerable opposition of the big toe to the others but relatively flat feet, as arboreal apes generally do. But other features of their skeleton, such as the position of the foramen magnum underneath the skull, the vertically shortened and laterally flaring hips, and the larger head of the femur, suggest that they were not just mainly terrestrial but habitually bipedal, unlike their knuckle-walking relatives. Most evidence suggests that the hominid lineage retained some of the anatomical features related to arboreal life and quadrupedal gait even after it had evolved a more terrestrial lifestyle and a bipedal gait. There is no fossil record of these behaviors, but the balance of the available evidence supports the hypothesis that the hominid ancestor was terrestrial and bipedal.

Much discussion in human paleontology surrounds the evolution of a bipedal, upright stance. When and why did this occur? One thing to keep in mind is that “bipedal” and “upright” are not equivalent terms. An animal can be bipedal without having a vertical backbone (think T. rex). It seems clear from the fossil record of hominids that habitual bipedality preceded the evolution of a recurved spine and upright stance. Other changes in the gait, such as how the relatively “splayed” gait of chimps evolved into the gait of humans, who put one foot directly in front of the other, involve studying the hip joint, the femur, and the foot. The famous Laetoli footprints attributed to Australopithecus afarensis are bipedal, but they are still relatively splayed compared to the tracks of living humans. (WOW! they are doing it again despite their own caution: humans did not evolve from chimpanzees!)

Another extremely interesting feature in hominid evolution is the degree of sexual dimorphism (i.e., physical differences between the sexes) in different species. Sexual dimorphism is linked to features of sociality and mate competition in many sorts of animals. To understand the evolution of this feature in humans, which have relatively low sexual dimorphism, we need to consider the other apes, in which sexual dimorphism tends to be moderate to high (with exceptions). 

(Again, culture is utterly ignored: the fact is; women and men “self-morph” according to socio-cultural “genders” into very dimorphic animals)

We don’t have sufficient evidence about Sahelanthropus, Orrorin, and Ardipithecus to understand much about sex differences in these species, but we do know that A. afarensis had relatively high sexual dimorphism: the males were considerably larger than the females. The difference seems to have been less in A. africanus, Paranthropus, and most of the Homo lineage. The evolutionary explanation for A. afarensis‘ dimorphism is not entirely clear. The larger males may have used their size to attract females and/or repel rivals, which would fit with an explanation based on sexual selection. Or the males and females may have been differently sized because they played different roles in their groups, the males hunting and gathering and the females caring for the young. Darwin thought that this differentiation of the sexes may have played a critical role in human evolution, but we simply do not know much about the role of this feature in A. afarensis. Some, all, or none of these functions may have been in play. (Novel-writing again! If we don’t have facts about a subject, why not say so? Speculation becomes dogma in the “magic word syndrome” social mind and people argue over imaginary histories and qualities.  Also – I suspect that once again the writers have “EuroAmerican humans in mind regarding sexual dimorphism: why?

We do know that by the time the animals known as Homo evolved, they could make tools, and their hands were well suited for complex manipulations. These features were eventually accompanied by the reduction of the lower face, particularly the jaws and teeth, the recession of the brow, the enlargement of the brain, the evolution of a more erect posture, and the evolution of a limb more adapted for extended walking and running (along with the loss of arboreally oriented features). The evogram shows the hypothesized order of acquisition of these traits. Yet each of the Homo species was unique in its own way, so human evolution should not be seen as a simple linear progression of improvement toward our own present-day form. (But, we show it that way, anyway!)

More…. Should you need a mind-boggling experience:

https://en.wikibooks.org/wiki/Survey_of_Communication_Study/Chapter_13_-_Gender_Communication

And to clarify all this: 

Male Beards / Covering up a Weak Chin?

The contemporary “love affair” that men are having with their ability to grow facial hair may be a reaction to the feminization (neoteny) of the male face that has been a trend for decades. Ironically, soldiers sent to Iraq and Afghanistan, who grew beards in order to “fit in” with ideals of manhood in those cultures, have encouraged the new “manly man” tradition.

No. Possibly the most unattractive type of beard: The Old Testament, patriarchal, we hate women facial hair.

The most creepy facial hair of all: The long and scraggly Patriarchal Old Testament, ‘we hate women’ beard. This style says, “I don’t know what a woman is, and I don’t want to know.”

______________________________________

I intended to write a post concerning “facial expression & mind reading.” Psychologists have made quite a big deal out of their contention that Asperger people are devoid of the ability to “read” the messages sent human to human via facial expressions and body language, and that this phantom “ability” must be displayed by an individual in order to be classified as “normal” or fully human. Other than the arrogance of this declaration, which to begin with, ignores cultural traditions and differences, one simply cannot get past asking questions about physical aspects that must be addressed in order to support the definition of “human” that has been derived by psychologists.

If facial expressions are necessary to human to human communication, doesn’t extensive facial hair negatively impact this “social ability”?

imagesSV2037JB orange-video-pub-avec-sebastien-chabal-L-1imagesWNFYGU3C

If you go hairy, you had better have the face and body to back it up. A beard does not “hide” a neotenic face. 

How does reading faces apply to earlier generations of males, and the many cultures around the world, that favor or demand that men grow varying amounts of facial hair? Shaving is a product of modern cultures beginning notably with the Egyptians who began removing facial hair and body hair because it harbored dirt and lice.  Other ancient cultures used beard growth as the transition to adult obligations and benefits, including the Greeks. Ancient Germanic males grew both long hair and full beards. The Romans made a ritual of a young male’s first shave, and then favored a clean face. Of course, growing a beard also depends on having hairy ancestors – or does it?

farnese-herculesLysippus greek

Top: Roman Hercules Bottom: Greek Hercules (Lysippus)

Reconstructions of Early Homo sapiens and a Neanderthal contemporary

Reconstructions of Early Homo sapiens and his Neanderthal contemporary

Right: Do we actually know how hairy early Homo species were? It would seem that without evidence, artists settle on a 5-day growth or scruffy short beard. Does a beard cover a “weak” Neanderthal chin?

The image of archaic humans, notably Neanderthals, as hairy and unkempt Cave Men has influenced how we interpret hairiness or hairlessness in Homo sapiens. Hair is extremely important in both favorable and unfavorable ways: hair can be a haven for disease and parasites; we need only look to the large amount of time that apes and monkeys spend grooming each other for lice, time that could be spent looking for food, learning about the environment, and practicing skills.

Growing hair requires energy. Our large human brain requires 20% of the energy that our body generates in order to power that brain. It could be that the growth of the modern brain (beginning with Homo erectus) was intricately tied up in a slow feedback cycle; the brain produces energy saving inventions (fire, tools, clothing, travel to more abundant environments) which means more energy to devote to the brain, which can increase brain connections, which makes increased technical innovation possible, which frees more energy for the brain. So, technology could be seen as part of streamlining the human animal into a energy-conserving species, which in turn improves brain function. In other words, the brain benefits from its own thinking when that thinking becomes a set of “apps” that manipulate the environment and the human body.

Meanwhile, what about facial hair? Personally, I’m thankful that I live in a time when men have the choice to grow, or not to grow.

 

____________________________________________________________________________

imagesKX09DYBA imagesLF2IM31T

 

 

 

 

Who’s Safe With a Gun? Don’t Ask a Shrink

The Daily Beast, May 2013 Background Checks

guns-mattel-swscan08492

Forget any guidance from psychiatry’s bible, the DSM-5, when it comes to background checks for gun buyers, writes the psychotherapist author of The Book of Woe. (Gary Greenburg)

Many years ago, a man I was seeing in therapy decided he wanted to take up a new hobby: high explosives. The state he lived in licensed purchasers of dynamite and other incendiaries only after a background check. He wanted to know: Would I write a letter declaring him fit to blow up stuff in his backyard for fun?

Aside from the fact that this was how he wanted to pass the weekend, I didn’t have any reason to think otherwise, so I gave him the note. He got the license. A few years after he stopped seeing me, I had occasion to visit him at his office. He had all his digits and limbs and, to my knowledge, had committed no antisocial acts with his legally obtained explosives. My note attesting to his mental health was framed on his wall.

I’ve been thinking about this guy recently, ever since our politicians’ imaginations have fastened upon background checks as the solution to our gun problems. I’ve also been thinking about a couple of other patients. One of them, a middle-aged professional, a ramrod-straight retired Marine, father of a little girl, faithful husband, the kind of man who buys a special lockbox just for transporting his weapon between home and gun club. The other: a 27-year-old hothead, an absentee father who never met a drug or a woman he didn’t like. His idea of fun was riding his motorcycle between lanes on the interstate at 100 mph, and he was the proud owner of (by his count) 37 guns. In the three years prior to arriving at my office, he’d been fired from four jobs, arrested for six or seven driving offenses and a few drug charges, and helped to bury three of his friends who met untimely and violent ends.

No one asked me which of these two men I’d rather was a gun owner, let alone which one ought to have a firearms license. But I know what my answer would have been. Or I would have known until about a year ago, when the ex-Marine, inexplicably and without warning (although he’d just been put on an antidepressant as part of a treatment for chronic pain), sat at the base of the tree holding his favorite deer perch and shot himself in the mouth. Meantime, the hothead has cooled down. He’s been with the same woman for two years and the same job for one. He sees his son faithfully twice a week. He’s sold his motorcycle and more than half of his guns, and become obsessed with bodybuilding and responsibility. The transformation is not complete—he’s still dead certain the government wants to come to his house and confiscate what’s left of his arsenal, for instance—and I can’t take too much credit for it. He’s pursuing the pleasures of self-control with the same manic intensity as he once chased adrenaline. But I’m not all that worried about his guns anymore, and I’m really glad no one asked me if he should have them.

Because one thing they don’t teach you in therapy school: how to tell the future. Clinicians can assemble a story out of the ashes of a person’s life; we might even be able to spot what we think are the seeds of catastrophe, but we generally do that best in retrospect. And that’s why, if one of us insists he or she knows for sure what’s coming next, you should find another therapist. It’s also why, to the extent that background checks involve people like me, it wouldn’t do much more than reassure politicians that they are doing something about gun violence without simultaneously threatening their National Rifle Association ratings.

But wait a minute, you may be saying. Don’t mental-health workers have a whole huge book of diagnoses to turn to that can help you assess a person’s fitness to own a gun? No, we don’t. We have the book, of course, the Diagnostic and Statistical Manual of Mental Disorders, which is about to come out in its fifth edition. But while some of those disorders seem incompatible with responsible gun ownership, even a diagnosis of a severe mental illness like schizophrenia or bipolar disorder isn’t a good predictor of who is going to become violent. Indeed, only about 4 percent of violent crimes are committed by mentally ill people. We are not going to diagnose our way to safety.

There’s a reason for this. A diagnosis of a mental disorder is only a description of a person’s troubles. A neurologist presented with a patient suffering loss of coordination and muscle weakness can run tests and diagnose amyotrophic lateral sclerosis or a brain tumor. They can explain the symptoms and predict with some accuracy what will happen as the disease takes its expected course. The 200 or so diagnoses in the DSM, on the other hand, explain little and predict less. Until the book contains a diagnosis called Mass Slaughter Disorder, whose criteria would include having committed mass slaughter, it’s not going to offer much guidance on the subject, and, obviously, what guidance it provides is going to come too late.

With the mentally disordered, as with all of us (and let’s remember that in any given year, something like 30 percent of us will meet criteria for a mental disorder, and 11 percent of us are on antidepressants right now), there is no telling what will happen next. No matter how many diagnoses are in the DSM, and no matter how astutely they are used, they will not tell us in whose hands guns are safe. The psyche is more unfathomable, and evil more wily, than any doctor or any book.

 

 

 

The most important “developmental” fact of life

is death.

It just happens: We grow old. It’s a natural progression, without doubt. But not in the U.S., of course, where openly denying death is a frenzied passion. Getting old is a crime in a society terrified of “growing up” and becoming adult.

Old people are proof of the most basic facts of life, so much so, that being old has become taboo. And if one lives to the “new” expectation of 80 or so, that means 30 years of life beyond the new “old age” of 50. That’s a long time to “fake” being “young, beautiful, athletic and sexy”. 

Growing old is tough enough without a “new” set of instructions; don’t look old, act old, get sick, become feeble or need help (unless that help is covered by insurance.) Don’t remind younger people, by your very presence, that there is an end; it is believed now that one can “look good” until the end – which will entail a short, or long, period of degeneration. This period of “old age” is rarely seen as a “good” time of life as valid as one’s childhood, young adulthood, or middle age, unless one has the funds to at least pretend to be “youngish”.

Contrary to popular American belief, it remains a fruitful time of personal development. As long as our bodies continue to function, learning and thinking continue to be what humans do.

If life has been one long illusion that only “social” rewards count, and life has been a display of materials owned, status achieved, people “bested”, then one will likely keep up the illusion, with whatever “solutions” the anti-aging industry has to offer.

I live in a town in which most people are “getting old” – not much opportunity for the young to work, to develop a career, to join the circus of material wealth and ambition. Traditionally, young people have returned to the area after college, and a stint in corporate America, time in the military, or success in finding a spouse. Having “grown up” in this unique place, it was where they chose to establish families and to be close to loved ones. The Wyoming landscape and lifestyle have always been a fundamental fact in this choice to return, and it pulls relentlessly on those who leave.

Disastrous policies, and frankly criminal wars, prosecuted from Washington D.C. in league with corporate-Wall Street crooks, and funded by abused taxpayers, demonstrate the general belief on both coasts that the people who inhabit the “rest of the U.S.” just don’t matter. We are indeed worthless and disposable inferiors willing to enrich a ruling class that despises them, and to literally die for “blood” profits in their service.

Our town needs new people to survive as a community; we need children and young families, but opportunity is lacking. Small businesses are closing and not reopening: the owners have retired and are dying off. Competition from online retailers has siphoned off local spending and old people have very little to spend anyway. Every dime goes to necessities and the obscene cost of healthcare.

The American dream left our town long ago. Wyoming’s existence has been plagued by Federal and corporate control from the beginning, when the railroad opened the West to outright looting of it’s resources by far away “global” entities. Pillage of the land and it’s resources funded the American coastal empires; exploitation of immigrants provided cheap labor. “Colonialization” by U.S. and European nations was not limited to the invasion of “foreign lands” but happened here also – and continues to this day.

Native Americans (not being suited to corporate life and labor) were killed off with conscious purpose – a policy of mass murder; the remnants confined to “reservations” where their descendants are expected to remain “invisible” – to whither away and to eventually die off, by a slow suicide of formerly unique human beings. Diversity? A smoke screen.

These thoughts occupy my meditations as I pass through a human being’s last opportunity for personal development. It’s a time of recognizing that the universe goes on without us; that our deepest questions will not be answered. It’s a time to understand that the individual cannot correct or improve much that goes on in an increasing cluttered and entangled social world, which doesn’t mean that we ought not try to improve our ourselves and our small areas of influence.  Our lives are eventually “finished” for us by nature, in disregard for our insistence that our life is essential to the universe and therefore, ought to go on forever.

____________________________________________

It is shocking to confront the fact that so much human effort, inventiveness, hard labor, suffering, and resource depletion was, and still is, devoted to the imaginary “immortality” of a few (not so admirable) individuals; Pharaohs, emperors, kings, dictators, war lords, ideologues, criminals, Popes and priests; not the best of humanity, but often the worst.

The big lie is an old lie: Immortality can be purchased. 

Yes, there is a pyramid for immortality-mortality also: The Pharaohs of our time will not be mummified. (A crude process of desiccation, which however has been wildly socially successful! They continue to be A -List celebrities that attract fans of the “rich and famous”.)

Today’s 1% equivalents will not be made immortal by being dried out like fish, cheese or jerky – no, they will be made “immortal” by means of “sophisticated” technology. What an advancement in human civilization! 

These immortality technologies, and lesser life extension, of replacements of organs and skeletal architecture, part by failing part, are being promoted as “mankind’s future” – What a lie! As if the today’s Pharaohs really intend to share their immortality with 15 billion humans!

timecover

2045: The year Man becomes Immortal. Right: All estimate 15 billion of us.

A few elite at the top may manage to purchase immortality of a limited sort: machines designed in their own image.

The mortal King Tut, a product of incest who died at age 19. How much human talent and potential has been wasted on fulfilling the fantasy of immortality for a predatory class of individuals?

It’s not King Tut, the Insignificant, who is immortal, but the lure of his “real estate” holdings, elite addresses, golden household furniture and knickknacks, layers of stone coffins, granite “countertops”, Jacuzzi bath tubs, fabulous jewelry, and rooms with a view of eternity, that keeps the envious modern social tourist coming back. 


This is not King Tut. This is a fabulous work of propaganda made by artisans, (Pharaohs had to impress the Gods in order to become a god – you wouldn’t show up for “judgement day” in anything less than the most impressive selections from your wardrobe) who rarely get credit (nameless) for their “creation of brands and products” that supply the magical connections necessary for supernatural belief in the pyramid of social hierarchy as the “definitive and absolute model” of the cosmos.  

Magic consists of the “transfer of power” between the “immortal mask” and the unimpressive person; the “mask” has become King Tut in the belief system of the socially-obsessed viewer.  

 

 

Mystified Asperger Female / Sexual Assault and the Media

I shouldn’t have to say this, but I will: Any assault on another person is an assault. The “measure of severity” and consequence-punishment is a socio-cultural determination. Sexual assault has traditionally been considered a separate and “special” case, with various cultures having very different attitudes, customs and laws surrounding “who owns” a person’s body. It is a subject basic also to slavery; slavery is “ownership” of body, soul and mind” of another human being. Traditionally, females have been subject to “ownership”, from outright slavery, to marriage customs to “simply being inferior” by virtue of being biologically female – and by supposedly being little more than a child in “intelligence” and self-actuation. This has been the social condition of females for all of recorded history.

Much of how modern social humans “view” sex – and the myriad complications heaped on what is a biologic necessity – by hundreds of thousands of discussions, negotiations, codes, laws, practices, controls, moral-ethical stances, criminal statutes, marriage contracts and the consequent “control” of children, is rooted in this concept of “ownership”.

The qualitative and functional hierarchy goes like this:

Men own women.

Men own children, because men own women.

Men choose when and where to have sex with women and children.

UH-OH! That’s a recipe for male-on-male conflict, which is of immense threat to society.

The hierarchy forms:

Top Males choose for lesser males. (The history of male access to females is clear about this being extremely important). There’s a distinct “Top Predator” hierarchy of “sexual privilege”. That hierarchy of restricted access to sex is one very big reason why males want to be “Top Males”. (See Ghenghis Khan and Y haplogroup)

This “set up” hasn’t changed, just because a bunch of American women have decided, in the last century or so, that this is a fundamentally “bad system” for females. (Me included). The campaign for equality with men; in fact, opportunity and aspiration, has largely been the purview of women who have had the opportunities for education, work and personal expression due to family circumstance and expectations. Class distinctions.

The current “eruption” of female anger toward an inarguably predatory “sexual” culture, are women who have managed to “gain some measure of power” – in media, politics, entertainment – essentially $$$$. It’s politics, pure and simple. Why wouldn’t women who have gained a foothold in the status, power, and wealth hierarchy not “turn on” males, who are now their “equals” in politics, business and media-entertainment; that is, “competitors”? And, the traditional male hierarchy “permits” and even requires that younger males “knock off” Top Males who are “declining in potency”.

Meanwhile. What about the other 99% of men and women?

Most cannot afford to do anything but “slog on” trying to find the ways and means to have a decent life. A revolution is well underway that affects all of us. Men benefit from having strong female partners; they must learn not to abuse women who are adding much “good” to their lives. At the risk of being “optimistic”, which goes against my practical Asperger instincts, I would say that most men understand this, but they are up against the male “way of being” as dictated by thousands of years of cultural tradition, in a way that is fundamentally different than the experience of females. Males are “somebody” by virtue of being male. No matter how low on the pyramid they fall, there is always 50% of the population they “outrank”. Embracing equality requires a profound individual rejection of male tyranny.

 

 

Blood-Sucking Parasites in D.C. / Know Your Predators

IMG_0228 Elf You!

Social Security, Treasury target taxpayers for their parents’ decades-old debts

April 10, 2014

A few weeks ago, with no notice, the U.S. government intercepted Mary Grice’s tax refunds from both the IRS and the state of Maryland. Grice had no idea that Uncle Sam had seized her money until some days later, when she got a letter saying that her refund had gone to satisfy an old debt to the government — a very old debt. When Grice was 4, back in 1960, her father died, leaving her mother with five children to raise. Until the kids turned 18, Sadie Grice got survivor benefits from Social Security to help feed and clothe them.Now, Social Security claims it overpaid someone in the Grice family — it’s not sure who — in 1977.  After 37 years of silence, four years after Sadie Grice died, the government is coming after her daughter. Why the feds chose to take Mary’s money, rather than her surviving siblings’, is a mystery. Across the nation, hundreds of thousands of taxpayers who are expecting refunds this month are instead getting letters like the one Grice got, informing them that because of a debt they never knew about — often a debt incurred by their parents — the government has confiscated their check. The Treasury Department has intercepted $1.9 billion in tax refunds already this year — $75 million of that on debts delinquent for more than 10 years, said Jeffrey Schramek, assistant commissioner of the department’s debt management service. The aggressive effort to collect old debts started three years ago

— the result of a single sentence tucked into the farm bill lifting the 10-year statute of limitations on old debts to Uncle Sam. No one seems eager to take credit for reopening all these long-closed cases.

A Social Security spokeswoman says the agency didn’t seek the change; ask Treasury. Treasury says it wasn’t us; try Congress. Congressional staffers say the request probably came from the bureaucracy.

The only explanation the government provides for suddenly going after decades-old debts comes from Social Security spokeswoman Dorothy Clark: “We have an obligation to current and future Social Security beneficiaries to attempt to recoup money that people received when it was not due.”Since the drive to collect on very old debts began in 2011, the Treasury Department has collected $424 million in debts that were more than 10 years old. Those debts were owed to many federal agencies, but the one that has many Americans howling this tax season is the Social Security Administration, which has found 400,000 taxpayers who collectively owe $714 million on debts more than 10 years old. The agency expects to have begun proceedings against all of those people by this summer.“It was a shock,” said Grice, 58. “What incenses me is the way they went about this. They gave me no notice, they can’t prove that I received any overpayment, and they use intimidation tactics, threatening to report this to the credit bureaus.”
Grice filed suit against the Social Security Administration in federal court in Greenbelt this week, alleging that the government violated her right to due process by holding her responsible for a $2,996 debt supposedly incurred under her father’s Social Security number.
Social Security officials told Grice that six people — Grice, her four siblings and her father’s first wife, whom she never knew — had received benefits under her father’s account. The government doesn’t look into exactly who got the overpayment; the policy is to seek compensation from the oldest sibling and work down through the family until the debt is paid. 
 

The Federal Trade Commission, on its Web site, advises Americans that “family members typically are not obligated to pay the debts of a deceased relative from their own assets.” But Social Security officials say that if children indirectly received assistance from public dollars paid to a parent, the children’s money can be taken, no matter how long ago any overpayment occurred.

“While we are responsible for collecting delinquent debts owed to taxpayers, we understand the importance of ensuring that debtors are treated fairly,” Treasury’s Schramek said in a statement responding to questions from The Washington Post. He said Treasury requires that debtors be given due process.

Social Security spokeswoman Clark, who declined to discuss Grice’s or any other case, even with the taxpayer’s permission, said the agency is “sensitive to concerns about our attempts to arrange repayment of overpayments.” She said that before taking any money, Social Security makes “multiple attempts to contact debtors via the U.S. Mail and by phone.”

 

 

American Pop Chart Toppers / 1940-2016 WEIRD!

What a strange trip! Pretty damn “kitschy” 

I think Americans are the weirdest people on the planet, but in our own estimation, we set the standard for NORMAL. Aye, yai, yai!

 

J.E. Robison / Where has all the Autism funding gone?

I don’t follow John Elder: I do understand that he’s tried to work within the “official Autism community” to produce change. It seems he’s finally waking up to the exploitation-for-profit program that is the Autism Industry.

Sex, Lies, and Autism Research—Getting Value for Our Money

How can we get tangible benefit from the millions we spend on autism science? (No, it’s not science; it’s a business.)

The U.S. government is the world’s biggest funder of autism research.  For the past decade I have had the honor of advising various agencies and committees on how that money should be spent. As an adult with autism, sometimes I’ve been pleased at our government’s choices. Other times I’ve been disappointed. Every now and then I turn to reflect: What have we gotten for our investment?

Autistic people and their parents agree on this: The hundreds of millions we’ve spent on autism research every year has provided precious little benefit to families and individuals living with autism today. Over the past decade the expenditures have run into the billions, yet our quality of life has hardly changed at all.

It would be one thing if massive help was just around the corner, but it’s not. There are no breakthrough medicines or treatments in the pipeline. Autistic people still suffer from GI pain, epilepsy, anxiety, depression, and a host of other issues at the same rates we did before any of this research was funded.

I don’t mean to suggest that nothing has been accomplished.  Scientists have learned a lot. They know more about the biological underpinnings of autism. Researchers have found hundreds of genetic variations that are implicated in autism. We’ve quantified how autistic people are different with thousands of studies of eye gaze, body movement, and more. Scientists are rightly proud of many of their discoveries, which do advance medical and scientific knowledge. What they don’t do is make our lives better today. (Sorry John, that you feel you still need to “support” a corrupt system by buying into false claims of scientific progress or the value of bogus research.)

Why is that?

In the past I’ve written about the idea that taxpayer-funded research should be refocused on delivering benefit to autistic people. What I have not written about, is why that hasn’t happened, at the most fundamental level.

The answer is simple: Until quite recently, autistic people were not asked what we needed.

There are many reasons for that. Autism was first observed in children and no one expects children to have adult insight and self-reflection. When autism was recognized in adults, they were assumed to be too cognitively impaired to participate in conversations about their condition. Finally, in the spirit of the times, doctors often assumed that they knew best. They were the trained professionals, and we were the patients (or the inmates.) (Are we confusing “medical” doctors with non-medical psychologists? )

So doctors studied every question they could imagine, and then some, seldom seeking our opinions except in answer to their research questions. They assumed they knew what “normal” was, and we weren’t it. Countless million$ went down the rabbit hole of causation studies, whether in genetics, vaccines, or other environmental factors. Don’t get me wrong—the knowledge we’ve gotten is valuable for science. (Not really! It’s been valuable for the funding of universities, academics and research institutions) It just did not help me, or any autistic person I know. (It wasn’t INTENDED to help “autistic” people or their families).

Millions more have been spent observing us and detailing exactly the ways in which we are abnormal. Only recently have some scientists began to consider a different idea: Perhaps “normal” is different for autistic people, and we are it. Again the studies enhanced the scientists’ knowledge (of how to profit from human suffering) but didn’t do much to help us autistics.

Then there are the educators and psychologists. They observed our “deviations” and then considered therapy to normalize us. That led to ABA and a host of other therapies. Some of those have indeed been beneficial, but the money spent on beneficial therapy is just a drop in the bucket when considering what we taxpayers have funded overall.

Want a different and better outcome? Ask actual autistic people.

We can tell you what our problems are, in many cases very eloquently. I’m not going to re-state all our needs here. I’ll tell you this: Whenever this topic comes up at IACC (the Federal committee that produces the strategic plan for autism for the U.S. government), the priorities of autistic people seem rather different from those of the researchers our government has been funding for so long. (It’s a corrupt system; part of the general policy to redistribute wealth “up to” the 1%).

Autistic people have many disparate needs, but they all boil down to one thing: We have major challenges living in American society. Medical problems, communication challenges, learning difficulties, relationship issues, and chronic unemployment are all big deals for us. The issues are well laid out and many.

Before autistic people began speaking out in great numbers, all we had was parent advocacy. We should not dismiss that, and parents still have a role today, particularly in advocacy for small children and children who are older but unable to effectively advocate for themselves.

Even as we thank parents for their service, it’s time to recognize autistic voices (some of which belong to parents too) should be taking the lead.

As much as parents did for us, they also unwittingly contributed to harm. Parents misinterpreted harmless stimming, and encouraged therapists to suppress it, leaving us scarred in adulthood. Many autistics of my generation remember being placed into programs for troubled children with parental encouragement in hopes we’d become “more normal.” We didn’t. Parents have given us bleach enemas, and some of us have died from misguided chelation and other treatments to “cure” our autism.

I don’t blame parents for any of that. They did their best, given the knowledge of the day. But it’s a different day now. The children who grew up being “normalized” can talk about how it affected them, and parents and clinicians of today would be wise to listen.

Autistic voices are finally speaking in large numbers and it’s time to pay attention. No one else knows life with autism. Parents and no-autistic researchers are sometimes listening. Hard as this may be for them to hear, they are always guessing. With autistics speaking out all over the world, that’s no longer good enough.

For the first time, IACC has recognized this in the 2017 Strategic Plan Update. They say it’s time for a paradigm shift in how we do research. We need to focus on the needs of people living with autism today. That’s a realization that I appreciate, and it’s long overdue. (OMG! Please don’t fall for this universal neurotypical ploy: We wrote it down: SEE? End of story.)

So what’s the answer to why we’ve gotten so little return on our autism research investment: No one asked the autistic people what we wanted. It’s that simple. Had we been able to articulate our challenges, with the framework of knowledge we have today, and had we been listened to, we’d be in a very different place today.

Today is gone, but tomorrow isn’t here yet, and it can be different.

(c) John Elder Robison (Thank-you John for “stepping up” to the truth.)

John Elder Robison is an autistic adult and advocate for people with neurological differences. He’s the author of Look Me in the Eye, Be Different, Raising Cubby, and Switched On. He serves on the Interagency Autism Coordinating Committee of the U.S. Dept. of Health and Human Services and many other autism-related boards. He’s co-founder of the TCS Auto Program (a school for teens with developmental challenges), and he’s the Neurodiversity Scholar in Residence at the College of William and Mary in Williamsburg, Virginia, and a visiting professor of practice at Bay Path University in Longmeadow, Massachusetts.
The opinions expressed here are his own.

_________________________________________________

What more does the Autism Industry need?

Director of the Institute of Mental Health declares that Autism is a “real” epidemic and not due to changes in labels, diagnostic criteria and fear-mongering. No objective evidence needed when you have the Federal Government working FOR YOU. 

TACA is an “interesting NON-PROFIT – check out their website and the financial statements they provide. Hard to find out how much $$$ actually filters down to real people outside the “charity”. Here’s their “agenda”. Note the cliché about someday finding a “cure” which is not going to happen: creates a classic “American Non-Profit” demand for “donations” and funding in perpetuity. Think of all those “charities” that have collected billions for “research” etc, without a “cure” in sight.