Child Abuse Around the World / Video (Warning)

Child abuse as socially-sanctioned religious practice… social humans! Sick behavior.

Advertisements

Defining Autism / The Unicorn Problem

Fantastically Wrong: The Weird, Kinda Perverted History of the Unicorn (Substitute Autism for Unicorn Myth in the following:)

https://www.wired.com/2015/02/fantastically-wrong-unicorn/

If you’re looking to figure out how an ancient myth started to get out of hand, a good place to start is with the great Roman naturalist Pliny the Elder, whose epic encyclopedia Natural History stood largely as fact for some 1,600 years. Problem was, Pliny wasn’t the most incredulous of writers, and crammed his encyclopedia with pretty much any account he could get his hands on. Autism is born – Autism as we know it…1980- Autism was added to the Diagnostic and Statistical Manual of Mental Disorders- Third Edition (DSM-III) as “infantile autism”. This addition made it possible for doctors to accurately diagnose Autism and gave the ability to easily differentiate Autism from schizophrenia. 1987- “Autistic Disorder” replaced “Infantile Autism” in the manual and gave a more expansive explanation of the diagnosis. 1991- Schools begin to identity and serve students with Autism following the federal government decision to make Autism a special education category. ($$$$$)

The unicorn,” Pliny wrote, “is the fiercest animal, and it is said that it is impossible to capture one alive. It has the body of a horse, the head of a stag, the feet of an elephant, the tail of a boar, and a single black horn three feet long in the middle of its forehead. Its cry is a deep bellow.”

Note the ever-increasing list of Autism “symptoms” (behaviors) without any logical coherence, just like the growing fantasy around the unicorn. The list grows and grows more fabulous with each recounting of the mythical beast called Autism. 

‘Unicorns huh? You look more like a couple of party animals to me.’

The unicorn then shows up in various places in the Bible, at least according to some translations (it’s sometimes instead referred to as the oryx, a kind of antelope whose antlers were indeed sold as unicorn horns in medieval times, or as the auroch, a massive type of cattle that went extinct in the 17th century). Here, its fierceness is affirmed. In Numbers 24:8, for instance: “God brought him forth out of Egypt; he hath as it were the strength of an unicorn: he shall eat up the nations his enemies, and shall break their bones, and pierce them through with his arrows.”

In the 7th century, the scholar Isidore of Seville chimed in, noting that the unicorn “is very strong and pierces anything it attacks. It fights with elephants and kills them by wounding them in the belly.” He also helped popularize the myth that would serve as a hallmark in European folklore for centuries to come: Catching a unicorn is impossible…unless you have access to a virgin woman. “The unicorn is too strong to be caught by hunters,” he writes, “except by a trick: If a virgin girl is placed in front of a unicorn and she bares her breast to it, all of its fierceness will cease and it will lay its head on her bosom, and thus quieted is easily caught.” It’ll suckle until it’s lulled to sleep. So…yeah.

Note the “behavior-based” description of the unicorn – subjective, imaginary, wildly illogical, supernatural – not a “real” animal at all.  

…Not only was the natural history of the animal given, but each was then compared to a biblical figure. And the unicorn stood for Christ, since he was captured and put to death like the unicorn is done in by the virgin (though pretty much every other animal was also compared to Christ, even the pelican, which was said to peck at its own breast to revive its young with blood, like Jesus shed his own blood for us).

Thus the unicorn became firmly implanted in  European lore. And Autism in American Psychology lore. What followed was a full-blown mania for their horns, which were said to detect poison if you stirred them around in your food or drink. They went for tens of thousands of dollars in today’s money, and were particularly popular among paranoid royalty. More industrious users who didn’t want to wait around to have their food poisoned would grind up the horns—usually those of the oryx or narwal (whose horn is actually a giant tooth)—to gain immunity from toxins.

Over in the East, royalty had a rather more complicated relationship with their version of the unicorn, the aforementioned kirin, or qilin. Its appearance was said to foretell the birth of a royal baby, which is nice of it, but can also predict an imminent death, which is not so nice. In the 15th century, a giraffe was brought to China for the first time and presented to the emperor as a kirin, which was a gutsy move considering its proclivities for letting royalty know they’re going to die soon. The emperor, though, dismissed it as a fraud and went on to live another 10 years. Does a giraffe look anything like the mythical unicorn? And yet … maybe it does! Who knows? But unicorns must surely exist? 

No one seems to notice throughout this sad charade, that just like the unicorn, the mythical beast called Autism, does not require proof of its existence. No such animal called the unicorn was ever shown to be real; nothing more than a barrage of  anecdotal reports, subjective opinions and imaginary conclusions have ever been  presented by psychologists.   

A Myth Is Born / “Autism, 1994”

The myth of the unicorn may have come from sightings of antelope and such ungulates with only one horn, having either been born with the defect or lost the horn when scrapping with a predator or one of its own kind. Less likely still is seeing a normal antelope from afar in profile, since that would only last as long as the animal didn’t move.

Reality is of no importance; social typical inattentional blindness conveniently “denies” any physical evidence that is contrary to social dogma.  

A far more likely culprit is the Indian rhinoceros, and clues for this are sprinkled throughout the early accounts—indeed, the unicorn is sometimes referred to as the Indian ass. Pliny, for instance, mentions that the unicorn has “the feet of an elephant,” a rhino’s feet in fact being not hooved like a horse’s, but fleshy like an elephant’s. He also notes that it has “the tail of a boar,” much like a rhino’s, “and a single black horn three feet long in the middle of its forehead.” Writers would only later describe the horn as white. Evidence? What evidence? Neurotypicals can look a rhinoceros in the face and call it a moose. If authorities say it is a duck, they will then call it a duck – or an Autistic duck.   

The ancient Greeks and Romans, you see, had been making forays into India and bringing back tales of the strange beasts there, and the facts tended to get a bit…lost. Cotton, for instance, was said to grow in India as an actual lamb that sprouted from the ground, just hanging there patiently producing cotton. And while Pliny actually did a pretty good job of describing the rhino, his popularization of the “unicorn” picked up more and more improbabilities as the centuries wore on. We also know that the ancient Chinese had contact with rhinos from art made out of their horns, so the animal could well have also inspired the kirin.

 

 

The tragedy of this story is that people who ought to know better – medical doctors, geneticists, and neuroscientists, have “bought” the myth of Autism as a “thing” in itself, when it is merely a collection of symptoms due to real and specific and different causes.

Symptoms can be as “supernatural” as a brain that “lights up wrong” – or social difficulties judged to “be annoying” by self-centered adults; parents, teachers and Puritanical psychologists who demand obedience. Who enforce an unsuccessful social behavior regime specific to “certain Americans” but which is alien to many “diverse” groups, cultures and child-raising traditions. Kids are being declared “defective” by opinionated psychologists and miscellaneous imperious adults on the basis of utterly subjective criteria.

Autism is not a “disease, mental illness, or even a MEDICAL DIAGNOSIS. It is a grab bag of social behavior that is rejected by psychologists and proclaimed to be “defective”. It has become fear-induced hysteria in the U.S. 

Autism Spectrum Disorder  / DSM 5 299.00 (F84.0)

Diagnostic Criteria (What a joke!)

A.      Persistent deficits in social communication and social interaction across multiple contexts, as manifested by the following, currently or by history (examples are illustrative, not exhaustive, see text): This is utterly subjective; depends on the “opinion” of the person doing the “reporting” – hearsay evidence; not admissible in a court of law, but “good enough” for labeling a child as defective. No standards for comparison are provided: “behaviors” are not-quantifiable; those listed vary wildly from family to family and culture to culture. 

1.       Deficits in social-emotional reciprocity, ranging, for example, from abnormal social approach and failure of normal back-and-forth conversation; to reduced sharing of interests, emotions, or affect; to failure to initiate or respond to social interactions.Abnormal and reduced compared to what “standard”? There are no objective criteria in these judgements. None of these blah, blah, blah criteria are even testable!

2.       Deficits in nonverbal communicative behaviors used for social interaction, ranging, for example, from poorly integrated verbal and nonverbal communication; to abnormalities in eye contact and body language or deficits in understanding and use of gestures; to a total lack of facial expressions and nonverbal communication. So vague, arbitrary and subjective as to be ridiculous. Where are the objective standards and “proof” that the vast majority of human children conform to these “undefined but absolutist” subjective interpretations of behavior? Where is the proof that any two people observing a child, will even agree with each other that these observations are factual? No “facts” are allowed!

3.       Deficits in developing, maintaining, and understanding relationships, ranging, for example, from difficulties adjusting behavior to suit various social contexts; to difficulties in sharing imaginative play or in making friends; to absence of interest in peers. WOW! This paragraph is so general that it could apply to any generic human being alive on planet earth.

Specify current severity:

Severity is based on social communication impairments and restricted repetitive patterns of behavior (see Table 2).

B.      Restricted, repetitive patterns of behavior, interests, or activities, as manifested by at least two of the following, currently or by history (examples are illustrative, not exhaustive; see text):

Any child could be diagnosed as autistic using this “potpourri” of “socially objectionable” activities!

1.       Stereotyped or repetitive motor movements, use of objects, or speech (e.g., simple motor stereotypies, lining up toys or flipping objects, echolalia, idiosyncratic phrases). If you throw enough non-related behaviors at a wall, and this is quite a mix of motor, language, organizational, speech and cognitive behaviors!, then one will likely “stick”. 

2.       Insistence on sameness, inflexible adherence to routines, or ritualized patterns or verbal nonverbal behavior (e.g., extreme distress at small changes, difficulties with transitions, rigid thinking patterns, greeting rituals, need to take same route or eat food every day). Describes half the people in any office environment every day; and 100% of humans at some point, depending on “what kind of day” they are experiencing. 

3.       Highly restricted, fixated interests that are abnormal in intensity or focus (e.g, strong attachment to or preoccupation with unusual objects, excessively circumscribed or perseverative interest). Such as obsessions that “normal kids” display: playing video games, constant social media monitoring, texting, uploading selfies, extremely limited self-image conformity, incessant talking, cruelty to other children, foul language, and mutual verbal abuse – but these are “not unusual” and therefore acceptable. But that child who likes to arrange toys by size and color? A threat to the social order…

4.       Hyper- or hyporeactivity to sensory input or unusual interests in sensory aspects of the environment (e.g., apparent indifference to pain/temperature, adverse response to specific sounds or textures, excessive smelling or touching of objects, visual fascination with lights or movement). Any reactions to the environment other than those displayed by a “perfect child” (a being as mythical as the Unicorn), are pathological. The perfect child notices nothing in the environment, even if it is dangerous, toxic or extraordinarily beautiful. No personal preferences are allowed. 

Specify current severity:

Severity is based on social communication impairments and restricted, repetitive patterns of behavior (see Table 2).

C.      Symptoms must be present in the early developmental period (but may not become fully manifest until social demands exceed limited capacities, or may be masked by learned strategies in later life). How non-specific! Anywhere, anytime, any behavior, observable or not – pathological. 

D.      Symptoms cause clinically significant impairment in social, occupational, or other important areas of current functioning. What constitutes clinically significant? Your clinic, or mine? The gym teacher’s or school counselor’s office? The BA in psychology who “does” therapy? Anyone and everyone is “diagnosing” Autism today. 

E.       These disturbances are not better explained by intellectual disability (intellectual developmental disorder) or global developmental delay. Intellectual disability and autism spectrum disorder frequently co-occur WOW! How lame: Autism “symptoms” are the RESULT of specific disabilities due to birth defects, premature birth, emotional and physical trauma, injury or disease, rare genetic conditions and myriad unknown “causes” – including the fact that human beings are not identical “clones” of imaginary supernatural “templates” but individuals with a range of personalities and temperaments AND BRAIN ORGANIZATION; to make comorbid diagnoses of autism spectrum disorder and intellectual disability, social communication should be below that expected for general developmental level. Who sets the standards and parameters for “judging” and “rating” childhood behavior? Just who are these people? No one asks; neurotypicals accept social authority as being reliable in the same way that the Laws of Physics are reliable.  

Note: Individuals with a well-established DSM-IV diagnosis of autistic disorder, Asperger’s disorder, or pervasive developmental disorder not otherwise specified should be given the diagnosis of autism spectrum disorder. Individuals who have marked deficits in social communication, but whose symptoms do not otherwise meet criteria for autism spectrum disorder, should be evaluated for social (pragmatic) communication disorder. Wow! Nothing like throwing you own previous diagnostic criteria under the bus! Why should anyone trust the existence of Autism to start with?

Specify if:
With or without accompanying intellectual impairment
With or without accompanying language impairment
Associated with a known medical or genetic condition (which is the ACTUAL PROBLEM) or environmental factor
(Coding note: Use additional code to identify the associated medical or genetic condition.)
Associated with another neurodevelopmental, mental, or behavioral disorder
(Coding note: Use additional code[s] to identify the associated neurodevelopmental, mental, or behavioral disorder[s].)
With catatonia (refer to the criteria for catatonia associated with another mental disorder, pp. 119-120, for definition) (Coding note: Use additional code 293.89 [F06.1] catatonia associated with autism spectrum disorder to indicate the presence of the comorbid catatonia.)

Codes – this Autism Diagnosis charade is driven by insurance reimbursement: it the CODE that matters, not the accuracy of the diagnosis.  

“Autism” is a selective symptom-based system for socially pathologizing and isolating children who suffer from a range of effects previously known as “mental retardation” – conditions caused by real physical impairments.
Today, Autism has grown to be a highly profitable industry, by sweeping inclusion of non-medical, non-scientific labeling of children who are not “retarded” but “socially unacceptable” using vague and subjective opinions dictated by overblown psychological dogma, which is “prescriptive” in origin and structure. It is a “super-fraud” that destroys families and dooms children to a lifetime of social “subhuman” status. And prevents effective provisions for real treatment for real problems experienced by real people.   

Table 2  Severity levels for autism spectrum disorder “The Table of Social Doom”

Severity level Restricted, repetitive behaviors
Level 3
“Requiring very substantial support”
Severe deficits in verbal and nonverbal social communication skills cause severe impairments in functioning, very limited initiation of social interactions, and minimal response to social overtures from others. For example, a person with few words of intelligible speech who rarely initiates interaction and, when he or she does, makes unusual approaches to meet needs only and responds to only very direct social approaches
Inflexibility of behavior, extreme difficulty coping with change, or other restricted/repetitive behaviors markedly interfere with functioning in all spheres. Great distress/difficulty changing focus or action.
Level 2
“Requiring substantial support”
Marked deficits in verbal and nonverbal social communication skills; social impairments apparent even with supports in place; limited initiation of social interactions; and reduced or  abnormal responses to social overtures from others. For example, a person who speaks simple sentences, whose interaction is limited  to narrow special interests, and how has markedly odd nonverbal communication.
Inflexibility of behavior, difficulty coping with change, or other restricted/repetitive behaviors appear frequently enough to be obvious to the casual observer and interfere with functioning in  a variety of contexts. Distress and/or difficulty changing focus or action.
Level 1
“Requiring support”
Without supports in place, deficits in social communication cause noticeable impairments. Difficulty initiating social interactions, and clear examples of atypical or unsuccessful response to social overtures of others. May appear to have decreased interest in social interactions. For example, a person who is able to speak in full sentences and engages in communication but whose to- and-fro conversation with others fails, and whose attempts to make friends are odd and typically unsuccessful.
Inflexibility of behavior causes significant interference with functioning in one or more contexts. Difficulty switching between activities. Problems of organization and planning hamper independence.

Self-Organizing Systems / Beginner Video

Note the archaic belief in an External Agent as the source of order: this is where today’s “social systems” and explanations for human behavior are STUCK. 

 

 

Psychotropic Drug Prescriptions / Link to Suicide, Violence

Let’s face it: The “helping, caring, fixing” industry has a policy of “carpet-bombing” American children and adults with dangerous and lethal drugs and with absolutely no regard for human life – WHY?

PDF : https://http://www.veterans.senate.gov

A REVIEW OF HOW PRESCRIBED PSYCHIATRIC MEDICATIONS COULD BE DRIVING MEMBERS OF THE ARMED FORCES AND VETS TO ACTS OF VIOLENCE & SUICIDE

A Report by Citizens Commission on Human Rights International, April 2014

INTRODUCTION

The recent tragedies at Fort Hood and the Washington, D.C. Navy Yard are deeply concerning because of the increasing reports of military and veteran violence and suicide in our Armed Forces. Though there can be many reasons for killing oneself or others, the possible role of psychiatric drugs in these tragedies has not been effectively explored. It would be a serious mistake to ignore this factor.

  • Researchers have identified 25 psychiatric medications disproportionately associated with violence, including physical assault and homicide.
  • There are 22 international drug-regulatory agency warnings about these medications causing violent behavior, mania, psychosis and homicidal ideation.
  • There are almost 50 international drug-regulatory agency warnings about psychiatric drugs causing suicidal ideation.
  • One in six American service members were taking at least one psychiatric medication in 2010. More than 110,000 Army personnel were given antidepressants, narcotics, sedatives, antipsychotics and anti-anxiety drugs while on duty in 2011.3

2008-2010

  • Between 2005 and 2011 the military increased its prescriptions of psychoactive drugs (antipsychotics, sedatives, stimulants and mood stabilizers) by almost 700 percent, according to The New York Times.
  • Prescriptions written for antipsychotic drugs for active-duty troops increased 1,083 percent from 2005 to 2011, while the number of antipsychotic drug prescriptions in the civilian population increased just 22 percent.5
  • The Department of Defense Suicide Event Reports (DoDSERs) for 2012 reported that the Armed Forces Medical Examiner System (AFMES) found that as of 31 March 2013, there were 319 suicides among Active component Service members and 203 among Reserve component Services members. 92.8 percent of the Service Members were male, with 39.6 percent aged between 17 and 24.
  • DoDSERs were only included in this report if they were submitted by April 1, 2013 and thus there are discrepancies between the fi gures reported by the AFMES and the number of DoDSERs included in the DoDSER 2012 report. In addition, there were some DoDSERs that were submitted for events that were still pending a final determination as a suicide.
  • A total of 841 Service members had one or more attempted suicides reported in the DoDSER program for CY 2012.
  • Some 134 suicide DoDSERs (42.1 percent) and 452 suicide attempt DoDSERs (52 percent) indicated a history of a behavioral disorder.
  • The reports also indicated that “93 decedents (29.2 percent) were reported to have ever taken psychotropic1 medications. A total of 63 decedents (19.8 percent) were known to have used psychotropic medications within 90 days prior to suicide.” However, this is likely to be much higher as almost 21 percent of both the “Ever Taken Psychotropic Medication” and the “Use of Psychotropic Medication last 90 days” questions were answered with “Data Unavailable.” Potentially up to 50 percent of those committing suicide had at some point taken psychiatric drugs and up to nearly 46 percent had taken them within 90 days.6

Psychotropic: A term coined in the late 1940s by Ralph Waldo Gerard, an American behavioral scientist and physiologist to medically describe medication capable of affecting the mind, emotions, and behavior—from the Greek, “mind-turning.”

  • The majority (55 percent) of service members who died by suicide during 2008-2010 had never deployed and 84 percent had no documented combat experiences. In the 2012 DoD Suicide Event report on suicide, 52.2 percent of completed suicides had not been deployed in the recent wars and 56.5 percent of suicide attempts had no reported history of deployment.
  • The suicide rate increased by more than 150 percent in the Army and more than 50 percent in the Marine Corps between 2001 to 2009. From 2008 to 2010, military suicides were nearly double the number of suicides for the general U.S. population, with the military averaging 20.49 suicides per 100,000 people, compared to a general rate of 12.07 suicides per 100,000 people.10
  • There are hundreds of “sudden deaths” among veterans that have been prescribed massive cocktails of psychotropic1 drugs, which a leading neurologist says are “probable sudden cardiac deaths.” Yet the practice of prescribing seven or more drugs documented to cause cardiac problems, stroke, violent behavior and suicide (to name but a few of the adverse effects) is still prevalent.

PSYCHOTROPIC MEDICATIONS: ACTS OF VIOLENCE

  • FORT HOOD GUNMAN IVAN LOPEZ, 34, was taking Ambien, a sleep agent, and other psychiatric drugs for depression and anxiety when he shot dead three colleagues and injured 16 others before killing himself on April 2, 2014.11
  • WASHINGTON NAVY YARD SHOOTER AARON ALEXIS, 34, had been prescribed Trazodone killed 12 people and wounded 8, before being killed by police on Sept. 16, 2013.12
  • SOLDIER PFC. DAVID LAWRENCE, 20, and MARINE LANCE CPL. DELANO HOLMES were both taking Trazodone and other psychiatric medications when they killed a Taliban commander in his prison cell and an Iraqi soldier respectively.

PSYCHOTROPIC MEDICATIONS: VIOLENCE RISKS

  • It is important to understand that the mental health system for our Armed Forces and veterans often involves the use of psychotropic and neuroleptic2 drugs. Between 2001 and 2009, orders for psychiatric drugs for the military increased seven-fold.14 In 2010, the Army Times reported that one in six service members were taking some form of psychiatric drug.15
  • A National Institutes of Health website warns consumers to report if while taking Trazodone—one of the drugs prescribed the Navy Yard shooter—they are “thinking about harming or killing yourself,” experience “extreme worry; agitation; panic attacks…aggressive behavior; irritability; acting without thinking; severe restlessness; and frenzied abnormal excitement….”
  • Psychologists have blamed the surge in random acts of violence among U.S. military on the heavy use of prescribed drugs. “We have never medicated our troops to the extent we are doing now …And I don’t believe the current increase in suicides and homicides in the military is a coincidence,” states Bart Billings, a former military psychologist and combat stress expert.
  • The Food and Drug Administration (FDA) MedWatch system that collects adverse drug reports revealed that between 2004 and 2012, there were 14,773 reports of psychiatric drugs causing violent side effects including: 1,531 (10.4 percent) reports of homicidal ideation/homicide, 3,287 (22.3 percent) reports of mania and 8,219 (55.6 percent) reports of aggression.
  • Dr. David Healy, a psychiatrist and a former secretary of the British Association for Psychopharmacology estimates that 90 percent of school shooters were users of antidepressants. These same medications are prescribed to at least 6 percent of our servicemen and women.

Supporting Information

“We have never medicated our troops to the extent we are doing now… The current increase in suicides and homicides is no coincidence.”

-Dr. Bart Billings, Fmr. Col. & Army Psychologist

This PDF has 34 pages of horrifying information, charts and statistics KNOWN to the VA, Congress and the “empathy experts” who are drugging our soldiers and destroying families.

 

 

Today’s Message from the Unconscious / Yes, it happens

“There is no way that as a human being, you won’t disturb the Earth.”

I have related in a few previous posts, how my “mind works” (and everyone’s does, actually; but you have to listen for the products of the unconscious, in order to make them conscious.) I enjoy sleep; it’s an active state of rest, refreshment and dreams. Powerful thinking goes on; a type of thinking much older than conscious verbal thought. A direct link to collective memory – evolutionary memory. A vast reservoir that is encoded along with all the myriad instructions that begin to build a human body within a woman’s body – and after birth must be utilized to grow the infant into an adult form.

Traditional so-called primitive cultures keep the unconscious conduit open; sometimes through initiation rituals and physical breakdown of the conscious / unconscious barrier, by use of psychoactive concoctions or physical stress; through dream imagery interpretation and the activities of shamans, who act as both guides and “librarians” -who thanks to their personality – brain type, can search the collective memory banks to “correct” whatever ails you or the community – paths, patterns and perceptions already worked out by natural processes.

If I’m lucky, a phrase or idea may linger from the night’s brain activity: it becomes an active “topic” for my word-based thinking, as if a basin of water with the drain plug in place had been left to fill and mix overnight, and that on waking a particular phrase serves to pull the plug, allowing the stored up potential of unconscious activity to be free to “do work” in the waking world. Geologic processes and events often supply the images for this dynamic relationship between what modern social people believe to be a dangerous divide between the “good” realm of conscious control and the “evil” realm of unconscious “trash and sewerage” – a tragic religious condemnation that has been imposed on a healthy system of human sensory experience, processing and creativity directed toward a goal of survival and reproduction of our specific “version” of animal life.  The unconscious is a powerful legacy of animal life that we have relegated to a sewer system, a septic tank, a dark region of monsters, dreadful impulses and dangers.

Myths from many cultures include Hell, the underworld, limbo or an after life in their scheme of things; some describe “that place” as a source of knowledge that is perilous to enter, but worth it for what can be found there. (The unconscious is “outside time” and therefore believed to be a place of reliable prophecy; an attractive lure to modern humans who desire to manipulate, dominate and control man and nature – hence the relentless and blinding quest for “magic” as the means to “cheat” the laws of nature.)

We can see that during the long the course of the “evolution” of Hominids, what we call “unconscious processes”, mainly visual thinking, sensory thinking, acquisition of energy and interaction with the environment, and the task of growing and maintaining an animal body, were simply taken care of by the brain – and still are. Our pejorative use of the words “instinct and instinctual” knowledge and processing as something inferior, which “we” have left behind, is a nonsensical conclusion; only one of thousands of mistakes produced 24/7 by the supposedly “superior” (and demonstrably less intelligent) “conscious verbal function” that has been cultivated by modern humans.

Why would I state that the “unconscious” animal brain is more intelligent than the “modern verbal function” as a guidance system for human advancement and survival?

As a modern human “Asperger” who relies on the unconscious as the “go to” source for patterns, systems, connections, networks and explanations for “how the universe actually works” it is OBVIOUS that nature itself provides the “master templates” for creating and implementing technological invention and innovation. Homo sapiens has “discovered” these templates (Laws of Physics) by means of mathematics, and the nature of these “languages of physical reality” remains a bit mysterious.

The “problem” arises with the assumption that the manifestation of technical ideas and products as solutions to the inefficient and painful drudgery of manual labor as the only “power to do work” is also believed to confer intelligence of a truly different type: Wisdom – an ability to “forecast” future consequences for both “positive” and “negative” results that potentially result from one’s actions. And the ability to modify present action accordingly. This is an almost impossible task for the human brain, and “we” know it. It’s why we invent or seek out the gods” to supply rules and structures to compensate for our utter lack of critical foresight and judgement.

Several notions help clarify this predicament.

1. “Nature” has done the work of “foresight” for us: we have access to knowledge stored in “instinct / unconscious content” and in the conscious apprehension of “how the environment works” through trial and error manipulation of real objects and materials and more recently by means of the “abstract codes” underlying natural behavior that mathematics can decipher and which we can apply with great utility.

That is, foresight is not “located” in seeing the “future” but by understanding the “eternal present”. These patterns are not mystical, magical nor supernatural.

2. The deceptive mirage of “word thinking” goes unrecognized. The lure of being freed from the Laws of Nature is great! Word thinking is not “tied to” actual reality – it’s usefulness and value is in making propositions that owe no allegiance to the limits and boundaries of the “real world”. Word language CAN lead to rapid communication of information and dissemination of  useful concepts, but! There is no guarantee that this “information” is accurate – most ideas are created to provide for the motivation and justification of time and energy being expended in the pursuit of inflicting injury and suffering on other humans, and the control- exploitation of resources, plants,  animals other life forms.

In fact, word thinking leads to the “illusion” of the reality and primacy of a supernatural domain, in which “magic” is the operating system; predatory humans “give themselves permission” via verbal constructs whose origin is assigned to this “supernatural realm” which justify the “dominance for personal gain and pleasure” – which does not correspond to the “role” of dominance in nature, which comes with heavy consequences.

3. Oh boy! Screw nature: I’m in control! “Bring on the spells, rituals, magic symbols, secret handshakes, rattles and drums, the abject obedience of “lesser beings” to my dictates.” This is where “we” humans are today: technically powerful, abysmally ignorant of the consequences of our actions – because we have cut ourselves off from access to the “the user’s manual” that is included “free” with every brain.

4. Instead, we have created a delusional and self-destructive “hatred and fear” of a vital evolutionary legacy; the unconscious has been selected and slandered by certain humans as the “cause” of pathologic behavior: mental illness, violence, depravity, abuse, “disobedience to social control” and to the “supernatural regime” of human social reality.

much of human “bad behavior” can be traced directly to the steeply hierarchical structures that dominate modern humans. From the top down (from tyrants, Pharaohs and other psycho-sociopaths, to the ranks of those who are their “prey”) it is the distortion of manmade supernatural “order” as the “original and absolute truth” of human existence that damages the healthy growth and sanity of actual human beings. Much behavior that is destructive, abusive, cruel and irrational on the part of Homo sapiens is inevitable, given the abnormal stresses inherent in human environments.

Next: The real dangers that occur due to alienation of the “unconscious” and our attempts to “reunify” functions, processes and perceptions produced by the “whole” human brain.

From the Archives / Superstition, Mass Murder, Psychosis

Why am I “exposing” my thinking from many years ago? Because the frustration of “dealing with” social humans was so debilitating, that I turned to a “new” asset – writing, in order to make my unconscious internal conflict something that I could “analyze” in terms of the social structure that mystified me.

That is, I discovered that nature had equipped me with thinking skills that could unlock the prison of human self-created misery. It’s ironic, I suppose, that finally “finding” that Asperger people, by whatever “name” one calls them, do exist, and that I am one of them, has actually “softened” my opinion of social typicals; modern humans are products of their brain type and obsessive social orientation, due to “evolutionary” trends and directions that they cannot control. The same can be said for neurodiverse and neurocomplex Homo sapiens: adaptation is guided by the environment; adaptations can be temporarily positive, but fundamentally self-destructive. “Being” Asperger, and exploring what that entails, has gradually allowed me to “be myself” – and to gain insight into the advantages of cognitive detachment in understanding “humanity” – which contrary to psychologists, REQUIRES empathy – empathy that is learned and discovered by experience, and not by “magic”.  

___________________________________________________________________________________________

From the archives:

Nature exists with or without us.

The Supernatural Domain is delusional projection; therefore, it is prudent to assume that any and all human ideas and assumptions are incorrect until proven otherwise! 

The supernatural realm is a product of the human mind – and most of its contents have no correlation with physical reality. As for the content that does correspond, mathematics supplies the descriptive language that makes it possible for us to predict events and create technology that actually works. Whatever jump-started human brain power, the results have been spectacular – from hand axes to planetary probes, from clay pots to cluster bombs. Designing simple tools is fairly easy; a thrown spear either travels true or it doesn’t. Improvements can be made and easily tested until “it works.”

Human beings not only learn from each other, but we observe and copy the behavior of other animals. Useful knowledge can be extracted from nonliving sources, such as the ability of water to do work.

Responses to the environment that belong to the category of conscious thought, and which are expressed by means of language (words and symbols), I would identify as The Supernatural Realm – a kind of warehouse or holding area for ideas waiting to be tested in the physical environment. Problems arise when we fail to test ideas! 

The ability to imagine objects that simply cannot exist, such as human bodies with functional wings attached, is remarkable as a source of useful imagination and dangerous mistakes. Ideas that produce aqueducts, sanitation, medical treatments, or aircraft correlate to conditions of physical reality, and therefore move out of fantasy and into a body of real knowledge. This system of observation, along with trial and error, and the building of a catalogue of useful environmental skills is what has made human adaptation to nearly all environments on earth possible. Each generation has capitalized on the real world techniques of the ancestors, but what about the content of the supernatural that has no value as a description of reality and which if tested, fails miserably?

Ironically this lack of correlation to reality may be what makes some ideas impossible to pry loose from the majority of human minds. Some supernatural ideas can easily piggyback onto acts of force: the religion of the conqueror needs no explanation nor justification. It is imposed and brutally enforced. The fact that the human brain can accommodate mutually impossible universes leads to fantastic possibilities and enormous problems. Without self-awareness and discipline, the result is a continual battle over ideas that are utterly insubstantial, but which are pursued with the furor of blind emotion.

There is widespread belief in the supernatural as an actual place in the sky, under the earth, or all around us, existing in a dimension in which none of the familiar parameters of reality exist, and that it is inhabited by powerful beings that magically take on the physical form of people, ghosts, animals, space aliens, meddlers, mind readers, winged messengers, law givers, deliverers of punishment – who stage car wrecks (then pick and choose who will be injured or die in them), killer tornados, and volcanic eruptions. These spirits prefer to communicate via secret signs and codes which have become the obsession of many. These disembodied beings monitor and punish bad thoughts, hand out winning lottery tickets to those who pray for them, but alternately refuse “wins” to those who are equally needy and prayerful. They demand offerings of flowers, food, blood, and money and millions of lives sacrificed in wars.  

More people believe in a universe where nothing works, or can possibly work, except through the temperamental will of unseen inflated humans, than understand the simple principle of cause and effect. This failure, in a time of space probes that successfully navigate the solar system, indicates that something is functionally delusional in the human brain. The ability of our big brain to investigate the world, to imagine possible action, and to test ideas for working results is remarkable, but our inability to discard concepts that do not reflect how the world works, is bizarre and dangerous. Powerful technologies are applied without understanding how they work. The dire consequences are real. Superstition is the mistaken assignment of cause and effect. The election of leaders who are automated by supernatural ideas, and our frustration when they cannot produce results, is a disaster. The physical processes that drive reality trump all human belief. The destructive power of the richest nation on earth is handed over to a leader without a technical or science-based education, on the claim that his intentions are good and those of the enemy are evil. Does this not seem inadequate?

In the supernatural state of mind, intent guarantees results: Cause, effect, and consequences are nowhere to be seen.

Just where does sanity exist? is a question that still awaits a functional answer. As ideas are vetted and removed to a rational catalogue, which in the U.S. has become the domain of science and engineering, the supernatural realm becomes enriched in fantasy.

Unless children are taught to distinguish between the two, they merely add to a population that is increasingly unable to function. Countries that we arrogantly label as backward embrace science and engineering education. Why is that?

 

Recent History of Socio-Political Anthropology Battles / Important

From Natural History Magazine:

Remembering Stephen Jay Gould

http://www.naturalhistory.com/perspectives/3024131/remembering-stephen-jay-gould

Human evolution was not a special case of anything.

By Ian Tattersall

For long-time readers of Natural History, Stephen Jay Gould needs no introduction. His column, “This View of Life,” was a mainstay of the magazine, starting in January 1974 with “Size and Shape” and concluding with the 300th installment, “I Have Landed,” in the December 2000/January 2001 issue. What made his columns so popular was not just Gould’s range of chosen topics, but also the way he regularly allowed himself to be carried away on any tangent that he found interesting.

Gould died on May 20, 2002. Last spring, on the tenth anniversary of his death, I was invited to join other scholars at a commemorative meeting in Venice organized by the Istituto Veneto di Scienze, Lettere ed Arti in collaboration with the Università Ca’ Foscari. It fell to me, as an anthropologist, to talk about Gould’s intellectual legacy to anthropology. Gould was, of course, anything but a primate specialist. But as it happens, in 1974, the year Gould started writing “This View of Life,” he and I were both invited to attend a specialized meeting on “Phylogeny of the Primates: An Interdisciplinary Approach.” Even at that early stage in his career, I learned, the reach of his writings had broadened well beyond his realms of invertebrate paleontology (he was a fossil-snail expert) and evolutionary theory. He came to address the roles of ontogeny (development of the individual) and neoteny (the evolutionary retention of juvenile traits in adults) in human evolution. What I personally found most interesting, however, was his preprint for the conference, which contained, among much else, a virtuoso canter through the history of human evolutionary studies. He effortlessly displayed mastery of a huge literature on a scale that many professional paleoanthropologists fail to achieve in entire academic lifetimes.

Despite a paucity of strictly technical contributions, there can be no doubt that Gould’s influence on anthropology, and on paleoanthropology in particular, was truly seminal. Foremost among such influences was his 1972 collaboration with Niles Eldredge in developing and publicizing the notion of “punctuated equilibria,” the view that species typically remain little changed during most of their geological history, except for rapid events when they may split to give rise to new, distinct species. This breakthrough enabled paleoanthropologists, like other paleontologists, to treat the famous “gaps” in the fossil record as information, a reflection of how evolution actually proceeded.

Similarly, it was Gould who, in collaboration with Yale paleontologist Elisabeth S. Vrba (then at the Transvaal Museum in Pretoria, South Africa), emphasized that an anatomical or behavioral trait that evolved to serve one function could prove a handy adaptation for an entirely unanticipated one—and that the term exaptation was a better name for this phenomenon than preadaptation, which implied some kind of inherent tendency for a species to follow a certain evolutionary path. Anthropologists were forced to recognize exaptation as an essential theme in the history of innovation in the human family tree.

Speaking of trees, I am convinced that Gould’s most significant contribution to paleoanthropology was his insistence, from very early on, that the genealogy of human evolution took the form of a bush with many branches, rather than a ladder, or simple sequence of ancestors and descendants. As he wrote in his April 1976 column, “Ladders, Bushes, and Human Evolution”:

“I want to argue that the ‘sudden’ appearance of species in the fossil record and our failure to note subsequent evolutionary change within them is the proper prediction of evolutionary theory as we understand it. Evolution usually proceeds by “speciation”—the splitting of one lineage from a parental stock—not by the slow and steady transformation of these large parental stocks. Repeated episodes of speciation produce a bush.”

Before World War II, paleoanthropologists had overwhelmingly been human anatomists by background, with little interest in patterns of diversity in the wider living world. And having been trained largely in a theoretical vacuum, the postwar generation of paleoanthropologists was already exapted to capitulate when, at exact midcentury, the biologist Ernst Mayr told them to throw away nearly all the many names they had been using for fossil hominids. Mayr replaced this plethora, and the diversity it had suggested, with the idea that all fossil hominids known could be placed in a single sequence, from Homo transvaalensis to Homo erectus and culminating in Homo sapiens.

There was admittedly a certain elegance in this new linear formulation; but the problem was that, even in 1950, it was not actually supported by the material evidence. And new discoveries soon made not only most paleoanthropologists but even Mayr himself—grudgingly, in a footnote—concede that at least one small side branch, the so-called “robust” australopithecines, had indeed existed over the course of human evolution. But right up into the 1970s and beyond, the minimalist mindset lingered. Gould’s was among the first—and certainly the most widely influential —voices raised to make paleoanthropologists aware that there was an alternative.

In his “Ladders, Bushes, and Human Evolution” column, Gould declared that he wanted “to argue that Australopithecus, as we know it, is not the ancestor of Homo; and that, in any case, ladders do not represent the path of evolution.” At the time, both statements flatly contradicted received wisdom in paleoanthropology. And while in making the first of them I suspect that Gould was rejecting Australopithecus as ancestral to Homo as a matter of principle, his immediate rationale was based on the recent discovery, in eastern Africa, of specimens attributed to Homo habilis that were just as old as the South African australopithecines.

Later discoveries showed that Gould had been hugely prescient. To provide some perspective here: In 1950, Mayr had recognized a mere three hominid species. By 1993, I was able to publish a hominid genealogy containing twelve. And the latest iteration of that tree embraces twenty-five species, in numerous coexisting lineages. This was exactly what Gould had predicted. In his 1976 article he had written: “We [now] know about three coexisting branches of the human bush. I will be surprised if twice as many more are not discovered before the end of the century.”

Indeed, his impact on the paleoanthropological mindset went beyond even this, largely via his ceaseless insistence that human beings have not been an exception to general evolutionary rules. Before Gould’s remonstrations began, one frequently heard the term “hominization” bandied about, as if becoming human had involved some kind of special process that was unique to our kind. Gould hammered home the message that human evolutionary history was just like that of other mammals, and that we should not be looking at human evolution as a special case of anything.

Of course, Gould had ideas on particular issues in human paleontology as well, and he never shrank from using his Natural History bully pulpit to voice his opinions. Over the years he issued a succession of shrewd and often influential judgments on subjects as diverse as the importance of bipedality as the founding hominid adaptation; the newly advanced African “mitochondrial Eve”; hominid diversity and the ethical dilemmas that might be posed by discovering an Australopithecus alive today; sociobiology and evolutionary psychology (he didn’t like them); the relations between brain size and intelligence; neoteny and the retention of juvenile growth rates into later development as an explanation of the unusual human cranial form; and why human infants are so unusually helpless.

(Removed here; a narrative about the search for who had perpetrated the Piltdow Man hoax)

Gould’s devotion to the historically odd and curious, as well as his concern with the mainstream development of scientific ideas, is also well illustrated by his detailed account of the bizarre nineteenth-century story of Sarah “Saartjie” Baartman. Dubbed the “Hottentot Venus,” Baartman was a Khoisan woman from South Africa’s Western Cape region who was brought to Europe in 1810 and widely exhibited to the public before her death in 1815. Gould’s publicizing of the extraordinary events surrounding and following Baartman’s exhibition may or may not have contributed to the repatriation in 2002 of her remains from Paris to South Africa, where they now rest on a hilltop overlooking the valley in which she was born. But what is certain is that Gould’s interest in this sad case also reflected another of his long-term concerns, with what he called “scientific racism.”

Principally in the 1970s—when memories of the struggle for civil rights in the United States during the previous decade were still extremely raw—Gould devoted a long series of his columns to the subject of racism, as it presented itself in a whole host of different guises. In his very first year of writing for Natural History, he ruminated on the “race problem” both as a taxonomic issue, and in its more political expression in relation to intelligence. He even made the matter personal, with a lucid and deeply thoughtful demolition in Natural History of the purportedly scientific bases for discrimination against Jewish immigrants to America furnished by such savants as H. H. Goddard and Karl Pearson.

Gould also began his long-lasting and more specific campaign against genetic determinism, via a broadside against the conclusions of Arthur Jensen, the psychologist who had argued that education could not do much to level the allegedly different performances of various ethnic groups on IQ tests. And he began a vigorous and still somewhat controversial exploration of the historical roots of “scientific racism” in the work of nineteenth-century embryologists such as Ernst Haeckel and Louis Bolk.

But Gould’s most widely noticed contribution to the race issue began in 1978, with his attack in Science on the conclusions of the early-nineteenth century physician and craniologist Samuel George Morton, whom he characterized rather snarkily as a “self-styled objective empiricist.” In three voluminous works published in Philadelphia between 1839 and 1849—on Native American and ancient Egyptian skulls, and on his own collection of more than 600 skulls of all races—the widely admired Morton had presented the results of the most extensive study ever undertaken of human skulls. The main thrust of (Morton’s) study had been to investigate the then intensely debated question of whether the various races of humankind had a single origin or had been separately created. Morton opted for polygeny, or multiple origins, a conclusion hardly guaranteed to endear him to Gould. Along the way, Morton presented measurements that showed, in keeping with prevailing European and Euro-American beliefs on racial superiority, that Caucasians had larger brains than American “Indians,” who in turn had bigger brains than “Negroes” did. (Cranial-brain size DOES NOT correlate to intelligence)

After closely examining Morton’s data, Gould characterized the Philadelphia savant’s conclusions as “a patchwork of assumption and finagling, controlled, probably unconsciously, by his conventional a priori ranking (his folks on top, slaves on the bottom).” He excoriated Morton for a catalog of sins that included inconsistencies of criteria, omissions of both procedural and convenient kinds, slips and errors, and miscalculations. And although in the end he found “no indication of fraud or conscious manipulation,” he did see “Morton’s saga” as an “egregious example of a common problem in scientific work.” As scientists we are all, Gould asserted, unconscious victims of our preconceptions, and the “only palliations I know are vigilance and scrutiny.”

That blanket condemnation of past and current scientific practice was a theme Gould shortly returned to, with a vengeance, in his 1981 volume The Mismeasure of Man. Probably no book Gould ever wrote commanded wider attention than did this energetic critique of the statistical methods that had been used to substantiate one of his great bêtes noires, biological determinism. This was (is) the belief, as Gould put it, that “the social and economic differences between human groups—primarily races, classes, and sexes—arise from inherited, inborn distinctions and that society, in this sense, is an accurate reflection of biology.”

We are still plagued by this pseudo-scientific “justification” of poverty and inequality; of misogyny and abuse of “lesser humans” by the Human Behavior Industries. Remember, this is very recent history, and the forces of social “control and abuse” are very much still with us.  

It is alarming that the revolution in DNA / genetic research has shifted the “means” of this abuse of human beings into a radical effort to “prove” that socially-created and defined “human behavior pathologies” are due to genetic determinism. The race is on to “prove” that genetic defects, rather than hidden social engineering goals, underlie “defective behavior and thinking” as dictated by closet eugenicists. Racism and eugenics are being pursued in the guise of “caring, treating and fixing” socially “defective” peoples. Genetic engineering of embryos is already in progress

SEE POST August 11, 2017: First Human Embryos ‘Edited’ in U.S. / 7 billion humans not consulted

In Mismeasure, Gould restated his case against Morton at length, adding to the mix a robust rebuttal of methods of psychological testing that aimed at quantifying “intelligence” as a unitary attribute. One of his prime targets was inevitably Arthur Jensen, the psychologist he had already excoriated in the pages of Natural History for Jensen’s famous conclusion that the Head Start program, designed to improve low-income children’s school performance by providing them with pre-school educational, social, and nutritional enrichment, was doomed to fail because the hereditary component of their performance—notably that of African American children—was hugely dominant over the environmental one. A predictable furor followed the publication of Mismeasure, paving the way for continuing controversy during the 1980s and 1990s on the question of the roles of nature versus nurture in the determination of intelligence.

This issue of nature versus nurture, a choice between polar opposites, was of course designed for polemic, and attempts to find a more nuanced middle ground have usually been drowned out by the extremes. So it was in Gould’s case. An unrepentant political liberal, he was firmly on the side of nurture. As a result of his uncompromising characterizations of his opponents’ viewpoints, Gould found himself frequently accused by Jensen and others of misrepresenting their positions and of erecting straw men to attack.

Yet even after Mismeasure first appeared, the climax of the debate was yet to come. In 1994, Richard Herrnstein and Charles Murray published their notorious volume, The Bell Curve: Intelligence and Class Structure in American Life. At positively Gouldian length, Herrnstein and Murray gave a new boost to the argument that intelligence is largely inherited, proclaiming that innate intelligence was a better predictor of such things as income, job performance, chances of unwanted pregnancy, and involvement in crime than are factors such as education level or parental socioeconomic status. They also asserted that, in America, a highly intelligent, “cognitive elite” was becoming separated from the less intelligent underperforming classes, and in consequence they recommended policies such as the elimination of what they saw as welfare incentives for poor women to have children.

Eugenics has never died in American Science; it remains an underestimated force in the shaping of “what do about unacceptable humans”. It is neither a liberal nor conservative impulse: it is a drive within elites to control human destiny.

To Gould such claims were like the proverbial red rag to a bull. He rapidly published a long review essay in The New Yorker attacking the four assertions on which he claimed Herrnstein and Murray’s argument depended. In order to be true, Gould said, Herrnstein and Murray’s claims required that that what they were measuring as intelligence must be: (1) representable as a single number; (2) must allow linear rank ordering of people; (3) be primarily heritable; and (4) be essentially immutable. None of those assumptions, he declared, was tenable. And soon afterward he returned to the attack with a revised and expanded edition of Mismeasure that took direct aim at Herrnstein and Murray’s long book.

There can be little doubt that, as articulated in both editions of Mismeasure, Gould’s conclusions found wide acceptance not only among anthropologists but in the broader social arena as well. But doubts have lingered about Gould’s broad-brush approach to the issues involved, and particularly about a penchant he had to neglect any nuance there might have been in his opponents’ positions. Indeed, he was capable of committing in his own writings exactly the kinds of error of which he had accused Samuel Morton—ironically, even in the very case of Morton himself.

In June 2011, a group of physical anthropologists led by Jason Lewis published a critical analysis of Gould’s attacks on Morton’s craniology. By remeasuring the cranial capacities of about half of Morton’s extensive sample of human skulls, Lewis and colleagues discovered that the data reported by Morton had on the whole been pretty accurate. They could find no basis in the actual specimens themselves for Gould’s suggestion that Morton had (albeit unconsciously) overmeasured European crania, and under-measured African or Native American ones. What’s more, they could find no evidence that, as alleged by Gould, Morton had selectively skewed the results in various other ways.

The anthropologists did concede that Morton had attributed certain psychological characteristics to particular racial groups. But they pointed out that, while Morton was inevitably a creature of his own times, he (Morton) had done nothing to disguise his racial prejudices or his polygenist sympathies. And they concluded that, certainly by prevailing standards, Morton’s presentation of his basic data had been pretty unbiased. (WOW! What an indictment of current Anthropology) What is more, while they were able to substantiate Gould’s claim that Morton’s final summary table of his results contained a long list of errors, Lewis and colleagues also found that correcting those errors would actually have served to reinforce Morton’s own declared biases. And they even discovered that Gould had reported erroneous figures of his own.

These multiple “errors” DO NOT cancel each other out: this is a favorite social typical strategy and magical belief – Present the contradictions from “each side” and reach a “socially acceptable” deadlock. No discussion is possible past this point. The American intellectual-cultural-political environment is trapped in this devastating “black and white, either or, false concept of “problem-solving”. Nothing can be examined; facts are removed to the “supernatural, word-concept domain” and become “politicized” – weapons of distortion in a socio-cultural landscape of perpetual warfare. In the meantime, the population is pushed to either extreme. This is where we are TODAY and this “warfare” will destroy us from within, because the hard work of running a nation is not being done.

It is hard to refute the authors’ conclusion that Gould’s own unconscious preconceptions colored his judgment. Morton, naturally enough, carried all of the cultural baggage of his time, ethnicity, and class. But so, it seems, did Gould. And in a paradoxical way, Gould had proved his own point. Scientists are human beings, and when analyzing evidence they always have to be on guard against the effects of their own personal predilections.

And of the domination and control of their professions by the “elite and powerful” who promote a racist-eugenic social order and control how their work is “messaged” and used to achieve socioeconomic and biological engineering goals – worldwide.


 

First Human Embryos ‘Edited’ in U.S. / 7 billion humans not consulted

The work, which removed a gene mutation linked to a heart condition, is fueling debate over the controversial tool known as CRISPR. Two days after being injected with a gene-editing enzyme, these developing human embryos were free of a disease-causing mutation.

Two Words: “Unintended Consequences”

By Erin Blakemore

PUBLISHED by National Geographic, August 2, 2017

What if you could remove a potentially fatal gene mutation from your child’s DNA before the baby is even born? In an advance that’s as likely to raise eyebrows as it is to save lives, scientists just took a big step toward making that possible.

For the first time, researchers in the United States have used gene editing in human embryos. As they describe today in the journal Nature, the team used “genetic scissors” called CRISPR-Cas9 to target and remove a mutation associated with hypertrophic cardiomyopathy, a common inherited heart disease, in 42 embryos.

DNA Hacking Tool Enables Shortcut to Evolution

Scientists who want to explore the technique hail it as a biomedical advance that could one day give people the option not to pass down heritable diseases. The tool could also reduce the number of embryos that are discarded during fertility treatments because of worrisome genetic mutations.

“The scientists are out of control,” says George Annas, director of the Center for Health Law, Ethics & Human Rights at the Boston University School of Public Health, who thinks that scientists should not edit the genomes of human embryos for any reason. “They want to control nature, but they can’t control themselves.”

Healing Hearts

According to the Centers for Disease Control and Prevention, hypertrophic cardiomyopathy occurs in about one in 500 people. The condition causes the heart muscle to thicken and can lead to sudden cardiac arrest. It takes only one gene mutation to cause the condition, and you can get the disease even if only one of your parents has the mutated gene. If you inherit it, there’s a 50 percent chance you will pass it on to your children.

For their work, Shoukhrat Mitalipov, principal investigator at the Oregon Health and Science University’s Center for Embryonic Cell and Gene Therapy, and his colleagues targeted the genetic mutations that cause the majority of hypertrophic cardiomyopathy cases.

First, they created 58 human embryos from the sperm of a male donor with the mutation and the egg of a female without the mutation. Then, they used CRISPR to cut the mutation out of the gene. When things go right, the DNA repairs itself and the mutation disappears.

The technique isn’t always successful. In previous studies, some CRISPR-edited embryos developed mosaicism, a condition in which some cells have the unwanted mutations and others don’t. All in all, the team was able to repair the gene mutation in about 70 percent of the embryos, and the study showed no unwanted changes at other sites in the edited DNA.

The team allowed the fertilized cells to develop into blastocysts—the stage at which embryos are usually implanted into the mother during fertility treatments. They showed normal development, the team reports. Then, the embryos were destroyed.

Science in Motion

“Of course further research and ethical discussions are necessary before proceeding to clinical trials,” study coauthor Paula Amato, adjunct associate professor of obstetrics and gynecology at OHSU, said during a press briefing on August 1.

Earlier this year, the National Academy of Sciences and National Academy of Medicine asked an international committee of scientists and ethicists to weigh in on the benefits and risks of genome editing in humans. (Find out why scientists think gene editing is both terrifying and terrific.)

The panel recommended that in the case of the human germline—genes passed down from generation to generation—scientists refrain from editing genes for any purpose other than treating or preventing a disease or disability. (No more neurodiverse people will be allowed to be born? Where does this “elimination” of genetic diversity end?) The report also insisted on a more robust public debate before such experiments begin. (Oh sure, as if anyone in power will listen to the “peasants” at the bottom of the pyramid)

In the United States, there’s currently a ban on using taxpayer funds for any research that destroys human embryos. In this case they used institutional and private funds. If the team can’t move ahead as quickly as desired in the U.S., (blackmail?) they’ll consider pursuing their research in other countries.

Of course, poor and developing countries need the money and will sacrifice their embryos, and the “outcomes” for the population, good or bad, gladly…)

Debating the Future

It’s already possible to screen for genetic defects within embryos during in vitro fertilization using a process called preimplantation genetic diagnosis. The team thinks their CRISPR technique could eventually be applied to gene mutations associated with other diseases, like cystic fibrosis.

In their paper, the team writes that their method may one day “rescue mutant embryos, increase the number of embryos available for transfer and ultimately improve pregnancy rates (for the Elites who can $$$$ pay for it)

“That’s just absurd,” says Annas. “They admit right up front that if you want to avoid having a baby with [the mutation], you can just not implant the embryos that are affected.”

Mitalipov disagrees: “Discarding half the embryos is morally wrong,” he tells National Geographic. “We need to be more proactive.”

The embryos in this experiment were destroyed!

This is unbelievable: taking over the future evolution of our species, is “morally” right? Nature provides genetic variation which is necessary for organisms to adapt to a changing environment – we are not smart enough to interfere with this process.

Has anyone asked 7 billion  humans if they think it’s a slam dunk “moral good” for a handful of scientists, who obviously are not the least concerned about morality or ethics, to just “go ahead” and terminate 3.5 billion years of evolution?

Either way, Annas says, it’s time to revisit the conversation about how to regulate CRISPR in the United States. “My guess is that the regulators will be horrified.” (Until $$$$ decides the issue)

But for Mitalipov, the debate is a chance to inform the world about the technique’s potential. And for a scientist who has also cloned monkey embryos and even cloned human embryos to make stem cells, he knows plenty about how to ignite public debate.

“We’ll push the boundaries,” he says.

Of course, “the rest of” Homo sapiens just don’t count; we have no choice but to submit to a future dictated by individuals who regard themselves as “GODS” The track record of “ethical nightmares” in human recent history (Eugenics, genocide, the Holocaust, chemical and biological warfare, and other mass murder of “defectives” – who are simply people of another race, religion, or ethnicity) has proven over and over that the power-insane “Male Psychopaths” at the top of the social pyramid are destroyers of Nature.

 

 

 

 

 

 

 

Top Psych Experiments / Psychologists cleverly embarrass themselves


OMG! The website is: Online Psychology Degree Guide

http://www.onlinepsychologydegree.info/influential-psychological-experiments/

Wow! Visit the site for the other 22 most influential psychology “experiments” PLUS many other informative lists offering “5 most” to “50 most” lists in this popular pop-social media format.

The 25 Most Influential Psychological Experiments in History

By Kristen Fescoe Published January 2016

“A Class Divided”

Study Conducted By: Jane Elliott

Study Conducted in 1968 in an Iowa classroom

Experiment Details: Jane Elliott’s famous experiment was inspired by the assassination of Dr. Martin Luther King Jr. and the inspirational life that he led. The third grade teacher developed an exercise to help her Caucasian students understand the effects of racism and prejudice.

Elliott divided her class into two separate groups: blue-eyed students and brown-eyed students. On the first day, she labeled the blue-eyed group as the superior group and from that point forward they had extra privileges, leaving the brown-eyed children to represent the minority group. She discouraged the groups from interacting and singled out individual students to stress the negative characteristics of the children in the minority group.

What this exercise showed was that the children’s behavior changed almost instantaneously. The group of blue-eyed students performed better academically and even began bullying their brown-eyed classmates. The brown-eyed group experienced lower self-confidence and worse academic performance. The next day, she reversed the roles of the two groups and the blue-eyed students became the minority group.

At the end of the experiment, the children were so relieved that they were reported to have embraced one another and agreed that people should not be judged based on outward appearances. This exercise has since been repeated many times with similar outcomes.

OMG! It’s ironic that the very studies on which psychologists base their claims are so obviously “super-flawed” that their claim to “be scientists” is easily disproven:

  1. Psychologists claim that use of human subjects as “lab rats” is an ethical “No-No”, but here we see uninformed, not-consenting “captive” children being manipulated (I would call it abuse…) by a teacher! The children suffered distress over the tactics used, including becoming bullies and objects to be bullied. How is this conceptually any different than “punishment” as pedagogy?
  2. The students were “relieved” to be “freed from” this awful manipulation – which automatically is interpreted as instant “moral enlightenment” over the question of physical appearances. This reveals the “social engineering” goals of psychology and the reckless “social puppeteer” attitude that prevails.
  3. This “experiment” (abuse of a word that has specific meaning in science) is “predatory” abuse of power: it may have been “repeated” in various forms (like a “fun prank”) but repetition means that many more children were subjected to manipulation and for no legitimate “reason”.

Car Crash Experiment

Study Conducted by: Elizabeth Loftus and John Palmer

Study Conducted in 1974 at The University of California in Irvine

Experiment Details: Loftus and Palmer set out to prove just how deceiving memories can be. The 1974 Car Crash Experiment was designed to evaluate whether wording questions a certain way could influence a participant’s recall by twisting their memories of a specific event.

  1. And yet, “psychological diagnosis” ARE BASED ON JUST THIS: “self-reporting” or “subjective” opinion of parents, teachers, school counselors, gym teachers, coaches, bystanders and the family dog! A “Psych Wizard” spends three minutes asking “loaded, leading” questions or worse – the “client” is required to fill out a “questionnaire” that is so biased that answers will “reveal” pathology – there are dozens to choose from.
  2. The “researchers” set out to prove what they already know ABOUT THEMSELVES: that manipulation can distort “memories” – it’s their prime directive.

The participants watched slides of a car accident and were asked to describe what had happened as if they were eyewitnesses to the scene. The participants were put into two groups and each group was questioned using different wording such as “how fast was the car driving at the time of impact?” versus “how fast was the car going when it smashed into the other car?” The experimenters found that the use of different verbs affected the participants’ memories of the accident, showing that memory can be easily distorted. 

This research suggests that memory can be easily manipulated by questioning technique, meaning that information gathered after the event can merge with original memory causing incorrect recall or reconstructive memory. The addition of false details to a memory of an event is now referred to as confabulation. This concept has very important implications for the questions used in police interviews of eyewitnesses (-and in psychology) 

As for the validity of “psychology” having a scientific “fact-finding” interest in assessing human behavior, we can see that the “goal” is to “test” manipulation techniques on human lab rats. It’s utterly non-objective, non-scientific and unethical. Psychologists refuse to be accountable for “proof or results” in theory or practice. 

Cognitive Dissonance Experiment

Study Conducted by: Leon Festinger and James Carlsmith

Study Conducted in 1957 at Stanford University

Experiment Details: The concept of cognitive dissonance refers to a situation involving conflicting attitudes, beliefs or behaviors. This conflict produces an inherent feeling of discomfort leading to a change in one of the attitudes, beliefs or behaviors to minimize or eliminate the discomfort and restore balance.

Again, the “basis” is putting humans in situations which manipulate personal morality, group ethics, social obedience, and “pain” in order to find out how these may be “applied” in contexts such as the classroom, workplace, consumer markets, media and advertising – and in government. The conclusion is simple: Lie, and use “bribes” and punishment – the Social Pyramid as we experience it every day. Psychology “intends” to legitimize lies, deception and manipulation as “scientifically valid” in human relationships. This is sick.

Cognitive dissonance was first investigated by Leon Festinger, after an observational study of a cult that believed that the earth was going to be destroyed by a flood. (Christians, perhaps?) Out of this study was born an intriguing experiment conducted by Festinger and Carlsmith where participants were asked to perform a series of dull tasks (such as turning pegs in a peg board for an hour). Participant’s initial attitudes toward this task were highly negative. (Anecdotal, hearsay, subjective opinion, not an “experiment” at all)

They were then paid either $1 or $20 to tell a participant waiting in the lobby (lie to them) that the tasks were really interesting. Almost all of the participants agreed to walk into the waiting room and persuade the next participant that the boring experiment would be fun. (The human lab rats were paid to lie and most agreed – where is motivation in this? Were they “students” who always need cash, or individuals who would lie because “an authority figure” asked them to? Who are these human beings ?)

When the participants were later asked to evaluate the experiment, (no, they were asked to evaluate their own experience) the participants who were paid only $1 rated the tedious task as more fun and enjoyable than the participants who were paid $20 to lie. Being paid only $1 is not sufficient incentive for lying and so those who were paid $1 experienced dissonance. They could only overcome that dissonance by coming to (being lied to) believe that the tasks really were interesting and enjoyable. Being paid $20 provides a reason for turning pegs and there is therefore no dissonance.

OMG! Where do I begin with dissecting this monstrosity of “social logic” and magical thinking?

(I need “fuel” – time for breakfast…LOL)

War is a MALE Social Activity / Nukes

Who will be “King” of a dead planet?

My childhood story wrote itself, directed by an impulse to challenge The Official Story, which never did make sense to me. First, there was the story my parents told about their marriage. I would listen to their private histories, both sad and tragic, and wonder why these obvious strangers insisted that finding each other and committing to an unworkable lifelong union was the best of all possible outcomes. Each parent had chosen to add to each other’s suffering by making a brief courtship legal, when apart, each could have pursued happiness. Why would any person do this?

It’s a simple question, but thousands of years of myth, religion, rules and laws, social convention, government institutions, and even reform and innovation in these areas, promote suffering, which has been elevated to the unshakeable position of human destiny. It wasn’t that I imagined a perfect world; I could not imagine why, when suffering exists as an inescapable consequence of being physical creatures, one would choose to voluntarily increase that suffering, and yet, it seemed to me that human beings put great effort into that outcome.

The consequences of choice preoccupied my mind. It took a long time for the reality to sink in: many people don’t recognize that they can make independent choices; their “choices” have been  predestined by a belief system that is so powerful that everything they do is shadowed by the question, What am I supposed to do?” It was shocking to me that people suffered unnecessarily by sticking to roles that had been proven over and over again to result in physical and mental harm to both individuals and groups, and which brought humankind to a state of nearly universal and chronic suffering.

Technology and science appeared as bright spots in the dead gray fog of human behavior that plagued mankind. Radio, television, household appliances, bicycles, automobiles, photography, hot running water, antibiotics, aspirin, eyeglasses – all were advances in comfort, health and pleasure. But! On the new and mysterious TV in our living room, movies were shown that dramatized war and the “wonderful machines of war’ that man had created. Soldiers were happy to be able to help out, as if they were at a communal barn-raising. They looked forward to killing strangers, whether men, women, children or animals, known as The Bad Guys, using guns, knives, grenades and flamethrowers to mangle, maim, and roast people alive. They did this, and then smoked cigarettes. War was fun: a joyful guy thing. The actual horror was ignored, except for an occasional hospital scene where doctors and nurses fixed wounded men so that they could go back and kill more people, or inevitably for some, to be killed. The reward for death and suffering was a cigarette if you lived and a flag and a speech about patriotism if you died.

I couldn’t imagine participating in a war, inflicting pain and death in horrific ways, and also risk my own life – for what? My life was given to me and was sacred. It didn’t belong to anyone else, especially to Big Men who were so careless as to throw lives away so easily.

The usual answer given to children was that there are The Bad Guys, and you have to kill The Bad Guys.

This wasn’t an answer simplified for a child; this was The Answer. It still is.

Soldiers usually do know, once there are at war, that they are being used by the Big Men (human predators) to do their killing.

Many soldiers realize, once they are at war, that they are being used by the Big Men (human predators) to do their killing.

The Korean War began in 1950: we rushed in to "save" Korea from the communists: the country is still divided and 28,000 U.S. troops are still deployed there, 64 years later.

The Korean War began in 1950: we rushed in to “save” Korea from the communists: the country ended up being divided, and 28,000 U.S. troops are still deployed in S. Korea 64 years later.

Few American young people have any idea that the U.S. we invaded Viet Nam, lost, and had to hand the country over to the communist Viet Cong.

Few American young people have any idea that the U.S. invaded Viet Nam, lost the war, with 58,000 dead American soldiers and lost the country to the communist Viet Cong.

Better not ask the question, “How can God be on our side and theirs, too? Everyone says God is always on our side, therefore we are The Good Guys, but The Bad Guys say the same thing. It’s this loopy thinking that keeps people stuck. Why can’t people exit the loop?”

If one pressed the question of war, supplementary answers appeared: the technology developed in war time benefits civilians later. Improved emergency medical techniques, antibiotics, more accurate clocks, fast computers, and many other gadgets were developed to better prosecute war. I found it absurd and shocking that we must have wars in which millions suffer and die so that Mom can cook in a microwave oven and I can take penicillin for a strep throat. Isn’t the suffering brought by disease or accident sufficient motivation to develop medical treatments? The Bull Shit  kept getting deeper.

I lived with a distinct biting anxiety over my obvious lack of sympathy for traditional ideas, which were presented as demands by those who had secured a rung of authority on The Pyramid. Lies were everywhere: in school, at church, at home, on television and in newspapers. I devoured  history books, and biographies of artists, scientists and adventurers – many of whom were people who defied The Official Story, not as bad guys or crusader or reformers, but because alternative explanations made more sense. They often had to hide their work and lived precarious lives, only to have their ideas rediscovered much later, when people discovered profit in their ideas. A happy few gained protection from a powerful patron, and saw their ideas exploited to perpetuate The Official Story that war is necessary, and isn’t it great to have bigger and better weapons, so that our side can kill more and more of The Bad Guys, and whole swathes of innocent bystanders who somehow get in the way.

I listened to educated people make abundant excuses as to why any improvement  is impossible, or must be carried out in the way it has always been done, despite acknowledged failures, as if they were driving forward, but with the parking brake set. “Let’s just throw some platitudes and money at the problem. Maybe it will stick,” is proof that humans are not very smart. Social humans claim to possess all sorts of intelligence and problem-solving skills, and then fall flat on their faces in the same old ruts.

After a lifetime of wondering why humans make life intolerable, I was informed that I am Asperger, which means that I’m not a Social human, but I still have to wait for the nukes to fall, just like everyone else…