Depicting Madness / Social View of Mental Illness

Franz Joseph Gall examining the head of a pretty young girl, while three gentlemen wait in line. Coloured lithograph by E.H., 1825. Copyrighted work available under Creative Commons

The PARIS REVIEW

Full article: https://www.theparisreview.org/blog/2015/04/22/madness-and-meaning

Madness and Meaning

By Andrew Scull, April 22, 2015

Depictions of insanity through history.

Modern psychiatry seems determined to rob madness of its meanings, insisting that its depredations can be reduced to biology and nothing but biology. One must doubt it. The social and cultural dimensions of mental disorders, so indispensable a part of the story of madness and civilization over the centuries, are unlikely to melt away, or to prove no more than an epiphenomenal feature of so universal a feature of human existence. Madness indeed has its meanings, elusive and evanescent as our attempts to capture them have been.

Western culture throughout its long and tangled history provides us with a rich array of images, a remarkable set of windows into both popular and latterly professional beliefs about insanity. The sacred books of the Judeo-Christian tradition are shot through with stories of madness caused by possession by devils or divine displeasure. From Saul, the first king of the Israelites (made mad by Yahweh for failing to carry out to the letter the Lord’s command to slay every man, woman, and child of the Amalekite tribe, and all their animals, too), to the man in the country of the Gaderenes “with an unclean spirit” (maddened, naked, and violent, whose demons Christ casts out and causes to enter a herd of swine, who forthwith rush over a cliff into the sea to drown), here are stories recited for centuries by believers, and often transformed into pictorial form. None proved more fascinating than the story of Nebuchadnezzar, the mighty king of Babylon, the man who captured Jerusalem and destroyed its Temple, carrying the Jews off into captivity all apparently without incurring divine wrath. Swollen with pride, however, he impiously boasts of “the might of my power,” and a savage and jealous God has had enough: driven mad, he “did eat grass as oxen, and his body was wet with the dew of heaven, till his hairs were grown like eagle’s feathers, and his nails like bird’s claws.” The description has proved irresistible to many an artist: above, an unknown German artist working in early fifteenth-century Regensburg provides a portrait of the changes madness wrought upon the sane.

Much more…

Critique of DSM 5 / No Medical Basis for Diagnoses

1_HgubUPoLvaikrySJGsQ40APacific Standard Magazine  

The Problem With Psychiatry, the ‘DSM,’ and the Way We Study Mental Illness

by Ethan Watters

Imagine for a moment that the American Psychiatric Association was about to compile a new edition of its Diagnostic and Statistical Manual of Mental Disorders. But instead of 2013, imagine, just for fun, that the year is 1880.

Transported to the world of the late 19th century, the psychiatric body would have virtually no choice but to include hysteria in the pages of its new volume. Women by the tens of thousands, after all, displayed the distinctive signs: convulsive fits, facial tics, spinal irritation, sensitivity to touch, and leg paralysis. Not a doctor in the Western world at the time would have failed to recognize the presentation. “The illness of our age is hysteria,” a French journalist wrote. “Everywhere one rubs elbows with it.”

Hysteria would have had to be included in our hypothetical 1880 DSM for the exact same reasons that attention deficit hyperactivity disorder is included in the just-released DSM-5. The disorder clearly existed in a population and could be reliably distinguished, by experts and clinicians, from other constellations of symptoms.

There were no reliable medical tests to distinguish hysteria from other illnesses then; the same is true of the disorders listed in the DSM-5 today.

“Practically speaking, the criteria by which something is declared a mental illness are virtually the same now as they were over a hundred years ago.”

The DSM determines which mental disorders are worthy of insurance reimbursement, legal standing, and serious discussion in American life.

That its diagnoses are not more scientific is, according to several prominent critics, a scandal.

In a major blow to the APA’s dominance over mental-health diagnoses, Thomas R. Insel, director of the National Institute of Mental Health, recently declared that his organization would no longer rely on the DSM as a guide to funding research. “The weakness is its lack of validity,” he wrote. “Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever.” As an alternative, Insel called for the creation of a new, rival classification system based on genetics, brain imaging, and cognitive science.

This idea — that we might be able to strip away all subjectivity from the diagnosis of mental illness and render psychiatry truly scientific — is intuitively appealing. But there are a couple of problems with it. The first is that the science simply isn’t there yet. A functional neuroscientific understanding of mental suffering is years, perhaps generations, away from our grasp. What are clinicians and patients to do until then? But the second, more telling problem with Insel’s approach lies in its assumption that it is even possible to strip culture from the study of mental illness. Indeed, from where I sit, the trouble with the DSM — both this one and previous editions — is not so much that it is insufficiently grounded in biology, but that it ignores the inescapable relationship between social cues and the shifting manifestations of mental illness.

PSYCHIATRY tends not to learn from its past. With each new generation, psychiatric healers dismiss the enthusiasms of their predecessors by pointing out the unscientific biases and cultural trends on which their theories were based. Looking back at hysteria, we can see now that 19th-century doctors were operating amidst fanciful beliefs about female anatomy, an assumption of feminine weakness, and the Victorian-era weirdness surrounding female sexuality. And good riddance to bad old ideas. But the more important point to take away is this: There is little doubt that the symptoms expressed by those thousands of women were real.

The resounding lesson of the history of mental illness is that psychiatric theories and diagnostic categories shape the symptoms of patients. “As doctors’ own ideas about what constitutes ‘real’ dis-ease change from time to time,” writes the medical historian Edward Shorter, “the symptoms that patients present will change as well.”

This is not to say that psychiatry wantonly creates sick people where there are none, as many critics fear the new DSM-5 will do. Allen Frances — a psychiatrist who, as it happens, was in charge of compiling the previous DSM, the DSM-IV — predicts in his new book, Saving Normal, that the DSM-5 will “mislabel normal people, promote diagnostic inflation, and encourage inappropriate medication use.” Big Pharma, he says, is intent on ironing out all psychological diversity to create a “human monoculture,” and the DSM-5 will facilitate that mission. In Frances’ dystopian post-DSM-5 future, there will be a psychoactive pill for every occasion, a diagnosis for every inconvenient feeling: “Disruptive mood dysregulation disorder” will turn temper tantrums into a mental illness and encourage a broadened use of antipsychotic drugs; new language describing attention deficit disorder that expands the diagnostic focus to adults will prompt a dramatic rise in the prescription of stimulants like Adderall and Ritalin; the removal of the bereavement exclusion from the diagnosis of major depressive disorder will stigmatize the human process of grieving. The list goes on.

In 2005, a large study suggested that 46 percent of Americans will receive a mental-health diagnosis at some point in their lifetimes. Critics like Frances suggest that, with the new categories and loosened criteria in the DSM-5, the percentage of Americans thinking of themselves as mentally ill will rise far above that mark.

But recent history doesn’t support these fears. In 1994 the DSM-IV — the edition Frances oversaw — launched several new diagnostic categories that became hugely popular among clinicians and the public (bipolar II, attention deficit hyperactivity disorder, and social phobia, to name a few), but the number of people receiving a mental-health diagnosis did not go up between 1994 and 2005. In fact, as psychologist Gary Greenberg, author of The Book of Woe, recently pointed out to me, the prevalence of mental health diagnoses actually went down slightly. This suggests that the declarations of the APA don’t have the power to create legions of mentally ill people by fiat, but rather that the number of people who struggle with their own minds stays somewhat constant.

What changes, it seems, is that they get categorized differently depending on the cultural landscape of the moment. Those walking worried who would have accepted the ubiquitous label of “anxiety” in the 1970s would accept the label of depression that rose to prominence in the late 1980s and the 1990s, and many in the same group might today think of themselves as having social anxiety disorder or ADHD.

Viewed over history, mental health symptoms begin to look less like immutable biological facts and more like a kind of language. Someone in need of communicating his or her inchoate psychological pain has a limited vocabulary of symptoms to choose from. From a distance, we can see how the flawed certainties of Victorian-era healers created a sense of inevitability around the symptoms of hysteria. There is no reason to believe that the same isn’t happening today. Healers have theories about how the mind functions and then discover the symptoms that conform to those theories. Because patients usually seek help when they are in need of guidance about the workings of their minds, they are uniquely susceptible to being influenced by the psychiatric certainties of the moment. There is really no getting around this dynamic. Even Insel’s supposedly objective laboratory scientists would, no doubt, inadvertently define which symptoms our troubled minds gravitate toward. The human unconscious is adept at speaking the language of distress that will be understood.

WHY DO PSYCHIATRIC DIAGNOSES fade away only to be replaced by something new? The demise of hysteria may hold a clue. In the early part of the 20th century, the distinctive presentation of the disorder began to blur and then disappear. The symptoms began to lose their punch. In France this was called la petite hysterie. One doctor described patients who would “content themselves with a few gesticulatory movements, with a few spasms.” Hysteria had begun to suffer from a kind of diagnostic overload. By 1930s or so, the dramatic and unmistakable symptoms of hysteria were vanishing from the cultural landscape because they were no longer recognized as a clear communication of psychological suffering by a new generation of women and their healers.

It is true that the DSM has a great deal of influence in modern America, but it may be more of a scapegoat than a villain. It is certainly not the only force at play in determining which symptoms become culturally salient. As Frances suggests, the marketing efforts of Big Pharma on TV and elsewhere have a huge influence over which diagnoses become fashionable. Some commentators have noted that shifts in diagnostic trends seem uncannily timed to coincide with the term lengths of the patents that pharmaceutical companies hold on drugs. Is it a coincidence that the diagnosis of anxiety diminished as the patents on tranquilizers ran out? Or that the diagnosis of depression rose as drug companies landed new exclusive rights to sell various antidepressants? Consider for a moment that the diagnosis of depression didn’t become popular in Japan until Glaxo-SmithKlein got approval to market Paxil in the country.

Journalists play a role as well: We love to broadcast new mental-health epidemics. The dramatic rise of bulimia in the United Kingdom neatly coincided with the media frenzy surrounding the rumors and subsequent revelation that Princess Di suffered from the condition. Similarly, an American form of anorexia hit Hong Kong in the mid-1990s just after a wave of local media coverage brought attention to the disorder.

The trick is not to scrub culture from the study of mental illness but to understand how the unconscious takes cues from its social settings. This knowledge won’t make mental illnesses vanish (Americans, for some reason, find it particularly difficult to grasp that mental illnesses are absolutely real and culturally shaped at the same time). But it might discourage healers from leaping from one trendy diagnosis to the next. As things stand, we have little defense against such enthusiasms. “We are always just one blockbuster movie and some weekend therapist’s workshops away from a new fad,” Frances writes. “Look for another epidemic beginning in a decade or two as a new generation of therapists forgets the lessons of the past.” Given all the players stirring these cultural currents, I’d make a sizable bet that we won’t have to wait nearly that long.

Looking Back / Life as an Avalanche


Looking back at my life, the first half was like this compendium of avalanches. I just didn’t know any better. The second half has been ‘living the aftermath’, in a state of shock and awe that I survived. Too true!

Empathy Nonsense / Crazy Psychology…again

Actually, this is one terrific test for finding out if you are a “Neurotypical” !

If this description of “how the social brain works” (hint; there is no “social brain”) is an acceptable “scientific explanation” as to how a real, living human brain works, then sadly, you are a Neurotypical and scientifically illiterate. 

H. Erectus Potpourri / Videos + paper

Homo erectus: Why can’t I get no respect!

Brain size? Some weaseling here! Late erectus up to 1250cc. Modern human range is 950cc – 1500cc.

It’s also “cheating” to compare H. erectus skulls to contemporary homo sapiens skulls, as if archaic Homo sapiens didn’t have a whopping big brow ridge and robust skull!

The fossilized remains of Homo sapiens idaltu were discovered in 1997 by Tim White at Herto Bouri near the Middle Awash site of Ethiopia’s Afar Triangle. Dating took place by the radioisotope method which analysed the volcanic layers containing the 3 cranial fossils [White 2003].

 The morphology of the skulls display archaic features not found in the later Homo sapiens, but are still seen as the direct ancestors of modern Homo sapiens sapiens.  The remains discovered at Herto Bouri have been named ‘Herto Man’. Experts claim the finds are complete enough to be identified as early modern humans, since they show the characteristic globular shape of the braincase and the facial features of our species. However, both the adult skulls are huge and robust, and also show resemblances to more primitive African skulls.
Homo sapiens by Kennis & Kennis / Based off of Jebel Irhoud 1, one of the oldest remains of anatomically modern humans, an adult male at 160,000 years old, from Jebel Irhoud cave in Morocco:

____________________________________________________________

PMID: 18191986 DOI: 10.1016/j.jhevol.2007.11.003

Taxonomic implications of cranial shape variation in Homo erectus.

Baab KL1., Department of Anatomical Sciences, Stony Brook University

Abstract

The taxonomic status of Homo erectus sensu lato has been a source of debate since the early 1980s, when a series of publications suggested that the early African fossils may represent a separate species, H. ergaster. To gain further resolution regarding this debate, 3D geometric morphometric data were used to quantify overall shape variation in the cranial vault within H. erectus using a new metric, the sum of squared pairwise Procrustes distances (SSD). Bootstrapping methods were used to compare the H. erectus SSD to a broad range of human and nonhuman primate samples in order to ascertain whether variation in H. erectus most clearly resembles that seen in one or more species. The reference taxa included relevant phylogenetic, ecological, and temporal analogs including humans, apes, and both extant and extinct papionin monkeys. The mean cranial shapes of different temporogeographic subsets of H. erectus fossils were then tested for significance using exact randomization tests and compared to the distances between regional groups of modern humans and subspecies/species of the ape and papionin monkey taxa. To gauge the influence of sexual dimorphism on levels of variation, comparisons were also made between the mean cranial shapes of single-sex samples for the reference taxa. Results indicate that variation in H. erectus is most comparable to single species of papionin monkeys and the genus Pan, which included two species. However, H. erectus encompasses a limited range of variation given its extensive geographic and temporal range, leading to the conclusion that only one species should be recognized. In addition, there are significant differences between the African/Georgian and Asian H. erectus samples, but not between H. ergaster (Georgia+Africa, excluding OH 9 and Daka) and H. erectus sensu stricto. This finding is in line with expectations for intraspecific variation in a long-lived species with a wide, but probably discontinuous, geographic distribution.

PMID: 18191986 DOI: 10.1016/j.jhevol.2007.11.003

Misshapen Heads / Position or pathology?

My purpose / interest in the origin of “skull shapes” in humans is the question of “malformation vs. deformation”. Why? Because two ideas are afloat in the “pop-knowledge-verse” that raise a conflict: Specifically, the attribution of a “dolichocephalic” (long-headed) human group or subspecies, which is based on this skull morphology, and the medical attribution of this morphology as specific to the “uterine environment” and birth difficulties (both deformation and malformation) . This distinction between malformation vs. deformation vs. intentional deformation raises problems in the analysis of individual fossil skulls and more recent human skulls as “types” (even species), when the researcher assumes “misshapen skulls” are “normal” skulls that represent a population.

Ochsner J. 2001 Oct; 3(4): 191–199. PMCID: PMC3116745

Misshapen Heads in Babies: Position or Pathology?

Daniel R. Bronfin, MD, Medical Director, Craniofacial Team, Department of Pediatrics, Ochsner Clinic and Alton Ochsner Medical Foundation, New Orleans, LA

Abstract

A newborn’s skull is highly malleable and rapidly expanding. As a result, any restrictive or constrictive forces applied to a baby’s head can result in dramatic distortions. These changes can be mild and reversible deformations, or severe, irreversible malformations that can result in brain injury. This paper reviews the anatomy and physiology of normal and abnormal brain and skull growth, the etiology of cranial deformation, the types of craniosynostosis most commonly seen in infants, and the importance of early diagnosis and treatment.

At birth, the shape of a newborn’s skull is highly variable due to its inherent plasticity, intrauterine constraint, and the tortuous journey through the birth canal. Variations from the typical oval shape that usually result from the vaginal delivery process will generally return to normal in a relatively short period of time. If this does not occur, the possibility of a rapidly progressive, irreversible, and, in rare circumstances, life threatening cranial malformation needs to be considered.

Historical Perspective: “Intentional Cranial Deformation”

Man’s fascination with misshapen heads dates back to prehistoric times. Archaeologists have found artistic renderings of the imposing heads of Neanderthals who lived 45 000 years ago. (Really?) Hippocrates described in detail a people referred to as the “Macrocephales.” Even in modern times, the television series “Saturday Night Live” entertained viewers with their comic series “The Coneheads.”

Of particular historical interest is the practice of intentional cranial deformation, “the process of dynamic distortion of the normal vectors of infantile neurocranial growth through the agency of externally applied forces”(1). Taking advantage of the rapid head growth and malleable skull unique to the newborn period, individuals have applied constrictive devices (wooden boards, stones placed in a crib, ties, manual molding) over the past centuries to intentionally and permanently deform a child’s skull. (Examples and figures – see original text)

Skip to:

Abnormal Development: Malformation vs. Deformation

There are two very distinct processes of morphogenesis through which the human skull is misshapen: malformation and deformation. Malformation refers to an intrinsically altered developmental process that interferes with cell migration and differentiation through genetically programmed biochemical processes or through extrinsic chemical interference (teratogens). In essence, this process represents an error in the normal development of a part. (Examples: See original text)

Cranial Deformations

In general, cranial deformations are common, mild, and typically reversible, while cranial malformations are relatively rare, progressive, and often irreversible anomalies which, if not aggressively treated in a timely manner, can result in severe cosmetic and functional impairment.

Molding

One out of three infants will have some degree of deformational molding. Fetal head constraint is more common in primigravidas (preterm), large for gestational age babies, and when there is cephalopelvic disproportion, oligohydramnios, multiple births, or prolonged courses of labor.

Caput succedaneum (Figure 5) is due to edema of the skin and subcutaneous tissues of the scalp resulting in a “conehead” appearance, which normally resolves in less than 6 days. A cephalohematoma is a traumatic subperiosteal hemorrhage that does not cross a suture line. This deformity is initially soft and, with time, becomes firm as it calcifies; it generally requires up to 4 months to resolve entirely.

Babies born breech (Figure 5 left; Figure 6 right ) typically have craniofacial and limb deformations resulting from their in utero position. These babies characteristically have a long, narrow head, (“dolichocephaly” or “type 1”), with a prominent occipital shelf, redundant skin over the neck, overlapping lambdoidal sutures, and an indentation below their ears (from shoulder compression). These babies are also more likely to have a head-tilt, or torticollis, after birth due to fetal constraint. Developmental dysplasia of the hips and calcaneovalgus (club) foot deformity are also more commonly seen in this population and may or may not be reversible without intervention.

Postnatal deformation can also occur in the neonatal intensive care nursery when high-risk babies are kept paralyzed and intubated on their side for extended periods. These babies have long and narrow heads due to their relatively large heads and poor neck muscle tone; the skull bones are soft and thin and the skull is flattened by gravity alone. It is also important to note that neurologically impaired infants with hypo- or hypertonia may have a greater degree of positional deformity of their heads due to limited mobility when prone.

Cranial Malformations

The other, and far more ominous, type of abnormal cranial development is craniosynostosis, or premature fusion of one or more cranial sutures. This malformation occurs in 1 in 2500 neonates as opposed to the 1 in 3 babies with a deformational anomaly. Craniosynostosis is classified as simple (1 suture) versus compound (2 or more sutures), and isolated (no other major malformations) versus syndromic (one of multiple associated anomalies).

The mechanism of skull malformation caused by a fused suture(s) in a developing skull was initially described by Virchow in 1851 (4). He pointed out that cranial growth restriction will occur in the plane parallel to a prematurely fused suture and enhanced in the perpendicular planes (Figure 9). Thus, if the sagittal suture were fused early, one would expect the skull to be restricted in the transverse dimension and to overcompensate in the anterior-posterior dimension in response to the growing brain resulting in dolichocephaly (type 1).

____________________________________________________

The original paper contains more info on what can go wrong in human fetal development and birth, and early infancy.  A reminder of the consequences of obligatory bipedalism and a big brain. Examples of Malformation-Deformation in the fossil record may tell us a great deal about the ease or difficulty of human birth in a species or group.

Unsatisfying Human Relationships / Proxy “Friends”

Social relationships do not give people everything they need or expect: the gap is filled in by God, or Jesus, or Mary or any imaginary friend who will give a believer everything he or she wants, but cannot get, from the people in their lives.

Religion and social media sites are in fact, a statement of disappointment in what other human beings can, or will, give each other. Could it be, that expectations for what one can obtain from other people are simply way too high?

God – He’s just a click away…

OMG No! / Not the “Aspie” Stare!

It’s the Aspie Blank Stare! Who knew that this was such a disturbing facet of Aspie behavior? Not me!

My observations attempt to “rectify” the communication that is (not) going well.

From a ‘Mum’

In my experience, I would get a blank stare when I asked (my Asperger son) a question.  It could be, for example, what he would like for dinner? What happened at school? You know – normal sorts of ‘Mum’ questions!

Social typical questions tend to be vague, over-general and non-specific. A specific question would be: Would you like pizza or hot dogs for dinner? Or try, “We’re having hamburgers for dinner. I bought the kind of buns you like and you can add tomatoes or pickles or cheese, or whatever else you like.”  What stories did you read in reading class today?

How did I interpret the blank stare that I got?

At the time, I believed that ‘the blank stare’ was used by (SON) to avoid answering the questions I asked questions I thought were easy to answer! I realize now, that in my frustration over not getting an answer, I would pile on the questions one after another, and (SON) didn’t have time to process even the first one!

I would get cross with him, frustrated that he seemed to refuse to respond to my requests for information, and I would give up.

One of the big mistakes that social typicals make is to attribute INTENT to Asperger behavior. This is because social typicals are “self-oriented” – everything is about THEM; any behavior on the part of a human, dog, cat, plant or lifeform in a distant galaxy, must be directed at THEM. Example: God, or Jesus, or whomever, is paying attention 24 / 7 to the most excruciatingly trivial moments in the lives of social typicals. We’re not as patient as God or Jesus.

The Asperger default mental state is a type of reverie, day-dreaming, trance or other “reflective” brain process; that is, we do “intuitive” thinking. The “blank face” is because we do not use our face to do this type of thinking. 

Sorry – we’re just busy elsewhere! When you ask a question, it can take a few moments to “come out of” our “reverie” and reorient our attention. If you are asking a “general question” that is meant to elicit a “feeling” (social) response, it will land like a dead fish in front of us. Hence the continued “blankness”.  

What is the real cause of the blank stare?

I believe that SON uses the blank stare while he is processing a question. If give him enough time, he will think deeply, and consider his response, which is often unexpected.

The “blank stare” is due to our type of brain activity: processing questions adds to response time. Some questions are so vague that we simply cannot answer them. Some questions aren’t questions at all, but are an attempt to get our attention and to get a “social” something from us. This is truly confusing. 

An Aspie will be taking in as much information as they can from the world around them at any given moment. They notice details that ‘normals’ ignore. These details can easily  result in sensory overload. The blank stare is used by Aspies as a way to ‘zone out’, or ‘go into themselves’ as a coping mechanism for when their senses are overloaded.

I don’t think this is correct. Sensory overload is another matter entirely; sensory overload results in the desire to flee, and if we can’t “get away” we experience “meltdown” Other Aspies may have a different take on this.)

Aspie chat concerning “The Stare”

I watched “Rain Man” again recently. There was a scene where Dusty was sitting on a park bench and just looking at the ground, and Tom Cruise started YELLING at him. I felt like, “Hey ! sometimes I just sit and think about things, and maybe I’m staring at the ground, so cool it Tom.” We tend to look off into the horizon while we’re talking, and really, it’s not a big deal …

At work I’ll be at my desk just working away and people will tell me to cheer up when I don’t feel at all down. Also, if I’m standing around somewhere, and not focusing on anything in particular – and feeling fine, someone will ask me if I’m OK or if I’m pissed off about something. Something about my neutral (not happy or sad, just contented) expression makes people think I’m depressed or angry.

I must do “The Stare” because people are always doing one of the following: Ask me if I’m okay because I’m staring off into the distance; look behind their back to see what I’m staring at; or tell me to “SMILE!” because I don’t have any facial expression.

Yes, social typicals are self-centered and demanding. They don’t want to “put up with” a blank face; it damages their perfect narcissistic universe, in which it is everyone’s job to make them feel important.

And then, there is the OTHER “Aspie Stare”

I dont get it…..my teacher tells me to look at her when she talks and when I look at other people they tell me to stop staring at them. What the…?

Apparently staring and looking are two different things, not that I know how to tell the difference.

Teacher demands eye-contact? It’s the OBEDIENCE – submission  thing. Authoritarian adults demand instant obedience from children. Stare at a ‘non-authoritarian’ person? Predators stare down prey; you, dear Aspie, are unwittingly behaving like a predator.

I stare because I get easily distracted by details and I want to see more; it’s just attention to detail. I’m doing better at straight eye contact, but open my eyes too wide because I’m trying hard to focus and pay attention.

If I am interested in what a person is saying – it’s new to me or important information, I will stare like a laser. Also if I am trying to recognize someone that looks vaguely familiar, or there is something interesting about how they look and I want to examine it. If I’m not interested, I won’t look at them. However, that does not mean I am not listening just because I am not looking at them.

It seems to me, that Aspergers use our senses selectively: many, if not most of us are visual thinkers so we use our eyes to “see” and if there is not something to “see”, but rather, there is something to “hear”  – we listen. How bizarre!

____________________________________________________________________________________________

Uh-oh! It’s that darn ASPIE / INTJ overlap again!

Asperger Emotion Problems / German Language to the Rescue

It’s not that Asperger individuals don’t have feelings; it’s a lack of words adequate to the expression of our deep and complex cognitive emotions. German to the rescue!

Touching, true and very Asperger…

“…asphyxiating friendliness…”