Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

Archaic H. sapiens – H. sapiens sapiens / Testosterone

A composite image shows the facial differences between an ancient modern human (Archaic Homo Sapiens) with heavy brows and a large upper face and the more recent modern human (Homo sapiens) who has rounder features and a much less prominent brow. The prominence of these features can be directly traced to the influence (reduction) of the hormone testosterone. Photo Credit: Robert Cieri, University of Utah

2238323698Archaic vs Modern

Read more: http://www.zmescience.com/science/archaeology/civilization-testosterone-skull-04082014/#ixzz3eYaNNQr0

“It is important to note that lower testosterone is associated with tolerance and cooperation in bonobos and chimpanzees, and with less aggression in humans. It seems very plausible that as humans started to group up in larger and more interconnected settlements,

they needed to find less violent ways to sort out their problems – and in the long run, the non-violent path won.”

8bc013fe2a9e1fcb69e2384793c2502a

 

 

 

 

Adult Rationality and Logic are LEARNED

Babble.com

“Psychologists have suggested that for little children, the boundary between reality and fantasy is blurry. The imaginary life of kids is powerful and sways their perceptions of the real world until they master adult rationality and logic. (And when does this supposedly occur? In most Americans, this never comes to pass…)

The famous pioneer of developmental psychology, Jean Piaget, said that kids in the preoperational stage of cognitive growth (ages two to seven) use magical thinking until they learn the properties of physics and reality a trial and error process that takes years. (In most Americans, this means NEVER…)

Or we could educate our children in math and science as a means to teach “how the world works” but then we would create adults who can think independently, question social dogma, insist on facts instead of accepting lies, abandon supernatural explanations for phenomena, and who use common sense, logic and critical thinking skills in making life decisions, instead of remaining dependent on infantile emotions, magical solutions, and “big daddies.”

99bb09d4f95115fb52998ca192af3a29

The one absolutely necessary requirement for becoming an adult:

DOUBT

Unraveling Asperger’s and Pain / “Normalizing” Chronic Misery

I don’t like to rely solely on my experiences to unravel what might be going on with Asperger types, but sometimes it’s all you have to go on. One reads that Asperger individuals either over react to pain and discomfort, or will not notice pain at all. Another of those “gotcha” symptoms in which we are either “over or under” the “normal” human behavior or experience, but in the case of pain, which is a subjective experience, what is “normal”?

Now is a good time to think about this, since I have a toothache (not another root canal!) and severe allergies. I’m a mess. I hate being sick, mostly because I’m very active and have trouble staying in bed or on the couch, resting as one should.

Questions arise. What was I like as a kid whenever some illness like the flu was going around? The plot thickens: how did my parents behave toward us (I had an older brother) when the inevitable sick days that kids have, came round?

Not good! My brother was six years older and from my observations was babied. He always had something “mysterious or nebulous” going on that meant staying home from school or being spared from regular tasks and chores that he didn’t want to do. This was very bad; by the time I arrived, a triad of dysfunctional relationships was already in place.

You say you're sick? PROVE IT

You say you’re sick? PROVE IT

The short story is that my brother received gifts, toys and attention if ill, but I was punished. If I said I didn’t feel well, I had to prove it: have a measurable fever, be vomiting or be possessed of some obvious bug going around school, and parents had been asked to keep symptomatic kids at home. I wasn’t allowed out of bed, or to have books or toys. Although my mother was merely peeved or angry with me, when my Asperger father came home, he  would state how he never became ill (it was true) and that illness was a sign of weakness and failure; why wasn’t I like him?  This message came through loud and clear and has been a negative influence – absolutely. When unwell, I have to fight feelings of inadequacy and failure, and a residue of abandonment. It’s ridiculous.

Here’s the question: Is this cruel message served up by my father a product of Asperger’s, or is it something else? Although his attitude was obviously hurtful, I also knew my father’s story: he had been a premature twin and his brother died at birth. He  was not expected to live, but he pulled through.  My father’s childhood had been a living Hell of beatings and hard work on the farm, dished out as tough love by his father in order to make him strong. In one of those “tragic” outcomes, my father ended up being a highly fit and muscular adult; tragic, because he believed his father’s cruelty was responsible for his good health.

I attribute my father’s survival to having good care as an infant, and good genes, not magic or cruelty. If a premature baby survived in the 1910s, long before the elaborate interventions of today’s medical devices and drugs, he or she had to have had a package of healthy provisions on board, just to survive the first year. I was stuck with a mystery; was my father a product of nature, or “severe” nurturing?

It just wasn’t my father’s nature to be cruel; his weak – strong theory of life descended like a dark curtain when issues of vulnerability appeared. Otherwise he was generous with his time and attention and I remember that father also. Unfortunately, he had no insight into the brutal treatment he endured as a child, and let’s face it, American males are subject to the irrational fear of being labeled as “weak” or soft; a “girl” – a fear intensively cultivated by American culture, then and now. And, the outcome was that “spoiling” my brother left him entirely dependent on my parents, but some “ill-treatment” did prepare me for an  independent life; the challenges were great, but made me an adult – slowly, but at least I made it.

It is my view, after two years of reading and thinking about Asperger’s, especially the bizarre dogma of psychology, medical information, and anecdotal references, that Asperger’s is a personality or temperament type, characterized by an intellectual “state” that is simply not socially-oriented, but attuned to the physical world: sensory attention, logic based, not word-supernatural based.

Conformity institutions (like psychology, corporations, religions and schools) simply cannot tolerate people who think for themselves. It’s the old story of domestication: dogs are useful to humans because they “work for food” and “adjust to” cruelty from humans, because – mostly they have no choice, and have been bred to various “addictions” – behaviors like extreme herding behavior or tracking of drugs, criminals or lost people; exploitation of their more accurate and extensive sensory abilities. abilities,drugs and criminals or lost people. A lucky few (?) become family, and are classified as “pets” – literally, we stroke and hug them, overfeed them bad food, and lock them in tiny apartments, basements or porches, abandoning them for long hours. Many breeds have been literally deformed to physically fulfill the awful constraints of being substitute infants for infantile people: purposely deformed, as if we were acceptable to “create” humans with severe physical distortions and disabilities, because it satisfies some warped idea of “cuteness”.

The desire for lifeforms to be either enslaved by work or to be enslaved as “cute social objects – status symbols” is domestication.

Wolves are despised and exterminated because they can’t be tamed; they remain free to be competing predators. Myth, fairytales and fabrications place wolves close to the devil. It’s not true; like any competing predator species, they have been hunted by human predators to near extinction.

There is no doubt that humans have domesticated humans: slavery is “forced ” domestication followed by sexual selection from the “survivors”. The designation of an individual, or group, or class of humans as having “potential” to be tamed – that is, be forced to work without resistance, as dogs and horses and other animal laborers do,   has always been paramount. Humans were selected just as animals were, to be reshaped into “useful” tame forms. Over thousands of years of this “civilizing” process, the “owners” of grand cities and the agricultural and manufacturing systems necessary to their existence, simply exterminated all things wild and increasingly cultivated submissive behavior, just as we continue to do today.

The relentless selection of human form and abilities “useful” to the predatory hierarchy changed humans into “specialized” organisms; varieties of people that have become “natural” to us – the class system as it exists today, in which domestic types – peasants, wage slaves; the middle and low classes; immigrants, and others who do the “shit work” for the upper and ruling classes, are fed scraps from the “dinner table” because like dogs, they have been bred to this condition, which no “wild human” could or would, tolerate.

Pain and its subjective experience by individuals is a tricky subject when you look into it. We are amazed and frightened by the “dangers” that wild animals live with 24 / 7 – but we forget that “pain” in nature is usually swift and brief: a few seconds to a few hours – and the animal has either recovered, been “finished off” or has died of stress – lack of water, blood loss and shock.

The human “domestic” condition may be seen as far worse. Someone said, “The problem with humans is that they will put up with anything.” One of the most obvious “changes” to the human animal has been the development of tolerance of very bad treatment by other humans, not unlike the dog that is chained to a post or fence, day after day, with little or no food, a dirty bucket of water (if that) and is expected to demonstrate “wild affection” at the appearance of its tormentor.

Human empathy, compassion or kindness? The system provides relief, but not freedom, and an “easy” new form of slavery – to religion, to drugs, to alcohol, to violent punishment and sadistic entertainment; to hopelessness and lies. Pain in humans is not swift; it is chronic and lifelong. Pain is stretched out over decades, and declared to be “progress” when medical intervention patches people up, so that they can return to fulfilling their role in the social machinery. Pain does not go away; it is a protracted state of dependency cultivated by the hierarchy. “Modern pain” is a result of domestication, which has become panhuman, and is “considered” to be normal – pain and slavery have been socialized.

The “idea” of pain, despite the knowledge that this is a highly subjective and variable physical experience, is so controlled, that Asperger types are classified as defective, because “supposedly” our experience of pain is “abnormal” – that is, we do not “behave” like domesticated animals; we do not respond with compliance to pain applied as punishment and control: our “reactivity” falls outside the imposed parameters of “being suitable for use as a slave.” We “leave” – physically if possible, and we suffer greatly if we can’t. Withdrawal into a “better world that exists in nature and in satisfying our curiosity and need to acquire knowledge” (labeled as “obsessions”) is a healthy reaction – too healthy for society to tolerate.

 

 

 

 

 

 

Normal people aren’t nearly as nice as they think they are.

Appearances feed inattentional blindness.

How could I be in pain? How dare I ask for help? No one saw me as a human being; they took one look and concluded that I had everything an American girl could want. I wasn’t human: I wasn’t allowed to be human. I suffered alone. I had to fight for a diagnosis; alone. Shrinks and doctor’s weren’t much help. They said things like, “I have patients who are really sick.”

I worked and studied to find answers for myself. And finally, after many years and much damage had already been done, a diagnosis, but no cure for the mean-spirited attacks on the “mentally ill” or “developmentally disabled” by social normals, especially those in the Caring Industry. The first shock of being diagnosed is that you fall off the social pyramid – you belong to an outcast class no matter where you formerly existed. People act as if you’re dangerous and talk down to you as if you’re an imbecile. Your presence becomes annoying, as if you’re just another disposable street person. And then you become invisible. So it’s grit your teeth, use your intelligence, find how to live, and forget about empathy, compassion or even shallow sympathy from the pillars of society. Normal people aren’t nearly as nice as they think they are.

If someone tells you they need help, believe them.

The Americanization of Mental Illness / Cultural Aggression – Globalization

This article exposes one of the “unnoticed costs” of globalization and American cultural aggression: a type of modern Trojan Horse, in the guise of “scientific progress” in the concept of mental illness and the very definition of “what it means to be a human being”.

What I have been maintaining throughout my blog, is that this same “cultural” extermination of “humanistic” ideas about human behavior, has been perpetrated on the American Public – with the same disastrous results! More “so-called” pathologies  / mental illnesses, disorders, defective children, addictions and trauma, and more so-called “need” for “intervention and treatment” and an unprecedented growth in the industries that profit from what is a “crime against humanity”… invented pathologies that destroy societies, communities, families and individuals, and education, by a “takeover” of  existing diverse American and now, world  cultures under one perverse ideology.   

The Americanization of Mental Illness

By ETHAN WATTERS, JAN. 8, 2010

a few excerpts – see original article: http://www.nytimes.com/2010/01/10/magazine/10psyche-t.html

AMERICANS, particularly if they are of a certain leftward-leaning, college-educated type, worry about our country’s blunders into other cultures. In some circles, it is easy to make friends with a rousing rant about the McDonald’s near Tiananmen Square, the Nike factory in Malaysia or the latest blowback from our political or military interventions abroad. For all our self-recrimination, however, we may have yet to face one of the most remarkable effects of American-led globalization. We have for many years been busily engaged in a grand project of Americanizing the world’s understanding of mental health and illness. We may indeed be far along in homogenizing the way the world goes mad.

In any given era, those who minister to the mentally ill — doctors or shamans or priests — inadvertently help to select which symptoms will be recognized as legitimate. Because the troubled mind has been influenced by healers of diverse religious and scientific persuasions, the forms of madness from one place and time often look remarkably different from the forms of madness in another.

That is until recently.

For more than a generation now, we in the West have aggressively spread our modern knowledge of mental illness around the world. We have done this in the name of science, believing that our approaches reveal the biological basis of psychic suffering and dispel prescientific myths and harmful stigma. There is now good evidence to suggest that in the process of teaching the rest of the world to think like us, we’ve been exporting our Western “symptom repertoire” as well. That is, we’ve been changing not only the treatments but also the expression of mental illness in other cultures. Indeed, a handful of mental-health disorders — depression, post-traumatic stress disorder and anorexia among them — now appear to be spreading across cultures with the speed of contagious diseases. These symptom clusters are becoming the lingua franca of human suffering, replacing indigenous forms of mental illness.

What is being missed, Lee and others have suggested, is a deep understanding of how the expectations and beliefs of the sufferer shape their suffering. “Culture shapes the way general psychopathology is going to be translated partially or completely into specific psychopathology,” Lee says. “When there is a cultural atmosphere in which professionals, the media, schools, doctors, psychologists all recognize and endorse and talk about and publicize eating disorders, then people can be triggered to consciously or unconsciously pick eating-disorder pathology as a way to express that conflict.”

The problem becomes especially worrisome in a time of globalization, when symptom repertoires can cross borders with ease. Having been trained in England and the United States, Lee knows better than most the locomotive force behind Western ideas about mental health and illness. Mental-health professionals in the West, and in the United States in particular, create official categories of mental diseases and promote them in a diagnostic manual that has become the worldwide standard. American researchers and institutions run most of the premier scholarly journals and host top conferences in the fields of psychology and psychiatry. Western drug companies dole out large sums for research and spend billions marketing medications for mental illnesses. In addition, Western-trained traumatologists often rush in where war or natural disasters strike to deliver “psychological first aid,” bringing with them their assumptions about how the mind becomes broken by horrible events and how it is best healed. Taken together this is a juggernaut that Lee sees little chance of stopping.

“As Western categories for diseases have gained dominance, micro-cultures that shape the illness experiences of individual patients are being discarded,” Lee says. “The current has become too strong.”

THE IDEA THAT our Western conception of mental health and illness might be shaping the expression of illnesses in other cultures is rarely discussed in the professional literature. Many modern mental-health practitioners and researchers believe that the scientific standing of our drugs, our illness categories and our theories of the mind have put the field beyond the influence of endlessly shifting cultural trends and beliefs. After all, we now have machines that can literally watch the mind at work. We can change the chemistry of the brain in a variety of interesting ways and we can examine DNA sequences for abnormalities. The assumption is that these remarkable scientific advances have allowed modern-day practitioners to avoid the blind spots and cultural biases of their predecessors.

EVEN WHEN THE underlying science is sound and the intentions altruistic, the export of Western biomedical ideas can have frustrating and unexpected consequences. For the last 50-odd years, Western mental-health professionals have been pushing what they call “mental-health literacy” on the rest of the world. Cultures became more “literate” as they adopted Western biomedical conceptions of diseases like depression and schizophrenia. One study published in The International Journal of Mental Health, for instance, portrayed those who endorsed the statement that “mental illness is an illness like any other” as having a “knowledgeable, benevolent, supportive orientation toward the mentally ill.”

CROSS-CULTURAL psychiatrists have pointed out that the mental-health ideas we export to the world are rarely unadulterated scientific facts and never culturally neutral. “Western mental-health discourse introduces core components of Western culture, including a theory of human nature, a definition of personhood, a sense of time and memory and a source of moral authority. None of this is universal,” Derek Summerfield of the Institute of Psychiatry in London observes. He has also written: “The problem is the overall thrust that comes from being at the heart of the one globalizing culture. It is as if one version of human nature is being presented as definitive, and one set of ideas about pain and suffering. . . . There is no one definitive psychology.”

Critique of DSM 5 / No Medical Basis for Diagnoses

1_HgubUPoLvaikrySJGsQ40APacific Standard Magazine  

The Problem With Psychiatry, the ‘DSM,’ and the Way We Study Mental Illness

by Ethan Watters

Imagine for a moment that the American Psychiatric Association was about to compile a new edition of its Diagnostic and Statistical Manual of Mental Disorders. But instead of 2013, imagine, just for fun, that the year is 1880.

Transported to the world of the late 19th century, the psychiatric body would have virtually no choice but to include hysteria in the pages of its new volume. Women by the tens of thousands, after all, displayed the distinctive signs: convulsive fits, facial tics, spinal irritation, sensitivity to touch, and leg paralysis. Not a doctor in the Western world at the time would have failed to recognize the presentation. “The illness of our age is hysteria,” a French journalist wrote. “Everywhere one rubs elbows with it.”

Hysteria would have had to be included in our hypothetical 1880 DSM for the exact same reasons that attention deficit hyperactivity disorder is included in the just-released DSM-5. The disorder clearly existed in a population and could be reliably distinguished, by experts and clinicians, from other constellations of symptoms.

There were no reliable medical tests to distinguish hysteria from other illnesses then; the same is true of the disorders listed in the DSM-5 today.

“Practically speaking, the criteria by which something is declared a mental illness are virtually the same now as they were over a hundred years ago.”

The DSM determines which mental disorders are worthy of insurance reimbursement, legal standing, and serious discussion in American life.

That its diagnoses are not more scientific is, according to several prominent critics, a scandal.

In a major blow to the APA’s dominance over mental-health diagnoses, Thomas R. Insel, director of the National Institute of Mental Health, recently declared that his organization would no longer rely on the DSM as a guide to funding research. “The weakness is its lack of validity,” he wrote. “Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever.” As an alternative, Insel called for the creation of a new, rival classification system based on genetics, brain imaging, and cognitive science.

This idea — that we might be able to strip away all subjectivity from the diagnosis of mental illness and render psychiatry truly scientific — is intuitively appealing. But there are a couple of problems with it. The first is that the science simply isn’t there yet. A functional neuroscientific understanding of mental suffering is years, perhaps generations, away from our grasp. What are clinicians and patients to do until then? But the second, more telling problem with Insel’s approach lies in its assumption that it is even possible to strip culture from the study of mental illness. Indeed, from where I sit, the trouble with the DSM — both this one and previous editions — is not so much that it is insufficiently grounded in biology, but that it ignores the inescapable relationship between social cues and the shifting manifestations of mental illness.

PSYCHIATRY tends not to learn from its past. With each new generation, psychiatric healers dismiss the enthusiasms of their predecessors by pointing out the unscientific biases and cultural trends on which their theories were based. Looking back at hysteria, we can see now that 19th-century doctors were operating amidst fanciful beliefs about female anatomy, an assumption of feminine weakness, and the Victorian-era weirdness surrounding female sexuality. And good riddance to bad old ideas. But the more important point to take away is this: There is little doubt that the symptoms expressed by those thousands of women were real.

The resounding lesson of the history of mental illness is that psychiatric theories and diagnostic categories shape the symptoms of patients. “As doctors’ own ideas about what constitutes ‘real’ dis-ease change from time to time,” writes the medical historian Edward Shorter, “the symptoms that patients present will change as well.”

This is not to say that psychiatry wantonly creates sick people where there are none, as many critics fear the new DSM-5 will do. Allen Frances — a psychiatrist who, as it happens, was in charge of compiling the previous DSM, the DSM-IV — predicts in his new book, Saving Normal, that the DSM-5 will “mislabel normal people, promote diagnostic inflation, and encourage inappropriate medication use.” Big Pharma, he says, is intent on ironing out all psychological diversity to create a “human monoculture,” and the DSM-5 will facilitate that mission. In Frances’ dystopian post-DSM-5 future, there will be a psychoactive pill for every occasion, a diagnosis for every inconvenient feeling: “Disruptive mood dysregulation disorder” will turn temper tantrums into a mental illness and encourage a broadened use of antipsychotic drugs; new language describing attention deficit disorder that expands the diagnostic focus to adults will prompt a dramatic rise in the prescription of stimulants like Adderall and Ritalin; the removal of the bereavement exclusion from the diagnosis of major depressive disorder will stigmatize the human process of grieving. The list goes on.

In 2005, a large study suggested that 46 percent of Americans will receive a mental-health diagnosis at some point in their lifetimes. Critics like Frances suggest that, with the new categories and loosened criteria in the DSM-5, the percentage of Americans thinking of themselves as mentally ill will rise far above that mark.

But recent history doesn’t support these fears. In 1994 the DSM-IV — the edition Frances oversaw — launched several new diagnostic categories that became hugely popular among clinicians and the public (bipolar II, attention deficit hyperactivity disorder, and social phobia, to name a few), but the number of people receiving a mental-health diagnosis did not go up between 1994 and 2005. In fact, as psychologist Gary Greenberg, author of The Book of Woe, recently pointed out to me, the prevalence of mental health diagnoses actually went down slightly. This suggests that the declarations of the APA don’t have the power to create legions of mentally ill people by fiat, but rather that the number of people who struggle with their own minds stays somewhat constant.

What changes, it seems, is that they get categorized differently depending on the cultural landscape of the moment. Those walking worried who would have accepted the ubiquitous label of “anxiety” in the 1970s would accept the label of depression that rose to prominence in the late 1980s and the 1990s, and many in the same group might today think of themselves as having social anxiety disorder or ADHD.

Viewed over history, mental health symptoms begin to look less like immutable biological facts and more like a kind of language. Someone in need of communicating his or her inchoate psychological pain has a limited vocabulary of symptoms to choose from. From a distance, we can see how the flawed certainties of Victorian-era healers created a sense of inevitability around the symptoms of hysteria. There is no reason to believe that the same isn’t happening today. Healers have theories about how the mind functions and then discover the symptoms that conform to those theories. Because patients usually seek help when they are in need of guidance about the workings of their minds, they are uniquely susceptible to being influenced by the psychiatric certainties of the moment. There is really no getting around this dynamic. Even Insel’s supposedly objective laboratory scientists would, no doubt, inadvertently define which symptoms our troubled minds gravitate toward. The human unconscious is adept at speaking the language of distress that will be understood.

WHY DO PSYCHIATRIC DIAGNOSES fade away only to be replaced by something new? The demise of hysteria may hold a clue. In the early part of the 20th century, the distinctive presentation of the disorder began to blur and then disappear. The symptoms began to lose their punch. In France this was called la petite hysterie. One doctor described patients who would “content themselves with a few gesticulatory movements, with a few spasms.” Hysteria had begun to suffer from a kind of diagnostic overload. By 1930s or so, the dramatic and unmistakable symptoms of hysteria were vanishing from the cultural landscape because they were no longer recognized as a clear communication of psychological suffering by a new generation of women and their healers.

It is true that the DSM has a great deal of influence in modern America, but it may be more of a scapegoat than a villain. It is certainly not the only force at play in determining which symptoms become culturally salient. As Frances suggests, the marketing efforts of Big Pharma on TV and elsewhere have a huge influence over which diagnoses become fashionable. Some commentators have noted that shifts in diagnostic trends seem uncannily timed to coincide with the term lengths of the patents that pharmaceutical companies hold on drugs. Is it a coincidence that the diagnosis of anxiety diminished as the patents on tranquilizers ran out? Or that the diagnosis of depression rose as drug companies landed new exclusive rights to sell various antidepressants? Consider for a moment that the diagnosis of depression didn’t become popular in Japan until Glaxo-SmithKlein got approval to market Paxil in the country.

Journalists play a role as well: We love to broadcast new mental-health epidemics. The dramatic rise of bulimia in the United Kingdom neatly coincided with the media frenzy surrounding the rumors and subsequent revelation that Princess Di suffered from the condition. Similarly, an American form of anorexia hit Hong Kong in the mid-1990s just after a wave of local media coverage brought attention to the disorder.

The trick is not to scrub culture from the study of mental illness but to understand how the unconscious takes cues from its social settings. This knowledge won’t make mental illnesses vanish (Americans, for some reason, find it particularly difficult to grasp that mental illnesses are absolutely real and culturally shaped at the same time). But it might discourage healers from leaping from one trendy diagnosis to the next. As things stand, we have little defense against such enthusiasms. “We are always just one blockbuster movie and some weekend therapist’s workshops away from a new fad,” Frances writes. “Look for another epidemic beginning in a decade or two as a new generation of therapists forgets the lessons of the past.” Given all the players stirring these cultural currents, I’d make a sizable bet that we won’t have to wait nearly that long.

Looking Back / Life as an Avalanche


Looking back at my life, the first half was like this compendium of avalanches. I just didn’t know any better. The second half has been ‘living the aftermath’, in a state of shock and awe that I survived. Too true!

Steven Pinker on Male-Female Brain Differences / Important

This is an important presentation of the “problem” of differences between the male female brain as characterized in Western Civilization. Yes, I have much to say about specific claims made: Pinker makes the case – a description actually, of the “status quo” as a cultural phenomenon which is “rooted in” biology. But – the biology can be interpreted and “applied” in many ways. Men have traditionally done the interpreting – and mislabel their opinions as “truth” which is the wrong word to begin with in a science context.

The problem is, Pinker, as a speaker for the status quo, does not grasp the essential questions. He does not venture outside the western psychological paradigm that “everything human” can be accounted for by the SYSTEM of psychology that has created the Western status quo regarding male – female status. The “division” of all things human into male and female “camps” IGNORES what males and females SHARE as characteristic of Homo sapiens, the species. This intra-species competition is ridiculous! Why would an “intelligent species” divide its wealth of abilities and capabilities into two parts; value one set of those traits and talents (male) as important, but denigrate “the other” set (female) as unimportant. This in itself is idiotic –

A grand accumulation of “studies” does not sum up to be a anything but that – a body of studies which DO NOT QUESTION the assumption that such studies “are interested in truth” to begin with, or represent any serious investigation of male and female contributions to the species as a whole. Our evolution, which has been the product of a “male-female co-operative team” is cast as an adversarial proposition, in stark contrast to our admiration for a male-female co-operative system for survival that is evident in many species. Male-female “contact” beyond mere reproduction in Homo sapiens is ignored, in favor of a “male brain” obsession – that of dominance. This too, is idiotic –

“Our” view (the male is always assumed to be the species exemplar) of “human truth” is highly unbalanced! And it’s not “our truth as a species”  if women, and our female brains, are not included.

More later…

Genes and Autism / A Look Back to 2004

Pediatrics. 2004 May

The genetics of autism.

Muhle R1, Trentacoste SV, Rapin I. Author information

Abstract

Autism is a complex, behaviorally defined, static disorder of the immature brain that is of great concern to the practicing pediatrician because of an astonishing 556% reported increase in pediatric prevalence between 1991 and 1997, to a prevalence higher than that of spina bifida, cancer, or Down syndrome. This jump is probably attributable to heightened awareness and changing diagnostic criteria rather than to new environmental influences. Autism is not a disease but a syndrome with multiple non-genetic and genetic causes. By autism (the autistic spectrum disorders [ASDs]), we mean the wide spectrum of developmental disorders characterized by impairments in 3 behavioral domains: 1) social interaction; 2) language, communication, and imaginative play; and 3) range of interests and activities. Autism corresponds in this article to pervasive developmental disorder (PDD) of the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition and International Classification of Diseases, Tenth Revision. Except for Rett syndrome–attributable in most affected individuals to mutations of the methyl-CpG-binding protein 2 (MeCP2) gene–the other PDD subtypes (autistic disorder, Asperger disorder, disintegrative disorder, and PDD Not Otherwise Specified [PDD-NOS]) are not linked to any particular genetic or non-genetic cause.

Review of 2 major textbooks on autism and of papers published between 1961 and 2003 yields convincing evidence for multiple interacting genetic factors as the main causative determinants of autism. Epidemiologic studies indicate that environmental factors such as toxic exposures, teratogens, perinatal insults, and prenatal infections such as rubella and cytomegalovirus account for few cases. These studies fail to confirm that immunizations with the measles-mumps-rubella vaccine are responsible for the surge in autism. Epilepsy, the medical condition most highly associated with autism, has equally complex genetic/non-genetic (but mostly unknown) causes. Autism is frequent in tuberous sclerosis complex and fragile X syndrome, but these 2 disorders account for but a small minority of cases. Currently, diagnosable medical conditions, cytogenetic abnormalities, and single-gene defects (eg, tuberous sclerosis complex, fragile X syndrome, and other rare diseases) together account for <10% of cases. There is convincing evidence that “idiopathic” (relating to or denoting any disease or condition that arises spontaneously or for which the cause is unknown )autism is a heritable disorder. Epidemiologic studies report an ASD prevalence of approximately 3 to 6/1000, with a male to female ratio of 3:1. This skewed ratio remains unexplained: despite the contribution of a few well-characterized X-linked disorders, male-to-male transmission in a number of families rules out X-linkage as the prevailing mode of inheritance. The recurrence rate in siblings of affected children is approximately 2% to 8%, much higher than the prevalence rate in the general population but much lower than in single-gene diseases. Twin studies reported 60% concordance for classic autism in monozygotic (MZ) twins versus 0 in dizygotic (DZ) twins, the higher MZ concordance attesting to genetic inheritance as the predominant causative agent. Reevaluation for a broader autistic phenotype  that included communication and social disorders increased concordance remarkably from 60% to 92% in MZ twins and from 0% to 10% in DZ pairs. (Real, or artifact?)

This suggests that interactions between multiple genes cause “idiopathic” autism but that epigenetic factors and exposure to environmental modifiers may contribute to variable expression of autism-related traits. The identity and number of genes involved remain unknown. The wide phenotypic variability of the ASDs likely reflects the interaction of multiple genes within an individual’s genome and the existence of distinct genes and gene combinations among those affected.

There are 3 main approaches to identifying genetic loci, chromosomal regions likely to contain relevant genes: 1) whole genome screens, searching for linkage of autism to shared genetic markers in populations of multiplex families (families with >1 affected family member; 2) cytogenetic studies that may guide molecular studies by pointing to relevant inherited or de novo chromosomal abnormalities in affected individuals and their families; and 3) evaluation of candidate genes known to affect brain development in these significantly linked regions or, alternatively, linkage of candidate genes selected a priori because of their presumptive contribution to the pathogenesis of autism. Data from whole-genome screens in multiplex families suggest interactions of at least 10 genes in the causation of autism. Thus far, a putative speech and language region at 7q31-q33 seems most strongly linked to autism, with linkages to multiple other loci under investigation. Cytogenetic abnormalities at the 15q11-q13 locus are fairly frequent in people with autism, and a “chromosome 15 phenotype” was described in individuals with chromosome 15 duplications. Among other candidate genes are the FOXP2, RAY1/ST7, IMMP2L, and RELN genes at 7q22-q33 and the GABA(A) receptor subunit and UBE3A genes on chromosome 15q11-q13. Variant alleles of the serotonin transporter gene (5-HTT) on 17q11-q12 are more frequent in individuals with autism than in non-autistic populations. In addition, animal models and linkage data from genome screens implicate the oxytocin receptor at 3p25-p26. Most pediatricians will have 1 or more children with this disorder in their practices. They must diagnose ASD expeditiously because early intervention increases its effectiveness. Children with dysmorphic features, congenital anomalies, mental retardation, or family members with developmental disorders are those most likely to benefit from extensive medical testing and genetic consultation.

The yield of testing is much less in high-functioning children with a normal appearance and IQ and moderate social and language impairments. Genetic counseling justifies testing, but until autism genes are identified and their functions are understood, prenatal diagnosis will exist only for the rare cases ascribable to single-gene defects or overt chromosomal abnormalities. Parents who wish to have more children must be told of their increased statistical risk. It is crucial for pediatricians to try to involve families with multiple affected members in formal research projects, as family studies are key to unraveling the causes and pathogenesis of autism. Parents need to understand that they and their affected children are the only available sources for identifying and studying the elusive genes responsible for autism. Future clinically useful insights and potential medications depend on identifying these genes and elucidating the influences of their products on brain development and physiology.

PMID: 15121991

Again, we have the assumption that Aspergers is a defect in development – a set of vague symptoms that constitute  pathology. People who study pathologies will look for pathologies and may identify a genome as belonging to a “defective” person, therefore, there must be defects in their genome.

I wish for once that someone would set all that aside and compare Asperger individual genomes to other Asperger individuals, without prejudice.

And also look for genetic neoteny in modern social humans, instead of classifying all neurotypicals as “normal”

%d bloggers like this: