Asperger Perception / Poetry, Emotion, Sensation

My particular perception of reality is concrete, which means I use my senses a lot, with little to no “distance” separating “who I am” from my sensory information. That is, I don’t “supernaturalize” information. I don’t use words to create social scaffolds of “meaning” that lead farther and farther away from understanding the environment. Meaning for me is the experience of being alive. Sometimes that experience is painful, precarious and uncertain, but “certain” is exactly what existence is not. Social humans have all manner of delusions that help them feel safe.

I wrote poems (not exactly my thing) when I first moved to Wyoming. Looking back, I can see that it was a way to become familiar with a magnificent new landscape that I already seemed to know, because the sensations it produced were similar to  sensations that began when I was a child and found familiarity, contentment, and beauty in particular images – real or artistic. In a way, it was if I was completing a picture of my reality, as a child who was did not belong where she was supposed to belong, and found it necessary to “jump ship” and swim until she either drowned or survived.

 

A Cold Evening

The screen door is open to the full moon

and to telling-of-snow clouds that push across its white face

and to a rip saw wind that breaks our few trees,

and the empty sound of a can that rattles along the street. 

Comfort

Barren town, barren land, sand in my boots, a cobble in my hand:

Fifty million years is a short trip into earth’s history,

But it’s a nice distance from humanity.

IMG_0172wp

Dry Land

Wheel tracks cut the flanks of yellow hills,

as if the lowly sagebrush and bitter creosote end somewhere,

and a person could drive that far.

Glass

Years vanish as if I had been sliding across glass instead of living:

my fingernails have left skid marks on time.

It was a happier life that I knew before this year, and yet,

I managed to digest my father’s death

and to end a cold war with my brother, who pampers an ancient grudge

like a Russian who aches to launch one last missile

for old time’s sake.

Limits

Beyond the simple life my capacities are wanting:

I know my house, the rocks and flowers in the garden,

and listen for the dogs’ overblown defense of our mundane perimeter.

I was made for this place, and insist on the right to set my limits low

but my horizons high.

Refresh Icon

The arrant light of daybreak restores our desert province to precious clarity,

but as the world turns, the gray chaparral and yellow cliffs bloom white hot

and the banded hills become a picture that is overexposed and uninviting.

The reward for our endurance comes at twilight, when nature’s products,

and man’s efforts as well, are suffused with the crimson wavelengths

of the sun’s farewell. Until tomorrow then; the earth’s rotation is our refresh icon.

October

Sagebrush stains the air a chemical color,

with a smell that is dark red and orange and concentrated, like piss.

A lone receptor in my nose approves of the bitter stink and passes it to my brain,

where it connects to all Octobers.

IMG_0175wp

 Pretty

Small tasks become precious ways to gather time:

a thread, when pulled, makes emptiness into a pretty ruffle.

On sunny days, when isolated clouds throw shadows

across the land, it may be said to be pretty, as a plain woman

may be pretty if she takes pride in her plain face.

Unjust Summer

Summer to the third power –

no clouds, no shade, no escape

from the blue spaces, from the yard

burning hot and red and pink,

as if the chimes and lawn chairs

and a whirl-a-gig were featured

in Home & Garden: MARS. 

 

 

 

 

 

 

 

 

 

 

 

 

Zinc, Lithium, Manganese, Magnesium, Iron and your Brain / SciAm

SA Mind

Metals and Mental Health

Deficiencies in zinc can play a role in depression and a new way to enhance lithium may hold promise for bipolar disorder

  • By Tori Rodriguez on September 1, 2015 (Click on author for access to a variety of interesting articles.)

An Elemental Effect on Mental Health
Zinc, copper, iron—these and many other elements play a crucial role in health and sickness. Beyond the well-known toxic effects of lead, it can be difficult to determine the precise impacts of these metals because they interact with one another and with many types of molecules found in our body. Recent research has led to some key insights, however, which may lead to new treatments for mental illnesses.


Linking Zinc to Depression
Depression is tricky to treat because many patients do not respond to antidepressant medications. A growing body of evidence suggests that zinc deficiency may be a factor underlying depression in some cases—and zinc supplements can be an effective treatment for people whose levels are low.

A meta-analysis published in December 2013 in Biological Psychiatry analyzed 17 studies and found that depressed people tended to have about 14 percent less zinc in their blood than most people do on average, and the deficiency was greater among those with more severe depression. In the brain, zinc is concentrated in glutamatergic neurons, which increase brain activity and play a role in neuroplasticity, explains one of the paper’s co-authors, Krista L. Lanctôt, a professor of psychiatry and pharmacology at the University of Toronto. “Those neurons feed into the mood and cognition circuitry,” she says.

Newer results increasingly point to a causal relation. Last September researchers at the University of Newcastle in Australia reported findings of two longitudinal studies that demonstrated an inverse relation between depression risk and dietary zinc intake. After adjusting for all known potential confounders, they found that the odds of developing depression among men and women with the highest zinc intake was about 30 to 50 percent lower than those with the lowest intake. Although previous studies have shown that zinc supplementation can augment the effects of antidepressant medications, research published in May in Nutritional Neuroscience is the first to investigate the effects of zinc alone on depressive symptoms. In the double-blind, randomized, placebo-controlled trial, researchers assigned participants to one of two groups: every day for 12 weeks, one group received 30 milligrams of zinc; the other group received a placebo. At the end of the study period, the zinc group showed a steeper decline in its scores on a rigorous inventory of depression symptoms.

“The future treatment of depression is zinc sulfate,” says Atish Prakash, a postdoctoral fellow in the department of pharmacy at the MARA University of Technology in Malaysia, who co-authored a thorough review of studies on the role of zinc in brain disorders, published in April in Fundamental and Clinical Pharmacology. Researchers strongly caution against people trying zinc supplements on their own, however—when levels are too high, zinc can cause other complications. Working with a doctor is essential, and in most cases, eating a healthier diet is probably a better way to ensure optimal zinc levels than supplementation. Yet for those with depression who are also at high risk for zinc deficiency, including vegetarians, people with alcoholism, gastrointestinal issues or diabetes, and pregnant or lactating women, zinc may be just what the doctor ordered.


Improving Lithium Treatment

Lithium has been providing relief to patients with bipolar disorder for decades. Although it is considered the standard treatment for the illness, how it works—and why it does not work for at least half of patients who try it—remains largely a mystery. Recent study findings suggest that a hormonal mechanism may be a factor.

In research published in July in the Journal of Molecular Neuroscience, scientists from several universities expanded on earlier work investigating the role of insulinlike growth factor (IGF1) in lithium sensitivity. (Scientific American is part of Springer Nature.) A 2013 paper by some of the authors of the newer study had found higher levels of the hormone in blood cells of bipolar patients who were responsive to lithium treatment, as compared with nonresponders. In the current study, researchers tested the effects of administering IGF1 to the blood cells of those same patients.

Adding the hormone increased lithium sensitivity only in cells of nonresponders, which “proves that indeed IGF1 is strongly implicated in determining clinical response or resistance to lithium,” says study co-author Elena Milanesi, a postdoctoral fellow at the Sackler Faculty of Medicine at Tel Aviv University in Israel. Further research will be needed to discern treatment possibilities, including supplemental use of the hormone or a similarly acting drug in lithium-resistant patients. Synthetic human IGF1 is already FDA-approved for human use in other kinds of disorders, Milanesi says, so she hopes clinical trials can get under way quickly.


Other Metals and the Mind
IRON. Iron deficiency impedes neurotransmission and cell metabolism, and research findingshave linked it with cognitive deficits in children and adults.

MAGNESIUM. Low magnesium intake has been implicated in anxiety and depression in studies of humans and rodents, and new research published in Acta Neuropsychiatrica suggests the relation is mediated by altered gut microbes, which have previously been linked with depression. In the study, mice fed a magnesium-deficient diet displayed an increase in depressive behavior and alterations in gut microbiota that were positively associated with neuroinflammation in the hippocampus.

Supposedly sources of magnesium

Supposed to be sources of magnesium

MANGANESE. In research reported in the Journal of Alzheimer’s Disease, scientists from China and Japan investigated the role of manganese—a known neurotoxin at high levels—in the progression of cognitive decline. In 40 older adults, they found that manganese levels were significantly correlated with scores on assessments of cognitive function and dementia and that levels of the characteristic protein tangles of Alzheimer’s disease increased as manganese levels did. Excessive manganese is usually caused by airborne pollutants or pesticides, but eating too little iron can increase manganese absorption—so a healthy diet is key here, too.


Beware of Supplements
That headline may sound alarmist—if your doctor advises you to take a supplement, by all means, you should take it. Yet we cannot emphasize enough the importance of consulting a health care provider before starting any kind of supplement regimen, especially one that includes the trace elements discussed in this overview. Many of these elements can cause serious complications at high levels as well as low levels, and it is easy to accidentally go overboard. In addition, it can be hard to tell whether a person truly needs supplements—zinc, for example, cannot be reliably measured in blood or urine. Researchers use a complex variety of measurements and indicators to determine patients’ zinc levels—something the average doctor’s office cannot replicate. (Supplements are NOT regulated and may or may not contain the ingredients on the label or may have contaminants.)

In addition, most researchers and physicians believe that improving a person’s diet is a far better way to reach healthy levels of these elements. Eating whole foods such as fresh meats, vegetables, fruits, nuts and seeds will give most people the nutrients they need. Avoiding highly processed foods with added sugars and fats is key, too, because those types of foods can impede your body’s absorption of nutrients. In other words, that spinach salad is actually rendered less healthy if you chase it with a candy bar.

I

Mental Illness / Media and Mass Shootings

This is a long article: Go to: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC43182861

U.S. National Library of Medicine National Institute of Health

Mental Illness, Mass Shootings, and the Politics of American Firearms

Jonathan M. Metzl, MD, PhD and Kenneth T. MacLeish, PhD

Jonathan M. Metzl is with the Center for Medicine, Health, and Society and the Departments of Sociology and Psychiatry, Vanderbilt University, Nashville, TN. Kenneth T. MacLeish is with the Center for Medicine, Health, and Society and the Department of Anthropology, Vanderbilt University.

Only in the 1960s and 1970s did US society begin to link schizophrenia with violence and guns. Psychiatric journals suddenly described patients whose illness was marked by criminality and aggression. Federal Bureau of Investigation (FBI) most-wanted lists in leading newspapers described gun-toting “schizophrenic killers” on the loose,76 and Hollywood films similarly showed angry schizophrenics who rioted and attacked.77

Historical analysis14,78 suggests that this transformation resulted, not from increasingly violent actions perpetuated by “the mentally ill,” but from diagnostic frame shifts that incorporated violent behavior into official psychiatric definitions of mental illness. Before the 1960s, official psychiatric discourse defined schizophrenia as a psychological “reaction” to a splitting of the basic functions of personality. Descriptors emphasized the generally calm nature of such persons in ways that encouraged associations with poets or middle-class housewives.79 But in 1968, the second edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM)80 recast paranoid schizophrenia as a condition of “hostility,” “aggression,” and projected anger, and included text explaining that, “the patient’s attitude is frequently hostile and aggressive, and his behavior tends to be consistent with his delusions.”80(p34-36)

A somewhat similar story can be told about posttraumatic stress disorder (PTSD), another illness frequently associated with gun violence.15 From the mid-19th century though World War II, military leaders and doctors assumed that combat-related stress afflicted neurotic or cowardly soldiers. In the wake of the Vietnam War, the DSM-III recast PTSD as a normal mind’s response to exceptional events. Yet even as the image of the traumatized soldier evolved from sick and cowardly to sympathetic victim, PTSD increasingly became associated with violent behavior in the public imagination, and the stereotype of the “crazy vet” emerged as a result. In the present day, even news coverage drawing attention to veterans’ suffering frequently makes its point by linking posttraumatic stress with violent crime, despite the paucity of data linking PTSD diagnosis with violence and criminality.38,81

Evolutions such as these not only imbued the mentally ill with an imagined potential for violence, but also encouraged psychiatrists and the general public to define violent acts as symptomatic of mental illness. As the following section suggests,

the diagnostic evolution of schizophrenia additionally positioned psychiatric discourse as authoritative, not just on clinical “conditions” linking guns with mental illness, but on political, social, and racial ones as well.

WOW! A dangerous granting of authority to psychiatrists, (and indeed psychology and the social sciences) and further evidence that the “caring, fixing, helping” industry has taken on vast power to define individual “destinies” since the 1960s. Most Americans have no awareness that this shift in dominant authority has occurred and how negatively this philosophy of pan-human dysfunction has eroded the American quality of life.

_______________________________________________________

Whatever behavior is disapproved, becomes a ‘symptom’ of mental illness. That symptom can be attached to “acting black” or any other chosen origin of behavior.

Social arrogance? In the 1960-70s Black activism was promoted as 'mental illness" which could be "treated" with medication - Haldol.

Psychiatric racism: In the 1960-70s Black activism was promoted as a mental illness, which could be controlled by the application of Haldol.

In a 1969 essay titled “The Protest Psychosis,” psychiatrists postulated that the growing racial disharmony in the US at the height of the Civil Rights Movement was a  manifestation of psychotic behaviors and delusions afflicting America’s black lower class. “Paranoid delusions that one is being constantly victimized” resulted in black male anger and misplaced desire to overthrow the establishment.

 

Memoir / Impossible for an Asperger?

A photo that tells me of a time when my inner and outer experiences were undivided.

A photo that tells me of a time when my inner and outer experiences were undivided.

 

Rethinking a Life

A few thoughts on writing memoir:

How much talent is lost because society doesn’t like the package it comes in? The individual is a rare creature: she is self-made and not designated by a political system. A political statement of rights does not make one an individual: those rights are defined and held in check by the society that grants them. As a young woman who not only wanted to achieve financial equality and independence, I was told that the ticket to “getting in” to the system was to adopt the very structure that denied opportunity to women. I was also horrified to learn that I was expected to drop my gender at the door, as well as my personality, values, individual potential, and most surprisingly, talents that might benefit an employer. Another shock: I learned that this defacement is what men have been required to accept for centuries.

Individuality is a function of personal qualities that are cast against the vast historical canvas of culture, and in many ways the individual exists in opposition to that picture. Identity is a package prepared by generations of ancestors, as well as the living family, long before a child is born. Father and mother shape a child’s beliefs, behavior, and future. The larger society sets the rules of membership, which can be extremely harsh. The individual is born and dies when each of us is assigned a role dictated by ideas, prescriptions, and absolutes that the individual has no part in creating. The result is that when one looks into a mirror, a shadow feeling haunts the body: that is not any face; it is my face, unique in all the world. Why then, do I not know myself?

To write a memoir is to tear oneself loose from social conformity and to declare that one’s life is not the same as any other life, regardless of how similar human lives are. A modern trend in autobiography has lead people to think that the writer has amazing secrets to reveal, and that he or she will do just that; why else would one write a book about a life that has yet to be concluded? A memoir is expected to erase the public person, to replace the mask with a livelier, racier, and more interesting person. Family members and close friends are expected to be shocked by revelations, and will claim that they did not know of secret individual choices on the part of someone they thought they knew. The public loves it when a willful individual goes bad. Confession and repentance, in the form of a serial memoir (the trek from talk show to talk show) or a best-selling book, return the stray reprobate to the group. In this sense, a memoir is a religious document.

What then, is an individual? The test is simple: Only an individual can care about the welfare of other individuals. The group, by definition, cannot. The group survives by enforcing conformity and does not recognize that each person has a valid interior life, only that inner lives are suspect. The group pressures and grooms the individual to vanish into a pre-assigned role: the American idea of an individual is someone who utterly conforms to social norms, but does something like skydive with a pet dog, or paint the bedroom orange, or pick a cartoon graphic for a credit card that claims, “I’m whacky! I’m crazy! I’m creative.”

From the point of view of a person with a mental illness, this is ludicrous: normal  people have no idea what crazy is. Any deviation from the suffocating religious-patriotic complex of American belief, including what have been called criminal acts, is increasingly regarded as mental deviation and not diversity. To be diagnosed as having a brain disorder automatically puts one outside of society, forcing one to embrace a strange, dangerous, and unsought individuality. The memoir of such a person cannot be separated from this predicament, which can be described as the terror and the mystery of the conventional.

A social view of life as a program to be fulfilled, and only completed at death.
“No one ought to be said to be happy, until death and the last funeral rights.” Ovid

 

 

Looking Back on Bipolar / Seasonal Transitions and Disruptions

See also related Circadian Rhythm posts –

I used to be bipolar; diagnosed long ago, before Asperger’s was a recognized “thing” (1994) and not considered applicable to females. Looking back, I think this was a mistake – the “bipolar” symptoms I experienced can logically be seen as evidence for “Asperger-ness” as a brain type that processes the environment in a distinct and even radically different way than the overwhelming majority of Modern Social Humans – neurotypicals. One notable problem for me, was and is, a response to seasonal change; lack of sunlight and outdoor activity in winter produce a direct physical effect: extreme restlessness (anxiety) and a longing for the world to “come back” – to revive, to be washed in sunlight and present a landscape wide open to movement. This is not an uncommon condition for many people! The experience can be grossly represented as  claustrophobia. Winter is a time, that once adjusted to, can be very productive; a time of internal focus, mental activity and concentration.

The transition into summer, while eagerly embraced, can be disruptive, unsettling, and “mind-blowing” – Where I live, it’s a long process; inter-leaving of days of increasing sunlight that fool fragile plant life into attempts to emerge, but which are discouraged by snow storms and overnight freezes. The energy gained by extended sunlight at high altitude (6,000-7,000 feet) hits a certain point – and suddenly, our tan and brown,  heavily dissected desert is GREEN. It’s “shocking” to the eye; strange and brief. The sagebrush steppe is covered in prickly shrubs and myriad bunch grasses, which  must reproduce in the short window of mid May through June, and then pack away their chlorophyll for another year, leaving only yellow leaves and seeds to be dispersed by the famous Wyoming wind.  A palette of rich yellows, pale earth, and dusty gray green returns; a much more interesting landscape for sunlight to change in appearance, from moment to moment, throughout the day and evening. A “light show” transforms our two-part landscape of land and sky – a daily cycle of color and shadow that passes into cool night.

I don’t know if this experience of reality is common to Asperger individuals; that is  – the direct influence of the environment on mood, emotion and energy. This responsiveness to the land is not exclusive to Asperger’s types.

This desert has no “social” uses; agriculture is futile. Few people can live here, and without resource extraction for “dollars” and importation of food, even fewer could, or would stay. There is something extremely luxurious about a landscape that can’t be “socialized” – unitized, divided, owned and exploited by human agriculture, trade, commerce – made useful or productive. There’s something extremely luxurious about a life that grows to fit this type of land. I was made for this place: finding it meant “letting go of things not meant for me.” The Bhudda.

_________________________________________________________________

Original post about transitioning from summer into winter:

This is my 65th transition from summer into fall. Of course I don’t remember most of these changes. Fall is a bit of a drive-through season; the way we get to winter. It says so on the calendar: First Day of Fall, but for me it’s a long drawn out state of confusion, instability, moodiness: doom. What has disrupted my normal, careful, mostly peaceful days? Normal for me: my “writer’s routine” of coffee, computer and coming awake. Sometimes writing is easier while I’m still a bit stupefied by sleep.

Anticipation: that’s my experience of Fall, as if something momentous is about to happen, but it never does. One morning the garden plants have frozen, cells bursting; really physically dead; mush with frost rimming the remains. Light snow that melts quickly, the rocks damp and shiny, their colors deep and revealing.

It’s not that I don’t like winter, but some innocent intuitive organ believes that the earth is dying, and me with it. These experiences are so strong and consistent year after year, that I’m sure that being bipolar has something to do with ancient humans -tropical creatures who pushed too far north for their mental health. People whose brains and bodies were extensions of the seasons: work like mad in Spring and Summer and semi-hibernate in winter. Expend the least energy possible obtaining food and water; curl up like most of nature and sleep and dream an alternate existence filled with giants, heroes and mortal powers.

 

 

 

Growing out of Autism / Personal Experience

Please read previous post: https://aspergerhuman.wordpress.com/growing-out-of-autism-fact-or-myth/

Most of my experience with “Asperger, the label” is quite recent; before that I was diagnosed as bipolar. Even that diagnosis took decades to “find” – I was 36 years old, and had lived all that time with “mysterious” symptoms that today would most likely be recognized as “a problem” but which would still most likely be misdiagnosed or under-diagnosed. It wasn’t that “mental illnesses” weren’t recognized; they had been for hundreds of years. The problem came down to a social prejudice: No one who looked like me, or “functioned well” in terms of having a career and supporting myself, COULD HAVE PROBLEMS! No one wanted a pretty, talented, hard-working young woman to be anything but perfect; an object; not a living, breathing and vulnerable  human being – this was, and is shocking. The attitude among many people, including medical-psychiatric people – was that I was a “bad person” for even claiming that I had problems. The summation was pretty much:

“How can you be so selfish? I have patients who are really sick!”

In a way, this was nothing new: the same attitude prevailed when I was growing up, and had severe chronic anxiety and related “school problems” – not that I wasn’t “good at schoolwork” rather, I was “too good” at schoolwork and the target of social bullying. Imagine the complexity for a young girl: being praised for good grades and precocious talents, but then “slammed” for displaying those talents. I was repeatedly told to “hide” my intelligence because it upset other children. (Boys never received this message of self-mutilation) Those little teacher checkmarks next to “socialization markers” (Does not work well with others) erased every praiseworthy quality I had. It was assumed that because I was “smart” EVERYTHING was easy for me: I must be intentionally “socially stupid” – a troublemaker with “character flaws”.

There was a slight nod to “the problem of being a gifted child” in that it was possible that I might develop socially, if forced to, by being ignored, the hope being that somehow “being an acceptable female” would magically happen at puberty.

What no one recognized, and it’s still a “fact” which normal people ignore, is that all those years of social bullying and attacks on a child’s positive attributes as “bad” – if only you’d been born a boy, your intelligence and talent would be acceptable and “good”; just suck it up – being female is a life sentence of servitude to others; it’s all on you to solve YOUR problem, and on and on, further damaged any hope that I would see “being social” as anything but emotional and intellectual suicide. Angry ? You bet!

Heaping responsibility for the “breakdown of the social order” on a 7 year-old kid is really outrageous abuse!

Things got better in high school; a very large school with AP classes, which I was placed in. Lo and behold! There were girls like me – and for the first time I had peers who were girls – some very social, others not, and I became close friends with several – one or two who were far “brighter” than me, and I could relax – not stand out as a female freak. I gained much confidence in myself and “blossomed” as a female, but in my own style. The pressure was gone to “stuff myself” into prevailing definitions of “girly”.

It is fortunate that my path to developing as “me” took its own course, with the environment supplying the motivation and feedback for my reactions and choices. It was a “rich” experience that was never all bad or all good; really tough and painful at times, even life-threatening, but essential to seeing life as a complex challenge, the ultimate challenge being; How to be an individual in this monstrous mess that is human society?

What I had to grow out of was a monumental socially-imposed definition of people like me as freaks of nature; as mistakes of birth; as an insult to the “word of God” and a social order that declares “being female” a pathology.

More later….

Biology of Emotional Behavior / Neuroscience Article

Published in Dialogues in Clinical Neuroscience 2002 Sep; 4(3): 231–249.

The biology of fear and anxiety-related behaviors

Thierry Steimer, PhD (55 papers)
From the Abstract:

In a book published in 1878 (Physiologie des passions), Charles Letourneau, who was contemporary with the French neuroanatomist Paul Broca, defined emotions as “passions of a short duration” and described a number of physiological signs and behavioral responses associated with strong emotions.1 Emotions are “intimately linked with organic life,” he said, and either result in an “abnormal excitation of the nervous network,” which induces changes in heart rate and secretions, or interrupt “the normal relationship between the peripheral nervous system and the brain.” Cerebral activity is focused on the source of the emotion; voluntary muscles may become paralyzed and sensory perceptions may be altered, including the feeling of physical pain. (Note that this is a description of a physiological event) This first phase of the emotional response is followed by a reactive phase, where muscles come back into action, but the attention still remains highly focused on the emotional situation.

With the knowledge of brain physiology and anatomy that was available at the end of the 19th century, hypotheses on the mechanisms possibly involved in emotions were of course limited. However, Letourneau assumed that “the strong cerebral excitation” that accompanies emotions probably only concerned “certain groups of conscious cells” in the brain and “must necessitate a considerable increase of blood flow in the cell regions involved.” (Curious – can a cell be “conscious?)

He also mentioned that the intensity, the expression, and the pathological consequences of emotions were directly linked to temperaments” (which he defined within the four classic Hippocratic categories). Note that hypotheses and speculation by early investigators are often grandfathered in as theories, by default – and become the guiding “concepts” of contemporary science, often without question. The reverse is also common: “good science” from the past may be dismissed, merely on the basis that “new” is better: the myth of inevitable linear progress!

The fact that emotions are “intimately linked with organic life,” his precise description of the sequence of the physiological and behavioral reactions that accompany a strong emotion, such as fear, the idea that emotions involve specific areas of the brain, and the theory (hypothesis, guess) that activation of these areas is associated with an increased blood flow have all been largely confirmed (waffling) by modern neuroscience. The suggestion (mandatory waffling since the following statement isn’t provable by scientific standards) –  that temperament or personality traits influence the “affective style” and vulnerability to psychopathology is also an important aspect of our modern approach to anxiety and mood disorders. Is this a description of a physiological phenomenon or an opinion advanced by Hippocrates?

_____________________________

See post: https://aspergerhuman.wordpress.com/brain-scans-dead-salmon  

Also search my blog: “neuroscience” “brain scans” for multiple related posts

___________________________________________________ 

For a long time, emotions were considered to be unique to human beings, and were studied mainly from a philosophical perspective.3 Evolutionary theories and progress in brain and behavioral research, physiology, and psychology have progressively introduced the study of emotions into the field of biology, and understanding the mechanisms, functions, and evolutionary significance of emotional processes is becoming a major goal of modern neuroscience. But! The takeover of human behavior, it’s definition as “pathology” by psychology (not a science) is already defeating this revolutionary  science-based inquiry. 

Three fundamental aspects of emotions

The modem era of emotion research probably started when it became obvious that emotions are not just “feelings” or mental states, but are accompanied by physiological and behavioral changes that are an integral part of them. Technically this is backwards: the physiology of organisms’ reactions to the environment, as produced by evolutionary processes, preceded by billions of years the manmade practice of “naming” those reactions as “emotions” – and claiming that emotion is exclusive to humans. The “exclusivity” idea that “emotion” is a phenomenon that occurs only in humans is utterly preposterous. Animals (and all organisms) could not exist without reacting  to and interacting with the environment; it’s logically and physically impossible. 

The socio-religious belief that our species is a special creation, and the universe is merely a stage-set for his magnificence is obnoxious – and as yet, despite claims that this narcissistic focus on MAN has been magically removed from the “human sciences” is obviously not true.

The “levels” scheme below, is not a reformation of prior mistakes, but functions to retain the socio-religious “metaphysical” control of human behavior, but disguised as the pseudoscience of modern psychology. By piggy-backing onto neuroscience, “priestly” power to define and enforce the social stratification of behavioral privilege at the top of the hierarchy) and rampant inequality, is retained by pathologizing group after group of “lesser” humans. Nice trick!!!

This has progressively led to today’s view of emotions being experienced or expressed at three different, but closely interrelated levels: Here we go: everything must be split into levels, regardless of how nature – our brain – actually works. The mental or psychological level (dominated by “approved” socio-religious prescriptions) the (neuro)physiological level, (what the brain-body does) and the behavioral level (socio-religious enforcement – social control). These three complementary aspects are present in even the most basic emotions, such as fear.

more at PubMed

Diagnosis Mental Illness / Ancient Greece and Rome

Electroshock1_jpg_display

From http://www.cchr.org.uk site dedicated to mental health abuses

The theory behind ECT (electro shock therapy) hasn’t advanced beyond that of the ancient Greeks who tried to cure mental problems using convulsive shock created by a drug called hellebore. A revival of ECT occurred in the early part of the 20th Century.

In 1938, Italian psychiatrist Ugo Cerletti developed electro shock treatment for humans, after being inspired by a visit to a slaughterhouse in Rome to observe butchers incapacitating pigs with electric shocks before killing them.

_______________________________________________

Diagnosing Mental Illness in Ancient Greece and Rome

Gods-given hallucinations and suppressing anger for the greater good: How what’s considered “abnormal” has changed.

THE ATLANTIC / Julie Beck Jan 23, 2014

William V. Harris, a professor of history and director of the Center for the Ancient Mediterranean at Columbia University, studies mental illness in the classical world—ancient Rome and Greece. Though the body of knowledge we have at our disposal is still not totally sufficient to understand mental illness today, there’s an added level of difficulty involved in trying to apply today’s knowledge to earlier civilizations. Or in understanding those civilizations’ concepts of mental illness in a time when the gods were thought to be involved in everyday life and hallucinations weren’t something to worry about.

Orestes chased by the Furies: in Ancient Greece, vengeance was often delivered by the Furies as a descent into madness

Many people in antiquity thought that mental disorders came from the gods. The Greek gods are a touchy lot, quick to take offense. For instance, they took a hard line with Orestes after his matricide. [Ed. Note: After killing his mother, Orestes was tormented by the Furies.] And in a world where many important phenomena such as mental illness were not readily explicable, the whims of the gods were the fallback explanation.

Physicians and others fought against this idea from an early date (the 5th century B.C.), giving physiological explanations instead. Many people sought magical/religious remedies—such as going to spend the night in a temple of the healing god Asclepius, in the hope that he would work a cure or tell you how to get cured—[while physicians sought] mainly medical ones. No one thought that it was the duty of the state to care for the insane. Either their families looked after them, or they ended up on the street—a nightmare situation.

In the introduction you wrote to Mental Disorders in the Classical World, you talk about “medicalizing mental illness.” When and why did people start to be seen as sick instead of crazy?

 Some time in the late 5th century B.C., some member of the school of Hippocrates wrote a treatise “On the Sacred Disease,” in which he argued that the “sacred disease,” i.e. epilepsy, was a physiological syndrome, and very soon all doctors and scientists (in so far as such a category existed) came to think that crazy people were sick (but not that they were not crazy). Greek doctors did not distinguish sharply between physical and mental disorders, and they did not have concepts that correspond simply with “depression” or “schizophrenia.” Roberto Lo Presti, in the book we are talking about, examines at length the development of Greek thinking about epilepsy. Greek doctors always tended to think that what we call psychoses were physiological in nature.
How did doctors diagnose the mentally ill back then? What were the criteria they used? And how did they go about treating them?They were mostly (not entirely) concerned with psychoses (externalizing disorders such as antisocial personality disorder and drug and alcohol use disorders) rather than neuroses (internalizing disorders such as depression and anxiety), and they took into account a full range of hard-to-define symptoms including inappropriate behavior in public, delusions, delirium, and hallucinations. Treatments also covered a whole range from physical restraint to counseling; they did not make much use of pharmaceuticals.In the essay you contributed about hallucinations, you mention that in the classical world, people often saw gods and otherworldly things. Was there an evolution of hallucinations from being seen as a supernatural experience to as a symptom of something medically wrong?

There was no simple evolution: the Hippocratic doctors already recognized hallucinations as a purely human phenomenon, but many ordinary people went on supposing that the gods were involved. The moral idea that anger was dangerous forms part of the widespread ancient idea that the essence of good behavior is self-control.
Does this mean that hallucinations were more commonplace and less stigmatized than today?No more commonplace, I think. Less stigmatized, yes, somewhat. One would not have sought treatment.Socrates had hallucinations, right? Did that affect how he was perceived?Socrates seems to have had recurrent hallucinations of one particular type: A voice spoke to him, usually advising him not to do things. His disciples were in awe of this phenomenon, but some of his later admirers thought they needed to explain it away—they thought it suggested that he was slightly cracked.

One of your older books is about rage—why was anger seen as an illness, or something to be controlled?

It took me about 400 pages to answer this question! Partly because it was seen as dangerous in the state, partly because it was seen as a danger in the family (especially because of slavery), partly later because excessive anger came to be seen as a personal moral failure.

Anger was dangerous to the state above all because it led to political violence, including tyrannical behavior by absolute rulers; dangerous to the family because of its potential to cause feuding and violence (as for slavery, the angry slave-owner could generally treat the slaves as he wished—but they might and did react). The moral idea arises out of these concrete political and social imperatives I think, but it also forms part of the widespread ancient idea that the essence of good behavior is self-control.

Are there difficulties applying today’s conceptions of what is “abnormal” to historical figures? Or vice versa?

There sure are, both ways. The conceptual and moral differences are huge. People have argued that, for example, Herod the Great and Caligula were schizophrenics, but tracing the way they actually behaved is rendered difficult by the inadequate sources [available]. And in the Roman world, a great deal of violence was normal, as was much of what we consider pedophilia. But this makes the work of scholars such as me more interesting as well as more difficult.

Are there any ideas the ancient Greeks or Romans held that would be helpful for us to think about in the discussion surrounding mental illness today?

Yes, as far as neuroses are concerned, see in particular Chris Gill’s contribution to the book I edited, with his emphasis on character. He looks at the idea that we should train our characters so that we are ready for life’s disasters and can face them robustly.

http://www.theatlantic.com/health/archive/2014/01/diagnosing-mental-illness-in-ancient-greece-and-rome/282856/

 

Denial of Social Failure / Mental Illness U.S.A.

The failure to face facts is a nationwide symptom of  infantile inability to grasp the importance of the forces that create both our  physical and social environments. An utterly fantastic state of denial exists in the minds of Americans as to our mental, physical and social health.

images565JQ7KG

NAMI

Numbers of Americans Affected by Mental Illness

One in four adults−approximately 61.5 million Americans−experiences mental illness in a given year. One in 17−about 13.6 million−live with a serious mental illness such as schizophrenia, major depression or bipolar disorder.

Approximately 20 percent of youth ages 13 to 18 experience severe mental disorders in a given year. For ages 8 to 15, the estimate is 13 percent.

Approximately 1.1 percent of American adults—about 2.4 million people—live with schizophrenia.

Approximately 2.6 percent of American adults−6.1 million people−live with bipolar disorder.

Approximately 6.7 percent of American adults−about 14.8 million people−live with major depression.

Approximately 18.1 percent of American adults−about 42 million people−live with anxiety disorders, such as panic disorder, obsessive-compulsive disorder (OCD), posttraumatic stress disorder (PTSD), generalized anxiety disorder and phobias.

About 9.2 million adults have co-occurring mental health and addiction disorders.

Approximately 26 percent of homeless adults staying in shelters live with serious mental illness and an estimated 46 percent live with severe mental illness and/or substance use disorders.

Approximately 20 percent of state prisoners and 21 percent of local jail prisoners have “a recent history” of a mental health condition.

Seventy percent of youth in juvenile justice systems have at least one mental health condition and at least 20 percent live with a severe mental illness.

Getting Mental Health Treatment in America

Approximately 60 percent of adults and almost one-half of youth ages 8 to 15 with a mental illness received no mental health services in the previous year.

African American and Hispanic Americans used mental health services at about one-half the rate of whites in the past year and Asian Americans at about one-third the rate.

One-half of all chronic mental illness begins by the age of 14; three-quarters by age 24. Despite (the availability of supposedly) effective treatment, there are long delays−sometimes decades−between the first appearance of symptoms and when people get help.

The Impact of Mental Illness in America

Serious mental illness costs America $193.2 billion in lost earnings per year.

Mood disorders such as depression are the third most common cause of hospitalization in the U.S. for both youth and adults ages 18 to 44.

Individuals living with serious mental illness face an increased risk of having chronic medical conditions. Adults living with serious mental illness die on average 25 years earlier than other Americans, largely due to treatable medical conditions.

Over 50 percent of students with a mental health condition age 14 and older who are served by special education drop out−the highest dropout rate of any disability group.

Suicide is the tenth leading cause of death in the U.S. (more common than homicide) and the third leading cause of death for ages 15 to 24 years. More than 90 percent of those who die by suicide had one or more mental disorders.

Although military members comprise less than 1 percent of the U.S. population, veterans represent 20 percent of suicides nationally.

Each day, about 22 veterans die from suicide.

Maybe they didn’t see this sign:

imagesdonworry

 

Aspergers listen to the WRONG kind of music

14COMMENT0611B

Please! Spare us any more “complex” music.

WOW! Just another reason why Asperger’s “annoy” neurotypicals: we actually enjoy complex music.

• Young kids with Asperger’s may become obsessed with complex topics, such as intricate patterns or music. Toddlers will become enraptured by a stylized pattern on a fabric or in a book. Babies may also listen to music that would typically be ignored by a normal youngster. This obsession becomes more apparent as the youngster ages. These children may be unable to focus on any other aspect of the environment once they notice the object of their obsession.

Behavioral conditioning will be necessary to help alleviate this symptom.

Just think: if psychologists had their way, “genius” could be prevented.

__________________________________________________________________________________

Evening Standard Article by James Rhodes, Classical Pianist (an annoying person who is obsessed with complex music)

It’s one a.m. A nine-year-old boy is fast asleep when his drunken father comes barging into his room. The boy is beaten awake and dragged downstairs to a piano where he is forced to play for his father and drunken friends for hours. A wrong note results in slaps, punches and ridicule. It happens regularly, and even when his father is sober the boy is mocked, beaten and forced to practise until he can barely see straight. Amid this madness, aged 11, rather than starve, he starts to earn a living as an organist. The beatings get so bad that twice, before the age of 13, he almost dies. As a teenager his mother dies, leaving him and his siblings in such dire straits that the boy is forced to go to court and wrest control of his father’s salary so the family can eat.

The adult that emerged from that hell was angry, sullen and suspicious. He was scarred physically and mentally, often suicidal, clumsy, badly coordinated, obtuse, prone to obsessive-compulsive behaviours and lacking both personal hygiene and social graces.

His name was Ludwig van Beethoven.

Ascribing specific dates and composers to different musical movements or eras works for everyone except Beethoven. There is Bach, the master of the Baroque; Haydn and Mozart the Classical superstars. There are Brahms, Chopin, Berlioz and Liszt the Romantics. Then there are Bruckner, Mahler and Wagner ushering music into the 20th century and the Stravinskys and  Schoenbergs with their “tyranny of the barline” and “emancipation of the dissonance” causing riots in Paris.

Before Beethoven, composers worked for the glory of God.  Or else they wrote on bended knee for wealthy courts and egotistical patrons. Beethoven kicked down the doors of the aristocratic world and made himself at home. He wrote for himself alone.

He was a superstar in Vienna — universally conceded to be the greatest composer in the world, something almost unheard of in the pre-digital age. And, most importantly, he knew it — “there will always be many princes and emperors but there will only ever be one Beethoven”, he wrote. It’s important here not to dismiss this as cockiness. What some may mistake as arrogance has stood the test of time as an unquenchable truth.

Beethoven is the most performed, revered composer there is. He eclipses every other composer and his shadow falls over every music manuscript in the world.  And if there were even a hint of injustice or hyperbole in that fact I would take issue with it. But the truth is, healthy or not (and I myself don’t hesitate to say healthy), Beethoven somehow achieved musical enlightenment and it is quite simply a fact of life that he is and always will be the benchmark, the prophet and the absolute peak of compositional genius for everyone else to aspire to.

Beethoven humbly transcended ego because he knew beyond doubt that he was writing for eternity. His confidence in his abilities was the only great truth in his life and he held on to it with such tenacity because it kept him alive. “To my art I owe the fact that I did not end my life with suicide,” he wrote. He was totally different to Mozart and Bach — his letters are full of words like artist, art, artistry. His music is the very definition of “interiority” — with Beethoven, music became about feelings, about looking within and expressing things hitherto unsayable.

Bach, Beethoven and Mozart are without question the holy trinity of music. But there is one reason alone that makes Beethoven The One, and it is his humanity. Bach and Mozart had gifts that came straight from God. I’m an unbeliever, but there is simply no other possible explanation for the depth of genius they displayed. What Bach and Mozart did with music is quite literally beyond any human comprehension.

Beethoven, on the other hand, was on his own. Every note was sweated over, every theme worked on tirelessly and chiselled into immortality. The manuscripts of Bach and Mozart look spotless next to the messy, crossed-out, almost indecipherable madness of Beethoven’s. While Mozart hurled symphonies on to paper as fast as he could write, barely without correction, Beethoven stewed and fought and wrestled and argued and raged until he forced what he was looking for out and onto the page.

In 1805 he changed the course of musical history, composing the Eroica symphony; a symphony twice as long as anything that had come before it, written for an orchestra of the future and the first truly “heroic” piece of music. With one compulsive wrench, music entered the 19th century. His invention and resource never flagged — his Fifth symphony, described by Forster as “the most sublime noise that has ever penetrated into the ear of man” — has an entire structure that is erected based on only four hammerblow notes. His music, especially that composed during his last 10 years, is unique — nothing like it has been composed, nothing ever will.

And he was deaf.

_________________________________