Looking Back on Bipolar / Seasonal Transitions and Disruptions

See also related Circadian Rhythm posts –

I used to be bipolar; diagnosed long ago, before Asperger’s was a recognized “thing” (1994) and not considered applicable to females. Looking back, I think this was a mistake – the “bipolar” symptoms I experienced can logically be seen as evidence for “Asperger-ness” as a brain type that processes the environment in a distinct and even radically different way than the overwhelming majority of Modern Social Humans – neurotypicals. One notable problem for me, was and is, a response to seasonal change; lack of sunlight and outdoor activity in winter produce a direct physical effect: extreme restlessness (anxiety) and a longing for the world to “come back” – to revive, to be washed in sunlight and present a landscape wide open to movement. This is not an uncommon condition for many people! The experience can be grossly represented as  claustrophobia. Winter is a time, that once adjusted to, can be very productive; a time of internal focus, mental activity and concentration.

The transition into summer, while eagerly embraced, can be disruptive, unsettling, and “mind-blowing” – Where I live, it’s a long process; inter-leaving of days of increasing sunlight that fool fragile plant life into attempts to emerge, but which are discouraged by snow storms and overnight freezes. The energy gained by extended sunlight at high altitude (6,000-7,000 feet) hits a certain point – and suddenly, our tan and brown,  heavily dissected desert is GREEN. It’s “shocking” to the eye; strange and brief. The sagebrush steppe is covered in prickly shrubs and myriad bunch grasses, which  must reproduce in the short window of mid May through June, and then pack away their chlorophyll for another year, leaving only yellow leaves and seeds to be dispersed by the famous Wyoming wind.  A palette of rich yellows, pale earth, and dusty gray green returns; a much more interesting landscape for sunlight to change in appearance, from moment to moment, throughout the day and evening. A “light show” transforms our two-part landscape of land and sky – a daily cycle of color and shadow that passes into cool night.

I don’t know if this experience of reality is common to Asperger individuals; that is  – the direct influence of the environment on mood, emotion and energy. This responsiveness to the land is not exclusive to Asperger’s types.

This desert has no “social” uses; agriculture is futile. Few people can live here, and without resource extraction for “dollars” and importation of food, even fewer could, or would stay. There is something extremely luxurious about a landscape that can’t be “socialized” – unitized, divided, owned and exploited by human agriculture, trade, commerce – made useful or productive. There’s something extremely luxurious about a life that grows to fit this type of land. I was made for this place: finding it meant “letting go of things not meant for me.” The Bhudda.


Original post about transitioning from summer into winter:

This is my 65th transition from summer into fall. Of course I don’t remember most of these changes. Fall is a bit of a drive-through season; the way we get to winter. It says so on the calendar: First Day of Fall, but for me it’s a long drawn out state of confusion, instability, moodiness: doom. What has disrupted my normal, careful, mostly peaceful days? Normal for me: my “writer’s routine” of coffee, computer and coming awake. Sometimes writing is easier while I’m still a bit stupefied by sleep.

Anticipation: that’s my experience of Fall, as if something momentous is about to happen, but it never does. One morning the garden plants have frozen, cells bursting; really physically dead; mush with frost rimming the remains. Light snow that melts quickly, the rocks damp and shiny, their colors deep and revealing.

It’s not that I don’t like winter, but some innocent intuitive organ believes that the earth is dying, and me with it. These experiences are so strong and consistent year after year, that I’m sure that being bipolar has something to do with ancient humans -tropical creatures who pushed too far north for their mental health. People whose brains and bodies were extensions of the seasons: work like mad in Spring and Summer and semi-hibernate in winter. Expend the least energy possible obtaining food and water; curl up like most of nature and sleep and dream an alternate existence filled with giants, heroes and mortal powers.




Growing out of Autism / Personal Experience

Please read previous post: https://aspergerhuman.wordpress.com/growing-out-of-autism-fact-or-myth/

Most of my experience with “Asperger, the label” is quite recent; before that I was diagnosed as bipolar. Even that diagnosis took decades to “find” – I was 36 years old, and had lived all that time with “mysterious” symptoms that today would most likely be recognized as “a problem” but which would still most likely be misdiagnosed or under-diagnosed. It wasn’t that “mental illnesses” weren’t recognized; they had been for hundreds of years. The problem came down to a social prejudice: No one who looked like me, or “functioned well” in terms of having a career and supporting myself, COULD HAVE PROBLEMS! No one wanted a pretty, talented, hard-working young woman to be anything but perfect; an object; not a living, breathing and vulnerable  human being – this was, and is shocking. The attitude among many people, including medical-psychiatric people – was that I was a “bad person” for even claiming that I had problems. The summation was pretty much:

“How can you be so selfish? I have patients who are really sick!”

In a way, this was nothing new: the same attitude prevailed when I was growing up, and had severe chronic anxiety and related “school problems” – not that I wasn’t “good at schoolwork” rather, I was “too good” at schoolwork and the target of social bullying. Imagine the complexity for a young girl: being praised for good grades and precocious talents, but then “slammed” for displaying those talents. I was repeatedly told to “hide” my intelligence because it upset other children. (Boys never received this message of self-mutilation) Those little teacher checkmarks next to “socialization markers” (Does not work well with others) erased every praiseworthy quality I had. It was assumed that because I was “smart” EVERYTHING was easy for me: I must be intentionally “socially stupid” – a troublemaker with “character flaws”.

There was a slight nod to “the problem of being a gifted child” in that it was possible that I might develop socially, if forced to, by being ignored, the hope being that somehow “being an acceptable female” would magically happen at puberty.

What no one recognized, and it’s still a “fact” which normal people ignore, is that all those years of social bullying and attacks on a child’s positive attributes as “bad” – if only you’d been born a boy, your intelligence and talent would be acceptable and “good”; just suck it up – being female is a life sentence of servitude to others; it’s all on you to solve YOUR problem, and on and on, further damaged any hope that I would see “being social” as anything but emotional and intellectual suicide. Angry ? You bet!

Heaping responsibility for the “breakdown of the social order” on a 7 year-old kid is really outrageous abuse!

Things got better in high school; a very large school with AP classes, which I was placed in. Lo and behold! There were girls like me – and for the first time I had peers who were girls – some very social, others not, and I became close friends with several – one or two who were far “brighter” than me, and I could relax – not stand out as a female freak. I gained much confidence in myself and “blossomed” as a female, but in my own style. The pressure was gone to “stuff myself” into prevailing definitions of “girly”.

It is fortunate that my path to developing as “me” took its own course, with the environment supplying the motivation and feedback for my reactions and choices. It was a “rich” experience that was never all bad or all good; really tough and painful at times, even life-threatening, but essential to seeing life as a complex challenge, the ultimate challenge being; How to be an individual in this monstrous mess that is human society?

What I had to grow out of was a monumental socially-imposed definition of people like me as freaks of nature; as mistakes of birth; as an insult to the “word of God” and a social order that declares “being female” a pathology.

More later….

Biology of Emotional Behavior / Neuroscience Article

Published in Dialogues in Clinical Neuroscience 2002 Sep; 4(3): 231–249.

The biology of fear and anxiety-related behaviors

Thierry Steimer, PhD (55 papers)
From the Abstract:

In a book published in 1878 (Physiologie des passions), Charles Letourneau, who was contemporary with the French neuroanatomist Paul Broca, defined emotions as “passions of a short duration” and described a number of physiological signs and behavioral responses associated with strong emotions.1 Emotions are “intimately linked with organic life,” he said, and either result in an “abnormal excitation of the nervous network,” which induces changes in heart rate and secretions, or interrupt “the normal relationship between the peripheral nervous system and the brain.” Cerebral activity is focused on the source of the emotion; voluntary muscles may become paralyzed and sensory perceptions may be altered, including the feeling of physical pain. (Note that this is a description of a physiological event) This first phase of the emotional response is followed by a reactive phase, where muscles come back into action, but the attention still remains highly focused on the emotional situation.

With the knowledge of brain physiology and anatomy that was available at the end of the 19th century, hypotheses on the mechanisms possibly involved in emotions were of course limited. However, Letourneau assumed that “the strong cerebral excitation” that accompanies emotions probably only concerned “certain groups of conscious cells” in the brain and “must necessitate a considerable increase of blood flow in the cell regions involved.” (Curious – can a cell be “conscious?)

He also mentioned that the intensity, the expression, and the pathological consequences of emotions were directly linked to temperaments” (which he defined within the four classic Hippocratic categories). Note that hypotheses and speculation by early investigators are often grandfathered in as theories, by default – and become the guiding “concepts” of contemporary science, often without question. The reverse is also common: “good science” from the past may be dismissed, merely on the basis that “new” is better: the myth of inevitable linear progress!

The fact that emotions are “intimately linked with organic life,” his precise description of the sequence of the physiological and behavioral reactions that accompany a strong emotion, such as fear, the idea that emotions involve specific areas of the brain, and the theory (hypothesis, guess) that activation of these areas is associated with an increased blood flow have all been largely confirmed (waffling) by modern neuroscience. The suggestion (mandatory waffling since the following statement isn’t provable by scientific standards) –  that temperament or personality traits influence the “affective style” and vulnerability to psychopathology is also an important aspect of our modern approach to anxiety and mood disorders. Is this a description of a physiological phenomenon or an opinion advanced by Hippocrates?


See post: https://aspergerhuman.wordpress.com/brain-scans-dead-salmon  

Also search my blog: “neuroscience” “brain scans” for multiple related posts


For a long time, emotions were considered to be unique to human beings, and were studied mainly from a philosophical perspective.3 Evolutionary theories and progress in brain and behavioral research, physiology, and psychology have progressively introduced the study of emotions into the field of biology, and understanding the mechanisms, functions, and evolutionary significance of emotional processes is becoming a major goal of modern neuroscience. But! The takeover of human behavior, it’s definition as “pathology” by psychology (not a science) is already defeating this revolutionary  science-based inquiry. 

Three fundamental aspects of emotions

The modem era of emotion research probably started when it became obvious that emotions are not just “feelings” or mental states, but are accompanied by physiological and behavioral changes that are an integral part of them. Technically this is backwards: the physiology of organisms’ reactions to the environment, as produced by evolutionary processes, preceded by billions of years the manmade practice of “naming” those reactions as “emotions” – and claiming that emotion is exclusive to humans. The “exclusivity” idea that “emotion” is a phenomenon that occurs only in humans is utterly preposterous. Animals (and all organisms) could not exist without reacting  to and interacting with the environment; it’s logically and physically impossible. 

The socio-religious belief that our species is a special creation, and the universe is merely a stage-set for his magnificence is obnoxious – and as yet, despite claims that this narcissistic focus on MAN has been magically removed from the “human sciences” is obviously not true.

The “levels” scheme below, is not a reformation of prior mistakes, but functions to retain the socio-religious “metaphysical” control of human behavior, but disguised as the pseudoscience of modern psychology. By piggy-backing onto neuroscience, “priestly” power to define and enforce the social stratification of behavioral privilege at the top of the hierarchy) and rampant inequality, is retained by pathologizing group after group of “lesser” humans. Nice trick!!!

This has progressively led to today’s view of emotions being experienced or expressed at three different, but closely interrelated levels: Here we go: everything must be split into levels, regardless of how nature – our brain – actually works. The mental or psychological level (dominated by “approved” socio-religious prescriptions) the (neuro)physiological level, (what the brain-body does) and the behavioral level (socio-religious enforcement – social control). These three complementary aspects are present in even the most basic emotions, such as fear.

more at PubMed

Diagnosis Mental Illness / Ancient Greece and Rome


From http://www.cchr.org.uk site dedicated to mental health abuses

The theory behind ECT (electro shock therapy) hasn’t advanced beyond that of the ancient Greeks who tried to cure mental problems using convulsive shock created by a drug called hellebore. A revival of ECT occurred in the early part of the 20th Century.

In 1938, Italian psychiatrist Ugo Cerletti developed electro shock treatment for humans, after being inspired by a visit to a slaughterhouse in Rome to observe butchers incapacitating pigs with electric shocks before killing them.


Diagnosing Mental Illness in Ancient Greece and Rome

Gods-given hallucinations and suppressing anger for the greater good: How what’s considered “abnormal” has changed.

THE ATLANTIC / Julie Beck Jan 23, 2014

William V. Harris, a professor of history and director of the Center for the Ancient Mediterranean at Columbia University, studies mental illness in the classical world—ancient Rome and Greece. Though the body of knowledge we have at our disposal is still not totally sufficient to understand mental illness today, there’s an added level of difficulty involved in trying to apply today’s knowledge to earlier civilizations. Or in understanding those civilizations’ concepts of mental illness in a time when the gods were thought to be involved in everyday life and hallucinations weren’t something to worry about.

Orestes chased by the Furies: in Ancient Greece, vengeance was often delivered by the Furies as a descent into madness

Many people in antiquity thought that mental disorders came from the gods. The Greek gods are a touchy lot, quick to take offense. For instance, they took a hard line with Orestes after his matricide. [Ed. Note: After killing his mother, Orestes was tormented by the Furies.] And in a world where many important phenomena such as mental illness were not readily explicable, the whims of the gods were the fallback explanation.

Physicians and others fought against this idea from an early date (the 5th century B.C.), giving physiological explanations instead. Many people sought magical/religious remedies—such as going to spend the night in a temple of the healing god Asclepius, in the hope that he would work a cure or tell you how to get cured—[while physicians sought] mainly medical ones. No one thought that it was the duty of the state to care for the insane. Either their families looked after them, or they ended up on the street—a nightmare situation.

In the introduction you wrote to Mental Disorders in the Classical World, you talk about “medicalizing mental illness.” When and why did people start to be seen as sick instead of crazy?

 Some time in the late 5th century B.C., some member of the school of Hippocrates wrote a treatise “On the Sacred Disease,” in which he argued that the “sacred disease,” i.e. epilepsy, was a physiological syndrome, and very soon all doctors and scientists (in so far as such a category existed) came to think that crazy people were sick (but not that they were not crazy). Greek doctors did not distinguish sharply between physical and mental disorders, and they did not have concepts that correspond simply with “depression” or “schizophrenia.” Roberto Lo Presti, in the book we are talking about, examines at length the development of Greek thinking about epilepsy. Greek doctors always tended to think that what we call psychoses were physiological in nature.
How did doctors diagnose the mentally ill back then? What were the criteria they used? And how did they go about treating them?They were mostly (not entirely) concerned with psychoses (externalizing disorders such as antisocial personality disorder and drug and alcohol use disorders) rather than neuroses (internalizing disorders such as depression and anxiety), and they took into account a full range of hard-to-define symptoms including inappropriate behavior in public, delusions, delirium, and hallucinations. Treatments also covered a whole range from physical restraint to counseling; they did not make much use of pharmaceuticals.In the essay you contributed about hallucinations, you mention that in the classical world, people often saw gods and otherworldly things. Was there an evolution of hallucinations from being seen as a supernatural experience to as a symptom of something medically wrong?

There was no simple evolution: the Hippocratic doctors already recognized hallucinations as a purely human phenomenon, but many ordinary people went on supposing that the gods were involved. The moral idea that anger was dangerous forms part of the widespread ancient idea that the essence of good behavior is self-control.
Does this mean that hallucinations were more commonplace and less stigmatized than today?No more commonplace, I think. Less stigmatized, yes, somewhat. One would not have sought treatment.Socrates had hallucinations, right? Did that affect how he was perceived?Socrates seems to have had recurrent hallucinations of one particular type: A voice spoke to him, usually advising him not to do things. His disciples were in awe of this phenomenon, but some of his later admirers thought they needed to explain it away—they thought it suggested that he was slightly cracked.

One of your older books is about rage—why was anger seen as an illness, or something to be controlled?

It took me about 400 pages to answer this question! Partly because it was seen as dangerous in the state, partly because it was seen as a danger in the family (especially because of slavery), partly later because excessive anger came to be seen as a personal moral failure.

Anger was dangerous to the state above all because it led to political violence, including tyrannical behavior by absolute rulers; dangerous to the family because of its potential to cause feuding and violence (as for slavery, the angry slave-owner could generally treat the slaves as he wished—but they might and did react). The moral idea arises out of these concrete political and social imperatives I think, but it also forms part of the widespread ancient idea that the essence of good behavior is self-control.

Are there difficulties applying today’s conceptions of what is “abnormal” to historical figures? Or vice versa?

There sure are, both ways. The conceptual and moral differences are huge. People have argued that, for example, Herod the Great and Caligula were schizophrenics, but tracing the way they actually behaved is rendered difficult by the inadequate sources [available]. And in the Roman world, a great deal of violence was normal, as was much of what we consider pedophilia. But this makes the work of scholars such as me more interesting as well as more difficult.

Are there any ideas the ancient Greeks or Romans held that would be helpful for us to think about in the discussion surrounding mental illness today?

Yes, as far as neuroses are concerned, see in particular Chris Gill’s contribution to the book I edited, with his emphasis on character. He looks at the idea that we should train our characters so that we are ready for life’s disasters and can face them robustly.



Denial of Social Failure / Mental Illness U.S.A.

The failure to face facts is a nationwide symptom of  infantile inability to grasp the importance of the forces that create both our  physical and social environments. An utterly fantastic state of denial exists in the minds of Americans as to our mental, physical and social health.



Numbers of Americans Affected by Mental Illness

One in four adults−approximately 61.5 million Americans−experiences mental illness in a given year. One in 17−about 13.6 million−live with a serious mental illness such as schizophrenia, major depression or bipolar disorder.

Approximately 20 percent of youth ages 13 to 18 experience severe mental disorders in a given year. For ages 8 to 15, the estimate is 13 percent.

Approximately 1.1 percent of American adults—about 2.4 million people—live with schizophrenia.

Approximately 2.6 percent of American adults−6.1 million people−live with bipolar disorder.

Approximately 6.7 percent of American adults−about 14.8 million people−live with major depression.

Approximately 18.1 percent of American adults−about 42 million people−live with anxiety disorders, such as panic disorder, obsessive-compulsive disorder (OCD), posttraumatic stress disorder (PTSD), generalized anxiety disorder and phobias.

About 9.2 million adults have co-occurring mental health and addiction disorders.

Approximately 26 percent of homeless adults staying in shelters live with serious mental illness and an estimated 46 percent live with severe mental illness and/or substance use disorders.

Approximately 20 percent of state prisoners and 21 percent of local jail prisoners have “a recent history” of a mental health condition.

Seventy percent of youth in juvenile justice systems have at least one mental health condition and at least 20 percent live with a severe mental illness.

Getting Mental Health Treatment in America

Approximately 60 percent of adults and almost one-half of youth ages 8 to 15 with a mental illness received no mental health services in the previous year.

African American and Hispanic Americans used mental health services at about one-half the rate of whites in the past year and Asian Americans at about one-third the rate.

One-half of all chronic mental illness begins by the age of 14; three-quarters by age 24. Despite (the availability of supposedly) effective treatment, there are long delays−sometimes decades−between the first appearance of symptoms and when people get help.

The Impact of Mental Illness in America

Serious mental illness costs America $193.2 billion in lost earnings per year.

Mood disorders such as depression are the third most common cause of hospitalization in the U.S. for both youth and adults ages 18 to 44.

Individuals living with serious mental illness face an increased risk of having chronic medical conditions. Adults living with serious mental illness die on average 25 years earlier than other Americans, largely due to treatable medical conditions.

Over 50 percent of students with a mental health condition age 14 and older who are served by special education drop out−the highest dropout rate of any disability group.

Suicide is the tenth leading cause of death in the U.S. (more common than homicide) and the third leading cause of death for ages 15 to 24 years. More than 90 percent of those who die by suicide had one or more mental disorders.

Although military members comprise less than 1 percent of the U.S. population, veterans represent 20 percent of suicides nationally.

Each day, about 22 veterans die from suicide.

Maybe they didn’t see this sign:



Aspergers listen to the WRONG kind of music


Please! Spare us any more “complex” music.

WOW! Just another reason why Asperger’s “annoy” neurotypicals: we actually enjoy complex music.

• Young kids with Asperger’s may become obsessed with complex topics, such as intricate patterns or music. Toddlers will become enraptured by a stylized pattern on a fabric or in a book. Babies may also listen to music that would typically be ignored by a normal youngster. This obsession becomes more apparent as the youngster ages. These children may be unable to focus on any other aspect of the environment once they notice the object of their obsession.

Behavioral conditioning will be necessary to help alleviate this symptom.

Just think: if psychologists had their way, “genius” could be prevented.


Evening Standard Article by James Rhodes, Classical Pianist (an annoying person who is obsessed with complex music)

It’s one a.m. A nine-year-old boy is fast asleep when his drunken father comes barging into his room. The boy is beaten awake and dragged downstairs to a piano where he is forced to play for his father and drunken friends for hours. A wrong note results in slaps, punches and ridicule. It happens regularly, and even when his father is sober the boy is mocked, beaten and forced to practise until he can barely see straight. Amid this madness, aged 11, rather than starve, he starts to earn a living as an organist. The beatings get so bad that twice, before the age of 13, he almost dies. As a teenager his mother dies, leaving him and his siblings in such dire straits that the boy is forced to go to court and wrest control of his father’s salary so the family can eat.

The adult that emerged from that hell was angry, sullen and suspicious. He was scarred physically and mentally, often suicidal, clumsy, badly coordinated, obtuse, prone to obsessive-compulsive behaviours and lacking both personal hygiene and social graces.

His name was Ludwig van Beethoven.

Ascribing specific dates and composers to different musical movements or eras works for everyone except Beethoven. There is Bach, the master of the Baroque; Haydn and Mozart the Classical superstars. There are Brahms, Chopin, Berlioz and Liszt the Romantics. Then there are Bruckner, Mahler and Wagner ushering music into the 20th century and the Stravinskys and  Schoenbergs with their “tyranny of the barline” and “emancipation of the dissonance” causing riots in Paris.

Before Beethoven, composers worked for the glory of God.  Or else they wrote on bended knee for wealthy courts and egotistical patrons. Beethoven kicked down the doors of the aristocratic world and made himself at home. He wrote for himself alone.

He was a superstar in Vienna — universally conceded to be the greatest composer in the world, something almost unheard of in the pre-digital age. And, most importantly, he knew it — “there will always be many princes and emperors but there will only ever be one Beethoven”, he wrote. It’s important here not to dismiss this as cockiness. What some may mistake as arrogance has stood the test of time as an unquenchable truth.

Beethoven is the most performed, revered composer there is. He eclipses every other composer and his shadow falls over every music manuscript in the world.  And if there were even a hint of injustice or hyperbole in that fact I would take issue with it. But the truth is, healthy or not (and I myself don’t hesitate to say healthy), Beethoven somehow achieved musical enlightenment and it is quite simply a fact of life that he is and always will be the benchmark, the prophet and the absolute peak of compositional genius for everyone else to aspire to.

Beethoven humbly transcended ego because he knew beyond doubt that he was writing for eternity. His confidence in his abilities was the only great truth in his life and he held on to it with such tenacity because it kept him alive. “To my art I owe the fact that I did not end my life with suicide,” he wrote. He was totally different to Mozart and Bach — his letters are full of words like artist, art, artistry. His music is the very definition of “interiority” — with Beethoven, music became about feelings, about looking within and expressing things hitherto unsayable.

Bach, Beethoven and Mozart are without question the holy trinity of music. But there is one reason alone that makes Beethoven The One, and it is his humanity. Bach and Mozart had gifts that came straight from God. I’m an unbeliever, but there is simply no other possible explanation for the depth of genius they displayed. What Bach and Mozart did with music is quite literally beyond any human comprehension.

Beethoven, on the other hand, was on his own. Every note was sweated over, every theme worked on tirelessly and chiselled into immortality. The manuscripts of Bach and Mozart look spotless next to the messy, crossed-out, almost indecipherable madness of Beethoven’s. While Mozart hurled symphonies on to paper as fast as he could write, barely without correction, Beethoven stewed and fought and wrestled and argued and raged until he forced what he was looking for out and onto the page.

In 1805 he changed the course of musical history, composing the Eroica symphony; a symphony twice as long as anything that had come before it, written for an orchestra of the future and the first truly “heroic” piece of music. With one compulsive wrench, music entered the 19th century. His invention and resource never flagged — his Fifth symphony, described by Forster as “the most sublime noise that has ever penetrated into the ear of man” — has an entire structure that is erected based on only four hammerblow notes. His music, especially that composed during his last 10 years, is unique — nothing like it has been composed, nothing ever will.

And he was deaf.




Perfection / A Social Trap

Yes; begin where you can.

Begin where you can.

Perfectionism is just a word until one begins thinking about the role it has played in one’s life. As usual, it is an activity, which when fused with social expectations, becomes an object of practical, moral and economic opinion. Perfectionism is not a “thing” but a tool with which to assess standards and compare outcomes, especially in art, literature and other creative endeavors.

Intelligent-creative people, minorities, and the disabled are held to much higher standards of “achievement” than typically-abled humans.

Google “perfectionism” and a highly negative picture appears. Once again, psychology has made a judgment about PEOPLE who are perfectionists; they are bad, unhappy, trapped in a corner, wasting their lives. We see the “pyramid scheme” poking through: common everyday perfectionists are self-abusers, unhappy, and paradoxically, create failure, but upper echelon “money-makers,”  are praised as perfectionists. A start up company, or artistic catalogue, once it becomes “trendy” and profitable, is contrarily opined as a positive result of perfectionism. Long hours, dedication to a goal, the march of progress and final economic success are added to the unending search for human perfection.

Athletes and immigrants are particularly subject to having their lives rewritten as journeys that fulfill the cultural need for success; rags to riches, American Dream, unlimited opportunity; the story of those whose early deprivation presented signs of future fame and influence. Perfect performance is always a component of the myth, but the expectation of perfection can be destructive.  How many “celebrity” children are crushed by such demands? And, the distance between failure and perfection grows and grows in American culture. It is no longer enough to be a “millionaire.” One must be a “billionaire.” One cannot simply post a funny video; it must generate millions of views globally. One cannot have a handful of close friends; one must garner the notice of thousands of strangers. And so, the perfect life is money and attention; not for any good reason, just because notoriety is the new “unreachable” scale of perfection.

We lie to children and torment them with one treacherous statement:

“You can be anything you dream of being,” is a bald-faced lie.

This pompous assertion cuts off actual potential by a “mental device” that has become typical in the U.S.; by presenting a socially reverse-engineered pop-culture myth, the “you can be anything” statement is delivered by individuals who have already achieved great success. The accompanying myth of their (supposedly) meteoric rise always includes magical signs that predict greatness – a “lucky”  legitimacy and foreshadowing of destiny by a chance meeting with a superstar; an injury that turned out to be a blessing; a lost parent who directs a child’s fate from the afterlife; a sudden supernatural voice, at the right moment, that said, “never give up.” These motivational events happen to almost all humans, but do not produce fame and fortune in the majority. The seed is planted: anything less than extraordinary destiny becomes failure.

Dream big! Achieve little.

The goal of becoming an adult who can find satisfying work, a worthwhile partner and the means to raise a family, has fallen to the bottom of the pyramid, when this “outcome” is the common denominator by which “average people” express the greatest source of happiness. But this achievement is not possible: everyone must put up the appearance of becoming more, and more and more.

I do think that Asperger individuals have a tricky relationship with “perfection”. Perfection as the act of seeking and creating meaningful work I see as no problem, but when our “passion” becomes a “fate” by which we are judged, it becomes a noose that tightens against our “defects”. Expectations as “the gifted child” create a problem: our lives have been laid out before us as a burden and an obligation; “gifts” are dangerous in a mediocre society. This is an ancient human theme in which those who arrive with something “extra” are expected to save everyone’s ass by acts of sacrifice, but are also expected to “disappear” once they are no longer useful.

We see this again and again in young men who are asked to die by old men;  soldiers have difficulty in not identifying the two as one and the same: Young males must die for the old. Isn’t this upside down? Why isn’t it the old and useless males, who have had their chance at life who are expected to “volunteer” to “save” young fathers and sons from unwarranted tragedy?

We encounter perfection and want to merge with it, which for me at least, is my subjective experience of “bliss”. Mythologies the world over warn of such improper boundary crossings by humans into the realm of the gods. Countless myths offer up Heroes who are granted “fire stolen from the gods” that costs them everything, but in the long run restores balance to society, which is the real goal of their existence. So, in this philosophy, talents and abilities are not the end in themselves, but means to ends; ends that are available to humans in general when an individual applies his or her abilities toward a realizable goal.

American culture is blind to this deeper and wider actualization of success. In the U.S., only those at the apex of the Pyramid count. The promise is to elevate “the peasants” to the upper levels of the pyramid, but this is logically impossible. The top 1% needs the 99% of humanity at the bottom to fail – and defines failure as “not being” at the top of the hierarchy.  

Aspergers are susceptible to being judged on the basis of success as something elevated beyond “normal”. In the neurotypical scheme of life, a child obsessed with knowledge dares to pass into factual reality, which contains the secrets of the universe; a domain where few socially typical individuals dare to go. Taboo, because neurotypical predators crave domination: any “successful” neurotypical would use intelligence to exploit other people. The idea that “Aspergers” have little to no social ambition is simply not credible; in fact it is a source of derision and fear – and opportunity for social predators. .

As a young child I was terribly confused. My intelligence was superficially praised, but harshly received. Intelligence was tested and tracked and presented as important, but forbidden to girls – actualization of “power” was a crime against nature, religion and males of any kind: against all of “defenseless” neurotypical humanity. Ironically, extra abilities and the good fortune of “beauty” could be exploited for family status (marry a rich man, become a “beauty queen”, an actress or celebrity) or to manipulate others behind the scenes to benefit a husband. Selfish ends were quite okay, but a desire to improve a greater sphere of human need was forbidden. To expand knowledge, opinion, laws or the frontiers of human stupidity was, and is, forbidden.

It has taken a lifetime to construct a workable “fix” for myself: Perfection happens. Nature is the domain of perfection and it is informative that nature never rests, but is the continual unfolding of possibilities within a set of laws (boundaries) – a balance of change and continuity that is perfect only in the moment. It’s okay to strive for perfection in creative work, but it’s good to understand that perfection is ephemeral.

Life may be a tool by which the universe acknowledges its own perfection.

However, no human is required to be perfect: What a relief. Nor is any child or adult required to fulfill any expectation that the label “Asperger’s Disorder” attempts to place on them. 



Warning / “Reasonable Person” Social Definitions

Free Dictionary Farlex

The “reasonable person standard” makes no allowance for the mentally ill.[24] The archaic (170 year old) guiding legal principle de facto identifies the “mentally ill” as persons who are “not Reasonable” and therefore belong to a criminal class. Mentally ill persons and “addicts” are jailed rather than treated medically, even when the “crime” is nonviolent, or not a crime at all. 

“Reasonable Person”: Legal Definition and Application

A phrase frequently used in tort and Criminal Lawto denote a hypothetical person in society who exercises average care, skill, and judgment in conduct and who serves as a comparative standard for determining liability. 
The decision as to whether or not an accused person is guilty of a given offense might involve the application of an objective test (Are you kidding?) in which the conduct of the accused is compared to that of a reasonable person under similar circumstances. In most cases, persons with greater than average skills, or with special duties to society, are held to a higher standard of care. For example, a physician who aids a person in distress is held to a higher standard of care than is an ordinary person.

If this doesn’t terrify an Asperger, nothing will!

That is: A hypothetical Neurotypical is the model for a reasonable person. An Asperger, as well as vast array of select “nonconforming” human beings, will be judged in the legal system by neurotypical behavior. So be forewarned…

The archaic (170 year old) guiding legal principle “reasonable person” de facto identifies people who are not “reasonable” – as defined by cultural, social and psychological prejudices, as belonging to a criminal class. This judgement is not about “taking responsibility for one’s actions”; this is refusal on the part of the “ruling classes” to recognize their own insincere rationalizations for not doing anything productive about the reality of poverty, addiction, mental illness and their own cruel belief that they ought to be spared the annoying presence of “defective” individuals.  

The Social pyramid, again, but this time, the “superior person” is supposedly held to higher stands of behavior! What a Joke! Corporations, wealthy individuals, financial criminals and government employees  are generally exempt from criminal prosecution and punishment via loopholes in laws, (Laws which they often “help” to write). Teams of lawyers, a swinging door between public and private sectors (influence) and the “old boy” system of privileged behavior and money decide “the fate” of “superior” people.

Meanwhile, the “rest of the population” is expected to “straighten up” when subjected to “Christian compassion, mercy and charity” in the federal and state prison systems, a horrific American cesspool that is run on brutality, inhumane conduct, and is overseen by psychologists, who spout rhetoric while aiding and abetting torture, and in fact, tacitly approve the INCREASE in mental and physical distress in human beings. The common complaint by the “experts” in “human behavior is that drugs are the answer, but “they” stop taking their medications, so we have to lock them up.

Where have we seen this “program model” before? If it works on rats, it will work on humans.

So… just stop being “poor, mentally ill, addicted to drugs “eccentric” and social typicals will be happy!

A video that allows the blatantly befuddled and hypocritical “experts” to expose their lack of comprehension as to the consequences of their own stupidity.









The Home Environment / Critical for ASD kids

A few snips from post after post by the truly enraged mother of a 21 year-old autistic son that she wants out of her house. There is no doubt that parenting an autistic child is an enormous challenge, but like any child, the environment is critical to the outcome for the child. Rants about a child who “chooses to be an asshole” are disturbing and shocking. One of the mean things that she says is painfully familiar: my mother accused me of being happy around my friends, but not when with the family. By that time (high school) I was able to say, “Yes. They like me, and they don’t treat me like a stranger.”


Makes one think that so-called institutionalization, if it were done correctly, would be better than a barrage of hate at home.


“There are too many people in this house that you fight with. You fight with everyone! Me. Your brother, the dog and cat. Your father. If you haven’t gotten along with us in almost 20 years, chances are, you won’t start now. You do however get along with friends and peers. You will get along with strangers better than you will EVER get along with your family!”

“You are not happy here. You are happiest when you are away from your family.  You are happy at work. You are happy at church and when you are with your church family. You are happy when you are around your friends, and people your age.  You are happy when you travel and spend time away from us. Why not have more of that?  Truthfully, freedom is happiness! Trust me. I’d give anything to be free and on my own! The time I lived on my own was the best time of my life!” (Wow! Wish you’d never been born.)

“Life struggles make you stronger. Life can’t always be easy, or you will never learn anything. Challenges are opportunities to learn. An easy life teaches you absolutely nothing. Living on your own figuring out your bills and your life will be challenging, but it will definitely lead to you being a strong man and less of a baby!”

“I say this with all the love in my heart. Love that you will see a lot more of, once you move out of my house!”

No More Free Drug Samples? / Dangerous Practice

The dangerous and widespread practice of Sample Drug Handout “Experimental” Diagnosis: 1. Try this drug: If it helps, you must be depressed. 2. If it doesn’t help: “Try this other drug, maybe it will help”  3. Keep adding “new” drugs until the patient is a zombie, suicidal, or in jail; blame the patient for “getting worse, not better”.

9b2b2b052b14114816bb22aa8d5520b2This “Guinea Pig” form of “diagnosis and treatment” is OUTRAGEOUSLY reckless and incompetent.  It leads to “diagnosis by drug reaction” and “additive” prescribing – handing out more and more “samples” without discontinuing “ineffective or harmful” medications. I have experienced this type of pressure to be a “Guinea Pig” on too many occasions, with very bad consequences. In one instance the “psychiatrist” using this lazy “technique” thought it was funny, saying to my face, “I hear you had a “bad trip” – laughing as he said it, as though I had taken a psychedelic drug “for fun”. He never acknowledged his carelessness behavior; he did not apologize – or show any “empathy” regarding the traumatic reaction I had suffered.

I called the pharmaceutical company for information (Yes! you can do this) and was told by one of their staff pharmacists, that the drug came with clear instructions that it was NOT to be prescribed or taken by patients with – Guess what? My diagnosis – due to the history of life-threatening side effects. There were ZERO consequences for the psychiatrist who willfully ignored the warning that the pharmaceutical company provided.


PLoS Med. 2009 May; 6(5): e1000074.
Published online 2009 May 12. doi:  10.1371/journal.pmed.1000074 Link full article

No More Free Drug Samples?


Everybody likes something free, and free prescription drug samples are no exception. Patients love to receive them, and doctors feel good about handing them out. The practice of providing free drug samples is based on the tacit assumption that “sampling” does much more good than harm. In two separate news releases within the past year by the Pharmaceutical Research and Manufacturers of America (PhRMA), the trade organization that represents the country’s largest and leading drug companies, a senior vice president claimed that free samples improve patient care, foster appropriate medication use, and help millions of financially struggling patients. He averred further that samples benefit physicians by exposing them to new treatment options [1],[2]. In this essay, we question the assumption that good trumps harm when prescription drugs are provided free to practicing doctors. We argue that “sampling” is not effective in improving drug access for the indigent, does not promote rational drug use, and raises the cost of care.

Health Care Costs

Samples are not effective in lowering patients’ costs. Indeed, evidence shows that patients who received free samples had higher out-of-pocket costs than their counterparts who were not given free samples [21]. Samples raise the cost of health care, as companies recoup marketing costs through higher prices and increased sales volume. Samples constitute an enormous promotional outlay of pharmaceutical companies. Between 1996 and 2000, they accounted for slightly more than half of the total promotional dollars spent by industry [22]. Although there is controversy about how best to tally the amount of money the pharmaceutical industry spends on free samples, a recent analysis of 2004 figures sets the retail value of samples at approximately 16 billion US dollars [23]. The retail value of free samples has risen steadily, doubling between 1999 and 2003 [24] (Figure 1). Sample distribution often intensifies during new drug launches, or when a product is withdrawn from the market and competitors scramble to fill the vacuum [25].


Figure 1. Retail value of US samples, in billions of dollars.


Samples raise health care costs by promoting the use of expensive products. In the US, prescription costs grew 5-fold from 1990 to 2006 [26] and are said to be approaching US$200 billion annually [27]. A substantial fraction of the increase is attributed to a growing reliance on expensive, brand-name medications [28] (Figure 2). One analysis several years ago showed that in a single year, the 50 most heavily marketed drugs accounted for nearly half of the increase in retail spending on prescription drugs (the other 9,850 drugs made up the remaining sum) [29]. These are the very products patients are mostly likely to receive as samples.


 Figure 2. Factors driving growth in drug spending, 1993–2003.