Light Skin and Lactose / Recent Adaptations to Cereal Diet

IFL Science

Why Do Europeans Have White Skin?

April 6, 2015 | by Stephen Luntz (shortened to get to the point)

The 1000 Genomes Project is comparing the genomes of modern individuals from specific regions in Europe with 83 samples taken from seven ancient European cultures. Harvard University’s Dr. Iain Mathieson has identified five features which  spread through Europe, indicating a strong selection advantage.

At the annual conference of the American Association of Physical Anthropologists, Mathieson said his team distinguished, “between traits that have changed consistently with population turnovers, traits that have changed apparently neutrally, and traits that have changed dramatically due to recent natural selection.”

… most people of European descent are lactose tolerant, to the extent that milk products not only form a major source of nutrition but are a defining feature of European cultures…that the capacity to digest lactose as an adult appeared in the population after the development of farming. Two waves of farmers settled Europe 7,800 and 4,800 years ago, but it was only 500 years later that the gene for lactose tolerance became widespread.

…hunter-gatherers in what is now Spain, Luxumberg and Hungary had dark-skinned versions of the two genes more strongly associated with skin color. The oldest pale versions of the SLC24A5 and SLC45A2 genes that Mathieson found were at Motala in southern Sweden 7,700 years ago. The gene associated with blue eyes and blond hair was found in bodies from the same site. H/T ScienceMag.



From: Civilization Fanatics Forum

Debunking the theory that lighter skin gradually arose in Europeans nearly 40,000 years ago, new research has revealed that it evolved recently – only 7,000 years ago

People in tropical to subtropical parts of the world manufacture vitamin D in their skin as a result of UV exposure. At northern latitudes, dark skin would have reduced the production of vitamin D. If people weren’t getting much vitamin D in their diet, then selection for pre-existing mutations for lighter skin (less pigment) would “sweep” the farming population.  

New scientific findings show that prehistoric European hunter-gatherers were dark-skinned, but ate vitamin D-rich meat, fish, mushrooms and fruits. With the switch to agriculture, the amount of vitamin D in the diet decreased – and resulted in selection for pale skin among European farmers.

Findings detailed today (Jan. 26, 2014) in the journal Nature, “also hint that light skin evolved not to adjust to the lower-light conditions in Europe compared with Africa, but instead to the new diet that emerged after the agricultural revolution”, said study co-author Carles Lalueza-Fox, a paleogenomics researcher at Pompeu Fabra University in Spain.

The finding implies that for most of their evolutionary history, Europeans were not what people today are known as  ‘Caucasian’, said Guido Barbujani, president of the Associazione Genetica Italiana in Ferrara, Italy, who was not involved in the study.





Who’s Safe With a Gun? Don’t Ask a Shrink

The Daily Beast, May 2013 Background Checks


Forget any guidance from psychiatry’s bible, the DSM-5, when it comes to background checks for gun buyers, writes the psychotherapist author of The Book of Woe. (Gary Greenburg)

Many years ago, a man I was seeing in therapy decided he wanted to take up a new hobby: high explosives. The state he lived in licensed purchasers of dynamite and other incendiaries only after a background check. He wanted to know: Would I write a letter declaring him fit to blow up stuff in his backyard for fun?

Aside from the fact that this was how he wanted to pass the weekend, I didn’t have any reason to think otherwise, so I gave him the note. He got the license. A few years after he stopped seeing me, I had occasion to visit him at his office. He had all his digits and limbs and, to my knowledge, had committed no antisocial acts with his legally obtained explosives. My note attesting to his mental health was framed on his wall.

I’ve been thinking about this guy recently, ever since our politicians’ imaginations have fastened upon background checks as the solution to our gun problems. I’ve also been thinking about a couple of other patients. One of them, a middle-aged professional, a ramrod-straight retired Marine, father of a little girl, faithful husband, the kind of man who buys a special lockbox just for transporting his weapon between home and gun club. The other: a 27-year-old hothead, an absentee father who never met a drug or a woman he didn’t like. His idea of fun was riding his motorcycle between lanes on the interstate at 100 mph, and he was the proud owner of (by his count) 37 guns. In the three years prior to arriving at my office, he’d been fired from four jobs, arrested for six or seven driving offenses and a few drug charges, and helped to bury three of his friends who met untimely and violent ends.

No one asked me which of these two men I’d rather was a gun owner, let alone which one ought to have a firearms license. But I know what my answer would have been. Or I would have known until about a year ago, when the ex-Marine, inexplicably and without warning (although he’d just been put on an antidepressant as part of a treatment for chronic pain), sat at the base of the tree holding his favorite deer perch and shot himself in the mouth. Meantime, the hothead has cooled down. He’s been with the same woman for two years and the same job for one. He sees his son faithfully twice a week. He’s sold his motorcycle and more than half of his guns, and become obsessed with bodybuilding and responsibility. The transformation is not complete—he’s still dead certain the government wants to come to his house and confiscate what’s left of his arsenal, for instance—and I can’t take too much credit for it. He’s pursuing the pleasures of self-control with the same manic intensity as he once chased adrenaline. But I’m not all that worried about his guns anymore, and I’m really glad no one asked me if he should have them.

Because one thing they don’t teach you in therapy school: how to tell the future. Clinicians can assemble a story out of the ashes of a person’s life; we might even be able to spot what we think are the seeds of catastrophe, but we generally do that best in retrospect. And that’s why, if one of us insists he or she knows for sure what’s coming next, you should find another therapist. It’s also why, to the extent that background checks involve people like me, it wouldn’t do much more than reassure politicians that they are doing something about gun violence without simultaneously threatening their National Rifle Association ratings.

But wait a minute, you may be saying. Don’t mental-health workers have a whole huge book of diagnoses to turn to that can help you assess a person’s fitness to own a gun? No, we don’t. We have the book, of course, the Diagnostic and Statistical Manual of Mental Disorders, which is about to come out in its fifth edition. But while some of those disorders seem incompatible with responsible gun ownership, even a diagnosis of a severe mental illness like schizophrenia or bipolar disorder isn’t a good predictor of who is going to become violent. Indeed, only about 4 percent of violent crimes are committed by mentally ill people. We are not going to diagnose our way to safety.

There’s a reason for this. A diagnosis of a mental disorder is only a description of a person’s troubles. A neurologist presented with a patient suffering loss of coordination and muscle weakness can run tests and diagnose amyotrophic lateral sclerosis or a brain tumor. They can explain the symptoms and predict with some accuracy what will happen as the disease takes its expected course. The 200 or so diagnoses in the DSM, on the other hand, explain little and predict less. Until the book contains a diagnosis called Mass Slaughter Disorder, whose criteria would include having committed mass slaughter, it’s not going to offer much guidance on the subject, and, obviously, what guidance it provides is going to come too late.

With the mentally disordered, as with all of us (and let’s remember that in any given year, something like 30 percent of us will meet criteria for a mental disorder, and 11 percent of us are on antidepressants right now), there is no telling what will happen next. No matter how many diagnoses are in the DSM, and no matter how astutely they are used, they will not tell us in whose hands guns are safe. The psyche is more unfathomable, and evil more wily, than any doctor or any book.




Guess Who’s Taking Over Mental Health Care? / Your smartphone

From NIH, National Institute of Mental Health

Director’s Blog: Look who is getting into mental health research

By on August 31, 2015

In the U.S., biomedical research funding has been estimated at $117 billion, following a rough law of thirds: about one-third government (mostly NIH), slightly more than one-third pharmaceutical companies, and the remainder a mix of biotech, foundations, and philanthropy.1 Support for research on mental disorders looks a little different. As pharmaceutical companies invest less in this area, government (especially NIMH) has become a larger fraction of the funding pool. Now there is a surprising new player in mental health research that is just beginning to emerge from the private sector.This summer I was invited to Apple, Google (now Alphabet), IBM, and Intel.
Why are technology companies inviting the NIMH director to visit? At first, I assumed their interest was purely financial. It’s hardly surprising that companies with large cash reserves have discovered a trillion dollar market (health care is now approaching 20 percent of the U.S. gross domestic product and mental health care is a significant part of those costs).2 But I quickly discovered two other factors that are driving tech companies into biomedical and mental health research.
One is big data. As genomics, imaging, and large health care studies generate terabytes of data daily, companies that know how to extract knowledge from data have become essential partners for progress towards new diagnostics and therapeutics. The data analytics from tech companies are becoming part of the engine of biomedical research. The other is the promise of technology to change health care, shifting it from episodic to continuous, from reactive to proactive, from physician-centered to patient-centered. (No longer in the system only when ill, the patient delivers profit 24/7, from conception to death.)

Even beyond wearable devices and online cognitive training, technology can offer information and interventions where and when someone needs it. Tech companies are realizing that mental health is, in their parlance, an excellent “use case.” Just as important, online health care (especially mental health care) creates data that can serve to improve quality, including monitoring the fidelity of psychotherapy.3 In the future, when we think of the private sector and health research, we may be thinking of Apple and IBM more than Lilly and Pfizer.

Coming to you a future near you: Individuals will be branded as potentially “socially aberrant, mentally ill, psychotic etc.” or having something diagnosable, and will be tracked 24/7 under the “lie” of preventative services.
Here are two fascinating previews of this new world I noted during my travels last week.

One was the publication of results from a collaboration between Columbia University and IBM.4 The team, led by Gillinder Bedi and Cheryl Corcoran, was looking for a biomarker to predict which clinically high-risk youth would convert to psychosis over a two- to three-year follow up period from an initial interview. Rather than depend on a protein in blood or a brain scan, they used an innovative big data approach to analyze the speech from the initial interview. The approach, developed by Guillermo Cecci at IBM, maps semantic coherence and speech complexity as a window into the earliest stages of disorganized thought. While analysis of previous clinical features have yielded, at best, 80 percent prediction, this automated analysis of unstructured speech was reported to be 100 percent accurate for identifying who would convert to psychosis during the follow up period. This is a small study (34 participants with 5 developing psychosis), but it serves as a preview of what we might see as the power of technology is applied to provide objective measures of behavior and cognition.

If this doesn’t scare you about the abuse of technology, what will?

Also last week I visited, an internet start-up housed on the nineteenth floor of an office building in San Francisco. The founders, Anmol Madan and Karan Singh, took me on a quick tour of their smartphone app, which tracks mood and anxiety to “deliver support to the right people at the right time.” Already used by thousands of people and adopted by several research groups, looks at everything from sleep and activity to social interaction and self-report to quantify mood. Their approach has enormous potential not only for research on mood and anxiety but for the development of interventions that can be deployed globally. While the app has mostly been used to link patients to providers, imagine a future when the app will empower patients with tools to become their own providers.

My summer tour of tech companies, large and small, left me with one unexpected conclusion. While the focus of wearable technology and online apps has thus far mostly been for managing heart disease and diabetes, the tech approach may be best suited for mental health. The biomarkers for depression and psychosis and post-traumatic stress disorder are likely to be objective measures of cognition and behavior, which can be collected by smartphones. Some of our most effective interventions are psychosocial treatments that can be delivered or extended by smartphones and tablets. Most important, the sensors and the interventions can be integrated into a closed loop so that care is continuous and iterative. Increasing symptoms, suicidal impulses, and paranoid thoughts lead immediately to an intervention. (BY WHOM? – The police? That’s who “provides” much of America mental health services, including throwing MH individuals in jail and depriving them of medication or care.  Population-based studies have shown that less than half of people with mental illness seek care. And workforce studies have shown that 55 percent of counties have no mental health care provider. Technology is not the answer to all problems, but it may help those with mental illness even more than those with other chronic, serious medical conditions.

Depression? / A Social Epidemic

The topic of Depression has been showing up quite a bit on sites that I frequent, and I realized that I don’t actually know much about Clinical Depression. I don’t find the description below to be very specific or medical. It’s self-diagnosis, isn’t it? You’re depressed if you think you are – the symptoms and criteria are offered to the patient to choose from; not an objective process. Five of these symptoms (why 5?) have to last for two weeks (why two weeks?) What if it’s not two weeks, but 13 days? Are you then not depressed? This seems a very short duration from what people with depression say – that it’s chronic.

There is an admonishment used to restrain this type of “bogus” quantification: Only count things that can be counted.  Making up” numbers (like 5 symptoms, 2 weeks) does not change the arbitrary social basis of diagnosis; useless quantification does not make a process “science”.

Call me a picky Asperger, but what is the cause?     Clinical Depression, if it’s real, must have cause(s).

Why bother with a charade of diagnosis? Just have people show up, say, “I’m depressed,” and dish out the prescriptions.  


From Mayo Clinic Online: Subscribe to Housecall Our general interest e-newsletter keeps you up to date on a wide variety of health topics.

What does the term “clinical depression” mean?

Answers from Daniel K. Hall-Flavin, M.D.

Depression ranges in seriousness from mild, temporary episodes of sadness to severe, persistent depression. Clinical depression is the more severe form of depression, also known as major depression or major depressive disorder. It isn’t the same as depression caused by a loss, such as the death of a loved one, or a medical condition, such as a thyroid disorder.

It’s easy to see that despite CD not having an environmental / medical cause, most of the symptoms CAN originate in economic and social fact – the social stress that the individual encounters in everyday existence. This stress is out of control, because it’s BUILT INTO the system. 

To be diagnosed with clinical depression, you must meet the symptom criteria for major depressive disorder in the Diagnostic and Statistical Manual of Mental Disorders (DSM), published by the American Psychiatric Association. This manual is used by mental health providers to diagnose mental conditions and by insurance companies to reimburse for treatment. Are insurance companies co-writing the DSM, that is,  practicing medicine without a license? How do pharmaceutical companies influence who gets a diagnosis and which medication is prescribed?  The fact is, insurance industry “representatives” do “contribute” to what appears in the DSM. Do we have a case of the diagnosis creating the condition, the treatment and profit? 

For clinical depression, you must have five or more of the following symptoms over a two-week period, most of the day, nearly every day. At least one of the symptoms must be either a depressed mood or a loss of interest or pleasure. Signs and symptoms may include:

Depressed mood, such as feeling sad, empty or tearful (in children and teens, depressed mood can appear as constant irritability) (What teenager isn’t sad, empty, tearful or irritable at times?)

Significantly reduced interest or feeling no pleasure in all or most activities (Isn’t that the reality of  people at the bottom of the American Social Pyramid in the 21st C.? Drug addiction, violence, poverty and crime would likely both indicate and arise from depression.)

Significant weight loss when not dieting, weight gain, or decrease or increase in appetite (in children, failure to gain weight as expected) (Wow! that covers just about anyone!)

Insomnia or increased desire to sleep (Here we go again – any behavior on either side of an imaginary “normal.” In the U.S. we are bombarded daily by the message that even “normal” people have a sleep disorder. I’m not downplaying the absolute need for quality sleep. And how does one get adequate sleep time working more than one job, just to survive? )

Either restlessness or slowed behavior that can be observed by others (hearsay evidence; subjective.)

Fatigue or loss of energy (subjective; millions of Americans are exhausted by the stress and insecurity of chaotic social demands)

Feelings of worthlessness, or excessive or inappropriate guilt (socially induced symptoms)

Trouble making decisions, or trouble thinking or concentrating (Wow! I keep hoping for objective, provable symptoms, but it’s I guess they don’t exist!) 

Recurrent thoughts of death or suicide, or a suicide attempt (Look no farther than people who have been discarded by society: ex-military, the homeless, Native American young people, and the elderly.  

Your symptoms must be severe enough to cause noticeable problems in relationships with others or in day-to-day activities, such as work, school or social activities. Symptoms may be based on your own feelings or on the observations of someone else. (Wow! How scientific is that? It’s clear that clinical depression is a SOCIAL DIAGNOSIS, created by stressful conditions built into the social environment. Unhealthy social conditions of poverty, violence, financial distress, broken families, tyrannical bosses and demeaning work place conditions, do create physical changes and disease in the human animal, but “pills” simply mask the pain; they offer no cure for a toxic society that values profit over people; it is the “medico-pharma” greed that has created massive opioid addiction in the U.S.) 

Clinical depression can affect people of any age, including children. However, clinical depression symptoms, even if severe, usually improve with psychological counseling, antidepressant medications or a combination of the two. (Vague, relative, subjective – no money back guarantee!)  It’s all about $$$$$$$$.


It is important to understand that not only do we do horribly abuse animals, we are animals, and our modern social environments are the equivalent of zoos, circuses and research labs. And we wonder why human beings are depressed and ill? An outrageous number of American males (notably black) are being confined for years, and  many for life, for nonviolent offenses, in conditions considered cruel in zoos and circuses, where animals also become seriously disturbed / depressed.



Under pressure from a state court, California is building a psychiatric care unit at San Quentin prison in order to provide long-term mental health care for death row inmates. If you think about it, it's slightly ironic.

Under pressure from a state court, California is building a psychiatric care unit at San Quentin prison in order to provide long-term mental health care for death row inmates. If you think about it,  it’s cruel, insane and socially typical thinking.




Asperger Smarts / How not to deal with a predator

We’ve witnessed this scenario too many times :


Gray, 25, was taken into custody April 12 after police “made eye contact” with him and another man in an area known for drug activity, police said. Gray was handcuffed and put in a transport van. At some point during his roughly 30-minute ride, the van was stopped and Gray’s legs were shackled when an officer felt he was becoming “irate,” police said.

Grey died Sunday — a week after his arrest — of what police described as “a significant spinal injury.”

Do not make eye contact with a predator; this allows it to pick you out of the herd. Even worse is to RUN AWAY. This triggers the chase and kill instinct. Do not run.

The Chase: an instinct that is not going extinct.

The Chase: an instinct that is not going extinct.

I have thought quite a bit about socially dictated “rules” of eye contact that help define Aspergers as “developmentally disordered.” But – these rules are culturally subjective and diverse. I think it’s the instinctive response between animals that is causing huge problems. “Policing authorities” are designated predators-by-law and empowered to execute predatory behavior. Policing is not all about predation; police do “protect and serve.” It’s no surprise the individuals who are comfortable with being a predator gravitate toward specific jobs, but weaker individuals, who may be very uncomfortable with the task of wielding power, may end up in such jobs and that’s also a problem.

There are situations that trigger the predatory response, especially if the person stopped by the police unwittingly switches into prey behavior – notably, running away. A perfectly natural response (especially for young black males, who ARE prey) but one which also jump-starts predatory behavior in a police officer.

The racial aspect of this is huge. The history of White and African relationships is built on  predation of blacks by whites. There is something else that is overlooked: the stereotyping of black males by entertainment, the media, sports, etc. is utterly skewed: Big, black, aggressive, out-of-control and criminally active are adjectives that dominate the white view of black males: dangerous on all counts.

Stop and take one minute to examine this stereotype: black males are always the aggressor and always belong in jail? Ridiculous! These are human beings who way more often than not are frightened, intimidated, harassed, and tragically, are trying to live up to the “Bad Black” stereotype. Are black men allowed to feel fear, pain, and sorrow? Are they allowed to be gentle, caring and intelligent?


To the public observer, the act of running away is “non-threatening” and the police reaction is baffling and “over the top.” We are viewing the event rationally – a given situation has escalated way beyond what we would imagine ourselves doing. A person who is running away is obviously not a threat. It’s not about threat; it’s about acting like prey and pulling the predatory trigger.

As an Asperger, I can describe my reactions regarding eye contact. (I do not react well to  aggressive people.)

1. The other person is invading my comfort zone. It’s a breach of “boundary etiquette.” If the person stands too close (and stares) I will gain space in any number of benign ways, including calculated removal of myself from the area. I will avoid eye contact naturally, instinctively, because eye contact with predators rings their bell.

2. If the person is not aggressive, and a conversation takes place, I will switch to listening mode, because his or her appearance is no longer vital; my visual system has sized up the person and decided that he or she is “friendly.” This is intuitive. I will likely keep adjusting the “space” between us, which probably looks weird to other people.


Eye Contact / Submit or Die

untitledcomOdd bits from: Psychologist

“Scotland’s University of Stirling found that, in a question-and-answer study among children, those who maintained eye contact were less likely to come up with the correct answer to a question than those who looked away to consider their response.” Eye contact, as a socialising device, can take a surprising amount of effort to maintain when this energy could be spend on calculating, as opposed to perceptive, tasks.”


A garment that conceals all but the eyes…what does that say?


It seems that females are all about control of the male animal. Testing, testing…


“I have nothing else to offer: will trade boobs for any attention that makes me feel less worthless.”


Nothing is more depressing than turning humans into robotic lab rats!

Wow! Could it be that Asperger children who don’t maintain eye contact, are not “psychopaths” as determined by Simon Baron Cohen, but are simply using their brains to process information – AND to conserve energy?  

Both Animal Behaviorists and Human Behaviorists study animal behavior, but there is little evidence that they share their work. Perplexing.

imagesIF3X5XC5 imagesPVTM2WJ9

When a parent forces a child to experience a direct and angry stare, he or she is not “bonding.” They are intimidating the child by letting them know, on an animal level, that the parent is the predator and the child is prey:  the challenge to the child is “submit or die.” The child experiences fear. Is that so hard to understand? This act of domination instigates competitive predatory behavior in some children (the parent – child relationship becomes a war), and the child will likely turn their aggression toward other children, but most children who experience “domineering parenting” become prey animals.

Human behavior is animal behavior; a culture “utilizes” and simultaneously condemns animal behavior, which is reserved for predators who dominate every level of the social pyramid. To forbid animal behavior in a human society is to maintain a prey population; obedient, tame, neotenic.




New Blog Subject? / Great Ideas that Stink

Asperger Human has grown to be so large that it’s become “unfriendly” to search engines; actually it passed that threshold long ago. The real problem is me; I’ve lost track of what individual posts cover, and my Asperger way of organizing information is not linear, categorical, branching, or file-based. Within-blog search works well, but I don’t know how many people use that function.  Being an intuitive visual thinker  serves ME well – it’s all in my brain, but in no way is information organized in the “typical” hierarchical framework.

Actually, at this point I doubt that the “typical social human” organizes information hierarchically either; it’s a convention taught in schools – what used to be called “making an outline” which forced children to read and think about the relative importance of ideas and facts in written documents and other types of communication. Decisions at all levels of social organization are now made based on “how people feel” moment to moment: an infantile state which decades ago was considered temporary. In fact, much of education was designed to get children beyond this infantile self-centeredness in order to help them become responsible adults, an archaic concept, the need for which has not gone away, however.

New blog? Great Ideas that Stink


I’ve been noodling over a topic for a new blog, which would be an extension of ideas in Asperger Human, but which will apply to all people: the most critical defect that I see in contemporary (western) cultures is the inability to form new ideas that will be effective in the 21st Century. A backlog of “good ideas that are no longer sufficient” drives “modern” choices; technology (as usual) is taking Homo sapiens into new arenas.

This is not new: exploration seems to be a given in human behavior. The cheerleaders for technical domination are presently in control, but (as usual) the business-technical cohort overestimates “progress” in human health and happiness. What good is that fabulous genetic medical technique when millions of humans don’t have clean water and sufficient nutrition? And countering the “new whizz-bang” medical devices are armed drones that shred individual “bystanders” in villages bereft of decent living conditions and the energy to overcome decades of trauma?

Ideas are not things; ideas drive behavior, and are “supernatural.” Ideas are formed by the human brain and exist nowhere else. This is a good thing, because unlike the Laws of Nature, ideas can be changed. There is one little nut that need’s to be “cracked” and it’s the illusion that the ideas we learn, mostly by absorption from the human environment, are essential to the universe, absolutely fundamental to the universe, and that every human being must agree on which ideas constitute this “absolutist” body of paper thin, ephemeral, supernatural regime of ideas. Terrible behavior follows on acceptance of “words” as the power that created the universe.

For now, let’s start with a simple definition of “idea” from Merriam Webster:

: a thought, plan, or suggestion about what to do (an idea is active): an opinion or belief (not all ideas are equal in fact or utility): something that you imagine or picture in your mind (supernatural)

Clip from an intro to The Great Ideas / Mortimer Adler:

What does it mean to be Good? How do we decide the Right thing to do? What is Love? The same question may appear to have different answers; the journey through the conflicting answers to a resolution is called philosophy. The Great Ideas are Art, Beauty, Change, Democracy, Emotion, Freedom, God, Good and Evil, Government, Justice, Labour, Language, Law, Learning, Love, Man, Opinion, Philosophy, Progress, Punishment, Truth, and War and Peace. (These aren’t “ideas” so much as Topics – so we’re in trouble already!) 
Although everyone has a basic grasp of these Great Ideas, not everyone understands them as well as he or she could or should. In “How to Think About the Great Ideas”, renowned philosopher Mortimer J. Adler guides readers to an understanding of these fundamental ideas and their practical applications to our daily lives.

There are thousands of books and articles written by historians, philosophers, and related “experts” about the stupendous course of western history, launched by POVs rooted in  tradition, revision, religion, technology, Great Men,  political and social ideologies, art, anthropology, and genetics / DNA, mostly upholding the supremacy or “march” of EuroAmerican males from the Ancient Greeks onwards. Not my intention to join the crowd! My approach will be much like Asperger Human: identifying the origin of an idea, the environment that encouraged it, its misplaced application today and why it is preventing individual “happiness” –

OMG! This is new blog will be as easy as shooting fish in a barrel.




Idiotic slideshow for GP’s to “diagnose” Aspergers

Aye, yai, yai!

Slideshare presentation by Debra Moore PhD, directed to “general practitioners” explaining Asperger’s Syndrome and a closet full of “disorders” that boggle the mind. The slideshow is comprised of 83 slides.

ONE BIG COMMENT: This is so bizarre that words fail me, except HOW INSULTING this is to all children, their parents and general practitioners (if there are any left in the U.S.) Asperger people will not be surprised by how neurotypically idiotic are the claims – another “imaginary” description of Asperger traits and behavior. Just stop!!!

Why would this be suitable information for a medical doctor when 1. Aspergers is not a medical condition 2. none of the information is medical 3. the information is ignorant slander.

How did this person qualify for a PhD? (Must be psychology) C’mon! If you’re going to use “multisyllabic psychology words” you have to first learn how to write a proper sentence.

aspergers-slide-show-1-728 aspergers-slide-show-36-728 aspergers-slide-show-37-728 aspergers-slide-show-38-728 aspergers-slide-show-39-728 aspergers-slide-show-40-728 aspergers-slide-show-41-728





PhD Dissertation / Asperger Syndrome Social Narratives

Dissertation for Dr. of Philosophy, Bowling Green State University, 2010 Neil Shepard


Scroll down to FILES / View or Download

imagesQTYDAM51 images94L4E8HC

From Introduction: This dissertation explores representations of Asperger’s syndrome, an autism spectrum disorder. Specifically, it textually analyzes cultural representations with the goal of identifying specific narratives that have become dominant in the public sphere. Beginning in 2001, with Wired magazine’s article by Steve Silberman entitled “The Geek Syndrome” as the starting point, this dissertation demonstrates how certain values have been linked to Asperger’s syndrome: namely the association between this disorder and hyper-intelligent, socially awkward personas.

Narratives about Asperger’s have taken to medicalizing not only genius (as figures such as Newton and Einstein receive speculative posthumous diagnoses) but also to medicalizing a particular brand of new economy, information-age genius. The types of individuals often suggested as representative Asperger’s subjects can be stereotyped as the casual term “geek syndrome” suggests: technologically savvy, successful “nerds.” On the surface, increased public awareness of Asperger’s syndrome combined with the representation has created positive momentum for acceptance of high functioning autism. In a cultural moment that suggests “geek chic,” Asperger’s syndrome has undergone a critical shift in value that seems unimaginable even 10 years ago.

This shift has worked to undo some of the stigma attached to this specific form of autism. The proto-typical Aspergian persona represented dominantly in the media is often both intelligent and successful. At the same time, these personas are also so often masculine, middle/upper class and white. These representations are problematic in the way that they uphold traditional normativity in terms of gender, race and class, as well as reifying stigma toward other points on the autistic spectrum.


Having grown up with a family connection to Asperger’s syndrome, I can say that from my experience the truly challenging difficulties that emerge do so from encounters with the social world. I have never met a person with autism who is, in and of themselves, a “problem.” Problems come in the form of ignorance; the forms of this ignorance vary in range from inadequate educational resources to bullies. The sentiment that the problem is social rather than individual is something that I have seen echoed repeatedly throughout my research, whenever I have read of or spoken with people with autism, their parents, guardians, children, siblings and friends. Whatever Asperger’s or autism may be has, in my experience, been less important thanthe beliefs and practices that comprise it. The work of cultural studies, as I see it, is to interrogate those beliefs and practices. To talk about a condition such as autism as being socially constructed isn’t to deny the reality of the condition, but rather to call attention to those beliefs and practices that shape the consequences of that reality. Understanding Asperger’s syndrome as a social construction is not to deny the clear realities of a condition that is manifested in the body, but to recognize the accountability of culture’s role in that reality. A social model approach to autism means an acute awareness of those impairments and those disabling features that are a result of the surrounding culture.

Citation: Shepard, Neil, “Rewiring Difference and Disability: Narratives of Asperger’s Syndrome in the Twenty-First Century” (2010). American Culture Studies Ph.D. Dissertations. Paper 40.