Guess Who’s Taking Over Mental Health Care? / Your smartphone

From NIH, National Institute of Mental Health

Director’s Blog: Look who is getting into mental health research

By on August 31, 2015

In the U.S., biomedical research funding has been estimated at $117 billion, following a rough law of thirds: about one-third government (mostly NIH), slightly more than one-third pharmaceutical companies, and the remainder a mix of biotech, foundations, and philanthropy.1 Support for research on mental disorders looks a little different. As pharmaceutical companies invest less in this area, government (especially NIMH) has become a larger fraction of the funding pool. Now there is a surprising new player in mental health research that is just beginning to emerge from the private sector.This summer I was invited to Apple, Google (now Alphabet), IBM, and Intel.
Why are technology companies inviting the NIMH director to visit? At first, I assumed their interest was purely financial. It’s hardly surprising that companies with large cash reserves have discovered a trillion dollar market (health care is now approaching 20 percent of the U.S. gross domestic product and mental health care is a significant part of those costs).2 But I quickly discovered two other factors that are driving tech companies into biomedical and mental health research.
One is big data. As genomics, imaging, and large health care studies generate terabytes of data daily, companies that know how to extract knowledge from data have become essential partners for progress towards new diagnostics and therapeutics. The data analytics from tech companies are becoming part of the engine of biomedical research. The other is the promise of technology to change health care, shifting it from episodic to continuous, from reactive to proactive, from physician-centered to patient-centered. (No longer in the system only when ill, the patient delivers profit 24/7, from conception to death.)

Even beyond wearable devices and online cognitive training, technology can offer information and interventions where and when someone needs it. Tech companies are realizing that mental health is, in their parlance, an excellent “use case.” Just as important, online health care (especially mental health care) creates data that can serve to improve quality, including monitoring the fidelity of psychotherapy.3 In the future, when we think of the private sector and health research, we may be thinking of Apple and IBM more than Lilly and Pfizer.

Coming to you a future near you: Individuals will be branded as potentially “socially aberrant, mentally ill, psychotic etc.” or having something diagnosable, and will be tracked 24/7 under the “lie” of preventative services.
Here are two fascinating previews of this new world I noted during my travels last week.

One was the publication of results from a collaboration between Columbia University and IBM.4 The team, led by Gillinder Bedi and Cheryl Corcoran, was looking for a biomarker to predict which clinically high-risk youth would convert to psychosis over a two- to three-year follow up period from an initial interview. Rather than depend on a protein in blood or a brain scan, they used an innovative big data approach to analyze the speech from the initial interview. The approach, developed by Guillermo Cecci at IBM, maps semantic coherence and speech complexity as a window into the earliest stages of disorganized thought. While analysis of previous clinical features have yielded, at best, 80 percent prediction, this automated analysis of unstructured speech was reported to be 100 percent accurate for identifying who would convert to psychosis during the follow up period. This is a small study (34 participants with 5 developing psychosis), but it serves as a preview of what we might see as the power of technology is applied to provide objective measures of behavior and cognition.

If this doesn’t scare you about the abuse of technology, what will?

Also last week I visited, an internet start-up housed on the nineteenth floor of an office building in San Francisco. The founders, Anmol Madan and Karan Singh, took me on a quick tour of their smartphone app, which tracks mood and anxiety to “deliver support to the right people at the right time.” Already used by thousands of people and adopted by several research groups, looks at everything from sleep and activity to social interaction and self-report to quantify mood. Their approach has enormous potential not only for research on mood and anxiety but for the development of interventions that can be deployed globally. While the app has mostly been used to link patients to providers, imagine a future when the app will empower patients with tools to become their own providers.

My summer tour of tech companies, large and small, left me with one unexpected conclusion. While the focus of wearable technology and online apps has thus far mostly been for managing heart disease and diabetes, the tech approach may be best suited for mental health. The biomarkers for depression and psychosis and post-traumatic stress disorder are likely to be objective measures of cognition and behavior, which can be collected by smartphones. Some of our most effective interventions are psychosocial treatments that can be delivered or extended by smartphones and tablets. Most important, the sensors and the interventions can be integrated into a closed loop so that care is continuous and iterative. Increasing symptoms, suicidal impulses, and paranoid thoughts lead immediately to an intervention. (BY WHOM? – The police? That’s who “provides” much of America mental health services, including throwing MH individuals in jail and depriving them of medication or care.  Population-based studies have shown that less than half of people with mental illness seek care. And workforce studies have shown that 55 percent of counties have no mental health care provider. Technology is not the answer to all problems, but it may help those with mental illness even more than those with other chronic, serious medical conditions.


Depression? / A Social Epidemic

The topic of Depression has been showing up quite a bit on sites that I frequent, and I realized that I don’t actually know much about Clinical Depression. I don’t find the description below to be very specific or medical. It’s self-diagnosis, isn’t it? You’re depressed if you think you are – the symptoms and criteria are offered to the patient to choose from; not an objective process. Five of these symptoms (why 5?) have to last for two weeks (why two weeks?) What if it’s not two weeks, but 13 days? Are you then not depressed? This seems a very short duration from what people with depression say – that it’s chronic.

There is an admonishment used to restrain this type of “bogus” quantification: Only count things that can be counted.  Making up” numbers (like 5 symptoms, 2 weeks) does not change the arbitrary social basis of diagnosis; useless quantification does not make a process “science”.

Call me a picky Asperger, but what is the cause?     Clinical Depression, if it’s real, must have cause(s).

Why bother with a charade of diagnosis? Just have people show up, say, “I’m depressed,” and dish out the prescriptions.  


From Mayo Clinic Online: Subscribe to Housecall Our general interest e-newsletter keeps you up to date on a wide variety of health topics.

What does the term “clinical depression” mean?

Answers from Daniel K. Hall-Flavin, M.D.

Depression ranges in seriousness from mild, temporary episodes of sadness to severe, persistent depression. Clinical depression is the more severe form of depression, also known as major depression or major depressive disorder. It isn’t the same as depression caused by a loss, such as the death of a loved one, or a medical condition, such as a thyroid disorder.

It’s easy to see that despite CD not having an environmental / medical cause, most of the symptoms CAN originate in economic and social fact – the social stress that the individual encounters in everyday existence. This stress is out of control, because it’s BUILT INTO the system. 

To be diagnosed with clinical depression, you must meet the symptom criteria for major depressive disorder in the Diagnostic and Statistical Manual of Mental Disorders (DSM), published by the American Psychiatric Association. This manual is used by mental health providers to diagnose mental conditions and by insurance companies to reimburse for treatment. Are insurance companies co-writing the DSM, that is,  practicing medicine without a license? How do pharmaceutical companies influence who gets a diagnosis and which medication is prescribed?  The fact is, insurance industry “representatives” do “contribute” to what appears in the DSM. Do we have a case of the diagnosis creating the condition, the treatment and profit? 

For clinical depression, you must have five or more of the following symptoms over a two-week period, most of the day, nearly every day. At least one of the symptoms must be either a depressed mood or a loss of interest or pleasure. Signs and symptoms may include:

Depressed mood, such as feeling sad, empty or tearful (in children and teens, depressed mood can appear as constant irritability) (What teenager isn’t sad, empty, tearful or irritable at times?)

Significantly reduced interest or feeling no pleasure in all or most activities (Isn’t that the reality of  people at the bottom of the American Social Pyramid in the 21st C.? Drug addiction, violence, poverty and crime would likely both indicate and arise from depression.)

Significant weight loss when not dieting, weight gain, or decrease or increase in appetite (in children, failure to gain weight as expected) (Wow! that covers just about anyone!)

Insomnia or increased desire to sleep (Here we go again – any behavior on either side of an imaginary “normal.” In the U.S. we are bombarded daily by the message that even “normal” people have a sleep disorder. I’m not downplaying the absolute need for quality sleep. And how does one get adequate sleep time working more than one job, just to survive? )

Either restlessness or slowed behavior that can be observed by others (hearsay evidence; subjective.)

Fatigue or loss of energy (subjective; millions of Americans are exhausted by the stress and insecurity of chaotic social demands)

Feelings of worthlessness, or excessive or inappropriate guilt (socially induced symptoms)

Trouble making decisions, or trouble thinking or concentrating (Wow! I keep hoping for objective, provable symptoms, but it’s I guess they don’t exist!) 

Recurrent thoughts of death or suicide, or a suicide attempt (Look no farther than people who have been discarded by society: ex-military, the homeless, Native American young people, and the elderly.  

Your symptoms must be severe enough to cause noticeable problems in relationships with others or in day-to-day activities, such as work, school or social activities. Symptoms may be based on your own feelings or on the observations of someone else. (Wow! How scientific is that? It’s clear that clinical depression is a SOCIAL DIAGNOSIS, created by stressful conditions built into the social environment. Unhealthy social conditions of poverty, violence, financial distress, broken families, tyrannical bosses and demeaning work place conditions, do create physical changes and disease in the human animal, but “pills” simply mask the pain; they offer no cure for a toxic society that values profit over people; it is the “medico-pharma” greed that has created massive opioid addiction in the U.S.) 

Clinical depression can affect people of any age, including children. However, clinical depression symptoms, even if severe, usually improve with psychological counseling, antidepressant medications or a combination of the two. (Vague, relative, subjective – no money back guarantee!)  It’s all about $$$$$$$$.


It is important to understand that not only do we do horribly abuse animals, we are animals, and our modern social environments are the equivalent of zoos, circuses and research labs. And we wonder why human beings are depressed and ill? An outrageous number of American males (notably black) are being confined for years, and  many for life, for nonviolent offenses, in conditions considered cruel in zoos and circuses, where animals also become seriously disturbed / depressed.



Under pressure from a state court, California is building a psychiatric care unit at San Quentin prison in order to provide long-term mental health care for death row inmates. If you think about it, it's slightly ironic.

Under pressure from a state court, California is building a psychiatric care unit at San Quentin prison in order to provide long-term mental health care for death row inmates. If you think about it,  it’s cruel, insane and socially typical thinking.




Asperger Smarts / How not to deal with a predator

We’ve witnessed this scenario too many times :


Gray, 25, was taken into custody April 12 after police “made eye contact” with him and another man in an area known for drug activity, police said. Gray was handcuffed and put in a transport van. At some point during his roughly 30-minute ride, the van was stopped and Gray’s legs were shackled when an officer felt he was becoming “irate,” police said.

Grey died Sunday — a week after his arrest — of what police described as “a significant spinal injury.”

Do not make eye contact with a predator; this allows it to pick you out of the herd. Even worse is to RUN AWAY. This triggers the chase and kill instinct. Do not run.

The Chase: an instinct that is not going extinct.

The Chase: an instinct that is not going extinct.

I have thought quite a bit about socially dictated “rules” of eye contact that help define Aspergers as “developmentally disordered.” But – these rules are culturally subjective and diverse. I think it’s the instinctive response between animals that is causing huge problems. “Policing authorities” are designated predators-by-law and empowered to execute predatory behavior. Policing is not all about predation; police do “protect and serve.” It’s no surprise the individuals who are comfortable with being a predator gravitate toward specific jobs, but weaker individuals, who may be very uncomfortable with the task of wielding power, may end up in such jobs and that’s also a problem.

There are situations that trigger the predatory response, especially if the person stopped by the police unwittingly switches into prey behavior – notably, running away. A perfectly natural response (especially for young black males, who ARE prey) but one which also jump-starts predatory behavior in a police officer.

The racial aspect of this is huge. The history of White and African relationships is built on  predation of blacks by whites. There is something else that is overlooked: the stereotyping of black males by entertainment, the media, sports, etc. is utterly skewed: Big, black, aggressive, out-of-control and criminally active are adjectives that dominate the white view of black males: dangerous on all counts.

Stop and take one minute to examine this stereotype: black males are always the aggressor and always belong in jail? Ridiculous! These are human beings who way more often than not are frightened, intimidated, harassed, and tragically, are trying to live up to the “Bad Black” stereotype. Are black men allowed to feel fear, pain, and sorrow? Are they allowed to be gentle, caring and intelligent?


To the public observer, the act of running away is “non-threatening” and the police reaction is baffling and “over the top.” We are viewing the event rationally – a given situation has escalated way beyond what we would imagine ourselves doing. A person who is running away is obviously not a threat. It’s not about threat; it’s about acting like prey and pulling the predatory trigger.

As an Asperger, I can describe my reactions regarding eye contact. (I do not react well to  aggressive people.)

1. The other person is invading my comfort zone. It’s a breach of “boundary etiquette.” If the person stands too close (and stares) I will gain space in any number of benign ways, including calculated removal of myself from the area. I will avoid eye contact naturally, instinctively, because eye contact with predators rings their bell.

2. If the person is not aggressive, and a conversation takes place, I will switch to listening mode, because his or her appearance is no longer vital; my visual system has sized up the person and decided that he or she is “friendly.” This is intuitive. I will likely keep adjusting the “space” between us, which probably looks weird to other people.


Eye Contact / Submit or Die

untitledcomOdd bits from: Psychologist

“Scotland’s University of Stirling found that, in a question-and-answer study among children, those who maintained eye contact were less likely to come up with the correct answer to a question than those who looked away to consider their response.” Eye contact, as a socialising device, can take a surprising amount of effort to maintain when this energy could be spend on calculating, as opposed to perceptive, tasks.”


A garment that conceals all but the eyes…what does that say?


It seems that females are all about control of the male animal. Testing, testing…


“I have nothing else to offer: will trade boobs for any attention that makes me feel less worthless.”


Nothing is more depressing than turning humans into robotic lab rats!

Wow! Could it be that Asperger children who don’t maintain eye contact, are not “psychopaths” as determined by Simon Baron Cohen, but are simply using their brains to process information – AND to conserve energy?  

Both Animal Behaviorists and Human Behaviorists study animal behavior, but there is little evidence that they share their work. Perplexing.

imagesIF3X5XC5 imagesPVTM2WJ9

When a parent forces a child to experience a direct and angry stare, he or she is not “bonding.” They are intimidating the child by letting them know, on an animal level, that the parent is the predator and the child is prey:  the challenge to the child is “submit or die.” The child experiences fear. Is that so hard to understand? This act of domination instigates competitive predatory behavior in some children (the parent – child relationship becomes a war), and the child will likely turn their aggression toward other children, but most children who experience “domineering parenting” become prey animals.

Human behavior is animal behavior; a culture “utilizes” and simultaneously condemns animal behavior, which is reserved for predators who dominate every level of the social pyramid. To forbid animal behavior in a human society is to maintain a prey population; obedient, tame, neotenic.




New Blog Subject? / Great Ideas that Stink

Asperger Human has grown to be so large that it’s become “unfriendly” to search engines; actually it passed that threshold long ago. The real problem is me; I’ve lost track of what individual posts cover, and my Asperger way of organizing information is not linear, categorical, branching, or file-based. Within-blog search works well, but I don’t know how many people use that function.  Being an intuitive visual thinker  serves ME well – it’s all in my brain, but in no way is information organized in the “typical” hierarchical framework.

Actually, at this point I doubt that the “typical social human” organizes information hierarchically either; it’s a convention taught in schools – what used to be called “making an outline” which forced children to read and think about the relative importance of ideas and facts in written documents and other types of communication. Decisions at all levels of social organization are now made based on “how people feel” moment to moment: an infantile state which decades ago was considered temporary. In fact, much of education was designed to get children beyond this infantile self-centeredness in order to help them become responsible adults, an archaic concept, the need for which has not gone away, however.

New blog? Great Ideas that Stink


I’ve been noodling over a topic for a new blog, which would be an extension of ideas in Asperger Human, but which will apply to all people: the most critical defect that I see in contemporary (western) cultures is the inability to form new ideas that will be effective in the 21st Century. A backlog of “good ideas that are no longer sufficient” drives “modern” choices; technology (as usual) is taking Homo sapiens into new arenas.

This is not new: exploration seems to be a given in human behavior. The cheerleaders for technical domination are presently in control, but (as usual) the business-technical cohort overestimates “progress” in human health and happiness. What good is that fabulous genetic medical technique when millions of humans don’t have clean water and sufficient nutrition? And countering the “new whizz-bang” medical devices are armed drones that shred individual “bystanders” in villages bereft of decent living conditions and the energy to overcome decades of trauma?

Ideas are not things; ideas drive behavior, and are “supernatural.” Ideas are formed by the human brain and exist nowhere else. This is a good thing, because unlike the Laws of Nature, ideas can be changed. There is one little nut that need’s to be “cracked” and it’s the illusion that the ideas we learn, mostly by absorption from the human environment, are essential to the universe, absolutely fundamental to the universe, and that every human being must agree on which ideas constitute this “absolutist” body of paper thin, ephemeral, supernatural regime of ideas. Terrible behavior follows on acceptance of “words” as the power that created the universe.

For now, let’s start with a simple definition of “idea” from Merriam Webster:

: a thought, plan, or suggestion about what to do (an idea is active): an opinion or belief (not all ideas are equal in fact or utility): something that you imagine or picture in your mind (supernatural)

Clip from an intro to The Great Ideas / Mortimer Adler:

What does it mean to be Good? How do we decide the Right thing to do? What is Love? The same question may appear to have different answers; the journey through the conflicting answers to a resolution is called philosophy. The Great Ideas are Art, Beauty, Change, Democracy, Emotion, Freedom, God, Good and Evil, Government, Justice, Labour, Language, Law, Learning, Love, Man, Opinion, Philosophy, Progress, Punishment, Truth, and War and Peace. (These aren’t “ideas” so much as Topics – so we’re in trouble already!) 
Although everyone has a basic grasp of these Great Ideas, not everyone understands them as well as he or she could or should. In “How to Think About the Great Ideas”, renowned philosopher Mortimer J. Adler guides readers to an understanding of these fundamental ideas and their practical applications to our daily lives.

There are thousands of books and articles written by historians, philosophers, and related “experts” about the stupendous course of western history, launched by POVs rooted in  tradition, revision, religion, technology, Great Men,  political and social ideologies, art, anthropology, and genetics / DNA, mostly upholding the supremacy or “march” of EuroAmerican males from the Ancient Greeks onwards. Not my intention to join the crowd! My approach will be much like Asperger Human: identifying the origin of an idea, the environment that encouraged it, its misplaced application today and why it is preventing individual “happiness” –

OMG! This is new blog will be as easy as shooting fish in a barrel.




PhD Dissertation / Asperger Syndrome Social Narratives

Dissertation for Dr. of Philosophy, Bowling Green State University, 2010 Neil Shepard


Scroll down to FILES / View or Download

imagesQTYDAM51 images94L4E8HC

From Introduction: This dissertation explores representations of Asperger’s syndrome, an autism spectrum disorder. Specifically, it textually analyzes cultural representations with the goal of identifying specific narratives that have become dominant in the public sphere. Beginning in 2001, with Wired magazine’s article by Steve Silberman entitled “The Geek Syndrome” as the starting point, this dissertation demonstrates how certain values have been linked to Asperger’s syndrome: namely the association between this disorder and hyper-intelligent, socially awkward personas.

Narratives about Asperger’s have taken to medicalizing not only genius (as figures such as Newton and Einstein receive speculative posthumous diagnoses) but also to medicalizing a particular brand of new economy, information-age genius. The types of individuals often suggested as representative Asperger’s subjects can be stereotyped as the casual term “geek syndrome” suggests: technologically savvy, successful “nerds.” On the surface, increased public awareness of Asperger’s syndrome combined with the representation has created positive momentum for acceptance of high functioning autism. In a cultural moment that suggests “geek chic,” Asperger’s syndrome has undergone a critical shift in value that seems unimaginable even 10 years ago.

This shift has worked to undo some of the stigma attached to this specific form of autism. The proto-typical Aspergian persona represented dominantly in the media is often both intelligent and successful. At the same time, these personas are also so often masculine, middle/upper class and white. These representations are problematic in the way that they uphold traditional normativity in terms of gender, race and class, as well as reifying stigma toward other points on the autistic spectrum.


Having grown up with a family connection to Asperger’s syndrome, I can say that from my experience the truly challenging difficulties that emerge do so from encounters with the social world. I have never met a person with autism who is, in and of themselves, a “problem.” Problems come in the form of ignorance; the forms of this ignorance vary in range from inadequate educational resources to bullies. The sentiment that the problem is social rather than individual is something that I have seen echoed repeatedly throughout my research, whenever I have read of or spoken with people with autism, their parents, guardians, children, siblings and friends. Whatever Asperger’s or autism may be has, in my experience, been less important thanthe beliefs and practices that comprise it. The work of cultural studies, as I see it, is to interrogate those beliefs and practices. To talk about a condition such as autism as being socially constructed isn’t to deny the reality of the condition, but rather to call attention to those beliefs and practices that shape the consequences of that reality. Understanding Asperger’s syndrome as a social construction is not to deny the clear realities of a condition that is manifested in the body, but to recognize the accountability of culture’s role in that reality. A social model approach to autism means an acute awareness of those impairments and those disabling features that are a result of the surrounding culture.

Citation: Shepard, Neil, “Rewiring Difference and Disability: Narratives of Asperger’s Syndrome in the Twenty-First Century” (2010). American Culture Studies Ph.D. Dissertations. Paper 40.

Born Free Foundation Zoochosis / Symptoms of ASD

It may seem implausible to social typical humans, but many ASD symptoms are indicative of “zoochosis” that occurs when we are forced to inhabit social environments.

Go to: for videos of zoo animals and specific abnormal behaviors Videos are also available on YOUTUBE

We don’t care about animals or children; how many American kids live in neighborhoods that are no more healthy than zoos?


Zoochosis: Abnormal and Stereotypic behaviour in Captive Animals

Ensuring reasonable animal welfare in captivity is extremely challenging. Animal species have evolved over millennia and their physical, physiological and behavioural traits have developed in order to optimise their chances of survival in their natural environment.

In captivity, animals may face a number of challenges for which evolution has not prepared them. The climate, diet and the size and characteristics of the enclosure may be completely alien to the species as it exists in the wild.

Captive animals may no longer be able to have control over their environment, nor carry out evolved behaviours aimed at enhancing their welfare or survival prospects. Instead they must rely on humans to provide for many of their physical, social, biological and other needs.

If the captive environment does not cater for the species-specific needs of the animal, there can be a deterioration in both physical and mental health such as the development of abnormal behaviour, disease and even early mortality.

Similarly, invasive interventions such as the restriction of movement, training using negative reinforcement techniques, being trained to preform unnatural behaviours or making modifications to the normal physiology of animals to reduce risks when handling, can cause severe and lasting distress.

Abnormal behaviour can include stereotypic behaviours – repetitive behaviours which appear to have no obvious goal or function – such as repetitive pacing, swaying, head-bobbing or circling and bar-biting ‘demonstrably caused by the frustration of natural behaviour patterns, impaired brain function, or repeated attempts to deal with some problem’ (Mason, 2005); over-grooming, excessive licking and vocalisation are recognised as displacement behaviours, ‘arising out of conflict when an animal is driven to perform two behaviours at the same time’ e.g. conflict between the fear of the keeper and the desire to get food (Bacon 2011); apathy and redirected aggression.

Other forms of stereotypic behaviour seen in captive wild animals includes: Apathy, where an animal is abnormally passive and does not react to stimuli. Occurs particularly when social animals are separated from companions; Abnormal mother-infant relationship, where mothers attack, abandon or kill their offspring, or where mothers wean offspring too soon or too late; Prolonged infantile behaviour, where animals do not mature properly or acquire aberrant social behaviours e.g. excessive crying or vocalisation, lack of social confidence, lack of secondary sexual characteristics; Abnormal aggressive behaviour, where aggression is uncontrolled, in terms of intensity and frequency, or directed to the wrong individuals or objects. This can be the result of overcrowding, threats by social dominants, isolation from companions or pressure from zoo visitors.


In 1992, Bill Travers, co-founder of the Born Free Foundation, first coined the term ‘zoochosis’ to describe this obsessive, repetitive behaviour, and described zoo animals behaving abnormally as ‘zoochotic’.

In 1993, the Zoo Check Charitable Trust (now the Born Free Foundation), produced The Zoochotic Report,  video observations by the late Bill Travers, taken over 3 years in over 100 zoos in Europe, North America and the Far East. The Zoochotic Report raised serious concern about the effects of captivity on wild animals. The Report helped form the philosophies for our organisation and its animal welfare objectives.

“In every zoo I visited when compiling the Zoochotic Report, I witnessed some sort of abnormal behaviour” Bill Travers, Co-Founder of the Born Free Foundation.

Further explanation on abnormal behaviour and examples:


Clever Sillies / High IQ people lack common sense Aye, yai, yai!

Aye, yai, yai! I think that there are more prejudicial and unscientific proclamations, and scientifically unprovable assertions, in this “intellectual demonstration” than I’ve encountered in reading hundreds of papers, studies, or reviews.

This is a disappointment in light of BC Charlton’s profile, which sounds quite promising. (Not really) He denies that western society is hierarchical – and by pronouncing it “modular” (magic word syndrome again) “proves it’s modular” Abracadabra!

Could Bruce Carlton be one of the “clever sillies” conjured in his editorial?

The Modernization Imperative by Bruce Charlton and Peter Andras

This book argues that contemporary society in Western democracies is generally misunderstood. Commentators typically assume that we still live in a ‘pyramidal’ hierarchical state that is dominated either directly by the government, or indirectly by capitalist economics. It is assumed that social cohesion is imposed on the population by a combination of force and propaganda. Such widespread views contribute to a pessimistic attitude to the present and a fearful attitude to the future, yet neither view is correct. The reality is that we live in a fundamentally pluralistic society divided into numerous ‘modular’ social systems each performing different functions; these include politics, public administration, the armed forces, law, economics, religion, education, health and the mass media. Because each is specialized, none of these systems are dominant and there is no overall hierarchy of power between them. Modernizing societies are therefore structured more like a mosaic than a pyramid. The Modernization Imperative explains the importance of modernisation to all societies and analyses anti-modernisation in the UK.

“Clever Sillies” – wasn’t that a recurring skit in Monty Python? Oh! I’m sorry, that was “Social Module” of Silly Walks.



Med Hypotheses. 2009 Dec;73(6):867-70. doi: 10.1016/j.mehy.2009.08.016. Epub 2009 Sep 4.

Clever sillies: why high IQ people tend to be deficient in common sense.


In previous editorials I have written about the absent-minded and socially-inept ‘nutty professor’ stereotype in science, and the phenomenon of ‘psychological neoteny’ whereby intelligent modern people (including scientists) decline to grow-up and instead remain in a state of perpetual novelty-seeking adolescence. These can be seen as specific examples of the general phenomenon of ‘clever sillies’ whereby intelligent people with high levels of technical ability are seen (by the majority of the rest of the population) as having foolish ideas and behaviours outside the realm of their professional expertise. In short, it has often been observed that high IQ types are lacking in ‘common sense’–and especially when it comes to dealing with other human beings. (Could he be unaware that Asperger people exist?) General intelligence is not just a cognitive ability; it is also a cognitive disposition. So, the greater cognitive abilities of higher IQ tend also to be accompanied by a distinctive high IQ personality type including the trait of ‘Openness to experience’, ‘enlightened’ or progressive left-wing political values, and atheism. (WOW!)

Drawing on the ideas of Kanazawa, my suggested explanation for this association between intelligence and personality is that an increasing relative level of IQ brings with it a tendency differentially to over-use general intelligence in problem-solving, and to over-ride those instinctive and spontaneous forms of evolved behaviour which could be termed common sense. (WOW!) Preferential use of abstract analysis is often useful when dealing with the many evolutionary novelties to be found in modernizing societies; but is not usually useful for dealing with social and psychological problems for which humans have evolved ‘domain-specific’ adaptive behaviours. And since evolved common sense usually produces the right answers in the social domain; (WOW! – common sense, whatever that is, is an evolutionary product that coincides with the demands of socialization. If you aren’t “social” you can’t practice common sense. And what happened to differences in culture and environment? 

This implies that, when it comes to solving social problems, the most intelligent people are more likely than those of average intelligence to have novel but silly ideas, and therefore to believe and behave maladaptively. I further suggest that this random silliness of the most intelligent people may be amplified to generate systematic wrongness when intellectuals are in addition ‘advertising’ their own high intelligence in the evolutionarily novel context of a modern IQ meritocracy. The cognitively-stratified context of communicating almost-exclusively with others of similar intelligence, generates opinions and behaviours among the highest IQ people which are not just lacking in common sense but perversely wrong. Hence the phenomenon of ‘political correctness’ (PC); whereby false and foolish ideas have come to dominate, and moralistically be enforced upon, the ruling elites of whole nations. (WOW!)