Territorial Asperger / Common?

Home on the Range

Typical day – out and about after finishing chores and “work” when I realized how territorial I am, like a jaguar or wolf patrolling its boundaries. Taking photos is like marking that territory, not for others, but for myself. A rather large territory for one human, about 30-40 miles north to south and 20 miles east to west. I wonder if other Asperger’s can identify a territory in which they feel comfortable and at home?

 

No Surprise! / Gender Bias in Psych Research

imagescensored

From Psych Central: Male Gender Bias in Psychology Research Continues     

By Rick Nauert, PhD Senior News Editor
Reviewed by John M. Grohol, Psy.D.

Despite decades of striving toward equality, gender biases appear prevalent amongst researchers in psychology. In a recently published study, the investigators found that psychology researchers most often compare females against an implicit male norm, rather than on their own or vice-a-versa.

When research finds that men and women differ psychologically, which group seems to be more responsible for the difference? Where gender differences were observed in the research examined, they were described as being about males, and less often than as being about females. Males are seen as the standard for the typical human subject. The research was published in the Review of General Psychology, by University of Surrey (UK) psychologists Peter Hegarty and Carmen Buechel. The authors systematically surveyed forty years of gender difference research in four journals published by the American Psychological Association.

“Even the graphs and tables show evidence of the male-norm effect” said Hegarty. “About three quarters of these positioned men’s data first, and made women the second sex. But this effect was reversed when psychologists depicted data about parents.” The conclusion? Men may be the prototype for modern psychology’s picture of the typical person, but mothers remain the most typical kind of parent.

The data are all the more striking as between 1965 and 2004 the journals studied ceased to be male-dominated. Roughly equal numbers of the study authors and roughly equal numbers of the participants in the studies now published in these journals are male and female. Hegarty doesn’t find it surprising that this shift in the body politic of psychology didn’t undo the male-norm effect. “In laboratory experiments, both women and men tend to spontaneously explain gender differences using a male norm and to attribute differences to females to the same degree. In our study, male and female authors of psychology articles focused their explanations on women to the same degree. Psychologists are not always aware of their implicit decisions about who to explain.”

Is the focus on women and girls a problem? Probably. (Probably???)

Hegarty has shown in other research on sexual orientation differences that stereotype-relevant results are explained in ways that perpetuate stereotypes about the group that is not taken as the norm; lesbians and gay men, in that case. Hegarty and Buechel also found that psychologists vastly preferred the phrase ‘more than’ over ‘less than’ when explaining gender differences. Put this together with the male-norm effect and you could reach the absurd conclusion that women and girls have more psychology than men and boys do.

Hegarty, a social psychologist himself, is optimistic about what the findings imply for the status of psychology. “They clearly show an area where more critical thinking is needed about gender, but on the other hand psychological methods allowed us to bring this issue to light and to describe it. Our conclusion is not that psychologists should not study group differences, but that we serve the public better when we think deeply about the ways that we implicitly frame questions about whose behavior is the default standard norm and whose is made the subject of psychological scrutiny.

Magical Thinking – Supernatural Domain / Neoteny

https://www.scholastic.com/teachers/articles/teaching-content/ages-stages-how-children-use-magical-thinking/

Ages & Stages: How Children Use Magical Thinking

By Susan A. Miller Ed.D., Ellen Booth Church, and Carla Poole

DEVELOPMENT

0 to 2 “NO! IT GET ME!”

A young baby’s world revolves around her own experiences. Those experiences are dominated by physical sensations, such as a gas bubble or a soft blanket, with blurred distinctions between herself and the rest of the world. She lives in the moment. For example, 4-month-old Jessica is fascinated by a toy her teacher is holding. She stares at it intently. Yet, when the toy is dropped out of view, Jessica doesn’t look down to find it. She simply looks at another object that is in her direct line of sight. Her behavior implies, “I see the toy, therefore it exists. I don’t see the toy and it doesn’t.” Her worldview is a series of images based on her own experiences rather than a sequence of logical events. (We may perhaps see this persist as “inattention” and novelty-seeking in older children and neotenic adults)

Moments of Magical Thinking

By 12 months, an infant’s thinking becomes more rooted in the reality that objects and people remain the same even when out of sight. This concept of object permanence, along with an expanding memory, makes the baby’s life a bit more predictable. But, she still often misinterprets reality. For instance, 1-year-old Jemima voices displeasure and is frightened when a toy unexpectedly rolls just a few inches toward her. The world is a mystical place, and babies have a fragile understanding of the difference between animate and inanimate objects. (American culture promotes this confusion: “entertainment” aimed at children and adults is saturated with just this infantile perception and presentation of reality.)

Seeing is Believing

When working with toddlers, it’s important to remember that they will make connections that are illogical and frustrating. (Neotenic social typical adults continue to produce “magical” connections as explanations for any and all phenomenon. This literally is what drives Asperger types “crazy” when interacting socially.)

No amount of reassurance (or factual information) is going to immediately convince 16-month-old Ashley (or neotenic adults) that she can’t slip down the bathtub drain like the sliver of soap just did. In cases such as this, you can recommend to parents that they temporarily let the toddler bathe standing up-supporting her while she stands on a safety mat fastened to the tub’s surface. They can reassure her that she is too big to go down the drain and that they will keep her safe. It’s important to respect toddlers’ fears and to understand that, for them, it is often the case that seeing is believing. “The soap slipped down the drain, so I can, too.”

Moving Toward Abstract Thinking

At around 18 months, emerging language and long-term memory pull toddlers out of the purely sensory world into more complex, abstract thinking. They begin to grasp concepts such as cause and effect. (Cause and effect is almost impossible for many adults to comprehend; their “development” is stuck at this stage – “culture” and social pressure either “affirm” faulty designation of cause as “magical-supernatural” and/or fail to “teach” and develop reasoning skills) Difficulties begin because their reasoning, which seems quite logical to them, has little connection with reality. For example, 20-month-old Jason spills a small amount of juice on the table just before a baby in the room lets out a piercing cry. Jason’s expression becomes very sad and serious. We can’t know for sure – that’s the challenge of caring for preverbal children, but Jason may think his accidental action caused the baby to cry.

Developing Theories

A thriving 2-year-old is a busy scientist actively exploring and creating his own theories about how things work. Julian loves to turn lights on and off. Does he think it is his fingertip that magically creates light and dark? Or, is it the blinking of his eyes that he does each time he flicks the switch? Two-year-olds do not have enough information about the world yet to draw reasonable conclusions. (American education fails to supply information about reality – math, science, nature – it confirms the infantile belief that reality is created by “emotional demand” and by spells, chants, rituals – consumerism, “brand” shopping, “free” money-credit – that will “magically” fulfill narcissistic focus. Narcissism is necessary to infants; in adults it is destructive.)

Remember that magical thinking is the very young child’s way of trying to figure out how things work. (Sadly, social typical adults don’t usually “try to figure out” how things work: they blindly consume whatever explanations are supplied by religious dogma, social indoctrination and cultural propaganda.)

Stage by Stage 0 – 2

  • Babies need to be the center of a loving, predictable world-the essential core experience for all kinds of thinking, both magical and rational.
  • Toddlers base their thinking on what they see, hear, and feel-often resulting in inaccurate but creative conclusions. (Improper word – these are “false” conclusions.)
  • Two-year-olds work hard, through much exploration, at developing their unique theories about the world. (This is not allowed to happen: socializations supplies “absolute supernatural theories” which children are coerced into accepting)

Stage by Stage 3 – 4

  • Threes and fours often use magical thinking to explain causes of events.
  • Preschoolers sometimes assign their own thinking as a reason for occurrences that are actually out of their control.
  • Three- and 4-year-olds believe, with their powers of magical thinking, that they can change reality into anything they wish. (This delusion is as American as apple pie!)

Stage by Stage 5 – 6

  • Fives and sixes move in and out of magical thinking as explanations for what they see. (But “modern social humans” do not; this development is not “automatic” – that is a fantasy of deterministic behavioirist psychology) Kindergartners use dramatic play as a way to sort through what is fantasy and what is reality. (American cultures presents fantasy as “true reality” and reality – adulthood as abnormal and “bad”. )
  • Five- and 6-year-olds are still in an animistic stage, thinking inanimate objects can come alive. (Observe advertising and marketing, which depend on this developmental stage; “useless and deceptive” products are guaranteed to magically “turn fantasy into reality” by contagious magic.)

.

“The Secret History of Emotions” / L.F. Barrett – Constructed Emotion

Comment: Emotions ARE SOCIALLY CONSTRUCTED when learning verbal language. See: https://aspergerhuman.wordpress.com/2015/12/03/empathy-and-emotion-are-words-that-describe-pain/

and: https://aspergerhuman.wordpress.com/2016/03/03/learning-emotion-words-socialization/

The Chronicle Review of Higher Education

(Bold highlights are mine)

http://www.chronicle.com/article/The-Secret-History-of-Emotions/239357

By Lisa Feldman Barrett March 05, 2017

On a brisk fall day in 2006, I was sitting on the floor of my former office in the Boston College psychology department, weeding through boxes of old journal articles on the science of emotion. As I perched in the center of a pile, I came across a tattered paper by a psychologist named Elizabeth Duffy, dated 1957, titled “The Psychological Significance of the Concept of Arousal or Activation.” I vaguely remembered reading it in graduate school, but the details were foggy. Probably worth rereading, I thought, and spared it from the recycling bin.

I had no idea that this action would lead me to unearth two major errors in psychology and a half-century of lost research.

Before I can tell you that story, you’ll need to understand how the science of emotion came to be. Most scientists who study it would relate a history roughly like this:

Once upon a time, people believed that the human mind was bestowed by gods or God. Emotions, in contrast, were said to live within the body, like an inner beast that needed to be controlled by divine, rational thought. In the 19th century, Charles Darwin replaced God with natural selection, and shortly thereafter, psychology was born. A golden age of emotion research began, as neurologists and physiologists searched for the physical basis of emotions. They discovered that emotions live in ancient parts of the brain that control the body: the mythical “inner beast” made real. These scientists’ triumph was short-lived, however, as the science of emotion soon plunged into a “dark ages.” Psychology fell prey to a scourge known as behaviorism, the study of pure behavior, in which intangibles like thoughts and feelings were deemed unmeasurable and therefore irrelevant to science. Nothing worthwhile was published on emotions for half a century.Then the cognitive revolution arrived, in the 1960s, rescuing psychology from the darkness, and the science of emotion experienced a renaissance. Emotions were discovered once and for all to have distinct and universal facial expressions, bodily patterns, and brain circuitry, and we all lived happily ever after.

Pick up any psychology textbook or read Wikipedia, and you’ll see some variation of that story: that emotions are inherited through natural selection and located in specific parts of the brain that trigger distinct reactions — the “fingerprints” of emotion — in the face and body. See a snake slither across your path, for example, and a “fear circuit” is said to cause your heart to race, your eyes to widen, your voice to shriek. If you’ve ever heard that emotions live in a “limbic system” in the brain, that you have a “lizard brain” that triggers your emotions, or that fear lives in a region called the amygdala, those ideas are rooted in the same story. So is the movie Inside Out, a children’s fantasy about emotions as individual characters in the brain, which was described by National Public Radio as “remarkably true to what scientists have learned about the mind, emotion, and memory.”

The story of how we came to the classical view of emotion has influenced generations of scientists, educated millions of students, and set the course of psychological research for decades. But it’s a fiction. The details about Darwin, the dark ages of behaviorism, and the subsequent rescue and renaissance bear only a passing resemblance to the facts. That’s what Elizabeth Duffy’s paper was about to teach me.

An extensive body of research points to a wholly different view of what emotions are. They are not caused by dedicated brain circuits that, in certain circumstances, flip on and make you feel and move a particular way. Rather, emotions are whole-brain affairs. Happiness, surprise, anger, and the rest are constructed in the moment by general-purpose systems throughout the brain, the same systems that create thoughts, memories, sights, sounds, smells, and other mental phenomena. The name for this alternative view is “construction,” and my particular approach is called the theory of constructed emotion.

Construction eschews “fingerprints” and points out the variety of emotion in real life. In anger, your heart rate might go up, go down, or stay the same. Your eyes might widen, narrow, or close. The so-called fingerprints of emotion, like a grimace and elevated blood pressure for anger, are merely cultural stereotypes. They are reinforced by popular TV shows like Daredevil and Lie to Me, in which people’s innermost thoughts and feelings are revealed by facial movements and heartbeats. My lab has copious data showing that emotions have no consistent patterns in the face, body, and brain, however, including a meta-analysis of 22,000 test subjects across more than 220 studies of peripheral physiological changes during emotion, and another meta-analysis of every published neuroimaging study of emotion.

I had begun graduate school believing in the classical view of emotion and its dignified history. By the time I encountered Elizabeth Duffy’s paper, I’d been publishing about construction for several years. However, I still believed the part about behaviorism, when nothing much happened in emotion research from about 1910 to 1960. Behaviorism redefined emotions as observable behaviors: Fear was defined as freezing in place; happiness as a tasty treat at the end of a maze. Many psychologists today consider the period of behaviorism to be scientifically bankrupt, producing little knowledge of any value about the human mind.

Reading Duffy’s paper, what caught my eye was the list of references at the end. Two of them, from the 1930s and ’40s, also written by Duffy, were unknown to me, which was odd because their titles sounded remarkably relevant to my research. When I tracked them down, I was dumbfounded. Duffy was making exactly the same points that I had made in a recent paper, questioning whether the scientific evidence on emotion really supports the classical view. But she’d done it 70 years earlier, when supposedly nobody was studying such things.

Her two papers were clearly crucial to the field. Why hadn’t I heard of them? Back in my office, I searched and located a few authors who had cited Duffy here and there over the past 60 years, but for the most part, the field had overlooked her.

I had stumbled onto a mystery. But I didn’t know how big it was going to get.

Duffy’s references led me to several other unfamiliar papers that tried in vain to locate emotion fingerprints. Unlike behaviorists, these researchers weren’t saying that emotions don’t exist. They were running experiments to find physical markers of distinct emotions, failing to do so, concluding that the classical view was unjustified, and speculating about what would later be called construction.

The list of references kept growing, and soon I had more than a dozen of these mystery papers, enough to make me wonder what the hell was going on. Together with one of my sharpest graduate students, I hunted for more papers in earnest and started buying rare, used psychology texts online. My husband was bemused by the steady stream of small packages from Amazon and the timeworn books inside them. We bought another bookcase. Then another.

Little by little, I headed backward in time. From Duffy and her peers in the 1930s and ’40s, to a trove of obscure work dating back to the turn of the century, and then to textbooks on emotion written in the mid to late 1800s. My new bookshelves creaked. I was looking at a mountain of research that was critical of the classical view: more than 100 little-known works spanning at least five decades.

Once I’d reached back into the 1800s, I turned to the work of luminaries in the field of emotion, including Charles Darwin and William James, that I’d last encountered in graduate school. This time around, rather than read bits and pieces or interpretations by other scholars, I pored over the original books in their entirety. They were eye-opening in ways I had not expected.

First up was Darwin’s The Expression of the Emotions in Man and Animals, which has been lauded for more than a century for demonstrating that facial expressions are useful and functional products of natural selection. I was stunned to discover that the book says nothing of the sort. Natural selection is barely mentioned, and Darwin never claims that facial expressions are functional. Quite the opposite: He repeatedly calls them vestigial and “purposeless”! Virtually everyone in my field, for reasons unknown, was citing Darwin’s ideas on emotional expressions inaccurately.

After Darwin, I reread William James, considered a father of modern psychology. James is widely known for saying that every type of emotion has a distinct fingerprint in the body. You can find this claim about James in undergraduate textbooks, in scholarly papers, and in best sellers. And yet, the more James I read in the original, the less plausible the claim became. A whole section in his classic Principles of Psychology, Volume 2, is titled “No Special Brain-Centres for Emotion.” And I kept encountering criticisms of the idea of emotion fingerprints, such as “ ‘Fear’ of getting wet is not the same fear as fear of a bear” (in “The Physical Basis of Emotion“). Ultimately, I discovered that James had been wildly misinterpreted. He never said that every type of emotion has a distinct bodily state. He said every instance of emotion may have a distinct bodily state — in other words, variety is the norm. That is the opposite of a fingerprint.

After some research, I uncovered how Darwin’s and James’s words had become twisted into these alternative meanings. In both cases, other scientists had reinterpreted the original text, and their modifications were wrongly attributed back to Darwin and James. Each mistake has endured for a century, becoming a firm yet false basis of the classical view of emotion, misleading generations of students, and wasting billions of dollars of research money in search of emotion fingerprints.

My findings implied an entirely different history of emotion research, one that is not kind to the classical view. Darwin and James could no longer be seen as the foundation of this view, and the so-called dark ages had actually been a period of tremendous innovation and evidence against the view.

So, how did these errors and oversights happen? Were 50 years’ worth of research papers accidentally overlooked, actively ignored, or intentionally suppressed? As with most historical events, there’s probably more than one cause.

A first possibility is that the “dark ages” of emotion never existed. What people call “history” is just a representation of the past that helps make sense of the present. People are creative historians who craft a story somewhere between fact and fiction. (Therapists know this, as does anyone who has tried online dating.) The history of scientific ideas is no exception.

One example is the “flat earth” myth. Students today learn that people of the Middle Ages thought the world was flat, and that Columbus set sail to prove it round. But that history is not true. The myth was propagated in the early 19th century to embellish a story about how the Age of Reason (science) triumphed over the ignorance of faith (religion).

Scientific progress sounds more impressive when it’s portrayed as a beacon of light suddenly appearing after decades or centuries of darkness, when in actuality those ideas have been around for ages. It’s possible that in a similar manner, the so-called dark ages of emotion research were manufactured to make the “renaissance” of the classical view viable.

A more mundane possibility is that the ideas of Duffy and her colleagues never took root because they did not offer a fully formed alternative model to compete with the classical view. They had a critique of the dominant scientific view, but dissent alone was not enough to remain relevant. As the philosopher Thomas Kuhn wrote about the structure of scientific revolutions: “Because there is no such thing as research in the absence of a paradigm, to reject one paradigm without simultaneously substituting another is to reject science itself.”

But the most likely reason that the classical view persisted, I believe, is that it’s not just a view of emotion. It also represents a compelling story of what it means to be a human being. It says that you are an animal at the core, at the mercy of automatic emotions that you regulate by that most human of abilities, rational thought. This view of human nature is deeply embedded in society. It’s in the legal system, which distinguishes between calculated crimes, such as first-degree murder, and crimes of passion, in which your emotions “take you over” and you are partially absolved of responsibility. It’s in economics, forming the foundation of theories about rational and irrational investors. It’s in health care, as autistic children are taught stereotypical facial poses ostensibly to help them recognize emotions in others. It’s in stereotypes of men versus women, in which women are believed to be innately more emotional than men.

Construction theories of emotion are an ambassador for an entirely different view of human nature. Your mind cannot be a battleground between animalistic emotions and rational thoughts, because the brain has no separate systems for emotion and cognition. Instances of both are constructed by the same set of brainwide networks working collaboratively. Scientists didn’t know this in Elizabeth Duffy’s time, but modern neuroscience has confirmed it.

These observations force us to reconsider some of the most fundamental tenets of law, economics, psychology, health care, and other areas of life. Yes, Yes, Yes.

In addition, the classical view of human nature, with its tale of ancient emotion circuits robed in rationality, depicts humankind as the pinnacle of evolution. Construction uncomfortably dislodges us from this honored position. Yes, we’re the only animal that can design nuclear reactors, but other creatures eat our lunch when it comes to other abilities, like remembering fine details (a strength of the chimpanzee brain) or even adapting to new situations (where bacteria reign supreme). Natural selection did not aim itself toward us — we’re just an interesting sort of animal with particular adaptations that helped us survive and reproduce. Construction teaches us that our brain is not more highly evolved, just differently evolved. That’s a humbling message to swallow in Duffy’s time and in ours.

We might never know why 50 years of research fell off the map. What is most important is to rediscover what was lost. Today we can peer harmlessly into a living human brain, and we have computers to gather and process data. It’s pretty clear that emotions are constructed, not lurking in dedicated brain circuits. At long last, we are on a scientific path marked by the data, rather than ideology, to understand emotion and ourselves.

Lisa Feldman Barrett is a professor of psychology at Northeastern University and the author of How Emotions Are Made: The Secret Life of the Brain (Houghton Mifflin Harcourt) published this month.

 

Memoir / Impossible for an Asperger?

A photo that tells me of a time when my inner and outer experiences were undivided.

A photo that tells me of a time when my inner and outer experiences were undivided.

 

Rethinking a Life

A few thoughts on writing memoir:

How much talent is lost because society doesn’t like the package it comes in? The individual is a rare creature: she is self-made and not designated by a political system. A political statement of rights does not make one an individual: those rights are defined and held in check by the society that grants them. As a young woman who not only wanted to achieve financial equality and independence, I was told that the ticket to “getting in” to the system was to adopt the very structure that denied opportunity to women. I was also horrified to learn that I was expected to drop my gender at the door, as well as my personality, values, individual potential, and most surprisingly, talents that might benefit an employer. Another shock: I learned that this defacement is what men have been required to accept for centuries.

Individuality is a function of personal qualities that are cast against the vast historical canvas of culture, and in many ways the individual exists in opposition to that picture. Identity is a package prepared by generations of ancestors, as well as the living family, long before a child is born. Father and mother shape a child’s beliefs, behavior, and future. The larger society sets the rules of membership, which can be extremely harsh. The individual is born and dies when each of us is assigned a role dictated by ideas, prescriptions, and absolutes that the individual has no part in creating. The result is that when one looks into a mirror, a shadow feeling haunts the body: that is not any face; it is my face, unique in all the world. Why then, do I not know myself?

To write a memoir is to tear oneself loose from social conformity and to declare that one’s life is not the same as any other life, regardless of how similar human lives are. A modern trend in autobiography has lead people to think that the writer has amazing secrets to reveal, and that he or she will do just that; why else would one write a book about a life that has yet to be concluded? A memoir is expected to erase the public person, to replace the mask with a livelier, racier, and more interesting person. Family members and close friends are expected to be shocked by revelations, and will claim that they did not know of secret individual choices on the part of someone they thought they knew. The public loves it when a willful individual goes bad. Confession and repentance, in the form of a serial memoir (the trek from talk show to talk show) or a best-selling book, return the stray reprobate to the group. In this sense, a memoir is a religious document.

What then, is an individual? The test is simple: Only an individual can care about the welfare of other individuals. The group, by definition, cannot. The group survives by enforcing conformity and does not recognize that each person has a valid interior life, only that inner lives are suspect. The group pressures and grooms the individual to vanish into a pre-assigned role: the American idea of an individual is someone who utterly conforms to social norms, but does something like skydive with a pet dog, or paint the bedroom orange, or pick a cartoon graphic for a credit card that claims, “I’m whacky! I’m crazy! I’m creative.”

From the point of view of a person with a mental illness, this is ludicrous: normal  people have no idea what crazy is. Any deviation from the suffocating religious-patriotic complex of American belief, including what have been called criminal acts, is increasingly regarded as mental deviation and not diversity. To be diagnosed as having a brain disorder automatically puts one outside of society, forcing one to embrace a strange, dangerous, and unsought individuality. The memoir of such a person cannot be separated from this predicament, which can be described as the terror and the mystery of the conventional.

A social view of life as a program to be fulfilled, and only completed at death.
“No one ought to be said to be happy, until death and the last funeral rights.” Ovid

 

 

Research Paper: Neanderthal – Early AMH Baby Brains / Erectus

Published online before print September 8, 2008, doi: 10.1073/pnas.0803917105
PNAS September 16, 2008 vol. 105 no. 37 13764-13768           

Neanderthal brain size at birth provides insights into the evolution of human life history

Marcia S. Ponce de León*,, Lubov Golovanova, Vladimir Doronichev, Galina Romanova, Takeru Akazawa§, Osamu Kondo, Hajime Ishida, Christoph P. E. Zollikofer*,

From Article / Discussion – Much more at PNAS

These findings permit several inferences regarding the evolution of brain growth patterns and of human life history. A neonate brain size of ≈400 ccm is most likely a feature of the last common ancestor of Neanderthals and AMHS, and it might represent the upper physiological and obstetrical limit that can be attained in hominins, irrespective of the course of postnatal brain expansion.

Various studies have proposed that a large neonatal brain size (≈300 ccm) and secondary altriciality were features already present in H. erectus (4, 9, 35) (estimates are summarized in SI Text, Estimates of Homo erectus Neonatal Brain Size). Because fetal brain growth requires substantial maternal energy investment (36), a large neonatal brain size must have represented a significant selective advantage in H. erectus, possibly by providing the primary substrate for complex learning tasks during childhood (4). Likewise, the high early postnatal brain growth rates of Neanderthals and AMHS compared with chimpanzees, which imply a more than twofold increase of ECV during the 1st year of life, might be a feature of their last common ancestor, and there is evidence that high postnatal brain growth rates might already have evolved in H. erectus (4, 9). Overall, therefore, our data support the hypothesis (4, 9) that the origins of “modern” human-like patterns of brain growth and life history must be sought relatively early during the evolution of the genus Homo.

High postnatal brain growth rates have been interpreted as an evolutionary extension of fetal modes of growth into early infancy (36), and this is thought to be the main ontogenetic mechanism to attain an exceptionally high degree of encephalization during adulthood (2). What are the implications of this pattern of brain growth for life history evolution? Recent analyses suggest that the correlation between brain growth patterns, adult brain size, and life history is indirect and results from maternal energetic constraints (2, 5, 7): The additional energetic costs of the fast-growing infant brain are mainly sustained by the mother, such that species investing in large infant brains that grow at high rates to reach large adult sizes require large, late-maturing mothers (2, 5).

In this context, the higher early brain growth rates and larger adult brains of the Neanderthals compared with rAMHS have interesting implications. The pattern of Neanderthal brain growth fits into the general pattern of rate hypermorphosis in this species: Compared with rAMHS, Neanderthals have been shown to attain larger adult cranial sizes and more advanced (peramorphic) shapes within a given period of ontogenetic time (31). Rate hypermorphosis might be a correlate of greater average body size in Neanderthals compared with rAMHS (21, 22). However, it does not imply earlier cessation of brain growth (Fig. 4B), nor does it imply a faster pace of life history (as was suggested in refs. 18 and 19): In light of the maternal energetic constraints hypothesis (2, 5), our results suggest that Neanderthal life history had a similarly slow pace as that of rAMHS, and probably was even somewhat slower.

What are the potential developmental, cognitive, and phyletic implications of these subtle developmental differences between the brains of Neanderthals and rAMHS? In recent humans, the temporal course of endocranial volume expansion is only loosely correlated with the temporal course of brain maturation (37), such that hypotheses regarding differences in cognitive development cannot be substantiated with fossil evidence. Nevertheless, several hypotheses can be proposed to explain how differences in brain growth rates between Neanderthals and AMHS evolved. High brain growth rates in the Neanderthals could represent a derived feature. This hypothesis would be in concert with the notion that Neanderthal morphology is derived in many respects. As an alternative hypothesis, high rates in the Neanderthals might represent an ancestral condition, probably shared with eAMHS as opposed to rAMHS, whose lower brain growth rates would represent a derived condition. Support for this hypothesis comes from the observation that adult brain size of eAMHS was similar in range to that of the Neanderthals (38, 39), such that it is likely that brain growth rates were similar as well.

According to this second hypothesis, the high ancestral rates of brain growth were probably reduced only relatively recently during AMHS evolution. Brain size reduction in AMHS during the late Pleistocene is well documented, and it went in parallel with body size reduction (39). We can only speculate on potential selective constraints driving this evolutionary trend toward rate hypomorphosis. Evidence from recent human populations indicates that size reduction is correlated with faster life histories and higher mortality risks (40). Alternatively, brain size reduction during the Late Pleistocene could be the result of an evolutionary performance optimization. Evidence for substantial cerebral reorganization comes from Late Pleistocene AMHS (Cro-Magnon 1) and Neanderthals (La Chapelle-aux-Saints 1, La Ferrassie 1, and Forbes’ Quarry 1), which had larger cerebral hemispheres relative to cerebellum volume than modern humans (41). It could be argued that growing smaller—but similarly efficient—brains required less energy investment and might ultimately have led to higher net reproduction rates. Such an evolutionary shift might have contributed to the rapid expansion of Upper Paleolithic AMHS populations into Eurasia.

The notion that genes down-regulating rates of early brain growth might have contributed to the fitness of our own species is an intriguing, but testable, hypothesis. Genes involved in the regulation of brain growth that show evidence of recent selective sweeps are of special interest (42, 43), but their known normal variants do not account for variation in brain size (44). Further research is thus necessary to clarify the genetic basis of brain and body size variation in modern humans and its relationship to life history variation.

610d78ee1aaac6f859084f192ef13dca

Overall, integrating neurocranial, dentognathic, and postcranial data on Neanderthal and AMHS development reveals a complex pattern of between-taxon and within-taxon variability of life history-related variables, and indicates that hominin life history evolution was a modular (5), mosaic-like, rather than a linear, process. Inferences on the evolution of hominin life history and cognitive development must be drawn with caution, especially when drawn from isolated aspects of fossil morphology.

                

Using Numbers to Express Emotion / Chat Room Chat

(My comments in olive Green) A common difficulty in communication between Asperger’s and social typicals is that specific words and word concepts (such as emotions, empathy) do not have common or shared “meanings”
This is not superficial. It reflects differences in the act of communication itself; in the (expected) intent, utility, and outcomes of communication. Social use of language often seems “Nebulous” “self referencing” “vague” and “pointless” to a Concrete visual thinker, whose brain is set to problem-solving mode against a background of logical “natural” structures.
Social typical language is About human relationships that define status on a social hierarchy – A system driven by rigid rules and yet perpetually “under negotiation” at personal, group and class  boundaries.

In essence, Asperger types and social types are not talking to each other at all, but about distinct mental “universes” that arise from very different perceptions of the environment – “reality” 

Trying to establish “contact” with Asperger’s individuals by forcing them to “reveal social-emotional states” is counterproductive; in fact it is outside our experience of reality. Trying to establish contact with social typicals by “sharing” the fascinating facts of physics, steam locomotion, forest growth cycles or geologic processes is equally hopeless.

No, I don’t have a solution. (Perhaps a sense of humor of the “absurd” kind aids tolerance, at least LOL)

Numbers / Empathy / Emotion

An Asperger chat line exchange: 

MOMBOY: My mom mentioned I don’t have much empathy. I told her it was a useless emotion. (Empathy: Is it an emotion, a behavior, a brain function, a concept, or science fiction?) I said that I do help people sometimes to make up for it, minus the emotional baggage. I told her that I usually have an empathy of 2/100. I said it peaks at 20/100. Then she laughed that I use numbers to describe it. She implied that I pull these numbers out of thin air. I feel like these numbers are ways to express approximations. Does anyone else use numbers to describe feelings? Is this funny to you?

MR FORMAL: What works for you, works for you, but when communicating with others you should consider trying to stick with known contexts (words). Otherwise you get weird looks. The point is, there is an amount of conformism that should be observed in order to cleanly operate within soceity. Humans need a consensus in many different areas in order to get along with each other. Communication is one of the most basic consensus items we use to attain this. If you are communicating in another language (foreign, numerical, or invented) then you will be misunderstood by the majority around you. Predict your future if you stick with describing yourself in unconventional ways while in the company of others. The case for social conformity, at least in public.

MOMBOY: You make a good point. However, this is my mother so I can get away with it.  And saying “I don’t possess empathy for other humans and you’re not smart enough to understand why” would not score me any more social points than my fractions would. The use of numbers to disguise true feelings and thoughts?

MS LOGIC: It depends on how you think. Being left brained I always use numbers to describe things, but most people don’t. Being a person of math most things are approximations, only theorems are absolute after all. Approximations = shades of emotion.

MOMBOY: When I think of something that has a quantitative value (amount of empathy or something), I see a glass of liquid and I guage how full it is. Numeric is then the most logical way. Visual conversion of quantity.

MS LOGIC: I don’t see a glass of liquid, but I agree that numbers tend to be the most logical representation. That way you can organize things based on situation. Just to clarify, my give-a-shit meter works on a percentage scale. Your fraction based system is very similar though. (Fractions are percentages!) Visualizing quantities – numbers.

MR FORMAL: A perfect example of communication building based on associating language with context. Keep this up and you guys will be speaking half in numbers and half in words.

MOMBOY: I don’t use the numbers in speech unless someone asks me how much empathy I feel for a certain situation. That has yet to happen, so for now my mind just uses the numbers as markers. The numbers gauge things comparatively against what you think about other events, not comparing what you think to what others think.

MR FORMAL: You misunderstand me, however, I was thinking along the lines of you both developing a new way of expressing something through a hybridization of an old language and a new element. And I was running a social simulation in my head where human language used numbers instead of words to describe feelings. Creative compromise.

NEWBOY: The numbers are useless without context. 1/100 could be referencing happiness, grief, or relief. I don’t think a language based entirely off of numbers is practical. Unless of course you change certain numbers to be words. As in when I say “5” it holds the same meaning as the word empathy. That would be rather clumsy though. There are languages based on numbers – they’re called CODES.

MOMBOY: There is little difference between how we use numbers and how other people use terms such as a lot, somewhat, strongly, very etc…

NEWGIRL: We just use the numbers because they are more concise and pleasing to us. In reality there is almost no difference between these two “I strongly empathize with XXX” and “I empathize 90% with XXX” Even though to some it may sound foreign to some people.

MOMBOY: Exactly. Numbers allow thoughts to be clarified in ways that are often notable. By supporting someone’s actions 85 percent (instead of saying support them greatly), the 15 percent left speaks volumes. It leaves room for a lecture, (fudging between saying what you think and limiting the social blowback?) yet still conveys that you are not in terrible opposition to the person’s actions; thus they needn’t be too troubled by your critique. Numbers (in the right context of course) can say a lot to people.

MS. DAISY: Makes perfect sense to me. I usually either feel an emotion or I don’t, so I don’t think that quantifying them would help me very much. With that in mind, though, I think it’s a very useful concept, at least for introspection. I doubt most people would appreciate the numeric representation of emotions, as it probably comes off as being a bit cold.

MOMBOY: The touchy feely types may not appreciate such a mathematical approach to detecting emotion.

JOINER: To me, emotion is like a smoke or a fog that moves a bit like liquid. Emotion is very ethereal. Feeling emotion is like a mystical treasure. But deciding how important it is? That must done with the most precise logic. Very visual experience – of a physical state.

MR MATH: I describe feelings with numbers most of the time as well because it’s easier for me to explain feelings this way. I suppose it just depends on whom you’re talking to whether they’ll appreciate it or not. Most of the people in my life have gotten used to it. I can explain my interest in someone in terms of “Its 10% friendship, 20% …” Having to translate physical feeling (emotion) into numbers in order to describe it.

MR FORMAL: I have a couple of published papers on quantification of soft cognition: beliefs, hunches, biases, assumptions, uncertainty, emotional mood, etc. This is for research related to applications in artificial intelligence. Some of my recent research is based upon a “calculus” I have developed – Bias-Based Reasoning, which mathematizes mental percepts.

MS Daisy: Sometimes I do quantify a feeling in terms of how much i would spend on something. (Me) This I can relate to: if I want to limit calories to 1200 / day, the calories automatically convert to dollars and  I “spend them” on food. It’s very different to think of a 400 calorie chocolate bar costing $4.00 out of a $12.00 budget, which shows that it’s not a “bargain.”

MR MATH: My natural tendency is to use numbers. I think in terms of a horizontal line with 100% at one side and 0% at the other. I have learned to edit out the % with most people as I feel I come across very nerdy and I know most people don’t relate to it. Visual conversion of numbers. This is familiar to me as a visual thinker.

NEW GIRL: I would say my empathy can get to 8/10 at extreme conditions, usually falls around a steady 0.37/10.00 and coasts up to 3/10 sometimes. A bit wacky, but perhaps charming. LOL!

AAP Pediatrics / Language – Emotion Co-development

American Academy of Pediatrics November 1998, VOLUME 102 / ISSUE Supplement E1 (Click here for extensive list of articles on child development)

Language Development and Emotional Expression

(this article) http://pediatrics.aappublications.org/content/102/supplement_E1/1272/

Lois Bloom

Excerpt: The relation of language and emotion in development is most often thought about in terms of how language describes emotional experiences with words that name different feelings. Not surprisingly, therefore, developmental studies of emotion and language typically have described how children acquire emotion labels, such as “mad,” “happy,” “scared.”1–3 However, children typically do not begin to use these words until language development is well underway, at approximately 2 years of age. Other studies have described how caregivers use emotion words when talking to their infants in the first year. Caregivers are very good, almost from the beginning, at attributing particular emotions to a young infant’s cries, whines, whimpers, smiles, and laughs, for example, “what a happy baby,” “don’t be so sad,” “are you angry?”4,,5 However, once infants begin to learn language, mothers are far less likely to name a child’s emotion than to talk about the situations and reasons for the child’s feelings and what might be done about them.6,,7

This research emphasis on the words that name emotions has at least these two limitations. First, the number of emotion words in the dictionary is small —at most, a few dozen terms for emotions and feeling states—compared with the enormous number of names in a dictionary for objects and actions. Second, the emotional expressions of infants and young children generally are transparent in their emotional meaning. Thus, the label for an emotion is very often redundant with its expression and adds no new information. Given the relatively small number of words for naming feelings and emotions, and the redundancy between emotion words and the expressions they name, understanding how emotion and language are related in early development requires looking beyond just acquisition of specific emotion words.

STUDYING LANGUAGE ACQUISITION IN ITS DEVELOPMENTAL CONTEXT

The core of development that brings an infant to the threshold of language in the second year of life is the convergence of emotion, cognition, and social connectedness to other persons.8,,9 Children learn language initially because they strive to connect with other persons to share what they are feeling and thinking. When language begins toward the end of the first year, infants have had a year of learning about the world. The results of their cognitive developments have given children contents of mind—beliefs, desires, and feelings—that have to be expressed because they are increasingly elaborated and discrepant from what other persons can see and hear in the context. Language expresses and articulates the elements, roles, and relationships in mental meanings in a way that a child’s smiles, cries, frowns, and whines cannot. Language, then, emerges in the second year out of a nexus of developments in emotion, social connectedness, and cognition.

For the past 10 years, I have been studying how language comes together with the cognitive, emotional, and social developments of the first 3 years of life,8 with the basic assumption that language acquisition is tied to other developments in a child’s life. The knowledge we set out to explain was language: how children learn words in the second year and then learn to combine words for phrases and simple sentences in the beginning of the third year. Early words are fragile, imprecise, and emerge tentatively at the same time that emotional expressions are robust, frequent, and fully functional. We asked, therefore, how these two systems of expression—emotion and language—come together in the second year of a child’s development. We looked at both the content of developments in emotional expression and language as well as at the process of their interaction.

The model of development that guided our research (Fig 1) built on the link between two well-known concepts in psychology: engagement and effort. Knowledge of language is represented here by the tripartite model of language that Peg Lahey and I introduced 20 years ago. Linguistic form—sounds, words, and syntax—is only part of language, albeit the part that attracts the most attention. Form necessarily interacts with content, or meaning, because language is always about something. And form and content interact with the pragmatics of language use: language is used in different situations, for different purposes and functions. Only one or the other of these components, notably form alone, cannot by itself be a language. Rather, language is, necessarily, the convergence of content, form, and use.10

See original for much more…

CONCLUSIONS

Many questions about the complex developmental relationship between language and emotion remain for additional research, but our findings provide some insight into the effort and engagement required by both language learning and emotional expression. We propose that the heart of language acquisition is in the dialectic tension between the two psychological components of effort and engagement (Fig 1).

To begin with, a language will never be acquired without engagement in a world of persons, objects, and events—the world that language is about and in which language is used. The concept of engagement embraces the social, affective, and emotional factors that figure into language learning. Other persons and the social context are required, because the motivation for learning a language is to express and interpret contents of mind so that child and others can share what each is thinking and feeling (the principle of discrepancy).

Affect and emotional expression are required for establishing inter-subjectivity and sharing between child and caregiver before language and also for motivating a child’s attention and involvement with people, objects, and events for learning language. The relevance of adult behavior is ensured when adults tune into what a child is feeling and thinking.

Asperger comment: If the caregiver is ONLY INTERESTED in his or her own expectations of what “ought to be” going on in the child’s mind, and rejects or ignores what the  the child is feeling and thinking, then this “motivation” for learning and using language may be blunted or severely damaged. 

Language is learned when the words a child hears are about the objects of engagement, interest, and feelings—about what the child has in mind (the principle of relevance). In turn, children use the language they are learning for talking about the things they care about—the objects of their engagement.

Asperger comment: Ridiculing an ASD or Asperger child’s interests, which is what happens consistently (the train schedule cliché); cutting the child off in conversation, and angry responses to “stupid topics that no one wants to hear about” guarantees feelings of shame, rejection and withdrawal from social interaction.

Acquiring language requires effort, first, for setting up the meanings consciousness that language expresses or that results from interpreting the expressions of others. Second, additional effort is required for learning the increasingly complex language needed to express and articulate the increasingly elaborated mental meanings that are made possible by developments in cognition (the principle of elaboration). And third, effort also is required for coordinating different kinds of behaviors—such as talking, expressing emotion, and playing with objects (as described by Bloom and associates11)—that make up the ordinary activities of a young child’s life. Neither speech nor emotional expression occurs in isolation; they are always and necessarily embedded in complex events.

In summary, language and emotion are related in complex ways in the process of development. Language is created by a child in the dynamic contexts and circumstances that make up the child’s world, and acquiring a language requires both engagement and effort. A child’s feelings and emotions are central to engagement with the personal and physical world and determine the relevance of language for learning. And the effect of the effort needed to coordinate cognitive, emotional, and linguistic resources for learning language is to recruit states of neutral affect for attention and processing. Children who began to learn words early spent more time in neutral affect (the Asperger “Little Professor” label?), whereas children who learned words somewhat later expressed more emotion instead. Effort also was apparent in the timing relation of speech and emotional expression at the transition to sentences, especially for the later language learners.

By the time language begins, toward the end of the first year, emotional expression already is well-established and children do not need to learn the names of the emotions to tell other people what they are feeling. But they do need to learn the language to tell other people what their feelings are.

Asperger comment: However, the constant “indoctrination” as to which feelings are socially approved, and which are  socially forbidden, denies the child expression of “negative” emotion – expression that is necessary if children are to learn how to “deal with” inevitable feelings of anger, frustration and discord between people. This is especially true for male children and developmentally diverse children, who are literally “shut down” by adult disapproval of their interests and feelings.

Language does not replace emotional expression. Rather, children learn language for expressing and articulating the objects and circumstances of their emotional experiences while they continue to express emotion with displays of positive and negative affective tone.

_______________________________

American “emotional intelligence” never gets past this judgmental social view of emotion! Americans are so consumed by the “power” of words, that we honestly believe that banning the use of “bad words” magically turns anger into love and  racism into equality. This is the root of politically correct policing of language. All it does is prevent any serious discussion about conditions that very much need serious discussion.

If only we taught children that emotions are fleeting physical reactions – and that rather than banning socially proscribed emotions, this “stop and think” step below,  which can be learned, is a path to emotional maturity. Children need to understand  our ability to choose how to handle all types of emotion. But! Americans are addicted to anger, rage and violence…

 

 

Significant progress toward real ASD science / Neurobiology paper

Open Access
Comment:
Nice to see recognitions: 1. There are many “Autisms” 2. Limitations of “behavior-based” diagnosis 3. Candor regarding “conclusions” that can be legitimately drawn using specific diagnostic / research methods  

Not quite there yet: 1. Recognition of some of the “Autisms” as representing variations on a human brain continuum, and not as “automatic pathologies”. 2. Abandonment of gross “behavior-only” Diagnosis which are highly subjective, Culturally contaminated, non-discriminatory, and limited by inconsistent “Diagnostician” Training. 3. Reorganization of A “HUMAN brain spectrum” on a scientifically valid Basis that includes “Typical” Development, “Legitimate” Variations, and truly “harmful” Identifiable pathologies.

___________________________________________

Describing the Brain in Autism in Five Dimensions—Magnetic Resonance Imaging-Assisted Diagnosis of Autism Spectrum Disorder Using a Multiparameter Classification Approach

Christine Ecker, Andre Marquand, Janaina Mourão-Miranda, Patrick Johnston, Eileen M. Daly, Michael J. Brammer, Stefanos Maltezos, Clodagh M. Murphy, Dene Robertson, Steven C. Williams and Declan G. M. Murphy

Discussion

Autism affects multiple aspects of the cerebral anatomy, which makes its neuroanatomical correlates inherently difficult to describe. Here, we used a multiparameter classification approach to characterize the complex and subtle gray matter differences in adults with ASD. SVM achieved good separation between groups, and revealed spatially distributed and largely non-overlapping patterns of regions with highest classification weights for each of five morphological features. Our results confirm that the neuroanatomy of ASD is truly multidimensional affecting multiple neural systems. The discriminating patterns detected using SVM may help further exploration of the genetic and neuropathological underpinnings of ASD.

There is good evidence to suggest that several aspects of cerebral morphology are implemented in ASD—including both volumetric and geometric features (Levitt et al., 2003; Nordahl et al., 2007). However, these are normally explored in isolation. Here, we aimed to establish a framework for multiparameter image classification to describe differences in gray matter neuroanatomy in autism in multiple dimensions, and to explore the predictive power of individual parameters for group membership. This was achieved using a multiparameter classifier incorporating volumetric and geometric features at each cerebral vertex. In the left hemisphere, SVM correctly classified 85.0% of all cases overall at a sensitivity and specificity as high as 90.0% and 80.0%, respectively, using all five morphological features. This level of sensitivity compares well with behaviorally guided diagnostic tools whose accuracies are on average ∼80%. Naturally, one would expect lower sensitivity values than the test used for defining the “autistic prototype” itself (i.e., ADI-R). Thus, if a classifier is trained on the basis of true positives identified by diagnostic tools, the maximal classification accuracy that could be reached is only as good as the measurements used to identify true positives.

The significant predictive value of pattern classification approaches may have potential clinical applications. Currently, ASD is diagnosed solely on the basis of behavioral criteria. The behavioral diagnosis is however often time consuming and can be problematic, particularly in adults. Also, different biological etiologies might result in the same behavioral phenotype [the “autisms” (Geschwind, 2007)], which is undetectable using behavioral measures alone. Thus, the existence of an ASD biomarker such as brain anatomy might be useful to facilitate and guide the behavioral diagnosis. This would, however, require further extensive exploration in the clinical setting, particularly with regards to classifier specificity to ASD rather than neurodevelopmental conditions in general.

To address the issue of clinical specificity, the established ASD classifier was used to classify individuals with ADHD—a neurodevelopmental control group. Bilaterally, the ASD classifier did not allocate the majority of ADHD subjects to the ASD category. This indicates that it does not perform equally well for other neurodevelopmental conditions, and is more specific to ASD. To further demonstrate that the classification is driven by autistic symptoms, the test margins of individuals with ASD were correlated with measures of symptom severity (Ecker et al., 2010). We found that larger margins were associated with more severe impairments in the social and communication domain of the ADI-R. The classifier therefore seems to use neuroanatomical information specifically related to ASD rather than simply reflecting nonspecific effects introduced by any kind of pathology. However, due to a recent scanner upgrade, ADHD scans were acquired with different acquisition parameters, while manufacturer, field strength, and pulse sequence remained the same. FreeSurfer has been demonstrated to show good test–retest reliability particularly within scanner-manufacturer and field strength (Han et al., 2006), but we cannot exclude the possibility that systematic differences in regional contrast may have affected the ADHD classification. Future research is thus needed to validate the ADHD findings on an independent sample.

The overall classification accuracy varied across hemispheres (79.0% left vs 65.0% right) in the absence of interhemispheric differences in parameter variability. Hemisphere laterality is an area, which remains relatively unexplored in autism. While our data suggest that the left hemisphere is better at discriminating between groups (i.e., is more “abnormal”), it is unclear whether this discrepancy is due to quantitative differences in parameters or to qualitative aspects of the discriminating patterns (i.e., additional regions). Furthermore, it is also not possible to identify whether individuals with ASD display a higher (lower) degree of cortical asymmetry relative to controls. There is some evidence to suggest that individuals with ASD show a lower degree of “leftward” (i.e., left > right) cortical symmetry than controls (Herbert et al., 2005), which may explain differences in classification accuracy. There is also evidence to suggest that the left hemisphere is under tighter genetic control than the right hemisphere (Thompson et al., 2001), which may be of relevance to a highly heritable condition such as ASD. However, a direct numerical comparison between hemispheres is needed to address this issue directly.

The classification accuracy not only varied across hemispheres but also across morphometric parameters. Bilaterally, cortical thickness provided the best classification accuracy and highest regional weights. Differences in cortical thickness have been reported previously in ASD for both increases (Chung et al., 2005; Hardan et al., 2006) as well as decreases (Chung et al., 2005; Hadjikhani et al., 2006), and in similar regions as reported here (i.e., parietal, temporal, and frontal areas). The overlap with previous studies indicates that these regions display high classification weights due to a quantitative (i.e., “true”) difference rather than high intercorrelations with thickness measures in other brain regions.

Certain geometric features such as average convexity and metric distortion provided above chance classifications as well, particularly in parietal, temporal, and frontal regions, and in areas of the cingulum. Average convexity and metric distortion measure different aspects of cortical geometry (see Materials and Methods) and have previously been linked to ASD, as has sulcal depth (Nordahl et al., 2007). Such geometric features were suggested to reflect abnormal patterns of cortical connectivity. There have also been reports of abnormal patterns of gyrification (Piven et al., 1990; Hardan et al., 2004) and large-scale displacements of the major sulci (Levitt et al., 2003). Thus, our study provides further evidence to support the hypothesis that the “autistic brain” is not just bigger or smaller but is also abnormally shaped.

While we demonstrated that the neuroanatomy of ASD is multidimensional, the etiology of such multivariate differences remains unclear. Here, little or no spatial overlap was observed between the discriminating patterns for individual parameters. Such region dependency was also observed in the regional morphometric profiles displaying the distribution of weights across multiple cortical features in a region of interest. If one assumes that different cortical features reflect different neuropathological processes, such region- and parameter-dependent variations may reflect the multifactorial etiology of ASD. For example, evidence suggests that cortical thickness and surface area reflect different neurobiological processes and are associated with different genetic mechanisms (Panizzon et al., 2009). Cortical thickness is likely to reflect dendritic arborization (Huttenlocher, 1990) or changing myelination at the gray/white matter interface (Sowell et al., 2004). In contrast, surface area is influenced by the division of progenitor cells in the embryological periventricular area, and is associated with the number of minicolumns (Rakic, 1988). Instead, geometric differences are predominantly linked with the development of neuronal connections and cortical pattern of connectivity, and are thus a marker for cerebral development (Armstrong et al., 1995; Van Essen, 1997). It is therefore likely that the reported maps reflect multiple genetic and/or neurobiological etiologies, which need further investigation. Thus, our findings should be interpreted in the context of a number of methodological limitations.

First, the classification algorithm is highly specific to the particular sample used for “training” the classifier, namely high-functioning adults with ASD. The advantage of this approach is that the classifier offers high specificity with regard to this particular subject group, but is less specific to other cohorts on the spectrum. Due to the small sample size, it was also not possible to reliably investigate differences between high-functioning autism and Asperger’s syndrome. Evidence (Howlin, 2003) suggests that by adulthood these groups are largely indistinguishable at the phenotypic level. However, the extent to which these groups differ at the level of brain anatomy is unknown, and may be investigating using SVM in the future. Second, 85% of ASD participants in our sample were diagnosed using the ADI-R, and 15% were diagnosed using the ADOS. As both diagnostic tools measure autistic symptoms at different developmental stages, the classifier may be biased toward individuals with an early diagnosis of ASD. Although it is not expected that classifier performance on the basis of ADOS and ADI differ drastically, diagnostic heterogeneity may be a potential limitation. Last, SVM is a multivariate technique and hence offers a limited degree of interpretability of specific network components. Additional analysis such as “searchlight” or “virtual lesions” approaches (Averbeck et al., 2006; Kriegeskorte et al., 2006; Pessoa and Padmala, 2007) may therefore be combined with SVM in the future to establish the relative contribution of individual regions/parameters to the overall classification performance.

Nevertheless, while classification values and specific patterns we report must be considered as preliminary, our study offers a “proof of concept” for describing the complex multidimensional gray matter differences in ASD.

Justice is the product of Reason, not Emotion

Brain scans link concern for justice with reason, not emotion

by Jann Ingmire, Medical Express.com

Neuroscience research demonstrates that the brain regions underpinning moral judgment share resources with circuits controlling other capacities such as emotional saliency, mental state understanding and decision-making. Credit: Jean Decety

People who care about justice are swayed more by reason than emotion, according to new brain scan research from the Department of Psychology and Center for Cognitive and Social Neuroscience.

Psychologists have found that some individuals react more strongly than others to situations that invoke a sense of justice—for example, seeing a person being treated unfairly or mercifully. The new study used brain scans to analyze the thought processes of people with high “justice sensitivity.” (One of the “negative symptoms” of Asperger’s is an innate concern for justice, fair play and honesty.) 

Asperger individuals ought never to apologize for being pro-justice: The problem is that there are too many unjust, outdated and ridiculous laws.

Asperger individuals ought never to apologize for being pro-justice: The problem is that there are too many unjust, outdated and arbitrary applications of laws.

“We were interested to examine how individual differences about justice and fairness are represented in the brain to better understand the contribution of emotion and cognition in moral judgment,” explained lead author Jean Decety, the Irving B. Harris Professor of Psychology and Psychiatry.

Using a functional magnetic resonance imaging (fMRI) brain-scanning device, the team studied what happened in the participants’ brains as they judged videos depicting behavior that was morally good or bad. For example, they saw a person put money in a beggar’s cup or kick the beggar’s cup away. The participants were asked to rate on a scale how much they would blame or praise the actor seen in the video. People in the study also completed questionnaires that assessed cognitive and emotional empathy, as well as their justice sensitivity.

As expected, study participants who scored high on the justice sensitivity questionnaire assigned significantly more blame when they were evaluating scenes of harm, Decety said. They also registered more praise for scenes showing a person helping another individual.

 

_____________________________________Abstract:

The Good, the Bad, and the Just: Justice Sensitivity Predicts Neural Response during Moral Evaluation of Actions Performed by Other

Keith J. Yoder and Jean Decety

In the past decade, a flurry of empirical and theoretical research on morality and empathy has taken place, and interest and usage in the media and the public arena have increased. At times, in both popular culture and academia, morality and empathy are used interchangeably, and quite often the latter is considered to play a foundational role for the former. In this article, we argue that although there is a relationship between morality and empathy, it is not as straightforward as apparent at first glance. Moreover, it is critical to distinguish among the different facets of empathy (emotional sharing, empathic concern, and perspective taking), as each uniquely influences moral cognition and predicts differential outcomes in moral behavior. Empirical evidence and theories from evolutionary biology as well as developmental, behavioral, and affective and social neuroscience are comprehensively integrated in support of this argument. The wealth of findings illustrates a complex and equivocal relationship between morality and empathy. The key to understanding such relations is to be more precise on the concepts being used and, perhaps, abandoning the muddy concept of empathy.

Jean Decety, Professor University of Chicago: Jean Decety is an American and French neuroscientist specializing in developmental neuroscience, affective neuroscience, and social neuroscience. Dr. Decety is the Director of the Brain Research Imaging Center at the University of Chicago Medicine, and the Director of the Child NeuroSuite.