New Study of Studies / Male to Female Ratio ASD

JAACAPJournal of the American Academy of Child and Adolescent Psychiatry

What Is the Male-to-Female Ratio in Autism Spectrum Disorder? A Systematic Review and Meta-Analysis

University College, London UK

DOI: http://dx.doi.org/10.1016/j.jaac.2017.03.013

Objective

To derive the first systematically calculated estimate of the relative proportion of boys and girls with autism spectrum disorder (ASD) through a meta-analysis of prevalence studies conducted since the introduction of the DSM-IV and the –International Classification of Diseases, Tenth Revision. (Thus – conclusions can only be as “accurate” as the data in the original studies) (Since DSM-V did away with Asperger’s as a diagnosis, how does this “deletion” affect this study?)

Method

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. The Medline, Embase, and PsycINFO databases were searched, and study quality was rated using a risk-of-bias tool. Random-effects meta-analysis was used. The pooled outcome measurement was the male-to-female odds ratio (MFOR), namely the odds of being male in the group with ASD compared with the non-ASD group. In effect, this is the ASD male-to-female ratio, controlling for the male-to-female ratio among participants without ASD.

Results

Fifty-four studies were analyzed, with 13,784,284 participants, of whom 53,712 had ASD (43,972 boys and 9,740 girls). The overall pooled MFOR was 4.20 (95% CI 3.84–4.60), but there was very substantial between-study variability (I2 = 90.9%). High-quality studies had a lower MFOR (3.32; 95% CI 2.88–3.84). Studies that screened the general population to identify participants regardless of whether they already had an ASD diagnosis showed a lower MFOR (3.25; 95% CI 2.93–3.62) than studies that only ascertained participants with a pre-existing ASD diagnosis (MFOR 4.56; 95% CI 4.10–5.07).

Conclusion

Of children meeting criteria for ASD, the true male-to-female ratio is not 4:1, as is often assumed; rather, it is closer to 3:1. There appears to be a diagnostic gender bias, meaning that girls who meet criteria for ASD are at disproportionate risk of not receiving a clinical diagnosis.

Hate to be snippy – but with the all the manipulation going on, is the “new” ratio of male / female ASD any more informative than a “good guess”? At least this group acknowledges the bias against clinical diagnoses of ASD in girls.

_______________________________________________

…And we are still expected to join the other inmates in the pink and purple social prison that confines female H. sapiens to low status.

 

Brain development and Neoteny / Neuroscience

J Psychiatry Neurosci. 2011 Nov;36(6):412-21. doi: 10.1503/jpn.100138.

Can Asperger syndrome be distinguished from autism? An anatomic likelihood meta-analysis of MRI studies.

Yu KK1, Cheung C, Chua SE, McAlonan GM

Whereas grey matter differences in people with Asperger syndrome compared with controls are sparser than those reported in studies of people with autism, the distribution and direction of differences in each category are distinctive.

Abstract

In development, timing is of the utmost importance, and the timing of developmental processes often changes as organisms evolve. In human evolution, developmental retardation, or neoteny, has been proposed as a possible mechanism that contributed to the rise of many human-specific features, including an increase in brain size and the emergence of human-specific cognitive traits. We analyzed mRNA expression in the prefrontal cortex of humans, chimpanzees, and rhesus macaques to determine whether human-specific neotenic changes are present at the gene expression level. We show that the brain transcriptome (transcriptome includes all mRNA transcripts in the cell; it reflects the genes that are being actively expressed at any given time, with the exception of mRNA degradation phenomena such as transcriptional attenuation.) is dramatically remodeled during postnatal development and that developmental changes in the human brain are indeed delayed relative to other primates. This delay is not uniform across the human transcriptome but affects a specific subset of genes that play a potential role in neural development.

Conclusion

By comparing the gene expression profiles in human, chimpanzee, and rhesus macaque prefrontal cortices throughout postnatal development, we have found that there is no uniform shift in the developmental timing between humans and other primates. We find instead a significant excess of genes showing neotenic expression in humans. This result is in line with the neoteny hypothesis of human evolution (6) and provides insight into the possible functional role of neoteny in human brain development. Specifically, we show that at least in one of the 2 cortical regions studied, the neotenic shift is most pronounced at the time when humans approach sexual maturity, (body matures; brain does not) a process known to be delayed in humans relative to chimpanzees or other primates (6, 24). Furthermore, the neotenic shift particularly affects a group of genes preferentially expressed in gray matter. Intriguingly, the timing of the shift also corresponds to a period of substantial cortical reorganization characterized by a decrease in gray-matter volume, which is thought to be related to synaptic elimination (21, 25, 26). The developmental pace of changes in gray-matter volume has been associated with the development of cognitive skills among humans (e.g., linguistic skills) (27) as well as with the development of disorders (e.g., attention-deficit/hyperactivity disorder) (28).

Although the precise causes and consequences of the human neotenic shift remain unknown, together these observations suggest that ontogenetic timing differences between the human and the chimpanzee prefrontal cortex transcriptomes may reflect differences in sexual and cognitive maturation between the 2 species. According to this logic, delayed gray-matter maturation in the human prefrontal cortex may extend the period of neuronal plasticity associated with active learning, thus providing humans with additional time to acquire knowledge and skills.

_________________________________________________________________________

 

Things to think about:

Maturation of gray matter in the human prefrontal cortex is delayed by neotenic shifts. Developmental delay is neotenic. There is a “fuzzy boundary” (?) between too much / too little grey matter volume for the brain to function well; there also seem to be 2 types of brain organization: social navigation vs. factual and problem-solving. (See today’s Temple Grandin post.) These are probably not ‘separate’ paths, but are developmental stages. (Social is juvenile; factual is adult.)

The gray matter volume in any specific human brain may vary between mature and neotenic states. Humans vary in degrees of neoteny. It might be more accurate to dump “Autism Spectrum” for an inclusive classification “The Neoteny Spectrum” which includes all contemporary homo sapiens.

How do we know which human brains (volume of grey matter) are mature and which are neotenic? What volume of grey matter is the reference for maturity vs. neoteny? We can begin with behavior.

Some changes in grey matter volume occur during puberty: therefore it would be useful to compare the pre-puberty and post-puberty states of grey matter in individuals, especially those diagnosed with a “brain disorder” in early childhood. Mixing data from before and after stages of brain reorganization may be completely misleading do to variations in timing.

Grey matter differences in individuals reflects maturity vs. neoteny (a spectrum of rates of development) and not fundamental developmental disability; if one is ONLY interested in “The Social Brain,” and designates this juvenile stage of development as the ONLY legitimate human brain, then the “problem” of Asperger’s is the product of inattentional blindness and ignorance. Bad science!

CASE IN POINT: It is said that Asperger’s have “less than normal” volumes of gray matter, but if this conclusion has been derived from pre-puberty testing, then it is entirely possible that Asperger children, in terms of specific brain development, (intellect, language, concrete – visual thinking, facts and problem-solving) simply MATURE FASTER – and gray matter volume is reduced to a more effective and efficient volume well before their peers.

Social brains are neotenic; the neurotypical neotenic brain never fully matures. Which means that the social orientation (obsession) of modern social humans is the product of extreme neoteny.   Neoteny is evident in a lack of logic, rationality, analytical thinking and effective problem-solving, which are absent in everyday life and most seriously, in our political leaders.

“Juvenile” neotenic behavior is evident in the inability to recognize that facts and physical reality exist. Instead, emotions are paramount; self absorption is rampant, magical thinking prevails, action is missing, narcissitic orientation is “normal”, and “worship of” childlike celebrities is a substitute for adult models and personal development. Adult children never leave home but remain dependent on parental support. Violence is characteristic of juvenile males; violent behavior usually decreases as males age, but today “frivolous” violence is the perpetual activity of neotenic males and is encouraged by popular culture.

 

Cerebral Asymmetry / Asperger Brain Differences

Philosophical Transactions of the Royal Society / Biological sciences

The evolution and genetics of cerebral asymmetry

http://rstb.royalsocietypublishing.org/content/364/1519/867/

Michael C Corballis

Circadian Rhythm Disturbances / Bipolar – Asperger Syndrome

Bipolar Disorder is thought to be co-morbid to Asperger Syndrome. I want to introduce material from Circadian Rhythm studies.

Circadian Rhythm

Social cues help set sleep patterns

Social cues help set sleep patterns

© 2002 Psychiatric Times. All rights reserved. Circadian Rhythms Factor in Rapid-Cycling Bipolar Disorder by Ellen Leibenluft, M.D. Psychiatric Times May 1996 Vol. XIII Issue 5

The nervous systems of people with bipolar disorder frequently make specific types of regulatory errors. Many of those errors involve the body’s internal clock, which controls the phenomena known as circadian rhythms. These are the regular rhythmic changes in waking and sleeping, activity levels, and sensations of hunger and thirst.

Phototherapy and melatonin, two interventions that manipulate the circadian system, are being used widely for seasonal affective disorder, jet lag and some forms of insomnia. Scientists continue to make tremendous strides in understanding the regulation of the circadian system in humans and animals. We now know that the body’s clock is located in the suprachiasmatic nucleus (SCN) of the hypothalamus, and that the SCN regulates the pineal gland’s secretion of the hormone melatonin (Klein and colleagues). In humans as well as animals, light suppresses melatonin secretion; recent evidence shows that even ordinary room light can have this effect. Because light suppresses melatonin secretion, the hormone is typically secreted at night. Furthermore, the SCN can “remember” the day-length to which it has been recently exposed, so the timing of nocturnal melatonin secretion is determined by the “lights on” and “lights off” times of the preceding days. In 1994, scientists cloned genes regulating circadian rhythms in mice, making the circadian system the first complex behavioral system whose genetic underpinnings could begin to be unraveled in mammals (Vitaterna and coworkers).

In mood disorder research, interest in circadian rhythms is not new. For at least 50 years, investigators have questioned whether abnormalities in circadian rhythm regulation might be involved in the pathogenesis of mood disorders, including rapid-cycling bipolar disorder. These questions were motivated by three clinical observations. The first of these was that the sleep duration of patients often changes dramatically as they cycle between mania and depression; bipolar depression is typically associated with hypersomnia, while mania is characterized by extreme and sometimes total insomnia.

The second observation was that approximately 60 percent of depressed patients experience remission after a night of total or partial sleep deprivation (SD). In bipolar patients, sleep deprivation (SD) may actually cause a switch into hypomania or mania. However, this “upward” switch usually lasts only until the patient undergoes recovery sleep, leading to the formulation that SD, or extended wakefulness, is antidepressant, (or manicogenic), while sleep is depressogenic, (Wehr). The antidepressant effects of SD are conceptually important because they show that changes in sleep duration are more than just symptoms of the illness, and also play a pathogenic role.

Using a longitudinal analysis of mood and sleep in a sample of patients with rapid-cycling bipolar disorder, we recently demonstrated that decreased sleep duration precedes, rather than simply follows, a switch into hypomania or mania (Leibenluft and coworkers, in press). Furthermore, in this sample, decreased sleep duration was more consistently associated with a shift to an earlier wake-up time than it was with a shift to a later bedtime. Pharmacologically, it is easier to manipulate the time a patient goes to sleep than to change the time he or she wakes up; perhaps for this reason, clinicians have generally attended less to wake-up time than to sleep-onset time. However, these data indicate that interventions designed to shift patients’ wake-up time may deserve further study.

The third observation implicating abnormal circadian rhythms in the pathogenesis of mood disorders concerns diurnal variation. In its classic, typical form, diurnal variation is defined as a gradual improvement in the patient’s depressed mood as the day wears on. Like sleep deprivation, typical diurnal variation demonstrates that extended wakefulness is associated with an antidepressant response. We recently extended the concept of diurnal variation to bipolar patients with data demonstrating that rapid-cycling patients are more likely to switch “up” (i.e., from depression or euthymia into hypomania) during the day, and to switch “down” (from hypomania or euthymia into depression) overnight, while they sleep (Susana Feldman-Naim, M.D., and coworkers, unpublished data). Thus, once again, extended wakefulness is associated with an antidepressant response, while sleep appears to be depressogenic.

Specific theories have been advanced as to how circadian rhythm dysfunction might lead to rapid-cycling bipolar disorder. In 1968, Halberg suggested that some, but not all, circadian rhythms in such patients were not synchronized with the 24-hour day-night cycle (Halberg). According to Halberg’s hypothesis, the interaction between the unsynchronized, “free-running” rhythms and the normally synchronized (entrained) rhythms causes switches back and forth between mania and depression.

Kripke and colleagues then presented data demonstrating what appeared to be a free-running temperature rhythm in five of seven rapid-cycling patients. In these patients, the period (time taken to complete one cycle) was abnormally short, in essence showing that patients with rapid mood cycles had rapid physiological cycles. However, subsequent investigators have not generally found either free-running or unusually fast circadian rhythms in patients with rapid-cycling bipolar disorder.

In the 1970s and ’80s Wehr and collaborators, working at the National Institute of Mental Health, continued to study biological rhythms in this patient population. Using both cross-sectional and longitudinal designs, they showed that the phase (timing) of patients’ sleep, temperature and motor activity rhythms varied systematically as they cycled between hypomania or mania and depression. Specifically, the timing of these rhythms appeared to be earlier in manic than in depressed patients, and earlier in depressed patients than in controls (Wehr and colleagues 1980). We now have preliminary data indicating a similar pattern in the time of onset of nocturnal melatonin secretion. These new data show that, in rapid-cycling bipolar patients, the time of nocturnal melatonin onset may be approximately 90 minutes earlier when they are hypomanic, compared to when they are depressed (Leibenluft and colleagues 1993). It is as if rapid-cycling patients might have an endogenous form of jet lag, internally traveling back and forth over one or two time zones as they cycle between hypomania and depression. Indeed, several studies show that bipolar patients are at risk to develop an affective episode when they travel across time zones (Young).

What might cause these shifts in the phase (time) of onset of nocturnal melatonin secretion? It is possible that phase shifts.i.phase shifts; in nocturnal melatonin secretion precede the patients’ mood switches and play a pathogenic role in mood cycling. However, it is also possible that the phase shifts are epiphenomena caused by the patient’s symptoms. Specifically, the phase shifts may be secondary to the changes in the sleep-wake cycle.i.sleep-wake cycle; that occur with mood cycling. The phase of circadian rhythms is determined by zeitgebers (“time-givers”).

While light is the most potent zeitgeber, physical activity, eating and social routines can probably also affect the timing of circadian rhythms. The timing of all these zeitgebers is often different when a patient is hypomanic, compared to when he or she is depressed, and shifts in the timing of zeitgebers would cause phase shifts in circadian rhythms.

However, a third possibility also exists. We suggest that phase shifts in melatonin secretion and other circadian rhythms are not the primary cause of mood cycling, but they are also not irrelevant epiphenomena. We hypothesize that phase shifts in melatonin secretion are secondary to the patient’s symptoms or to more fundamental causes of bipolar illness, but they nonetheless have pathogenic significance and contribute to the development of a full-blown affective episode.

This formulation is analogous to that of Wehr and coworkers (1987) in describing the contribution of sleep deprivation to the development of manic episodes. These authors suggested that insomnia, which is itself a symptom of mania, contributes to the development of a manic episode because it causes sleep deprivation. In other words, insomnia is both a symptom and a cause of mania. If one treats the insomnia early and aggressively, one can truncate an episode, or prevent mild or moderate symptoms from snowballing into a severe and destructive episode. Similarly, it is possible that the shifts in circadian rhythms, while not the initial cause of a mood switch, contribute to the severity and duration of an episode, and thus play a role in determining the course of illness.

We are currently testing this hypothesis by determining whether interventions designed to prevent phase shifts in nocturnal melatonin secretion have therapeutic effects in rapid-cycling bipolar patients. One such experimental treatment involves the use of phototherapy. Data indicate that midday bright light may increase the amplitude of nocturnal melatonin secretion. Since increasing the amplitude of a rhythm makes it more resistant to phase shifts, midday light might be expected to stabilize the time of nocturnal melatonin secretion. In other words, midday phototherapy administered to these patients might prevent the shifts in timing of nocturnal melatonin secretion that we believe have pathogenic significance. After encouraging results with a small number of patients, we are now conducting a formal, controlled trial of this intervention. Interestingly, morning bright light, which shifts patients’ circadian rhythms, may have caused several of our rapid-cycling bipolar patients to cycle more dramatically.

Thus, even if circadian abnormalities are neither the sole nor the primary cause of bipolar illness, it is possible that circadian interventions can have therapeutic utility. Compared to psychotropic medications, circadian interventions are relatively flexible therapeutic modalities; they have a rapid onset and offset of action, and their clinical effects may be altered by changing the time that they are administered. This flexibility may be particularly useful in rapid-cycling bipolar patients, whose frequent mood cycles may require rapid alterations in their therapeutic regimen. Further research will indicate what, if any, role circadian dysfunction plays in the pathogenesis of rapid-cycling bipolar disorder, and whether circadian interventions can be helpful to these often treatment-resistant patients.

Ellen Leibenluft, M.D., is chief, Unit on Rapid-Cycling Bipolar Disorder, Clinical Psychobiology Branch, National Institute of Mental Health.

The Lunch Theory of Human Social Cognition

You are a “student” of human evolutionary development; you are curious about how “humans” came to be Who we are today – Masters of the Universe. You come to this quest with certain assumptions in mind, which you probably are not “consciously” using, but which form a “filter system” that will not only prejudice the “evidence” you “discover” but which pre-classifies what “evidence” is. This is the way it is, for Homo sapiens.

It is obvious to you that “you” (your thoughts, habit, lifestyle, beliefs) are the culmination of “human” excellence; “you” are where evolution was headed all along. “You” includes all the wonders of Western Civilization, which “you” take to be evidence of “your” intelligence, inventiveness, superior intellect, diligence, eagle-eyed observation and analysis, literacy, artist accomplishment, etc. because “you” studied these things in school and look a lot like all the “big-brained males” (this includes female students today) who have “built” Western Civilization (U.S. version) and proven that, well frankly, It’s the best damn civilization EVER!

1. Homo sapiens (males) have big-brains; size is everything. Therefore, any and all evidence” for “evolutionary progress” depends on signs and omens of “cognitive abilities” in those “fossil species” that LED to US. “Dumb” species, de facto, cannot be our ancestors.

fig. 1 The Supreme Species. If wealthy, individuals from subspecies of various skin color may be included.

Fig. 1 The “supreme” species and attendant females and offspring. The prime evolutionary question is, “How did this glorious species come to be?” It used to be simple enough: A Supreme Male created everything in the universe, including Man, to whom he handed over all of Creation, with “Man” as the master of all Nature. But the invention of Science threw a monkey-wrench into the plot: it seems “man” was not really the center of the Universe, after all… unless… of course, all 3.5 billion years of life on earth could be directed into producing “The Last Ape Standing” What a coup – even better than “special creation” by a supreme male God! We’ll prove that all the forces and processes at work in the universe were necessary to produce “US”.

2. Let’s get to work establishing our superiority to all other species. We can automatically dismiss any life form previous to “humanoids” as unimportant, and fossil humanoids “that count” as ancestors can be identified by “signs”  which point in our direction. We can easily define “humanoid” by the categorizing “things we eat”. (Cannibalism is a no-no. You can’t eat the Supreme Species, like it was just another food source)

Fig 2. Lunch. This organized, processed and prepared “nutrition” demonstrates “cognitive abilities” found in no other life form. The sheer amount of brain-power needed to exploit natural resources (fossil fuels) to design, manufacture and transport “plastic” containers, with uniform subdivisions, that create the illusion that $1.50 worth of “food items” is worth $39.95″ is genius! And what about decorating two sacrificial crab claws elevated on a slice of bread with wildflowers sourced from a meadow in Mongolia? Just $49.95. Brilliant.

Let’s see what “the rest of” American humanoids are having for lunch.

Another sign of superior intelligence: The American food industry is paid billions $$ to transform “surplus storage” food into meals for children; and it’s free. How much more compassionate can a supreme species become? Why waste “good brain food” (and good education) on “lesser beings” who will never be intelligent anyway? In fact, we can guarantee impaired cognitive function and stunted development using this brilliant strategy. Note the wonderful array of fossil fuel containers provided, which can be used once and disposed of immediately into landfills. More profit! More socially savvy behavior that creates environmental destruction and millions of defective, low status humanoids. No Neanderthal could accomplish that. Thank God and evolution that we exterminated them just in time!

3. Evidence that Neanderthal didn’t stand a chance of being intelligent enough to compete with us. When the supreme species “goes wild” they do it with superior social cognition. Who needs survival skills when amazing fossil fuel-based non-recyclable immortal plastic products can be purchased at a “wilderness” adventure store? Stupid Neanderthals!

fig. 3a AMH “outsmarting” Neanderthals using superior social skills: “You bring the hot dogs and Gatorade, we’ll bring the tent” social strategy.

Fig. 3a, 3b Lunch ca. 40,000 y.a. Anatomically Modern Humans (just like Modern Social Humans in every way) take over Eurasia from the Neanderthals with superior social networking.

fig 4b No evidence has been found for AMH having developed smart phones, but the “conceptual” ability to communicate effectively was just like that of modern humans.

Neanderthals: too dumb to make Margaritas for brunch guests. Lack of social cognition led to extinction. (BTW – Thanks for the casual sex!)

4. Social behavior explains why AMC became Masters of the Universe: Neanderthals had open “caves” while highly social AMH had open “concept” floor plans with kitchen islands and granite counter tops that facilitated the acquisition of social status. How could Neanderthals have competed with such advanced innovation, which clearly depends on social networking, direct eye contact, empathy, “mind-reading” and high-end finishes?

fig. 4a Neanderthal “open” cave. Primitive housing = primitive brain.

fig. 4b “Primitive” Neanderthal cave under renovation with advanced AMH upgrades: Open concept floor plain for “flow” when entertaining, and high-end finishes; granite countertops and slate flooring.

WOW! “Scientific” proof that not even GOD could create a species that’s as intelligent as modern social humans.

File this video under, “What Asperger’s mean when we say that neurotypicals are stooopid.”

 

What “The World” Sounds like to (Many) Asperger People

The woman who made this audio track is correct! I could not bear to listen longer than a few seconds. If you can listen to this COMFORTABLY, you will likely not be able to understand what an Asperger person goes through daily, when trapped in social typical environments.

One particular point: It’s nearly impossible to pay attention to and to understand what a person is saying when “background noise” is not in the background! It’s competing with the person speaking; the impulse is to get away from the discordant “sounds” – the effect is like being tortured. Truly!

 

Brow Ridge unrelated to cognitive development

Comparing Frontal Cranial Profiles in Archaic and Modern Homo by Morphometric Analysis

FRED BOOKSTEIN et al

There is an 8-page pdf for this paper, but once again I can’t get the URL to connect.

Brain_directions__planes_-_smallINTRO: Archaic and modern human frontal bones are known to be quite distinct externally, by both conventional visual and metric evaluation. Internally this area of the skull has been considerably less well-studied. Here we present results from a comparison of interior, as well as exterior, frontal bone profiles from CT scans of five mid-Pleistocene and Neanderthal crania and 16 modern humans. Analysis was by a new morphometric method, Procrustes analysis of semi-landmarks, that permits the statistical comparison of curves between landmarks. As expected, we found substantial external differences between archaic and modern samples, differences that are mainly confined to the region around the brow ridge. However, in the inner median-sagittal profile, the shape remained remarkably stable over all 21 specimens. This implies that no significant alteration in this region has taken place over a period of a half-million years or more of evolution, even as considerable external change occurred within the hominid clade spanning several species. This confirms that the forms of the inner and outer aspects of the human frontal bone are determined by entirely independent factors, and further indicates unexpected stability in anterior brain morphology over the period during which modern human cognitive capacities emerged.

(New Anat): 257:217–224, 1999. 1999 Wiley-Liss, Inc.

External changes (such as the brow ridge) ARE NOT LINKED morphologically to the frontal brain, which has been stable for 500,000 years of  cognitive development.

Basic CMYK

 

Neurotypical Perception Defects / Inferred Images, Social Filters

Humans rely more on ‘inferred’ visual objects than ‘real’ ones

May 16, 2017
Summary:
Humans treat ‘inferred’ visual objects generated by the brain as more reliable than external images from the real world, according to new research.
“In such situations with the blind spot, the brain ‘fills in’ the missing information from its surroundings, resulting in no apparent difference in what we see,” says senior author Professor Peter König, from the University of Osnabrück’s Institute of Cognitive Science. “While this fill-in is normally accurate enough, it is mostly unreliable because no actual information from the real world ever reaches the brain. We wanted to find out if we typically handle this filled-in information differently to real, direct sensory information, or whether we treat it as equal.”

Visual thinkers are all too aware of this reality deficit in the “typical” perception of reality; I can’t say that the mechanism described here is the “cause” of discrepancies between “typical” perception and the greatly enhanced perception of visually-oriented brains, but it does point out that the typical human brain has evolved “short cuts” that result in varying accuracy in the  perception of the environment. This deficit, combined with de facto “magical-social” thinking has dire consequences for survival.

Links:

Article in Science Daily: https://sciencedaily.com/releases/2017/05/170516080752.htm

Original Paper with figures, charts: 10.7554/eLife.21761

Posts on inattentional blindness: https://aspergerhuman.wordpress.com/2015/12/04/visual-thinking-inattentional-blindness/

https://aspergerhuman.wordpress.com/2015/12/04/inattentional-blindness-why-the-u-s-gets-sucker-punched-by-terrorists/

Immune System Introgressions / Neandertal, Denisovan HLA alleles


Neanderthal, State Museum, Halle, Germany

Denisovan admix today: Low – Black / High – Red

 

The Shaping of Modern Human Immune Systems by Multiregional Admixture with Archaic Humans

Laurent Abi-Rached,1 (see original paper for list of authors)

Abstract

Whole genome comparisons identified introgression from archaic to modern humans. Our analysis of highly polymorphic HLA class I, vital immune system components subject to strong balancing selection, shows how modern humans acquired the HLA-B*73 allele in west Asia through admixture with archaic humans called Denisovans, a likely sister group to the Neandertals. Virtual genotyping of Denisovan and Neandertal genomes identified archaic HLA haplotypes carrying functionally distinctive alleles that have introgressed into modern Eurasian and Oceanian populations. These alleles, of which several encode unique or strong ligands for natural killer cell receptors, now represent more than half the HLA alleles of modern Eurasians and also appear to have been later introduced into Africans. Thus, adaptive introgression of archaic alleles has significantly shaped modern human immune systems.

Example: Includes similar graphics for Neanderthal and Denisovan HLA alleles

Fig. 3 Effect of adaptive introgression of Neandertal HLA class I alleles on modern human populations. (A) All six Neandertal HLA-A, -B and -C alleles are identical to modern HLA class I alleles…