BBC History of “The Celts” / Video 6 Episodes

Totally annoying “Easy Listening version of Bronze Age Hits” soundtrack – much archaeological “novel writing” but great art objects – I want the bronze couch. Also, recent genetic research may have changed assumptions presented here.

Well! Someone else left a comment on EP 3:                           “I don’t know why – but the music on this series is fucking annoying – well, the vocals at least . . . the rest of the production is great – too bad I can’t turn that annoying music off . . .”

And from EP 4: —————— “I CANNOT STAND THIS MUSIC ANY LONGER” and “The music is awful. But the rest is great.”

Cool sounding Celtic “battle horns” + head hunting and snug thatched houses; farming and domestic local sheep = Iron Age. More archaeological novel writing and the lame “ritual religious function” as explanation for anything “too hard” to figure out.

…but, fabulous jewelry composed of “meaningful geometry”

and the tired old “Bad Romans” narrative… gee whiz! Roman disciplinary superiority was basically “cheating” – the Celts (Irish) lost because they were drunkards! Daft Brit historians…

Let’s face it: without Ancient Rome, Europeans would still be living in dirt hovels…LOL

Go to youtube for remaining 4 episodes… / paleoanthropology wars

The Peking Man Delusion

Monday, September 23, 2013 | By:

While waves were sent thrashing through the intellectual world in 1987 as DNA evidence hailed a woman living 200,000 years ago in Africa as the earliest common ancestor to all modern humans, China’s leading scientists fervently waved the red flag and rejected the findings in Nature as being little more than preposterous. They held up the theory that an archaic form of human, the “Peking man” (Homo erectus)  (北京猿人 Běijīng yuánrén), was the true ancestor of the modern Chinese, with radicals going so far as to claim that China was the cradle of humanity.

Over the next thirty 30 years, as the model of a common African origin gained momentum with growing genetic and archaeological evidence, the theory evolved into a well-tread narrative of human evolution, depicting an early form of modern humans emerging out of Africa around 60,000 years ago. The scientific community rejected the competing theory that supported independent evolution in different parts of the world. But in China, the revelation barely rippled with a defiant nation that continued to teach in its history and text books that the Peking man was the real deal, and the prevailing theory of evolution failed to permeate the realm of public conscience. (As if the Out of Africa Theory was not culturally biased and has had to be highly modified.)

The Peking man in question— a Homo erectus or upright man that lived 750,000 years ago— was discovered in Zhoukoudian 周口店, now a UNESCO World Heritage site, southwest of Beijing in 1929. At the time they were the oldest fossils of man known, fanning flames for theories that all humans originated in China.

These believers were at least right about one point: common originality. “Based on evolutionary theory, all modern humans have to share a common ancestor at some point in the past”, says Dr. Mark Stoneking, of the Max Planck Institute for Evolutionary Anthropology. “The evidence for this is extremely strong from all fields of anthropology and genetics. The controversy is over how far back in the past this common ancestor is”.

Indeed, apart from the occasional loon, what most leading scientists, including the Chinese, can now agree on is that around two million years ago during the Pleistocene era (2.6 million-11,700 years ago), primates in Africa evolved into Homo erectus on the African continent. This species was able to cover greater distances due to its upright form, and with climate conditions permitting, a sub-population of this species successfully migrated out of Africa in an event known as Out of Africa I, spreading to Asia and other parts of the world, giving rise to populations such as the Peking man.

So what happened to these archaic forms that occupied Asia? Those in support of this view back the Multiregional Model of human evolution, a theory that has largely been rejected by the international scientific community. Continuous descent from Peking man, at least, “is very unlikely from fossil evidence, and perhaps even more unlikely from the totality of the genetic evidence,” according to Professor Chris Stringer of the Natural History Museum in London. “Classic multiregionalists model from late last century proposed an evolutionary continuity that lasts at least a million years in each region, which would result in a gradual transformation in each area to modern people of today—from the Neanderthals to modern Europeans, from Peking Man to modern Chinese.” For Professor Stringer, the picture is clear, “In my view there is only one region where we can observe such a transformation, and that is in Africa.”

That is to say, a second migration out of Africa occurred around 60,000 years ago, known as the Recent Out of Africa Hypothesis hypothesis (ROAH) by a more evolved species. This new species, an early form of Homo sapiens, dispersed around the world, replacing archaic forms like the Homo erectus, giving rise to the modern human population today.

There is nearly no doubt that modern humans came out of Africa, “The evidence in favor of an African origin about 200,000 years ago for modern humans is very robust,” says Dr. Rosalind Harding, Oxford University Lecturer in Biological Anthropology. Some of this comes from studies on mitochondrial DNA (mtDNA), which is only passed along the maternal line, and can therefore be used to trace genetic lineages. The work that began in the 1970s by Dr. Stoneking and colleagues has demonstrated that all humans are traceable to “Mitochondrial Eve”, the woman living 200,000 years ago in Africa. (MtDNA is not definitive; it’s more like a travelogue than anything else. Whole genome studies must be done.)

The Homo erectus in Asia is believed to have gone extinct—they either died out or were driven to extinction by incoming species, although no one knows when and where this happened. This is one of those cases, that if a statement is repeated often enough, people believe it.

On the other hand, Multiregionalists reject that their lineage was cut short in this way, believing that it persists until today. Champions of this latter school of thought, notably the paleontologist Professor Wu Xinzhi of the Chinese Academy of Science, have recently advocated a view whereby populations of Homo erectus in Africa, Asia, and Europe continued to exchange DNA through interbreeding, while under going strong natural selection in their own regions. In this way, archaic populations developed “regional” as well as “common” features, creating what Professor Wu refers to as a “mosaic of morphological features”.

According to Wu, distinct shovel shaped upper incisors and flat upper faces seen in Peking man and other archaic Chinese fossils are only identifiable as “complex” in modern Chinese. For Professor Stringer however, this line of argument is unconvincing: “I would argue that the distinctive features of Africans, Europeans, Asians, and Australasians evolved only after the development of our shared modern features, i.e. within the last 100,000 years”. (European features such as light skin and hair and blue eyes, are indeed very recent!)

And while academics in China rage against the grain, the Chinese public is less than perturbed by stirrings in the scientific realm. Ask the average Chinese person who they evolved from, and the likely answer is “Peking man”. Suggest an African ancestor in response, and whole gamut of reactions to this bizarre notion is likely to surface. Ms. Zhang Yanfang, a 67-year-old Beijinger and retired office manager, on learning of the Africa theory, replied that she was not even aware such a hypothesis existed. “I’ve never heard of such a thing,” Zhang replied, saying that she grew up learning, like every other Chinese person, that she descended from the Peking man. “I think it’s quite unlikely that we are related to Africans; the difference is too great in every respect—including that of race.”

Ask an average American who admits to belief in evolution from whom he or she descended, and they will say Adam & Eve or Ancient Aliens. At least Peking Man actually existed.

While general ignorance might fuel popular skepticism towards ROAH, for many the disbelief in an African origin lies in the notion that the Chinese, with its unique and illustrious history, the longest continuing civilization in the world, and the creator of some of the greatest inventions ever, could fundamentally be “non-Chinese” in any way.


Peking Man, E. Daynes

Professor Frank Dikötter, Chair Professor of Humanities at the University of Hong Kong and author of The Discourse of Race in Modern China, believes that intellectualism can be strongly influenced by a cultural context, noting a similar resistance in the West to the multiple origins theory in the 19th century, “Monogenesis until recently was a view with its own cultural background. Despite the fact that Charles Darwin and others were talking in the name of science, it was very clear that Christianity and the idea of Adam and Eve provided a strong setting.” In China, views opposing this began emerging at the end of the 19th Century, when thinkers, likely influenced by strong Confucianism, says Professor Dikötter, “insisted that Africans and Europeans came from one single origin, but that the Chinese were completely different.” The discovery at Zhoukoudian several decades later reinforced this view: “Chinese scientists took it to be proof that China is one long line of uninterrupted descent.”

China has a long history of ancestor worship, with the belief that the legendary Yellow Emperor (黄帝) is the ancestor of all Chinese people. This conceptuality of a distinct racial lineage took particular form during the Qing dynasty, where tracing patrilineal lineages of descent to form long genialities became a popular practice. “Clans would compile ancestral lines to show that an uninterrupted line of decent existed,” Professor Dikötter says. “This certainly combined with newer evolutionary ideas about race at the end of the 19th century.” Professor Dikötter believes that this strong culture is one that is still present today, ”Science never really operates outside of social, cultural, and political contexts; we might wish to believe it is completely objective and blind, but it is not.”

Stripped of cultural and historical connotations, the issue of race may be more than a mere reverence for one’s own. There exists a widely carried unfavorable perception of Africa and the African people. Both Professor Dikötter and Professor Sam Crane, Chair and Professor of Political Science at Williams College, Massachusetts, agree that these views have a part to play in the debate. Such sentiments were demonstrably brought to the fore in 2009 when Lou Jing, a 20 year old mixed-race Shanghai woman whose father is of African-American descent, appeared on the TV talent contest Go! Oriental Angel. After speaking of what it meant to be Chinese, she almost immediately became a target of online abuse and slurs centered on the color of her skin. Many netizens were insulted that she considered herself Chinese at all, exposing the realities of identity and racial prejudices in modern day China.

The U.S. ought not point fingers at other cultures concerning persistent racism; Western paleoanthropology is all about EuroAmerican superiority.

The notions of race may be a more recent phenomenon, ideas about national identity are deeply rooted in history. Dr. Lukas Nickel, Reader in Chinese Art History and Archaeology at the School of Oriental and African Studies, University of London, believes that ideas of national identity are important in this debate. “China is one state with a long and independent heritage, and the first emperor of China was enormously important in constructing a nation and a unified people.”. The first emperor Qin Shi Huang (259 B.C.-210 B.C.), a foreboding military strategist and policy maker conquered individual kingdoms and unified them into one state, germinating more than 2,000 years of imperial rule.
“Before Qin unified China, each country was different and each state had a distinct dialect with only a limited sense of commonality,” says Dr. Nickel. He adopted what Dr. Nickel calls “psychological unity measures,” to ensure that China could function as a single unit, where its people felt “crucially different from everyone else.” Qin even gave his people a new name, “The Black-Headed Ones”, standardizing their language, script, laws and customs. While regional dialects can still be very different today, there were enormous differences during that period.

Even the Great Wall of China, commonly believed to be a strategic barrier built by Qin to ward off invaders, was built as a powerful symbol of separation between the Chinese people and others. Dr. Nickel is convinced that Qin helped shape the psyche of a unique population, “and what he has created has persisted and resonated through all of Chinese history.”

While the glowing legacy of the Qin Dynasty (221 B.C.-206 B.C.) may have contributed to a nationalistic psyche, of equal relevance is “The Century of Humiliation” (百年国耻 bǎinián guóchǐ) that China suffered at the hands of foreign forces in the late 19th to early 20th Century. From the Opium War to the Japanese invasion, these events and the political turmoil that ensued brought the country to its knees. Patriotism was introduced through primary school lessons, alongside activities such as raising the national flag. “Patriotic education in China may well contribute to the general skepticism regarding the ‘Out of Africa’ approach to human evolution,” says Professor Sam Crane of Williams University. “Patriotic education focuses critical thinking in a nationalistic direction, which in turn could increase the preference for uniquely ‘Chinese’ answers to all sorts of questions.” Professor Crane also points to the phrase Zhongguo tese (中国特色) which refers to the cultural ‘”specialties,” an array of the good, bad and absurd, that is uniquely made-in-China.

The desire by a nation to see itself as “special” may be relevant, but Professor Crane thinks there’s more to it than cultural egos when it comes to intellectualism: “Perhaps more important in the area of evolution, is a general anxiety among academics regarding the apparent intellectual dominance of “Western” theories and interpretations.” In part, says Professor Crane, “it reflects a broader intellectual aspiration to transform the Chinese experience into more universal theories.”.

Ironically, then perhaps, that the biggest blow to multi-regionalism in recent years has come from a contemporary of Professor Wu’s at Fudan University, who apparently set out to do just the opposite. Professor Jin Li, one of China’s leading geneticists at Fudan’s Research Centre for Contemporary Anthropology, claims his intention was to prove once and for all that the Chinese evolved independently in China. But instead, his results in 2001 turned his own theory on its head and lent substantial support to ROAH. By looking at the Y-chromosome found only in men, Professor Li singled out a genetic mutation that arose in Africa 31,000—79,000 years ago and collected 12,000 DNA samples from 163 different ethnic groups around East Asia. The results were shattering–every single sample contained the maker that originated in Africa.

While consensus holds sway on the basic model of human evolution, the picture is far from assembled. Even the “Out of Africa” narrative is proving to be murkier than first thought. In contrast to a static model of evolution from earlier decades, “human evolution has been shaped as much by dispersal out of Africa many times, and across Eurasia many times, as by isolation within geographic regions,” says Dr. Harding. She attributes these movements to Pleistocene climate changes that permitted dispersal as well as imposed isolation, and thinks that, while species became extinct at the hands of other incoming species, there could also have been reproductively successful mating, allowing for gene flow between species.

So, perhaps there is still a glimmer of hope that the Peking man DNA will be picked up in the gene pool somewhere. While none have yet been detected, archeological findings in 2010 yielded a new species in Siberia, the Denosivan (41,000 years ago), whose DNA is a mixture of different species, of which a significant amount has passed onto modern Australasians. Furthermore, a small fraction of this seems to belong to an archaic human–could this be an Asian Homo erectus, or even Peking man? More complete Denosivan DNA is needed before anything can be inferred, but if it was the case, Australasians, rather than the Chinese, would be its descendants.

Until definitive evidence shows that Asian Homo erectus simply died out, the door of possibility will remain unhinged. While it is “extremely unlikely that ‘Peking man’ evolved into modern Chinese,” says Dr. Stoneking, “it is still possible that when modern humans entered China, around 40,000 years ago, they interbred with Homo erectus in China, and so it is possible that modern Chinese carry some Homo erectus ancestry.”

Not so fast, curtain call, there may be a Peking man amongst us.

Drug Trial Reform Reduces Positive Results

Increasing the transparency of clinical drug trials reduces number of positive results.


Pre-registering big drug trials resulted in fewer drugs having a positive effect.

Lack of Sensory Stimulation / Winter Peril

Just when I think that there’s nothing left to “think about” –

Winter is a perilous time for my “moods” or “affect”, to be official. The weather has changed from sunny and warm to sudden cold; it’s the season of futzing with the heating system in my old house. One gas heater sans blower; convection only.  It puts out plenty of heat in one room – the kitchen. It’s up to me to devise ways to move that hot air around the house with fans and a small auxiliary heater that I carry with me to whichever room I will be spending time in. That’s the computer room most mornings. I don’t linger in bed, but wake up ready to go. One cup of coffee and I’m typing away on whatever topic “popped” into my conscious awareness when I woke up.

This morning it was a question of “dealing with” the confines of winter. Each year our winter season arrives on an unknown trajectory: it’s true that in this high altitude desert, in southwestern Wyoming, a “season” in the ordinary sense doesn’t mean much, except that other than the ratio of light to dark hours keeps shrinking toward that magic date, the winter solstice, when “winter” officially begins. I don’t understand that at all: the solstice for me marks “mid-winter” – the turning point when “daylight” begins its return to bearable length. A minute or two per day, but that accumulates fast: by January, the expanding “light” is obvious. This process might be compared to myths worldwide in cold (non-equatorial) climates; the “death and resurrection” of the sun god, despite heated houses, 4 WD vehicles, food availability, and holidays for distraction.

It usually takes me about a week to adjust to the “death of the sun”. During that interval I “freeze” in a state that feels like a prelude to hibernation; if I could only sleep away the winter like a bear. Of course, this “attitude” must end, and I begin piling up “projects” to do: reviewing all my photos (there must be some I can delete), restock the pantry (there is but one grocery store, far across town, which may be inaccessible in truly bad weather) review and update or delete blog posts, go back to research topics I set aside, clean house – an uncivilized – mess after ignoring the consequences of a windy-dusty-sandy-muddy world that doesn’t stop at the threshold. Prepare for any weather conditions that will arise: boots, hats, gloves, a parka and lesser jackets – layering is the way to go. My neighbor actually replaced the fuel pump in my Dodge Ram truck: Wow! My hero…a huge weight off my anxious state of preparations.

ALL THIS IS DEPRESSING! But why, specifically? The “winter dilemma” has plagued me my entire life. Did my distress simply begin because growing up in Chicagoland, winter meant enduring months and months of bitter confinement? I began plotting my escape to “The West” after a winter trip to Colorado revealed a sunny, blue-skied land, with most of the snowfall where it belonged (on the mountain peaks and slopes), and relief from the prison of trees at home in the Midwest that negated an open horizon, except along the shore of lake Michigan.

In meaningful ways, weather, landscape and climate have (dictated, guided, informed?) my choices and decisions above and beyond other considerations. Life at it’s best has occurred in those intervals when optimal physical parameters have coincided with opportunities for “self-expression” – whether or not self-discovery was provided for in the context of a “job-job” or by a nomadic episode that defied concerns about “being normal”. Normal only ever concerned me in financial terms; if I could come up with funding for my “eccentric” ways, social normality was quickly extinguished, and I could indulge in personal freedom, which means living in the “timeless” time frame of “now”.  If one lives “now” the stupidity of most human ideas and actions becomes obvious and it becomes impossible to reconnect with social illusions and delusions.

So – here I am! Another winter, another adjustment; but the question remains, why do I react this way? The “could be” answers are still apparent: simple physical “screw ups” in circadian rhythm due to the clockwork of decreasing light; Asperger sensory eccentricity; bipolar “moodiness” triggered by environmental change or speculation that my mood is one consequence of non-adaptation by ancient ancestors who migrated to northern Europe, but who lacked the memory or sense to head back south.

I think that I’ve finally pinpointed “the source” of my physical discontent: Sensory deprivation! It’s a curious dilemma: too much “human” stimulation (noise, chatter, social annoyance, discordant fundamental views of “being human” – etc.) leads to “pain” and retreat; a safe little house in a quiet town; control of human exposure. But winter flips the problem over. There is too little of the stimulation that I crave; visual-spatial opportunity is still there in winter’s cold contrast between snow, frost and ice.  Deep shadows and shimmering blue sky beckon – an entirely new landscape arises from the familiar one.

Actually, I take more photographs in “bad weather” episodes and during environmental change created by geologic processes, than in our protracted and bleak summer of overexposed and colorless dead plants; of uniform yellow grasses and bleached rock.

It’s just more difficult to “get out there” when sunlight disappears between 5 p.m. and 6 p.m., truncating the long slow interplay of land and sky that is summer’s gift to the human eye and brain. Concerns over travel into the countryside are very real, even with 4WD. The consequences of “getting stuck” can range from hours of hard labor to get unstuck, to a hefty bill for tow truck service (if you’re within live cell phone range), to hypothermia, frostbite or death. That is, one mistake in judgement and / or luck can be a huge factor in having or not having a really bad day. This is of increasing concern especially as one ages: the reality of predicaments that require adequate mental and physical response has been drilled into memory by those very challenges having occurred frequently, but when young and stupidly optimistic. Diminished capabilities are real and must be accounted for in daily calculations concerning physical activity.

There is also more to this: the “spatial” element. Perpetual restlessness has two solutions; either “move around a lot” or find an intellectual focus that essentially makes it possible to “ignore” the lack of sensory stimulation for hours or sometimes days. However awkward, this “tension” between mental focus and physical movement is necessary to thinking – perhaps for a visual person this is merely fact: reality as “received and processed and ultimately realized” requires physical (sensory) sensitivity and fulfillment: how could a “concrete” visual thinker function otherwise?


This “type” of human experience WAS TYPICAL in pre-modern wild humans who did not rely on word thinking for their survival. Sound communication was not “language” but a component of sensory reaction to activity: what “sound” does a particular bird or animal make that reveals its behavior? What sound can “we” assign to information that is valuable when conveying such information is impossible due to separation by distance or obstacles to sight?  Which sounds are attuned to the body’s natural functioning, eliciting a powerful positive or negative response in other humans? None of these communications require words; in fact, in the context of hunting, scavenging, and gathering, “human noise” is a detriment to success, whereas understanding animal behavior, especially “sound behavior” and “squelching” verbal compulsion is quite a task for many hunters today.

How many hunting trips are ruined because Uncle Ralph couldn’t keep his mouth shut, talked incessantly and crashed through the underbrush like a deranged modern social human without “a lick of common sense”?

How many “hyperactive” kids are merely moving because it’s how visual thinkers acquire data; experience spatial relationships that “put that data” into real contexts; and therefore motion teaches that child, not only how to “operate the body” but also what he or she needs to know to form a “sensory understanding” of its environment?

Today’s visual stimulation: the unwrapping, cleaning and placement of antique-vintage ornaments on two tiny Xmas trees; there will be more trees, since I’ve been collecting ornaments for many years.






Dimanisi Round Up / Small-Brained Tough Hominins


It’s the species problem again! Declare a bunch of fossils to compose a “species”, and then try to “cram” new fossils into the old “artificial” scheme: there are no species in nature (but reproductive boundaries do Evolve). No matter how “clever” our scientific schemes, “stair step” evolution is misguided: a sad copy of Biblical “begat, begat, begat genealogies. And obviously, primitive Life forms persist (and are incredibly successful) despite our self-aggrandizing appointment to “the pinnacle of creation”!
Narcissism plagues these awkward products of “circular thinking’: how many of the millions of animals and plants that spread and adapt to every environment on earth, manage to do this without our “precious” BIG BRAIN? DUH?

The Wanderer

Fossils of the first human ancestors to trek out of Africa reveal primitive features and a brutal way of life.

by Ann Gibbons*
Science  25 Nov 2016: Vol. 354, Issue 6315, pp. 958-961
DOI: 10.1126/science.354.6315.958
See original for photos and illustrations

On a promontory high above the sweeping grasslands of the Georgian steppe, a medieval church marks the spot where humans have come and gone along Silk Road trade routes for thousands of years. But 1.77 million years ago, this place was a crossroads for a different set of migrants. Among them were saber-toothed cats, Etruscan wolves, hyenas the size of lions—and early members of the human family.

Here, primitive hominins poked their tiny heads into animal dens to scavenge abandoned kills, fileting meat from the bones of mammoths and wolves with crude stone tools and eating it raw. They stalked deer as the animals drank from an ancient lake and gathered hackberries and nuts from chestnut and walnut trees lining nearby rivers. Sometimes the hominins themselves became the prey, as gnaw marks from big cats or hyenas on their fossilized limb bones now testify.

“Someone rang the dinner bell in gully one,” says geologist Reid Ferring of the University of North Texas in Denton, part of an international team analyzing the site. “Humans and carnivores were eating each other.”

This is the famous site of Dmanisi, Georgia, which offers an unparalleled glimpse into a harsh early chapter in human evolution, when primitive members of our genus Homo struggled to survive in a new land far north of their ancestors’ African home, braving winters without clothes or fire and competing with fierce carnivores for meat. The 4-hectare site has yielded closely packed, beautifully preserved fossils that are the oldest hominins known outside of Africa, including five skulls, about 50 skeletal bones, and an as-yet-unpublished pelvis unearthed 2 years ago. “There’s no other place like it,” says archaeologist Nick Toth of Indiana University in Bloomington. “It’s just this mother lode for one moment in time.”

Until the discovery of the first jawbone at Dmanisi 25 years ago, researchers thought that the first hominins to leave Africa were classic H. erectus (also known as H. ergaster in Africa). These tall, relatively large-brained ancestors of modern humans arose about 1.9 million years ago and soon afterward invented a sophisticated new tool, the hand ax. They were thought to be the first people to migrate out of Africa, making it all the way to Java, at the far end of Asia, as early as 1.6 million years ago. But as the bones and tools from Dmanisi accumulate, a different picture of the earliest migrants is emerging.

By now, the fossils have made it clear that these pioneers were startlingly primitive, with small bodies about 1.5 meters tall, simple tools, and brains one-third to one-half the size of modern humans’. Some paleontologists believe they provide a better glimpse of the early, primitive forms of H. erectus than fragmentary African fossils. “I think for the first time, by virtue of the Dmanisi hominins, we have a solid hypothesis for the origin of H. erectus,” says Rick Potts, a paleoanthropologist at the Smithsonian Institution’s National Museum of Natural History in Washington, D.C.

This fall, paleontologists converged in Georgia for “Dmanisi and beyond,” a conference held in Tbilisi and at the site itself from 20–24 September. There researchers celebrated 25 years of discoveries, inspected a half-dozen pits riddled with unexcavated fossils, and debated a geographic puzzle: How did these primitive hominins—or their ancestors—manage to trek at least 6000 kilometers from sub-Saharan Africa to the Caucasus Mountains (see map, below)? “What was it that allowed them to move out of Africa without fire, without very large brains? How did they survive?” asks paleoanthropologist Donald Johanson of Arizona State University in Tempe.

They did not have it easy. To look at the teeth and jaws of the hominins at Dmanisi is to see a mouthful of pain, says Ann Margvelashvili, a postdoc in the lab of paleoanthropologist Marcia Ponce de León at the University of Zurich in Switzerland and the Georgian National Museum in Tbilisi. Margvelashvili found that compared with modern hunter-gatherers from Greenland and Australia, a teenager at Dmanisi had dental problems at a much younger age—a sign of generally poor health. The teen had cavities, dental crowding, and hypoplasia, a line indicating that enamel growth was halted at some point in childhood, probably because of malnutrition or disease. Another individual suffered from a serious dental infection that damaged the jawbone and could have been the cause of death. Chipping and wear in several others suggested that they used their teeth as tools and to crack bones for marrow. And all the hominins’ teeth were coated with plaque, the product of bacteria thriving in their mouths because of inflammation of the gums or the pH of their food or water. The dental mayhem put every one of them on “a road to toothlessness,” Ponce de León says.

They did, however, have tools to supplement their frail bodies. Crude ones—but lots of them. Researchers have found more than 15,000 stone flakes and cores, as well as more than 900 artifacts, in layers of sediments dating from 1.76 million to 1.85 million years ago. Even though H. erectus in East Africa had invented hand axes, part of the so-called Acheulean toolkit, by 1.76 million years ago, none have been found here at Dmanisi. Instead, the tools belong to the “Oldowan” or “Mode 1” toolkit—the first tools made by hominins, which include simple flakes for scraping and cutting and spherical choppers for pounding. The Oldowan tools at Dmanisi are crafted out of 50 different raw materials, which suggests the toolmakers weren’t particularly selective. “They were not choosing their raw material—they were using everything,” says archaeologist David Zhvania of the Georgian National Museum.

That simple toolkit somehow enabled them to go global. “They were able to adjust their behavior to a wide variety of ecological situations,” Potts says. Perhaps the key was the ability to butcher meat with these simple tools—if hominins could eat meat, they could survive in new habitats where they didn’t know which plants were toxic. “Meat eating was a big, significant change,” says paleoanthropologist Robert Foley of the University of Cambridge in the United Kingdom.

Even with their puny stone flakes, “these guys were badass,” competing for meat directly with large carnivores, Toth says. At the meeting, he pointed to piles of cobblestones near the entrance of an ancient gully, which suggest the hominins tried to fend off (or hunt) predators by stoning them.

They set their own course as they left Africa. Researchers had long thought that H. erectus swept out of their native continent in the wake of African mammals they hunted and scavenged. But all of the roughly 17,000 animal bones analyzed so far at Dmanisi belong to Eurasian species, not African ones, according to biological anthropologist Martha Tappen of the University of Minnesota in Minneapolis. The only mammals not of Eurasian origin are the hominins—“striking” evidence the hominins were “behaving differently from other animals,” Foley says.

Perhaps venturing into new territory allowed the hominins to hunt prey that would not have known to fear and flee humans, suggests paleoanthropologist Robin Dennell of the University of Exeter in the United Kingdom. Tappen calls that an “intriguing new idea” but thinks it should be tested. Checking the types of animal bones at other early Homo fossil sites out of Africa could show whether the mix of prey species changed when hominins colonized a new site, supporting a “naïve prey” effect.

Whatever impelled them, the migrants left behind a trail of tools that have enabled researchers to trace their steps out of Africa. There, the oldest stone tools, likely fashioned by the first members of early Homo, such as small-brained H. habilis, date reliably to 2.6 million years ago in Ethiopia (and, possibly, 3.3 million years in Kenya). New dates for stone tools and bones with cutmarks at Ain Boucherit, in the high plateau of northeastern Algeria, suggest that hominins had crossed the Sahara by 2.2 million years ago when it was wetter and green, according to archaeologist Mohamed Sahnouni of the National Centre for Research on Human Evolution in Burgos, Spain. His unpublished results, presented at the Dmanisi meeting, are the earliest evidence of a human presence in northern Africa.

The next oldest tools are those from Dmanisi, at 1.85 million years old. The trail of stone tools then hopscotches to Asia, where Mode 1 toolkits show up by nearly 1.7 million years ago in China and 1.6 million in Java, with H. erectus fossils. “We pick up little fractions of a current” of ancient hominin movements, Foley says.

The identity of the people who dropped these stone breadcrumbs is a mystery that has only deepened with study of the Dmanisi fossils. The excavation team has classified all the hominins at the Georgia site as H. erectus, but they are so primitive and variable that researchers debate whether they belong in H. erectus, H. habilis, a separate species, H. georgicus—or a mix of all three, who may have inhabited the site at slightly different dates.

A new reanalysis of the Dmanisi skulls presented at the meeting added fuel to this debate by underscoring just how primitive most of the skulls were. Using a statistics-based technique to compare their shape and size with the skulls of many other hominins, Harvard University paleoanthropologist Philip Rightmire found that only one of the Dmanisi skulls—at 730 cubic centimeters—fits “comfortably within the confines of H. erectus.” The others—particularly the smallest at 546 cc—cluster more closely with H. habilis in size.

Nor did the Dmanisi hominins walk just like modern humans. A new analysis of cross sections of three toe bones found that the cortical bone—the dense outer layer—wasn’t buttressed in the same way as it is in the toes of modern humans. When these hominins “toed off,” the forces on their toes must have been distributed differently. They may have walked a bit more like chimps, perhaps pushing off the outside edge of their foot more, says Tea Jashashvili of the University of Southern California in Los Angeles and the Georgian National Museum.

“If there are so many primitive traits, why are they calling it H. erectus?” asks Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York City. “People are avoiding the question of what H. erectus is. Every time new stuff comes up, they’re enlarging the taxon to fit new stuff in.” Foley ventures: “I haven’t the slightest idea of what H. erectus means.”

Indeed, H. erectus now includes the 1-million-year-old type specimen from Trinil on the island of Java as well as fossils from South Africa, East Africa, Georgia, Europe, and China that span roughly 300,000 to 1.9 million years. “They’re putting everything into H. erectus over huge geographical distances, essentially spread throughout the whole world, and over a vast number of years,” Johanson says.

Yet no other species matches the Dmanisi specimens better, Rightmire says. For example, the shapes of their dental palate and skulls match those of H. erectus, not H. habilis. And the variation in skull size and facial shape is no greater than in other species, including both modern humans or chimps, says Ponce de León—especially when the growth of the jaw and face over a lifetime are considered.

Though the fossils’ small stature and brains might fit best with H. habilis, their relatively long legs and modern body proportions place them in H. erectus, says David Lordkipanidze, general director of the Georgian National Museum and head of the Dmanisi team. “We can’t forget that these are not just heads rolling around, dispersing around the globe,” Potts adds. Like Rightmire, he thinks the fossils represent an early, primitive form of H. erectus, which had evolved from a H. habilis–like ancestor and still bore some primitive features shared with H. habilis.

Regardless of the Dmanisi people’s precise identity, researchers studying them agree that the wealth of fossils and artifacts coming from the site offer rare evidence for a critical moment in the human saga. They show that it didn’t take a technological revolution or a particularly big brain to cross continents. And they suggest an origin story for first migrants all across Asia: Perhaps some members of the group of primitive H. erectus that gave rise to the Dmanisi people also pushed farther east, where their offspring evolved into later, bigger-brained H. erectus on Java (at the same time as H. erectus in Africa was independently evolving bigger brains and bodies). “For me, Dmanisi could be the ancestor for H. erectus in Java,” says paleoanthropologist Yousuke Kaifu of the National Museum of Nature and Science in Tokyo.

In spite of the remaining mysteries about the ancient people who died on this windy promontory, they have already taught researchers lessons that extend far beyond Georgia. And for that, Lordkipanidze is grateful. At the end of a barbecue in the camp house here, he raised a glass of wine and offered a toast: “I want to thank the people who died here,” he said.

  • * in Dmanisi, Georgia


Athletes who don’t make eye contact / A Social Tip for Aspergers

If you are a sports fan, you may have noticed that many athletes don’t make eye contact during interviews; not even Peyton Manning, who ought to be one of the most confident humans on the planet. They look up at the sky, over the reporter’s shoulder, at the ground, sneaking in one or two glances at the camera or reporter. What’s up?

Maybe Aspergers need to take a page from the playbook – always wear sunglasses; deliver a string of clichés, and then make your exit.


“Bleach” Cure / Abuse of Autistic Children

Parents Give Autistic Children Bleach Enemas (Chlorine Dioxide)

A product sold on the internet claims to cure autism: called Miracle Mineral Solution (MMS) it sounds like any other quack remedy, but MMS can harm living things in serious ways. That’s because it’s a solution of 28 percent sodium chlorite which, when mixed with citric acid as instructed, forms chlorine dioxide (ClO2), a potent form of bleach used in the paper and fabric industries; a dangerous chemical concoction, and yet some parents are giving this to their children, both orally and through enemas, in the belief that it will cure their child of autism.

What is the FDA doing to protect children from toxic chemicals and criminal abuse?

The FDA is aware of this abuse: in 2010 it issued a warning that the product turns into “a potent bleach” that “can cause nausea, vomiting, diarrhea, and symptoms of severe dehydration” if ingested. There are has been one possible death and children were taken from a home in Arkansas because the parents were suspected of using the solution on them, but a number of people are convinced that MMS will provide cures. The underlying belief is that it will clear the body of parasites known as “rope worms” and other pathogens that they believe cause autism: this is dangerous quackery.

Funny how religious people are behind so much child abuse.

If this belief sounds religious in nature, it is. MMS was “discovered” by Jim Humble, a former Scientologist who started Genesis II ( a new religion) for which he is the Archbishop. The church is a marketing tool for “the cure” but the church site doesn’t sell MMS directly. It offers supplementary materials like a $199 “MMS HOME VIDEO COURSE” and information on expensive MMS seminars.

A woman named Kerri Rivera is a bishop in Humble’s church; she has authored a book titled Healing the Symptoms Known as Autism, which recommends “hourly doses” of chlorine dioxide and enemas to kill pathogens in the brain.”

If you fed your child rat poison or antifreeze you’d be prosecuted for attempted murder. Hell, if you tried this “cure” on your dog, you’d be arrested for animal cruelty.

Why aren’t the “bleach cure” propagandists and parents charged with crimes against children? Why? Because Americans still hold to the religious notion that children are the property of their parents and have no right to protection from violent and “insane” adults. 



Consumer affairs posts articles on dangerous and fraudulent products, lists recall notices, and divulges scams!

“Miracle Mineral Solution” promoter convicted of selling bleach as a miracle cure

Seven-day federal trial ends in conviction for Louis Daniel Smith

05/29/2015 | ConsumerAffairs |  Scams


Power, Hierarchy, and Ritual Inversion / from “Primates” blog

From super wordpress blog on “being primates” – BONES OF CULTURE


The human world is full of hierarchy. As I touched on last week, hierarchy is anthropologically necessary for groups over the size of a band (approximately 150, not four or five).

Culturally, some people have more status than others, and it leads to different levels of access to resources, from food and shelter, to knowledge and education, to political decision-making. In less complex societies, such status is usually achieved. That means that people who have status earned it themselves. This is contrasted in with ascribed status, which is inherited. Usually, ascribed status is the hallmark of social classes, as it allows people’s children to inherit their wealth and power.

Hierarchy and Hegemony

Hierarchy does not always work as advertised. Sometimes we are called on to believe things that are not true or are not in our immediate best interest. Sometimes we are called on to “sacrifice for the greater good” when that “good” is simply support of the status quo. Hegemony occurs when people of higher social status and power manipulate the beliefs of those beneath them so that their power is protected. That doesn’t mean that those on the bottom are without recourse.

Cultures have a way of allowing people to blow off some of that steam without resorting to overwhelming violence (riots and civil war) and massive disruption. Everything from employees griping about bosses behind their backs (a venerable tradition everywhere) to the Occupy Wall Street movement count as ways to express resistance to these power imbalances.

Power and Inversion

When the people on the downside of a power imbalance take part in resistance to it in a formal way that is by a culture, this can be done in a form called “ritual inversion.” In ritual inversion, the rules of society are reversed or ignored. This can be anything from late-night comedians commenting on politicians as the voice of the “common man,” to a day at your job where the bosses serve the employees lunch with their own hands, to a protest march, where those without the social power to make political decisions express themselves and judge their country’s leaders.

Ritual inversions are different from open rebellion. These expressions of social power happen within specific contexts, and follow their own rules. While these rules can invert certain social power structures, at the same time they have their own sets of rules.

On a political march in America, for example, members may say and do things that rebel against cultural hegemony, but at the same time they are likely to avoid open violence, and lawbreaking happens in a ritual context. When protesters sit down in a road, blocking it, knowing that they will be arrested for their actions, it is an example of ritualized rebellion. People of less social status are standing up to those in power, showing their lack of fear and using their own social power to publicly call that power structure into question. At the same time, they are not rioters, using indiscriminate violence to try to tear down the larger system. Protesters are not going to war; they are engaging in ritualized action.

What is Ritual?


A recent addition to American “rites of passage”

Ritual doesn’t only refer to what happens in a church, but any set of actions that are prescribed. Rituals allow for the limited rewriting of the rules of culture within a certain context. Sure, ritual can mean those formalized, stiff ways of talking and acting in certain social situations. It can be the activity of high mass at Christmas, but it can also be a high school graduation, or even something as simple as the “ritual” of meeting someone new, introducing yourself, and shaking hands.

Rituals are activities that change the social world. The religious rituals that we are most familiar with are only a subset of these, often changing the social world by incorporating deities and such into “social” relationships.

Rituals are specific contexts where the rules of culture are changed for a limited time…and they can also, through their completion, reinforce cultural rules, either old ones (status quo) or new ones. A high school graduation changes the social status of the graduates, and is also one of the ways that students can enter adulthood. There’s no coincidence that the end of high school roughly coincides with the transition of children to adults at age eighteen. High school graduation is a rite of passage in Western culture.

Ritual Inversion

When we take the rules of culture and turn them on their heads, but only in specific circumstances, we are usually working with “ritual inversions.” A protest movement, like Occupy Wall Street, that follows social rules even while breaking the written laws, is a perfect example of this process. So, while we can talk about the necessity of hierarchy among humans living in groups higher than about 150, it can be important to recognize that human cultures have methods for both changing and critiquing power structures in ways that prevent total collapse, widespread violence, and civil war.

“Optimism is Cowardice”

– Oswald Spengler

Actually, total collapse, widespread violence and civil war are everyday occurrences in the modern world; our methods of “cultural adjustment” are quite ineffective.

Modern Social Typicals fear change: the just-completed U.S. Presidential election has produced social hysteria –  Half the voters voted for change (but only after more than half a century of complaints about government) and scared the SHIT out of the other half!


Joseph LeDoux Neuroscience / Amygdala-Fear

Joseph LeDoux is a professor and a member of the Center for Neural Science and Department of Psychology at NYU. His work is focused on the brain mechanisms of emotion and memory. In addition to articles in scholarly journals, he is author of “The Emotional Brain: The Mysterious Underpinnings of Emotional Life” and “Synaptic Self: How Our Brains Become Who We Are.” He is a fellow of the American Association for the Advancement of Science, a fellow of the New York Academy of Science, a fellow of the American Academy of Arts and Science, and the recipient of the 2005 Fyssen International Prize in Cognitive Science.

“You don’t learn how to be afraid.”

“Emotions are vivid in the personal sense, but are not accurate memories.”

The “Sensory Overload” Phenomena / Misinterpretations

Out of the laundry list of “complaints”  about Asperger behavior by “neurotypicals” is that we’re “overly-sensitive” to certain environmental conditions; despite the fact that sense organs and the brain and nervous system ARE PHYSICAL OBJECTS. The modern social medical and psychological communities persist in “muddling” the topic with “supernatural” ideas about how the human body works.

Sensory sensitivity and reactions to sensory overload precipitate the need for Asperger types to flee complex artificial environments. It’s physical:

Sensory sensitivities are physical and produce physical reactions. Asperger individuals react in similar ways to artificial environments, but each person reacts to varying degrees. These reactions are “automatic” – not “created by the person in order to cause neurotypicals discomfort.

When our specific “senses” are assaulted by sounds, smells, chemicals, toxins or light (usually artificial – manmade) we instinctively retreat if possible to a dark and / or quiet place; if trapped we “shutdown” (block sensory input as an attempt to reduce the stimulation). Many “therapists” try total immersion, that is, exposing the Asperger child or adult to intolerable types and levels of stimulation for increasing periods of time, in order to “desensitize” the person to sensory overstimulation. Often this is forced upon Aspergers and autistics as “treatment.” Exposure reportedly helps some people, but we’re talking about increasing tolerance by minutes, and not a “cure.”

poke-pngn-guide-toA poke in the eye is not supposed to feel good.

Anecdotal emphasis by those diagnosed Asperger, reveals that sensory assault is the trigger for other Asperger “symptoms” – that is, social and work problems are the result of sensory sensitivity to noise, lighting conditions, chaotic human activity, and lack of consistency in workplace behavior and communication between workers, bosses and hierarchical social agendas.  

Social difficulties are the result of the special perception and processing that is inherent in the Asperger brain, compounded by toxic modern environments – which are populated by “domesticated” modern social humans.   

While “too bright and flickering lights” (fluorescent) are a big complaint, as well as polluted indoor and outdoor air; artificial fabrics and other materials, plus chemical odors that result in headaches and sinus and respiratory problems; a very common and direct  trigger of sensory pain is  sound.

This sound problem can be broken down into parts: The inability to block out background sound, like someone whistling or coughing, the humming or buzzing of a machine, a low-volume radio, “constant babbling or gossiping,” “terrible music,” or ever-present street noise. Many Asperger people say that the problem created by this type of noise is that it interferes with their ability to focus on tasks. Work environments are particularly generative of distractions that can’t be removed. At home, irritating sounds, smells, lights and other conditions can be limited, but rarely totally eliminated. Running away to one’s room, shutting the door and turning down lights, is a classic means of dealing with sensory overload. For me, it’s also fleeing to the wilderness that surrounds my town and releasing stress to the vast unpopulated desert.

The Asperger problem with persistent and information-less background sound introduces questions: The “fuzzy” and subjective current methods of diagnosis label these physical phenomena as “developmental disabilities” in a “sloppy” analysis of the traits said to indicate Asperger Disorder. The inability to block out the cacophony of sounds that swamp the neurotypical environment and to “enjoy human social noise” would rather indicate a “type” of human that is more attuned physically to their environment; not “lacking” but “extra sensory” – not in the hooky-spooky supernatural sense, but adapted to natural environments in which being acutely sensory-aware is a survival necessity.

As an Asperger who does experience severe negative reactions (pain) to some types of sound, my question is, In which environments is this awareness of a wider range of sound be adaptive? Distant sound, even at a very low level, would seem to be important to Asperger knowledge of the environment.


If I were to use a picture to characterize this relationship to sound, it would be a deer  swiveling its radar dish ears to identify the direction, distance and origin of the sound. Many animals, especially prey, can identify the “mood or intent” of an animal predator, distinguishing its hunting mode from a benign state. If benign, prey will ignore the predator nearby or “stand by” until it detects a change. If prey were always to run when aware of danger, they would soon be exhausted. In natural environments, instances of alarm and flight END with the cessation of the threat, but in the human environment, sensory inputs are persistent and relentless and to an Asperger. I would suggest that the more “wild” and significantly less-domesticated Asperger brain expects, like the deer, that alarm-inducing sounds (including alarm by other species) will stop, but in modern human environments, the sounds don’t abate; the sensory signs of danger never end and the Asperger nervous system is almost constantly alert.

The modern social brain has adapted (been down-selected) to human-created chaos; the Asperger brain has not. Modern humans conform not only to  conventional social behavior, but to long periods of subjection to unhealthy working  and living conditions. Not only Asperger types but Neurotypicals suffer from severe stress that has become the “normal” condition of Homo sapiens ONLY RECENTLY: sound pollution is a product of the Industrial Revolution.


“Living with Extreme Sound Sensitivity”

“If you feel disgusted to the point of rage when you hear the sound of chewing, swallowing, breathing, throat-clearing and other common “people” noises, you’re not alone. You’re also not crazy. Misophonia is a sound sensitivity disorder, which makes certain noises intolerable to the sufferer.” (It has a name, and also affects non-ASD humans)

“Although this condition is primarily neurological, the experience of these sounds can cause psychological (social) distress. The term misophonia was developed by Pawel and Margaret Jastreboff, American neuroscientists. Literally translated, it means ‘hatred of sounds.’” (Even the “name” conveys pejorative insinuations – we don’t hate sounds; we AUTOMATICALLY experience PAIN when exposed to specific types of sound. 

“This condition usually develops when a child is just entering into his or her tween years, although it can develop earlier in life. The affected child will often feel a frightening and uncontrollable urge either to strike the person making the noises or run away with hands over ears. Alternatively, some will mimic the sounds of the chewer in an attempt to cover up the noise or to communicate in a nonverbal way how horrible the sound is to them. This reaction is called ‘echolalia’ and is also quite common among those on the autistic spectrum.” (ASD “weird” behavior is often a reaction to neurotypical “weird behavior”.

“One of the primary difficulties of living with this disorder is others’ reactions. Those who do not have any hypersensitivity to sound simply cannot imagine how their chewing and swallowing noises (and other incredible rudeness, including out-of-control children) can be so disgusting to another person. Often, protests from the sufferer are misinterpreted as passive-aggressive personal attacks or simply not believed at all.” (You’ve got this correct!)

“Although misophonia is thought of as a relatively rare disorder, those with other neurological and sensory processing disorders often struggle with this condition.” Conditions such as autism, Asperger’s syndrome, and ADHD FIX  causing the patient’s brain to misinterpret information taken in by their senses. These disorders often cause a misinterpretation of social cues, smell, visual cues, touch, balance, hearing, sense of time, space, and movement.” Here we go! This is all about social conformity, not about a physical condition. Domesticated (juvenalized-narcissistic) modern humans are annoyed by any person whose behavior or situation differs in the least from their expectation. Ask any individual utilizing a wheelchair or other device how bizarrely neurotypicals behave in the presence of a disabled person. 

The insistent drum beat that our sensory systems are “broken” is the default social judgement, when the evidence points to an individual sensory experience that may simply fall outside a “typical” range – and instead of being atypical, our experiences are defined as pathologies. “Misinterpret” implies that there is one “correct interpretation of sensory input that every human being must duplicate, when EACH PERSON experiences his or her own sensory “universe.” There is no “perfectly normal” sensory system outside of the word-illusions created by doctrinaire humans (esp. psychologists).

“This sensory information can cause either a hypersensitive or hyposensitive response to various stimuli. In other words, the patient may hear or feel things much more or much less intensely than those with a neurotypical brain.” (His usage!)

It is entirely possible that the reverse is true; modern social humans have diminished sensory abilities due to domestication / neoteny. And it is well-known that human hearing is damaged by the incredible noise pollution in modern environments.

“Although there is no cure for sound sensitivity, there are various techniques as well as some dietary and lifestyle modifications that can help dial back the symptoms of misophonia so it does not interfere so severely with everyday life.

Tinnitus retraining therapy. …tinnitus retraining therapy was developed for those who live with tinnitus, misophonia, and hyperacusis. A combination of counseling and desensitization therapy with low-level broad-band noise aims to reclassify intolerable sounds to more neutral signals. This training helps to weaken the neuronal activity associated with the fight-or-flight response these noises often produce.” (That is, impair one’s sensory abilities in order to tolerate harmful environments. )

Cognitive-behavioral therapy. Cognitive-behavioral therapy is a technique designed to rewire the brain through the use of intense psychotherapy aimed at treating one specific problem. The specialist helps the patient go deep within to understand the specific emotions certain sounds produce and so they can gain control over automatic response. Over time, this helps desensitize the patient to formally rage-inducing sounds. (Note that for AS children, this requires “damaging” the sensory system that is native to us: we are not adapted to extreme artificial sounds and sound levels that are common to modern environments; sirens, “beep-beeping” machinery; repetitive and shrill  noise, and also, near infrasound “boom” box speakers. Nor the “social shrieks” of hyper-emotional people and of people constantly fighting, arguing, and screaming at each other. )

Occupational therapy. Those with sensory processing disorders often find occupational therapy beneficial. This approach helps a person’s neurological system integrate his or her senses so he or she can more appropriately process information. (Again, the opinion is that our processing isn’t “different,” it’s wrong.) For example, an occupational therapist might have a person who is hypersensitive to certain noises gradually experience a wide variety of noises, including the offensive ones, to help their brains get used to and eventually dismiss them. (Does anyone see this “treatment” as similar to torture tactics used on political and military prisoners?) These sounds are altered as needed to ensure the experiences are positive and within the patient’s comfort zone. (How reassuring!)

Psychotherapeutic hypnotherapy. Hypnotherapy with a certified hypnotherapist can help ease the symptoms of misophonia through the proven power of suggestion. Many individuals have been able to successfully overcome phobias and addictions through this method. (So sensory sensitivity is classed with phobias and addictions?)

Misophonia, while rare, is a real neurological condition. You haven’t lost your mind. If you hate the sound of chewing and other common noises to the point of frenzy, there is real help and validation out there. Talk with a trusted medical professional about the therapeutic techniques mentioned above. They may help you better integrate your senses and help you enjoy the world around you.

We see again the “blindness” of neurotypicals to perfectly valid variation in human physiology: psychology: The pressure is always for human beings to “warp” their “psychology” to satisfy neurotypical demands; to “tolerate” living and working conditions that are terribly unhealthy for ALL humans.

%d bloggers like this: