BBC History of “The Celts” / Video 6 Episodes

Totally annoying “Easy Listening version of Bronze Age Hits” soundtrack – much archaeological “novel writing” but great art objects – I want the bronze couch. Also, recent genetic research may have changed assumptions presented here.

Well! Someone else left a comment on EP 3:                           “I don’t know why – but the music on this series is fucking annoying – well, the vocals at least . . . the rest of the production is great – too bad I can’t turn that annoying music off . . .”

And from EP 4: —————— “I CANNOT STAND THIS MUSIC ANY LONGER” and “The music is awful. But the rest is great.”

Cool sounding Celtic “battle horns” + head hunting and snug thatched houses; farming and domestic local sheep = Iron Age. More archaeological novel writing and the lame “ritual religious function” as explanation for anything “too hard” to figure out.

…but, fabulous jewelry composed of “meaningful geometry”

and the tired old “Bad Romans” narrative… gee whiz! Roman disciplinary superiority was basically “cheating” – the Celts (Irish) lost because they were drunkards! Daft Brit historians…

Let’s face it: without Ancient Rome, Europeans would still be living in dirt hovels…LOL

Go to youtube for remaining 4 episodes…

 

 

Empirical Planet Blog / Critique of All Things Neurotypical

http://empiricalplanet.blogspot.com

About Empirical Planet:  (Jason) I’m a neuroscience PhD student and hopeless news junkie. I’m passionate about bringing empirical thinking to the masses. @jipkin

Blog written by a fellow cranky complainer and hopeless believer in converting the masses to a love of reality, which is a pointless endeavor:

(Posted 07/2003)This idea of there being a “primitive” brain crops up all over the place, and from reputable sources: They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities” (Picower Professor of Neuroscience Earl Miller, MIT, said that).

It’s like adding scoops to an ice cream cone.  So if you imagine the lizard brain as a single-scoop ice cream cone, the way you make a mouse brain out of a lizard brain isn’t to throw the cone and the first scoop away and start over and make a banana split — rather, it’s to put a second scoop on top of the first scoop.” (Professor David Linden, Johns Hopkins, said that).

Now let me explain why this is all complete BS. First, semantics.  What is “primitive”?  How do you measure a tissue’s “primitivity”?  In the common usage of the word, primitive means simple, especially in the context of a task, idea, structure, way of life, etc that was employed a long time ago.  Cavemen were more primitive than us, for example.  Unfortunately, this means that “primitive” is a word that refers to things both “ancient” AND “simple”.  Which, as we’ll see, is a big problem when you start applying it to mean only one of those things as occurs with the “primitive brain” meme.

Second, what are people actually talking about when they say “primitive brain”?  This is confused as well, but in general the thought is structured like this: Most primitive – brain stem, pons, cerebellum.  (The hindbrain, basically). Also primitive – the limbic system where “limbic” means “border, edge”.  This includes the hippocampus, amygdala, parts of the thalamus (who knows why), some of the cortex, and some other bits.  It’s supposed to do “emotion” and this is what Daniel Goleman is referring to when he talks about the “primitive brain”.

Really though it’s just a lumping together of all the structures right near the inner border of cortex, because why not lump it all together? – the mighty cortex, you know, is the glorious wrinkly part on the outside.

Third, why do people say that these particular brain structures are “primitive”?  The idea is that evolutionarily, one came after the other.  As in, the first vertebrate just had a brain stem.  Then it evolved more stuff after that.  Then as things kept evolving, more and more stuff got added on.  This is the “ice cream cone” model that David Linden espouses.  It’s also incredibly stupid (or at least misleading).  Let’s break it down.

Evolution did not happen like this: untitled-png-linear-evo

Evolution did happen like this: untitled-png-tree-evo

I hope everyone pretty much understands the concept of a phylogeny (phylogeny, the history of the evolution of a species or group, especially in reference to lines of descent and relationships among broad groups of organisms) and the fact that every vertebrate came from one common ancestor. Yes, the common ancestor was a kind of fish.  No, today’s fish aren’t the same as the common ancestor. They evolved from it just like everyone else, although “rates” of evolutionary phenomena like mutation and drift can vary and that’s beyond the scope of this post.

The point is that the “primitive” brain meme is born in the idea that the brain components shared by fish, lizards, mice, and humans share must be evolutionarily ancient and were likely shared in common by the common ancestor.  So, homologous structures across the phylogeny indicate “ancientness”.  And “ancientness” = “primitive”.  (Except it doesn’t, but more on that in a second). And since we all share structures that resemble the brain stem, voilà!  That’s the most primitive part of the brain.  Here’s where things go astray. First, we don’t just share the brain stem with all animals.

EmbryonicBrain_svgHere’s the real “ice cream cone” of the brain: And when I say “the” brain, I should say “all vertebrate brains”.  Every fish, bird (including reptiles), amphibian, and mammal has a brain that starts out looking like the pictures to the right. Each colored bump (or “scoop of ice cream”) represents a region of the brain very early on in development, when the whole nervous system is basically a simple tube.  Each bump goes on to expand to varying sizes into varying kind of structures and yadda yadda depending on the species.  The point, though, is that all vertebrates have a forebrain, a midbrain, and a hindbrain.  And the hindbrain, by the way, is the “primitive” brain stem.  But clearly, humans, fish, lizards, and mice all evolved from a common ancestor that had all brain regions, not just the hindbrain.

This is why David Linden’s ice cream analogy is so dumb.  He’s implying that first you start with one scoop, the hindbrain, then add on another (the midbrain), and finally one more (the forebrain).

When mammals like mice came along, the lizard brain didn’t go away. It simply became the brain stem, which is perched on top of the spine, Linden says.  Then evolution slapped more brain on top of the brain stem. But that’s not what happened at all.  All the scoops were there to begin with.  Then as adaptation took its course, different scoops got bigger or smaller or just different as you began comparing across the entire phylogeny.  Yes, humans have an enormous cortex and lizards don’t.  And yes, lizards simply evolved a different-looking kind of forebrain.  That’s all.

Second, homology (“likeness”) does NOT imply “ancientness”.  Even if the hindbrain looks pretty similar across the vertebrate phylogeny as it exists today, that doesn’t make it “ancient”.  The hindbrain has been evolving just like the midbrain and the forebrain.  Maybe it’s not been changing as much, but it’s still been changing.

This leads me to why the “primitive” notion is so misleading, and should be avoided:

(1) Calling a part of the brain “primitive” suggests what David Linden articulated: that brain evolution happened like stacking scoops of ice cream.  It implies that our brain stem is no different than that of a lizard, or of a mouse, or of a fish.  Yet despite their vast similarities, they are clearly not the same.  You can’t plug a human forebrain into a lizard hindbrain and expect the thing to work.  The hindbrain of humans HAD to adapt to having an enormous forebrain.  There’s something seductive in the idea that inside all of us is a primal being, a “reptilian brain”.  There isn’t.  It’s a human brain, top to bottom.

(2) Calling brain parts “primitive” because they are shared across phylogeny is often used to justify how amazing our human cortex is.  Look at what makes us, us!  We’re so great!  Well, I guess.  But we are just one little excursion among many that evolution has produced.  The lizard brain is adapted for what lizards need to do.  The fish brain is adapted for what fish need to do.  They don’t have “primitive” brains.  They have highly adapted brains, just like any other vertebrate.

(3) Simply using the word “primitive” makes the casual reader think of cavemen.  It just does.  And that’s even more ridiculous, because ancient Homo sapiens were still Homo sapiens.  Read what this poor misinformed blogger has written:

“So, let me explain the Primitive brain in simple terms. We have an Intellectual (rational) part of the brain and a Primitive (emotional) part of the brain. In the picture above, the Primitive brain is around the Hippocampus and Hypothalamus areas. In some texts, it has also been called the Limbic System. The subconscious Primitive part has been there ever since we were cavemen and cavewomen, and houses our fight/flight/freeze response (in the amygdala in between the Hippocampus and the Hypothalamus). Its function is to ensure our survival.”

AHHHHHHHHHHHHHHH.  You see?  You see??????

(4) There is not just a “primitive, emotional brain” and a “complex, intellectual brain”.  That is so…. wrong.  Factually wrong.  Yet people like Daniel Goleman sell books about emotional intelligence claiming that people need to develop their “emotional brain” (Asperger individuals are belittled as developmentally impaired  for not “having” a (mythical) SOCIAL or EMOTIONAL BRAIN.) and then bloggers like Carrie (above) start internalizing and spreading the misinformation.  Folks.  Let me be clear.  You have ONE brain.  One.  It has many parts, which is to say that humans looking at brains have found ways to subdivide them into various areas and regions and structures and whatnot.  Regardless of all that, the whole damn thing works together.  It’s one big circuit that does emotion, rationality, sensation, movement, the whole shebang.  There isn’t a simplistic “emotion” part and an “intellectual” part.  The cortex is involved in emotions and feelings.  The basal ganglia are involved in cognition.  In fact, the whole idea of splitting emotion and reason into two separate camps is fraught, as emotion turns out to be critical in reasoning tasks like decision-making.

Asperger individuals aren’t “able to – allowed to” claim that emotion influences our thinking (nor are we granted any feelings toward other humans) because we’re “missing” the non-existent “emotional brain” – and every idiot knows that not having an “emotional brain” makes a person “subhuman” or even psychopathic – we are irretrievably “broken”. The real story is that Asperger “emotions”, which technically are reactions to the environment, are different because we receive and process the environment differently to social-emotional magical thinkers: neoteny favors irrational reactions. 

(5) The “primitiveness” of lizard brains is vastly overstated.  Things like this get written about the “primitive brain”: A lizard brain is about survival — it controls heart rate and breathing, and processes information from the eyes and ears and mouth.

This implies, to the casual reader, that lizards are just sitting around breathing.  Maybe there’s some “survival instinct” in there: look out for that big hawk!  Yeah, okay.  But guess what?  Lizards gotta do other stuff too.  They have to reproduce, find food, navigate their environment, learn, remember, make choices, etc.  They aren’t just breathing and instinct machines.  And because they aren’t, that means their brains aren’t just doing that either.  And why is it always lizards and reptiles?  You’d think fish would get picked on more.

(6) “Primitive” in the context of a brain part primarily means “ancient”.  But the word “primitive”, as we already saw, connotes simplicity.  This leaves laypeople with many misconceptions.  First, that the brain stem, or the “emotional brain”, or whatever, is simple.  Or even that they’re simpler.  Nope.  Not really.  Pretty much every part of the brain is complex.  Second, it reinforces, in the case of the “emotional brain”, that emotions are beneath intellect. (In the U.S. “emotional responses” have been elevated OVER intellect, because no one wants an analytical consumer or voter.)  They came first, they are older, they are simpler, they are the stupid part of your brain.  Again, just no.  You need emotions to survive just as you need your intellect to survive.  Fish need  emotions (an emotion, after all, is just a representation of bodily state imbued with a certain postive/negative valence) just like they need their reasoning abilities as well.

(7) People (who use the word) “primitive” (copy scientists) because it can sound cool and surprising.  Look at how Earl Miller framed it, from above:

“They suggest that new learning isn’t simply the smarter bits of our brain such as the cortex ‘figuring things out.’ Instead, we should think of learning as interaction between our primitive brain structures and our more advanced cortex. In other words, primitive brain structures might be the engine driving even our most advanced high-level, intelligent learning abilities”

Look at that result!  A primitive thing did something advanced!  The forgotten thing is important!  Or maybe – this is going to sound crazy – the whole system evolved together in order to support such essential tasks like learning.  There never was a primitive part or an advanced part, despite two areas or regions being labeled as such.  Every part of the human brain has been evolving for the same amount of time as every other part, and has been evolving to work as best as possible with each of those other parts.

(8) Finally, let’s return to Daniel Goleman, who argues that “emotional intelligence” arises from the “primitive emotional brain”.  Then he waxes on and on about the value of emotional intelligence, particularly as it relates to social abilities.  Ability to understand your own emotions.  Ability to perceive those of others.  Ability to interact well with others on the basis of understanding their emotions.  Et cetera. 

That’s all fine, but by saying this comes from an ancient, primitive, emotional brain might make people think that (neurotypicals are primitive and stupid) and ancient vertebrates really had to know themselves, be able to read others, and interact socially.  ( ie; ancient vertebrates were as intelligent as modern humans) But there’s a whole lot of solitary, nonsocial vertebrate species out there, and they have brainstems and limbic systems too.

Hopefully never again will you refer to a part of the brain as “primitive.”  Some structures probably more closely resemble their homologous counterparts in the last common ancestor of vertebrates, but all the basic parts were there from the beginning.  And remember, evolution didn’t happen (only) to make humans. (And specifically, EuroAmerican white males.)  We aren’t more advanced in an evolutionary sense than fish, lizards, or mice.  Each species is just adapted to the roles it finds itself in, and continues to adapt.  Our sense of being “advanced” comes purely from our own self-regard and anthropocentric tendencies. The human brain is not the best brain, nor is it the most advanced brain, because there’s no scale on which to measure how good a brain is.

Actually, the processes of evolution appear to settle for “good enough” as the standard for successful adaptation!

worldofchinese.com / paleoanthropology wars

The Peking Man Delusion

Monday, September 23, 2013 | By:

While waves were sent thrashing through the intellectual world in 1987 as DNA evidence hailed a woman living 200,000 years ago in Africa as the earliest common ancestor to all modern humans, China’s leading scientists fervently waved the red flag and rejected the findings in Nature as being little more than preposterous. They held up the theory that an archaic form of human, the “Peking man” (Homo erectus)  (北京猿人 Běijīng yuánrén), was the true ancestor of the modern Chinese, with radicals going so far as to claim that China was the cradle of humanity.

Over the next thirty 30 years, as the model of a common African origin gained momentum with growing genetic and archaeological evidence, the theory evolved into a well-tread narrative of human evolution, depicting an early form of modern humans emerging out of Africa around 60,000 years ago. The scientific community rejected the competing theory that supported independent evolution in different parts of the world. But in China, the revelation barely rippled with a defiant nation that continued to teach in its history and text books that the Peking man was the real deal, and the prevailing theory of evolution failed to permeate the realm of public conscience. (As if the Out of Africa Theory was not culturally biased and has had to be highly modified.)

The Peking man in question— a Homo erectus or upright man that lived 750,000 years ago— was discovered in Zhoukoudian 周口店, now a UNESCO World Heritage site, southwest of Beijing in 1929. At the time they were the oldest fossils of man known, fanning flames for theories that all humans originated in China.

These believers were at least right about one point: common originality. “Based on evolutionary theory, all modern humans have to share a common ancestor at some point in the past”, says Dr. Mark Stoneking, of the Max Planck Institute for Evolutionary Anthropology. “The evidence for this is extremely strong from all fields of anthropology and genetics. The controversy is over how far back in the past this common ancestor is”.

Indeed, apart from the occasional loon, what most leading scientists, including the Chinese, can now agree on is that around two million years ago during the Pleistocene era (2.6 million-11,700 years ago), primates in Africa evolved into Homo erectus on the African continent. This species was able to cover greater distances due to its upright form, and with climate conditions permitting, a sub-population of this species successfully migrated out of Africa in an event known as Out of Africa I, spreading to Asia and other parts of the world, giving rise to populations such as the Peking man.

So what happened to these archaic forms that occupied Asia? Those in support of this view back the Multiregional Model of human evolution, a theory that has largely been rejected by the international scientific community. Continuous descent from Peking man, at least, “is very unlikely from fossil evidence, and perhaps even more unlikely from the totality of the genetic evidence,” according to Professor Chris Stringer of the Natural History Museum in London. “Classic multiregionalists model from late last century proposed an evolutionary continuity that lasts at least a million years in each region, which would result in a gradual transformation in each area to modern people of today—from the Neanderthals to modern Europeans, from Peking Man to modern Chinese.” For Professor Stringer, the picture is clear, “In my view there is only one region where we can observe such a transformation, and that is in Africa.”

That is to say, a second migration out of Africa occurred around 60,000 years ago, known as the Recent Out of Africa Hypothesis hypothesis (ROAH) by a more evolved species. This new species, an early form of Homo sapiens, dispersed around the world, replacing archaic forms like the Homo erectus, giving rise to the modern human population today.

There is nearly no doubt that modern humans came out of Africa, “The evidence in favor of an African origin about 200,000 years ago for modern humans is very robust,” says Dr. Rosalind Harding, Oxford University Lecturer in Biological Anthropology. Some of this comes from studies on mitochondrial DNA (mtDNA), which is only passed along the maternal line, and can therefore be used to trace genetic lineages. The work that began in the 1970s by Dr. Stoneking and colleagues has demonstrated that all humans are traceable to “Mitochondrial Eve”, the woman living 200,000 years ago in Africa. (MtDNA is not definitive; it’s more like a travelogue than anything else. Whole genome studies must be done.)

The Homo erectus in Asia is believed to have gone extinct—they either died out or were driven to extinction by incoming species, although no one knows when and where this happened. This is one of those cases, that if a statement is repeated often enough, people believe it.

On the other hand, Multiregionalists reject that their lineage was cut short in this way, believing that it persists until today. Champions of this latter school of thought, notably the paleontologist Professor Wu Xinzhi of the Chinese Academy of Science, have recently advocated a view whereby populations of Homo erectus in Africa, Asia, and Europe continued to exchange DNA through interbreeding, while under going strong natural selection in their own regions. In this way, archaic populations developed “regional” as well as “common” features, creating what Professor Wu refers to as a “mosaic of morphological features”.

According to Wu, distinct shovel shaped upper incisors and flat upper faces seen in Peking man and other archaic Chinese fossils are only identifiable as “complex” in modern Chinese. For Professor Stringer however, this line of argument is unconvincing: “I would argue that the distinctive features of Africans, Europeans, Asians, and Australasians evolved only after the development of our shared modern features, i.e. within the last 100,000 years”. (European features such as light skin and hair and blue eyes, are indeed very recent!)

And while academics in China rage against the grain, the Chinese public is less than perturbed by stirrings in the scientific realm. Ask the average Chinese person who they evolved from, and the likely answer is “Peking man”. Suggest an African ancestor in response, and whole gamut of reactions to this bizarre notion is likely to surface. Ms. Zhang Yanfang, a 67-year-old Beijinger and retired office manager, on learning of the Africa theory, replied that she was not even aware such a hypothesis existed. “I’ve never heard of such a thing,” Zhang replied, saying that she grew up learning, like every other Chinese person, that she descended from the Peking man. “I think it’s quite unlikely that we are related to Africans; the difference is too great in every respect—including that of race.”

Ask an average American who admits to belief in evolution from whom he or she descended, and they will say Adam & Eve or Ancient Aliens. At least Peking Man actually existed.

While general ignorance might fuel popular skepticism towards ROAH, for many the disbelief in an African origin lies in the notion that the Chinese, with its unique and illustrious history, the longest continuing civilization in the world, and the creator of some of the greatest inventions ever, could fundamentally be “non-Chinese” in any way.

article-2623485-1DAAF5A900000578-889_634x662

Peking Man, E. Daynes

Professor Frank Dikötter, Chair Professor of Humanities at the University of Hong Kong and author of The Discourse of Race in Modern China, believes that intellectualism can be strongly influenced by a cultural context, noting a similar resistance in the West to the multiple origins theory in the 19th century, “Monogenesis until recently was a view with its own cultural background. Despite the fact that Charles Darwin and others were talking in the name of science, it was very clear that Christianity and the idea of Adam and Eve provided a strong setting.” In China, views opposing this began emerging at the end of the 19th Century, when thinkers, likely influenced by strong Confucianism, says Professor Dikötter, “insisted that Africans and Europeans came from one single origin, but that the Chinese were completely different.” The discovery at Zhoukoudian several decades later reinforced this view: “Chinese scientists took it to be proof that China is one long line of uninterrupted descent.”

China has a long history of ancestor worship, with the belief that the legendary Yellow Emperor (黄帝) is the ancestor of all Chinese people. This conceptuality of a distinct racial lineage took particular form during the Qing dynasty, where tracing patrilineal lineages of descent to form long genialities became a popular practice. “Clans would compile ancestral lines to show that an uninterrupted line of decent existed,” Professor Dikötter says. “This certainly combined with newer evolutionary ideas about race at the end of the 19th century.” Professor Dikötter believes that this strong culture is one that is still present today, ”Science never really operates outside of social, cultural, and political contexts; we might wish to believe it is completely objective and blind, but it is not.”

Stripped of cultural and historical connotations, the issue of race may be more than a mere reverence for one’s own. There exists a widely carried unfavorable perception of Africa and the African people. Both Professor Dikötter and Professor Sam Crane, Chair and Professor of Political Science at Williams College, Massachusetts, agree that these views have a part to play in the debate. Such sentiments were demonstrably brought to the fore in 2009 when Lou Jing, a 20 year old mixed-race Shanghai woman whose father is of African-American descent, appeared on the TV talent contest Go! Oriental Angel. After speaking of what it meant to be Chinese, she almost immediately became a target of online abuse and slurs centered on the color of her skin. Many netizens were insulted that she considered herself Chinese at all, exposing the realities of identity and racial prejudices in modern day China.

The U.S. ought not point fingers at other cultures concerning persistent racism; Western paleoanthropology is all about EuroAmerican superiority.

The notions of race may be a more recent phenomenon, ideas about national identity are deeply rooted in history. Dr. Lukas Nickel, Reader in Chinese Art History and Archaeology at the School of Oriental and African Studies, University of London, believes that ideas of national identity are important in this debate. “China is one state with a long and independent heritage, and the first emperor of China was enormously important in constructing a nation and a unified people.”. The first emperor Qin Shi Huang (259 B.C.-210 B.C.), a foreboding military strategist and policy maker conquered individual kingdoms and unified them into one state, germinating more than 2,000 years of imperial rule.
“Before Qin unified China, each country was different and each state had a distinct dialect with only a limited sense of commonality,” says Dr. Nickel. He adopted what Dr. Nickel calls “psychological unity measures,” to ensure that China could function as a single unit, where its people felt “crucially different from everyone else.” Qin even gave his people a new name, “The Black-Headed Ones”, standardizing their language, script, laws and customs. While regional dialects can still be very different today, there were enormous differences during that period.

Even the Great Wall of China, commonly believed to be a strategic barrier built by Qin to ward off invaders, was built as a powerful symbol of separation between the Chinese people and others. Dr. Nickel is convinced that Qin helped shape the psyche of a unique population, “and what he has created has persisted and resonated through all of Chinese history.”

While the glowing legacy of the Qin Dynasty (221 B.C.-206 B.C.) may have contributed to a nationalistic psyche, of equal relevance is “The Century of Humiliation” (百年国耻 bǎinián guóchǐ) that China suffered at the hands of foreign forces in the late 19th to early 20th Century. From the Opium War to the Japanese invasion, these events and the political turmoil that ensued brought the country to its knees. Patriotism was introduced through primary school lessons, alongside activities such as raising the national flag. “Patriotic education in China may well contribute to the general skepticism regarding the ‘Out of Africa’ approach to human evolution,” says Professor Sam Crane of Williams University. “Patriotic education focuses critical thinking in a nationalistic direction, which in turn could increase the preference for uniquely ‘Chinese’ answers to all sorts of questions.” Professor Crane also points to the phrase Zhongguo tese (中国特色) which refers to the cultural ‘”specialties,” an array of the good, bad and absurd, that is uniquely made-in-China.

The desire by a nation to see itself as “special” may be relevant, but Professor Crane thinks there’s more to it than cultural egos when it comes to intellectualism: “Perhaps more important in the area of evolution, is a general anxiety among academics regarding the apparent intellectual dominance of “Western” theories and interpretations.” In part, says Professor Crane, “it reflects a broader intellectual aspiration to transform the Chinese experience into more universal theories.”.

Ironically, then perhaps, that the biggest blow to multi-regionalism in recent years has come from a contemporary of Professor Wu’s at Fudan University, who apparently set out to do just the opposite. Professor Jin Li, one of China’s leading geneticists at Fudan’s Research Centre for Contemporary Anthropology, claims his intention was to prove once and for all that the Chinese evolved independently in China. But instead, his results in 2001 turned his own theory on its head and lent substantial support to ROAH. By looking at the Y-chromosome found only in men, Professor Li singled out a genetic mutation that arose in Africa 31,000—79,000 years ago and collected 12,000 DNA samples from 163 different ethnic groups around East Asia. The results were shattering–every single sample contained the maker that originated in Africa.

While consensus holds sway on the basic model of human evolution, the picture is far from assembled. Even the “Out of Africa” narrative is proving to be murkier than first thought. In contrast to a static model of evolution from earlier decades, “human evolution has been shaped as much by dispersal out of Africa many times, and across Eurasia many times, as by isolation within geographic regions,” says Dr. Harding. She attributes these movements to Pleistocene climate changes that permitted dispersal as well as imposed isolation, and thinks that, while species became extinct at the hands of other incoming species, there could also have been reproductively successful mating, allowing for gene flow between species.

So, perhaps there is still a glimmer of hope that the Peking man DNA will be picked up in the gene pool somewhere. While none have yet been detected, archeological findings in 2010 yielded a new species in Siberia, the Denosivan (41,000 years ago), whose DNA is a mixture of different species, of which a significant amount has passed onto modern Australasians. Furthermore, a small fraction of this seems to belong to an archaic human–could this be an Asian Homo erectus, or even Peking man? More complete Denosivan DNA is needed before anything can be inferred, but if it was the case, Australasians, rather than the Chinese, would be its descendants.

Until definitive evidence shows that Asian Homo erectus simply died out, the door of possibility will remain unhinged. While it is “extremely unlikely that ‘Peking man’ evolved into modern Chinese,” says Dr. Stoneking, “it is still possible that when modern humans entered China, around 40,000 years ago, they interbred with Homo erectus in China, and so it is possible that modern Chinese carry some Homo erectus ancestry.”

Not so fast, curtain call, there may be a Peking man amongst us.

Drug Trial Reform Reduces Positive Results

Increasing the transparency of clinical drug trials reduces number of positive results.

imagesV961RUV3

Pre-registering big drug trials resulted in fewer drugs having a positive effect.

Lack of Sensory Stimulation / Winter Peril

Just when I think that there’s nothing left to “think about” –

Winter is a perilous time for my “moods” or “affect”, to be official. The weather has changed from sunny and warm to sudden cold; it’s the season of futzing with the heating system in my old house. One gas heater sans blower; convection only.  It puts out plenty of heat in one room – the kitchen. It’s up to me to devise ways to move that hot air around the house with fans and a small auxiliary heater that I carry with me to whichever room I will be spending time in. That’s the computer room most mornings. I don’t linger in bed, but wake up ready to go. One cup of coffee and I’m typing away on whatever topic “popped” into my conscious awareness when I woke up.

This morning it was a question of “dealing with” the confines of winter. Each year our winter season arrives on an unknown trajectory: it’s true that in this high altitude desert, in southwestern Wyoming, a “season” in the ordinary sense doesn’t mean much, except that other than the ratio of light to dark hours keeps shrinking toward that magic date, the winter solstice, when “winter” officially begins. I don’t understand that at all: the solstice for me marks “mid-winter” – the turning point when “daylight” begins its return to bearable length. A minute or two per day, but that accumulates fast: by January, the expanding “light” is obvious. This process might be compared to myths worldwide in cold (non-equatorial) climates; the “death and resurrection” of the sun god, despite heated houses, 4 WD vehicles, food availability, and holidays for distraction.

It usually takes me about a week to adjust to the “death of the sun”. During that interval I “freeze” in a state that feels like a prelude to hibernation; if I could only sleep away the winter like a bear. Of course, this “attitude” must end, and I begin piling up “projects” to do: reviewing all my photos (there must be some I can delete), restock the pantry (there is but one grocery store, far across town, which may be inaccessible in truly bad weather) review and update or delete blog posts, go back to research topics I set aside, clean house – an uncivilized – mess after ignoring the consequences of a windy-dusty-sandy-muddy world that doesn’t stop at the threshold. Prepare for any weather conditions that will arise: boots, hats, gloves, a parka and lesser jackets – layering is the way to go. My neighbor actually replaced the fuel pump in my Dodge Ram truck: Wow! My hero…a huge weight off my anxious state of preparations.

ALL THIS IS DEPRESSING! But why, specifically? The “winter dilemma” has plagued me my entire life. Did my distress simply begin because growing up in Chicagoland, winter meant enduring months and months of bitter confinement? I began plotting my escape to “The West” after a winter trip to Colorado revealed a sunny, blue-skied land, with most of the snowfall where it belonged (on the mountain peaks and slopes), and relief from the prison of trees at home in the Midwest that negated an open horizon, except along the shore of lake Michigan.

In meaningful ways, weather, landscape and climate have (dictated, guided, informed?) my choices and decisions above and beyond other considerations. Life at it’s best has occurred in those intervals when optimal physical parameters have coincided with opportunities for “self-expression” – whether or not self-discovery was provided for in the context of a “job-job” or by a nomadic episode that defied concerns about “being normal”. Normal only ever concerned me in financial terms; if I could come up with funding for my “eccentric” ways, social normality was quickly extinguished, and I could indulge in personal freedom, which means living in the “timeless” time frame of “now”.  If one lives “now” the stupidity of most human ideas and actions becomes obvious and it becomes impossible to reconnect with social illusions and delusions.

So – here I am! Another winter, another adjustment; but the question remains, why do I react this way? The “could be” answers are still apparent: simple physical “screw ups” in circadian rhythm due to the clockwork of decreasing light; Asperger sensory eccentricity; bipolar “moodiness” triggered by environmental change or speculation that my mood is one consequence of non-adaptation by ancient ancestors who migrated to northern Europe, but who lacked the memory or sense to head back south.

I think that I’ve finally pinpointed “the source” of my physical discontent: Sensory deprivation! It’s a curious dilemma: too much “human” stimulation (noise, chatter, social annoyance, discordant fundamental views of “being human” – etc.) leads to “pain” and retreat; a safe little house in a quiet town; control of human exposure. But winter flips the problem over. There is too little of the stimulation that I crave; visual-spatial opportunity is still there in winter’s cold contrast between snow, frost and ice.  Deep shadows and shimmering blue sky beckon – an entirely new landscape arises from the familiar one.

Actually, I take more photographs in “bad weather” episodes and during environmental change created by geologic processes, than in our protracted and bleak summer of overexposed and colorless dead plants; of uniform yellow grasses and bleached rock.

It’s just more difficult to “get out there” when sunlight disappears between 5 p.m. and 6 p.m., truncating the long slow interplay of land and sky that is summer’s gift to the human eye and brain. Concerns over travel into the countryside are very real, even with 4WD. The consequences of “getting stuck” can range from hours of hard labor to get unstuck, to a hefty bill for tow truck service (if you’re within live cell phone range), to hypothermia, frostbite or death. That is, one mistake in judgement and / or luck can be a huge factor in having or not having a really bad day. This is of increasing concern especially as one ages: the reality of predicaments that require adequate mental and physical response has been drilled into memory by those very challenges having occurred frequently, but when young and stupidly optimistic. Diminished capabilities are real and must be accounted for in daily calculations concerning physical activity.

There is also more to this: the “spatial” element. Perpetual restlessness has two solutions; either “move around a lot” or find an intellectual focus that essentially makes it possible to “ignore” the lack of sensory stimulation for hours or sometimes days. However awkward, this “tension” between mental focus and physical movement is necessary to thinking – perhaps for a visual person this is merely fact: reality as “received and processed and ultimately realized” requires physical (sensory) sensitivity and fulfillment: how could a “concrete” visual thinker function otherwise?

d285ca17f68b588bc9e9b6c3de0dd738

This “type” of human experience WAS TYPICAL in pre-modern wild humans who did not rely on word thinking for their survival. Sound communication was not “language” but a component of sensory reaction to activity: what “sound” does a particular bird or animal make that reveals its behavior? What sound can “we” assign to information that is valuable when conveying such information is impossible due to separation by distance or obstacles to sight?  Which sounds are attuned to the body’s natural functioning, eliciting a powerful positive or negative response in other humans? None of these communications require words; in fact, in the context of hunting, scavenging, and gathering, “human noise” is a detriment to success, whereas understanding animal behavior, especially “sound behavior” and “squelching” verbal compulsion is quite a task for many hunters today.

How many hunting trips are ruined because Uncle Ralph couldn’t keep his mouth shut, talked incessantly and crashed through the underbrush like a deranged modern social human without “a lick of common sense”?

How many “hyperactive” kids are merely moving because it’s how visual thinkers acquire data; experience spatial relationships that “put that data” into real contexts; and therefore motion teaches that child, not only how to “operate the body” but also what he or she needs to know to form a “sensory understanding” of its environment?

Today’s visual stimulation: the unwrapping, cleaning and placement of antique-vintage ornaments on two tiny Xmas trees; there will be more trees, since I’ve been collecting ornaments for many years.

two-trees

 

 

 

 

Dimanisi Round Up / Small-Brained Tough Hominins

From SCIENCE / AAAS

It’s the species problem again! Declare a bunch of fossils to compose a “species”, and then try to “cram” new fossils into the old “artificial” scheme: there are no species in nature (but reproductive boundaries do Evolve). No matter how “clever” our scientific schemes, “stair step” evolution is misguided: a sad copy of Biblical “begat, begat, begat genealogies. And obviously, primitive Life forms persist (and are incredibly successful) despite our self-aggrandizing appointment to “the pinnacle of creation”!
Narcissism plagues these awkward products of “circular thinking’: how many of the millions of animals and plants that spread and adapt to every environment on earth, manage to do this without our “precious” BIG BRAIN? DUH?

The Wanderer

Fossils of the first human ancestors to trek out of Africa reveal primitive features and a brutal way of life.

by Ann Gibbons*
Science  25 Nov 2016: Vol. 354, Issue 6315, pp. 958-961
DOI: 10.1126/science.354.6315.958
See original for photos and illustrations

On a promontory high above the sweeping grasslands of the Georgian steppe, a medieval church marks the spot where humans have come and gone along Silk Road trade routes for thousands of years. But 1.77 million years ago, this place was a crossroads for a different set of migrants. Among them were saber-toothed cats, Etruscan wolves, hyenas the size of lions—and early members of the human family.

Here, primitive hominins poked their tiny heads into animal dens to scavenge abandoned kills, fileting meat from the bones of mammoths and wolves with crude stone tools and eating it raw. They stalked deer as the animals drank from an ancient lake and gathered hackberries and nuts from chestnut and walnut trees lining nearby rivers. Sometimes the hominins themselves became the prey, as gnaw marks from big cats or hyenas on their fossilized limb bones now testify.

“Someone rang the dinner bell in gully one,” says geologist Reid Ferring of the University of North Texas in Denton, part of an international team analyzing the site. “Humans and carnivores were eating each other.”

This is the famous site of Dmanisi, Georgia, which offers an unparalleled glimpse into a harsh early chapter in human evolution, when primitive members of our genus Homo struggled to survive in a new land far north of their ancestors’ African home, braving winters without clothes or fire and competing with fierce carnivores for meat. The 4-hectare site has yielded closely packed, beautifully preserved fossils that are the oldest hominins known outside of Africa, including five skulls, about 50 skeletal bones, and an as-yet-unpublished pelvis unearthed 2 years ago. “There’s no other place like it,” says archaeologist Nick Toth of Indiana University in Bloomington. “It’s just this mother lode for one moment in time.”

Until the discovery of the first jawbone at Dmanisi 25 years ago, researchers thought that the first hominins to leave Africa were classic H. erectus (also known as H. ergaster in Africa). These tall, relatively large-brained ancestors of modern humans arose about 1.9 million years ago and soon afterward invented a sophisticated new tool, the hand ax. They were thought to be the first people to migrate out of Africa, making it all the way to Java, at the far end of Asia, as early as 1.6 million years ago. But as the bones and tools from Dmanisi accumulate, a different picture of the earliest migrants is emerging.

By now, the fossils have made it clear that these pioneers were startlingly primitive, with small bodies about 1.5 meters tall, simple tools, and brains one-third to one-half the size of modern humans’. Some paleontologists believe they provide a better glimpse of the early, primitive forms of H. erectus than fragmentary African fossils. “I think for the first time, by virtue of the Dmanisi hominins, we have a solid hypothesis for the origin of H. erectus,” says Rick Potts, a paleoanthropologist at the Smithsonian Institution’s National Museum of Natural History in Washington, D.C.

This fall, paleontologists converged in Georgia for “Dmanisi and beyond,” a conference held in Tbilisi and at the site itself from 20–24 September. There researchers celebrated 25 years of discoveries, inspected a half-dozen pits riddled with unexcavated fossils, and debated a geographic puzzle: How did these primitive hominins—or their ancestors—manage to trek at least 6000 kilometers from sub-Saharan Africa to the Caucasus Mountains (see map, below)? “What was it that allowed them to move out of Africa without fire, without very large brains? How did they survive?” asks paleoanthropologist Donald Johanson of Arizona State University in Tempe.

They did not have it easy. To look at the teeth and jaws of the hominins at Dmanisi is to see a mouthful of pain, says Ann Margvelashvili, a postdoc in the lab of paleoanthropologist Marcia Ponce de León at the University of Zurich in Switzerland and the Georgian National Museum in Tbilisi. Margvelashvili found that compared with modern hunter-gatherers from Greenland and Australia, a teenager at Dmanisi had dental problems at a much younger age—a sign of generally poor health. The teen had cavities, dental crowding, and hypoplasia, a line indicating that enamel growth was halted at some point in childhood, probably because of malnutrition or disease. Another individual suffered from a serious dental infection that damaged the jawbone and could have been the cause of death. Chipping and wear in several others suggested that they used their teeth as tools and to crack bones for marrow. And all the hominins’ teeth were coated with plaque, the product of bacteria thriving in their mouths because of inflammation of the gums or the pH of their food or water. The dental mayhem put every one of them on “a road to toothlessness,” Ponce de León says.

They did, however, have tools to supplement their frail bodies. Crude ones—but lots of them. Researchers have found more than 15,000 stone flakes and cores, as well as more than 900 artifacts, in layers of sediments dating from 1.76 million to 1.85 million years ago. Even though H. erectus in East Africa had invented hand axes, part of the so-called Acheulean toolkit, by 1.76 million years ago, none have been found here at Dmanisi. Instead, the tools belong to the “Oldowan” or “Mode 1” toolkit—the first tools made by hominins, which include simple flakes for scraping and cutting and spherical choppers for pounding. The Oldowan tools at Dmanisi are crafted out of 50 different raw materials, which suggests the toolmakers weren’t particularly selective. “They were not choosing their raw material—they were using everything,” says archaeologist David Zhvania of the Georgian National Museum.

That simple toolkit somehow enabled them to go global. “They were able to adjust their behavior to a wide variety of ecological situations,” Potts says. Perhaps the key was the ability to butcher meat with these simple tools—if hominins could eat meat, they could survive in new habitats where they didn’t know which plants were toxic. “Meat eating was a big, significant change,” says paleoanthropologist Robert Foley of the University of Cambridge in the United Kingdom.

Even with their puny stone flakes, “these guys were badass,” competing for meat directly with large carnivores, Toth says. At the meeting, he pointed to piles of cobblestones near the entrance of an ancient gully, which suggest the hominins tried to fend off (or hunt) predators by stoning them.

They set their own course as they left Africa. Researchers had long thought that H. erectus swept out of their native continent in the wake of African mammals they hunted and scavenged. But all of the roughly 17,000 animal bones analyzed so far at Dmanisi belong to Eurasian species, not African ones, according to biological anthropologist Martha Tappen of the University of Minnesota in Minneapolis. The only mammals not of Eurasian origin are the hominins—“striking” evidence the hominins were “behaving differently from other animals,” Foley says.

Perhaps venturing into new territory allowed the hominins to hunt prey that would not have known to fear and flee humans, suggests paleoanthropologist Robin Dennell of the University of Exeter in the United Kingdom. Tappen calls that an “intriguing new idea” but thinks it should be tested. Checking the types of animal bones at other early Homo fossil sites out of Africa could show whether the mix of prey species changed when hominins colonized a new site, supporting a “naïve prey” effect.

Whatever impelled them, the migrants left behind a trail of tools that have enabled researchers to trace their steps out of Africa. There, the oldest stone tools, likely fashioned by the first members of early Homo, such as small-brained H. habilis, date reliably to 2.6 million years ago in Ethiopia (and, possibly, 3.3 million years in Kenya). New dates for stone tools and bones with cutmarks at Ain Boucherit, in the high plateau of northeastern Algeria, suggest that hominins had crossed the Sahara by 2.2 million years ago when it was wetter and green, according to archaeologist Mohamed Sahnouni of the National Centre for Research on Human Evolution in Burgos, Spain. His unpublished results, presented at the Dmanisi meeting, are the earliest evidence of a human presence in northern Africa.

The next oldest tools are those from Dmanisi, at 1.85 million years old. The trail of stone tools then hopscotches to Asia, where Mode 1 toolkits show up by nearly 1.7 million years ago in China and 1.6 million in Java, with H. erectus fossils. “We pick up little fractions of a current” of ancient hominin movements, Foley says.

The identity of the people who dropped these stone breadcrumbs is a mystery that has only deepened with study of the Dmanisi fossils. The excavation team has classified all the hominins at the Georgia site as H. erectus, but they are so primitive and variable that researchers debate whether they belong in H. erectus, H. habilis, a separate species, H. georgicus—or a mix of all three, who may have inhabited the site at slightly different dates.

A new reanalysis of the Dmanisi skulls presented at the meeting added fuel to this debate by underscoring just how primitive most of the skulls were. Using a statistics-based technique to compare their shape and size with the skulls of many other hominins, Harvard University paleoanthropologist Philip Rightmire found that only one of the Dmanisi skulls—at 730 cubic centimeters—fits “comfortably within the confines of H. erectus.” The others—particularly the smallest at 546 cc—cluster more closely with H. habilis in size.

Nor did the Dmanisi hominins walk just like modern humans. A new analysis of cross sections of three toe bones found that the cortical bone—the dense outer layer—wasn’t buttressed in the same way as it is in the toes of modern humans. When these hominins “toed off,” the forces on their toes must have been distributed differently. They may have walked a bit more like chimps, perhaps pushing off the outside edge of their foot more, says Tea Jashashvili of the University of Southern California in Los Angeles and the Georgian National Museum.

“If there are so many primitive traits, why are they calling it H. erectus?” asks Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York City. “People are avoiding the question of what H. erectus is. Every time new stuff comes up, they’re enlarging the taxon to fit new stuff in.” Foley ventures: “I haven’t the slightest idea of what H. erectus means.”

Indeed, H. erectus now includes the 1-million-year-old type specimen from Trinil on the island of Java as well as fossils from South Africa, East Africa, Georgia, Europe, and China that span roughly 300,000 to 1.9 million years. “They’re putting everything into H. erectus over huge geographical distances, essentially spread throughout the whole world, and over a vast number of years,” Johanson says.

Yet no other species matches the Dmanisi specimens better, Rightmire says. For example, the shapes of their dental palate and skulls match those of H. erectus, not H. habilis. And the variation in skull size and facial shape is no greater than in other species, including both modern humans or chimps, says Ponce de León—especially when the growth of the jaw and face over a lifetime are considered.

Though the fossils’ small stature and brains might fit best with H. habilis, their relatively long legs and modern body proportions place them in H. erectus, says David Lordkipanidze, general director of the Georgian National Museum and head of the Dmanisi team. “We can’t forget that these are not just heads rolling around, dispersing around the globe,” Potts adds. Like Rightmire, he thinks the fossils represent an early, primitive form of H. erectus, which had evolved from a H. habilis–like ancestor and still bore some primitive features shared with H. habilis.

Regardless of the Dmanisi people’s precise identity, researchers studying them agree that the wealth of fossils and artifacts coming from the site offer rare evidence for a critical moment in the human saga. They show that it didn’t take a technological revolution or a particularly big brain to cross continents. And they suggest an origin story for first migrants all across Asia: Perhaps some members of the group of primitive H. erectus that gave rise to the Dmanisi people also pushed farther east, where their offspring evolved into later, bigger-brained H. erectus on Java (at the same time as H. erectus in Africa was independently evolving bigger brains and bodies). “For me, Dmanisi could be the ancestor for H. erectus in Java,” says paleoanthropologist Yousuke Kaifu of the National Museum of Nature and Science in Tokyo.

In spite of the remaining mysteries about the ancient people who died on this windy promontory, they have already taught researchers lessons that extend far beyond Georgia. And for that, Lordkipanidze is grateful. At the end of a barbecue in the camp house here, he raised a glass of wine and offered a toast: “I want to thank the people who died here,” he said.

  • * in Dmanisi, Georgia

 

Athletes who don’t make eye contact / A Social Tip for Aspergers

If you are a sports fan, you may have noticed that many athletes don’t make eye contact during interviews; not even Peyton Manning, who ought to be one of the most confident humans on the planet. They look up at the sky, over the reporter’s shoulder, at the ground, sneaking in one or two glances at the camera or reporter. What’s up?

Maybe Aspergers need to take a page from the playbook – always wear sunglasses; deliver a string of clichés, and then make your exit.

 

“Bleach” Cure / Abuse of Autistic Children

Parents Give Autistic Children Bleach Enemas (Chlorine Dioxide)

A product sold on the internet claims to cure autism: called Miracle Mineral Solution (MMS) it sounds like any other quack remedy, but MMS can harm living things in serious ways. That’s because it’s a solution of 28 percent sodium chlorite which, when mixed with citric acid as instructed, forms chlorine dioxide (ClO2), a potent form of bleach used in the paper and fabric industries; a dangerous chemical concoction, and yet some parents are giving this to their children, both orally and through enemas, in the belief that it will cure their child of autism.

What is the FDA doing to protect children from toxic chemicals and criminal abuse?

The FDA is aware of this abuse: in 2010 it issued a warning that the product turns into “a potent bleach” that “can cause nausea, vomiting, diarrhea, and symptoms of severe dehydration” if ingested. There are has been one possible death and children were taken from a home in Arkansas because the parents were suspected of using the solution on them, but a number of people are convinced that MMS will provide cures. The underlying belief is that it will clear the body of parasites known as “rope worms” and other pathogens that they believe cause autism: this is dangerous quackery.

Funny how religious people are behind so much child abuse.

If this belief sounds religious in nature, it is. MMS was “discovered” by Jim Humble, a former Scientologist who started Genesis II ( a new religion) for which he is the Archbishop. The church is a marketing tool for “the cure” but the church site doesn’t sell MMS directly. It offers supplementary materials like a $199 “MMS HOME VIDEO COURSE” and information on expensive MMS seminars.

A woman named Kerri Rivera is a bishop in Humble’s church; she has authored a book titled Healing the Symptoms Known as Autism, which recommends “hourly doses” of chlorine dioxide and enemas to kill pathogens in the brain.”

If you fed your child rat poison or antifreeze you’d be prosecuted for attempted murder. Hell, if you tried this “cure” on your dog, you’d be arrested for animal cruelty.

Why aren’t the “bleach cure” propagandists and parents charged with crimes against children? Why? Because Americans still hold to the religious notion that children are the property of their parents and have no right to protection from violent and “insane” adults. 

 

UPDATE: https://www.consumeraffairs.com

Consumer affairs posts articles on dangerous and fraudulent products, lists recall notices, and divulges scams!

“Miracle Mineral Solution” promoter convicted of selling bleach as a miracle cure

Seven-day federal trial ends in conviction for Louis Daniel Smith

05/29/2015 | ConsumerAffairs |  Scams

 

Asperger Frustration / Social Typical Lack of Imagination

untitledfhom

Comments on Western Civilization and U.S. culture in particular.

Imagination:

We actually hear this word used less and less, and its meaning is increasingly narrow: something little kids do with crayons, and limited to copying animated movie, video game and comic book characters. Adults are creative if they buy a scrapbook kit to document their daily routine in old-fashioned style; try recipes that contain exotic ingredients (which are standard fare in other areas of the globe) or amass a  wardrobe of designer fakes. That covers women; I don’t know what men do to express themselves. Buy another gun perhaps?

Adults are today enticed by the word innovation: “Be an innovator,” commercials say. But the “story” that is used to demonstrate innovation too often is centered on selling cupcakes with blue icing and sprinkles, via the internet, to people all over the globe – one day delivery, which adds to the continued burning of fossil fuels, global warming and the amount trash that must be disposed of – and fulfills the wish of one or two people to become rich. (Sorry – to pursue their passion for serving humanity through innovative marketing of cupcakes.)

Innovation (in the U.S.) is directed toward new ways to part people and their money. Often this has been done by offering useful products, but contemporary innovation is likely to focus on rebranding the same old manufactured food product as “Green” “Organic” or “Natural” on revised packaging and advertising New Look! What an asinine label, as if new packaging actually improves the product; it is most commonly employed to disguise a downgrade in product quality. Or adding “profit  gimmicks” (apps) to what were once gadgets (cell phones) that are meant to improve communication. American capitalism runs on one concept: You can always fool the American consumer.

Okay – this is all status quo and Americans have been cheating people (Native Americans in particular) since Jamestown and Plymouth Colony. (Actually, these were English people.)  Prior to that, the Age of Exploration was fueled by greed and only incidentally by scientific curiosity in the service of greed. The imagination I’m talking about is not concerned with “dreams” “fantasies” or Vaudeville entertainment rebranded as “deep human insight” –

The imagination that I long to see is the ability for a person or group to “see” reality. In this case, archaic humans. The human being inside the skull; not the brain size, face shape, dentition or scattered stone flakes, but how they perceived their world. Not our world; theirs. Not as we would see their world if they were an imitation of us, which is what we certainly search for among the bones. Narcissism prevails: Where are we, Evolutionary Messiahs, in this jumble of refuse on ancient cave floors? Surely there are signs of our coming? After all, we are better, smarter, healthier and more important to the universe than any of these misshapen troglodytes and mumblers, none of whom are really fit to be our ancestors. Surely accounting systems, bureaucracies, military expenditures and weapons are more important than whatever came before.

untitlede bear

What is required to imagine reality? Intellectual Empathy – not for our in-group, who are people just like us in birth and class and wealth and appearance and beliefs, but the ability to shed our self-centered orientation; to imagine  living contexts and the human being shaped by that context.  It’s empathy without a suitcase, instructions or travel plans. Intellectual imagination  = empathy that is deeper and more sustainable than “Empathy Bear feels your Pain.”

Imagination: It’s empathy without a suitcase, instructions or travel plans.

Power, Hierarchy, and Ritual Inversion / from “Primates” blog

From super wordpress blog on “being primates” – BONES OF CULTURE

Primates

The human world is full of hierarchy. As I touched on last week, hierarchy is anthropologically necessary for groups over the size of a band (approximately 150, not four or five).

Culturally, some people have more status than others, and it leads to different levels of access to resources, from food and shelter, to knowledge and education, to political decision-making. In less complex societies, such status is usually achieved. That means that people who have status earned it themselves. This is contrasted in with ascribed status, which is inherited. Usually, ascribed status is the hallmark of social classes, as it allows people’s children to inherit their wealth and power.

Hierarchy and Hegemony

Hierarchy does not always work as advertised. Sometimes we are called on to believe things that are not true or are not in our immediate best interest. Sometimes we are called on to “sacrifice for the greater good” when that “good” is simply support of the status quo. Hegemony occurs when people of higher social status and power manipulate the beliefs of those beneath them so that their power is protected. That doesn’t mean that those on the bottom are without recourse.

Cultures have a way of allowing people to blow off some of that steam without resorting to overwhelming violence (riots and civil war) and massive disruption. Everything from employees griping about bosses behind their backs (a venerable tradition everywhere) to the Occupy Wall Street movement count as ways to express resistance to these power imbalances.

Power and Inversion

When the people on the downside of a power imbalance take part in resistance to it in a formal way that is by a culture, this can be done in a form called “ritual inversion.” In ritual inversion, the rules of society are reversed or ignored. This can be anything from late-night comedians commenting on politicians as the voice of the “common man,” to a day at your job where the bosses serve the employees lunch with their own hands, to a protest march, where those without the social power to make political decisions express themselves and judge their country’s leaders.

Ritual inversions are different from open rebellion. These expressions of social power happen within specific contexts, and follow their own rules. While these rules can invert certain social power structures, at the same time they have their own sets of rules.

On a political march in America, for example, members may say and do things that rebel against cultural hegemony, but at the same time they are likely to avoid open violence, and lawbreaking happens in a ritual context. When protesters sit down in a road, blocking it, knowing that they will be arrested for their actions, it is an example of ritualized rebellion. People of less social status are standing up to those in power, showing their lack of fear and using their own social power to publicly call that power structure into question. At the same time, they are not rioters, using indiscriminate violence to try to tear down the larger system. Protesters are not going to war; they are engaging in ritualized action.

What is Ritual?

emediaworld-alcohol-drug-addiction-rehab-center

A recent addition to American “rites of passage”

Ritual doesn’t only refer to what happens in a church, but any set of actions that are prescribed. Rituals allow for the limited rewriting of the rules of culture within a certain context. Sure, ritual can mean those formalized, stiff ways of talking and acting in certain social situations. It can be the activity of high mass at Christmas, but it can also be a high school graduation, or even something as simple as the “ritual” of meeting someone new, introducing yourself, and shaking hands.

Rituals are activities that change the social world. The religious rituals that we are most familiar with are only a subset of these, often changing the social world by incorporating deities and such into “social” relationships.

Rituals are specific contexts where the rules of culture are changed for a limited time…and they can also, through their completion, reinforce cultural rules, either old ones (status quo) or new ones. A high school graduation changes the social status of the graduates, and is also one of the ways that students can enter adulthood. There’s no coincidence that the end of high school roughly coincides with the transition of children to adults at age eighteen. High school graduation is a rite of passage in Western culture.

Ritual Inversion

When we take the rules of culture and turn them on their heads, but only in specific circumstances, we are usually working with “ritual inversions.” A protest movement, like Occupy Wall Street, that follows social rules even while breaking the written laws, is a perfect example of this process. So, while we can talk about the necessity of hierarchy among humans living in groups higher than about 150, it can be important to recognize that human cultures have methods for both changing and critiquing power structures in ways that prevent total collapse, widespread violence, and civil war.

“Optimism is Cowardice”

– Oswald Spengler

Actually, total collapse, widespread violence and civil war are everyday occurrences in the modern world; our methods of “cultural adjustment” are quite ineffective.

Modern Social Typicals fear change: the just-completed U.S. Presidential election has produced social hysteria –  Half the voters voted for change (but only after more than half a century of complaints about government) and scared the SHIT out of the other half!