JudeoChristian Morality / Torture children, just as the Bible commands

If any man has a stubborn and rebellious son who will not obey his father or his mother, and when they chastise him, he will not even listen to them, then his father and mother shall seize him, and bring him out to the elders of his city at the gateway of his home town. And they shall say to the elders of his city, “This son of ours is stubborn and rebellious, he will not obey us, he is a glutton and a drunkard.” Then all the men of his city shall stone him to death; so you shall remove the evil from your midst, and all Israel shall hear of it and fear.

The Bible is a pathological human-hating document that advocates unbelievable levels of brutality and injustice. Where do people think child abuse comes from?

The people of Samaria must bear their guilt, because they have rebelled against their God. They will fall by the sword; their little ones will be dashed to the ground, their pregnant women ripped open. (Hosea 13:16)

Now go and attack Amalek, and utterly destroy all that they have, and do not spare them. But kill both man and woman, infant and nursing child, ox and sheep, camel and donkey. (1 Samuel 15:3)

1535737_775121579184605_666463585_n

 

 

Advertisements

Part 2 Human self-domestication / Martin Brüne

Part 2

Human self-domestication – the development of an idea

Charles Darwin was the first to systematically examine biological changes in species under artificial breeding conditions. Even though he did not refer to the question of human self-domestication in his two volumes on Variations of Animals and Plants under Domestication [2], Darwin proposed clear definitional criteria for the process of domestication. He emphasized (1) that the domestication of animals is more than taming, (2) that it represents a goal-oriented process for human purposes, (3) that the variability of physical and ‘mental’ characteristics is greater in domesticated species than in their wild ancestors, including the occurrence of dwarfism and gigantism, (4) that the behavioural plasticity and educability of domesticated species is greater, and (5) that the brain size of domesticated animals is smaller than that of their wild ancestors’.

In spite of these unequivocal definitional criteria, Darwin was remarkably vague regarding the possibility that humans could have undergone domestication. In The Decent of Man [11], he wrote the following (the most critical phrases are highlighted in italics by the author): “It is, nevertheless, an error to speak of man, even if we look only to the conditions to which he has been exposed, as ‘far more domesticated’ (Blumenbach 1865) than any other animal. … In another and much more important respect, man differs widely from any strictly domesticated animal; for his breeding has never long been controlled, (this is not true! The social hierarchy is a reproductive selection machine!) either by methodical or unconscious selection. No race or body of men has been so completely subjugated by other men, as that certain individuals should be preserved, and thus unconsciously selected, from somehow excelling in utility to their masters. Nor have certain male and female individuals been intentionally picked out and matched, except in the well known case of the Prussian grenadiers;” (p. 29) … By contrast, in another paragraph Darwin stated: “We might, therefore, expect that civilized men, who in one sense are highly domesticated, would be more prolific than wild men. It is also probable that the increased fertility of civilised nations would become, as with our domestic animals, an inherited character …” (p. 45–46). (Darwin was a man of his time and class; likely oblivious to de facto social selection. People married and reproduced within their “proper place” on the pyramid.

With respect to brain size Darwin argued, however, that in contrast to domesticated animals the human brain and skull has increased over time. Nevertheless, in the chapter on human races, Darwin reiterates that “man in many respects may be compared with those animals which have long been domesticated, …” (p. 178); and later: “With man no such question can arise, for he cannot be said to have been domesticated at any particular period” (p. 183). And finally: “With our domestic animals a new race can readily be formed by carefully matching the varying offspring from a single pair, or even from a single individual possessing some new character; but most of our races have been formed, not intentionally from selected pair, but unconsciously by the preservation of many individuals which have varied, however slightly, in some useful or desired manner” (p. 188). In summary, although Darwin did not hold a clear position concerning the possibility that domestication could have taken place in homo sapiens, he pointed to the fact that no scientific proof in favour of such a hypothesis existed, particularly, due to a lack of goal-directedness or conscious selection of traits. However, he also made clear that humans might share some characteristics typical of domesticated animals such as increased fertility.

In the biological literature following Darwin, the term “domestication” became increasingly poorly defined. The criterion of intentional and goal-directed selection, which according to Darwin’s definition was critical for domestication, was largely replaced, at least with respect to humans, by the equation of culture and civilisation with domestication. (One example of intentional goal directedness: The Harem – females selected for social position, connection to allies or subjugated nations, tameness and beauty and continually replenished with youthful baby producers. A broad “blood” base (genetic pool) was available: a veritable farm for producing “top males” for the continuation of a dynasty.

An extensive evaluation of the topic was put forward by Eugen Fischer in his essay on Die Rassenmerkmale des Menschen als Domestikationserscheinungen (“The racial characteristics of man as a result of domestication”, 1914) [12]. A couple of years later, Fischer became known for his publication of Grundriß der menschlichen Erblichkeitslehre und Rassenhygiene (“Outline of human genetics and racial hygiene”), which he edited together with Erwin Baur and Fritz Lenz in 1921 [13]; all authors later became leading authorities in Nazi eugenics and supported the legalisation of sterilisation and dismantling of welfare institutions to reinstitute the laws of natural selection [10].

( A prime human conceit that has ravaged the planet: we are so intelligent that our blunders-efforts at reshaping natural processes and entire ecologies are de facto  improvements on nature. WE ARE NOT THAT SMART!)

In his essay on the domestication of man, Fischer suggested that domestication should be defined as a condition in which “the nutrition and reproduction has been influenced over a number of generations by humans” (author’s translation). In line with these greatly relaxed definitional criteria of domestication, Fischer reasoned that humankind should be considered domesticated from the beginning of its existence. (We were never wild animals?) Fischer considered racial differences to be the result of domestication, because “almost all characteristics of human races could be found in domesticated animals, except for the low variability of the external ear and the lack of dappling of the skin or hair.” Interestingly, Fischer regarded blond hair, blue eyes, and bright skin colour of Europeans as signs of domestication-induced partial albinism, as well as, dwarfism and gigantism in some populations, racial differences concerning the disposition for obesity, temperament, character and intelligence. Even “the permanent female breast indicates domestication much like the udder of domesticated cattle” (author’s translation) [12]. However, the point that “Aryans” should be carriers of outstanding signs of domestication was apparently overlooked, a point to which I will return in the discussion. Remarkably, however, the very same attitude towards domestication and racial hygiene including support of sterilisation was also found in leading Jewish scientists such as Richard Goldschmidt, who was Professor at the Kaiser Wilhelm Institute for Biology in Berlin-Dahlem [14]. Goldschmidt argued that the abandonment of natural selection and “radical extermination of the unfit” (Goldschmidt, 1933, pp. 214; author’s translation) ought to be replaced by positive and negative eugenic measures (apparently, Goldschmidt later realised that the Nazi regime held an even more radical position regarding eugenics and was expatriated by the Nazis in 1935; he was appointed Professor of Genetics and Cytology at Berkeley, CA). Even anthropologist Franz Boas, who was not a racist and strongly opposed the Nazi regime, described curly hair, variation in stature and increasing or decreasing pigmentation of the skin as signs of human domestication, but was inconclusive about how much environmental and genetic factors contributed to these variations [15]. Thus, although Fischer and colleagues may, to a certain degree, have had an opportunistic interest in mixing scientific ideas with political claims, the association of acknowledging the self-domestication hypothesis with eugenic consequences during the 1930s was not only an issue for racist scientists. (The misconception / mixing of non-scientific social, political, and religious beliefs has not disappeared in psychology. Biological sources are sought for justification of  discrimination. These prejudices do not negate the possibility of domestication, but unfortunately, have made it a “shady” subject for study. The same problem taints psychology and its support and contributions to American Eugenics movement.) 

In the 1920s, another, entirely independent biological concept was adopted from embryology to explain human self-domestication. The Dutch anatomist Louis Bolk (1926) [16] postulated that adult humans would resemble juvenile apes, and that the retention of juvenile characteristics of the ancestral species into adulthood of the descendant, referred to as “foetalisation” or “neoteny”, could be associated with the process of domestication. For example, the zoologist Max Hilzheimer (1926/1927) argued that “the recent European should be considered the most progressively domesticated form whereas Neanderthals were much less juvenilised” (author’s translation) due to the more pronounced retention of juvenile traits in anatomically modern humans compared to Neanderthals (at that time, it was not known that Neanderthals were not ancestral to anatomically modern humans) [17]. The parallel drawn between domestication and neoteny is interesting in light of the currently resurrected debate about human self-domestication (see below).

In the 1940s Nobel laureate Konrad Lorenz’ published some speculations on the relation of human psychological capacities to the process of domestication. In his article Durch Domestikation verursachte Störungen arteigenen Verhaltens (“Domestication-induced disorders of species-typical behaviour”, published in 1940) Lorenz reiterated parallels between the living conditions of civilised inhabitants of metropolitan areas with domesticated animals, which he thought indicated signs of degeneration [18]. (The assumption of “degeneration” damaged scientific research.)

Lorenz proposed that the intensity and frequency of instinctual patterns of behaviour were altered under these conditions, leading to a hypertrophy of some instincts due to a lowered releasing threshold and to a functional disruption of species-typical behaviours. Beside the alleged domestication-associated morphological features in human beings, such as shortening of the extremities and of the base of the skull, atony of the muscles, and obesity, which he later subsumed under the term ‘Verhausschweinung’ (a term hard to translate that roughly compares the physical appearance of human beings with domesticated pigs), Lorenz described a domestication-associated diminished social sensitivity and a functional disruption of love, marriage, and the “copulation drive”. Apart from his appallingly coarse language, which conformed to the writing style of that time, Lorenz did not refrain from discussing racial hygienic consequences such as the “extermination of ethically inferior people.” Moreover, and from our perspective today virtually ridiculous, Lorenz proposed a positive selection for Anständigkeit (decency) and for the physical ideal of the ancient Greek. (As modern western “civilized” and Christian people, we applaud ourselves for having high ethical and moral standards, but what is the underlying goal of military, economic, and cultural invasion by any nation? It’s murder, rape and pillage – virtual extinction of peoples and cultures – on a massive industrial scale. “Democratization=Domestication” How many so-called primitive tribal people, religious minorities, and any “outgroup” that is labeled enemy, or any enemy at all is “cleansed” of its heritage, values beliefs and practices by military, social and corporate actions? Civilian casualties, millions of displaced refugees – hypocritically disguised as the inevitable consequence of the mysterious “fog of war.”)

By contrast, in his chapter on Psychologie und Stammesgeschichte (“psychology and epistemology”, first published in 1943) [19] Lorenz took over Arnold Gehlen’s idea that human beings were specialised in being non-specialised. Gehlen had acknowledged Bolk’s and Hilzheimer’s hypotheses as scientific proofs for his thesis of man as “Mängelwesen” (“deficient being”). Following Gehlen, Lorenz highlighted man’s lack of physiological specialisation while rejecting the hypothesis of deficiency. In contrast to his earlier exclusively negative approval, Lorenz now accepted the hypothesis of domestication-associated neoteny, which accounted for the positively asserted human “Weltoffenheit” (“cosmopolitanism”) and persisting explorative behaviour. This was new, since he now ascribed to neoteny a variety of human behavioural and psychological features in addition to his physical characteristics. Even in his later writings, however, Lorenz stuck to his culturally pessimistic attitude, while partially backing off from his writings during the Nazi regime.

Since the 1960s, both the foetalisation and the domestication hypotheses concerning humans have been refuted by various scientists. Starck (1962), for example, criticised that Bolk’s hypothesis had been so broadly accepted simply because the many problems of explaining human evolution could be resolved with apparent ease. According to Starck, hairlessness and the reduction of pigmentation of the skin (a geographic phenomen due to varying solar radiation) were more reliably explained by chance mutations rather than by foetalisation. Moreover, the retention of juvenile characters (i.e. neoteny) did not sufficiently explain the increased variation of traits under domestication [20]. In addition, Herre and Roehrs (1971) rejected the human self-domestication hypothesis for its lack of goal-directedness and artificial selection of traits; nor was there evidence for a “wild” ancestral human species from which a domesticated homo sapiens should have derived. They further argued that a reduction of instinctual patterns of behaviour in human beings could also better be explained by a more sophisticated cortical control rather than domestication [21]. (Objections based on the lack of scientific evidence at the time and the resistance to Homo sapiens the animal.)

As with many scientific ideas, these hypothesis of human self-domestication has recently been revived as a possible explanation of changes of human physical traits since the late Pleistocene changes include the reduction of body size and decrease in skeletal robusticity, modifications in cranial and dental features including reduction in cranial capacity, shortening of the facial region of the skull and maleruption of teeth, and reduction in sexual dimorphism. In contrast to earlier biological writings, other domestication-associated features observed in animals such as an increased variation in skin colour, increasing fat storage, earlier sexual maturation and activity, and reduction in motor activity are not discussed with respect to human self-domestication in recent accounts [1]. It is indeed plausible to assume that these changes could have taken place due to the creation of an artificially protective environment after humans adopted a more sedentary lifestyle in the Neolithic period, thereby relaxing natural selection pressures. (But! selection pressures changed and increased due to selection by a new urban and dietary environment that required behavioral and reproductive adaptation.  Reproduction became controlled by social customs, class barriers to reproduction partners, and selection of females for tameness.)

Similarly, the idea that foetalisation and domestication could be related has recently been highlighted in a seminal paper comparing anatomical features and behaviour of apes and humans [3]. The authors argue that changes in social structures of early humans, compared to our closest living relatives, the chimpanzee, could have favoured the selection against aggression, and that such selection was accompanied by a reduction of sexual dimorphism in humans and the retention of juvenile characteristics in body shape and behaviour. Interestingly, a parallel development has been proposed in the bonobo, which displays more neotenic physical features and is much less aggressive compared to the common chimpanzee [3].

From a biological perspective the greatest dispute with regard to physical changes in anatomically modern humans akin to domestication pertains to a slight but measurable decline of brain volume from around 1,400 cm3 to roughly 1,300 cm3, which could be interpreted in further support of the human self-domestication hypothesis. However, this decline in stature was accompanied by a reduction in body size such that the allometric brain-body relation remains unchanged [22]. In contrast to humans, domesticated animals show a large disproportionate decline of brain size by up to 30%, especially of the sensory perceptual centres, compared to their wild ancestral species, yet no such pronounced decline has convincingly been demonstrated in any human population.

We have a huge stumbling block in the investigation of self-domestication in humans: Which “human” is our wild ancestor?

Part 3 next…

How Animals Think / Review of Book by Frans de Waal

How Animals Think

A new look at what humans can learn from nonhuman minds

Alison Gopnik, The Atlantic 

Review of: Are We Smart Enough to Know How Smart Animals Are?

By Frans de Waal / Norton

For 2,000 years, there was an intuitive, elegant, compelling picture of how the world worked. It was called “the ladder of nature.” In the canonical version, God was at the top, followed by angels, who were followed by humans. Then came the animals, starting with noble wild beasts and descending to domestic animals and insects. Human animals followed the scheme, too. Women ranked lower than men, and children were beneath them. The ladder of nature was a scientific picture, but it was also a moral and political one. It was only natural that creatures higher up would have dominion over those lower down. (This view remains dominant in American thinking: “The Great Chain of Being” is still with us and underlies social reality)

Darwin’s theory of evolution by natural selection delivered a serious blow to this conception. (Unless one denies evolution)  Natural selection is a blind historical process, stripped of moral hierarchy. A cockroach is just as well adapted to its environment as I am to mine. In fact, the bug may be better adapted—cockroaches have been around a lot longer than humans have, and may well survive after we are gone. But the very word evolution can imply a progression—New Agers talk about becoming “more evolved”—and in the 19th century, it was still common to translate evolutionary ideas into ladder-of-nature terms.

MAN ILLUS

Modern biological science has in principle rejected the ladder of nature. But the intuitive picture is still powerful. In particular, the idea that children and nonhuman animals are lesser beings has been surprisingly persistent. Even scientists often act as if children and animals are defective adult humans, defined by the abilities we have and they don’t. Neuroscientists, for example, sometimes compare brain-damaged adults to children and animals.

We always should have been suspicious of this picture, but now we have no excuse for continuing with it. In the past 30 years, research has explored the distinctive ways in which children as well as animals think, and the discoveries deal the coup de grâce to the ladder of nature. (Not in psychology!)The primatologist Frans de Waal has been at the forefront of the animal research, and its most important public voice.

In Are We Smart Enough to Know How Smart Animals Are?, he makes a passionate and convincing case for the sophistication of nonhuman minds.

De Waal outlines both the exciting new results and the troubled history of the field. The study of animal minds was long divided between what are sometimes called “scoffers” and “boosters.” Scoffers refused to acknowledge that animals could think at all: Behaviorism—the idea that scientists shouldn’t talk about minds, only about stimuli and responses—stuck around in animal research long after it had been discredited in the rest of psychology. (Are you kidding? “Black Box” psychology is alive and well, especially in American education!) Boosters often relied on anecdotes and anthropomorphism instead of experiments. De Waal notes that there isn’t even a good general name for the new field of research. Animal cognition ignores the fact that humans are animals too. De Waal argues for evolutionary cognition instead.

Psychologists often assume that there is a special cognitive ability—a psychological secret sauce—that makes humans different from other animals. The list of candidates is long: tool use, cultural transmission, the ability to imagine the future or to understand other minds, and so on. But every one of these abilities shows up in at least some other species in at least some form. De Waal points out various examples, and there are many more. New Caledonian crows make elaborate tools, shaping branches into pointed, barbed termite-extraction devices. A few Japanese macaques learned to wash sweet potatoes and even to dip them in the sea to make them more salty, and passed that technique on to subsequent generations. Western scrub jays “cache”—they hide food for later use—and studies have shown that they anticipate what they will need in the future, rather than acting on what they need now.

From an evolutionary perspective, it makes sense that these human abilities also appear in other species. After all, the whole point of natural selection is that small variations among existing organisms can eventually give rise to new species. Our hands and hips and those of our primate relatives gradually diverged from the hands and hips of common ancestors. It’s not that we miraculously grew hands and hips and other animals didn’t. So why would we alone possess some distinctive cognitive skill that no other species has in any form?

De Waal explicitly rejects the idea that there is some hierarchy of cognitive abilities. (Thank-you!) Nevertheless, an implicit tension in his book shows just how seductive the ladder-of-nature view remains. Simply saying that the “lower” creatures share abilities with creatures once considered more advanced still suggests something like a ladder—it’s just that chimps or crows or children are higher up than we thought. So the summary of the research ends up being: We used to think that only adult humans could use tools/participate in culture/imagine the future/understand other minds, but actually chimpanzees/crows/toddlers can too. Much of de Waal’s book has this flavor, though I can’t really blame him, since developmental psychologists like me have been guilty of the same rhetoric.

As de Waal recognizes, a better way to think about other creatures would be to ask ourselves how different species have developed different kinds of minds to solve different adaptive problems. (And – How “different humans” have done, and continue to do, the same!) Surely the important question is not whether an octopus or a crow can do the same things a human can, but how those animals solve the cognitive problems they face, like how to imitate the sea floor or make a tool with their beak. Children and chimps and crows and octopuses are ultimately so interesting not because they are mini-mes, but because they are aliens—not because they are smart like us, but because they are smart in ways we haven’t even considered. All children, for example, pretend with a zeal that seems positively crazy; if we saw a grown-up act like every 3-year-old does, we would get him to check his meds. (WOW! Nasty comment!)

Sometimes studying those alien ways of knowing can illuminate adult-human cognition. Children’s pretend play may help us understand our adult taste for fiction. De Waal’s research provides another compelling example. We human beings tend to think that our social relationships are rooted in our perceptions, beliefs, and desires, and our understanding of the perceptions, beliefs, and desires of others—what psychologists call our “theory of mind.” (And yet horrible behavior toward other humans and animals demonstrates that AT BEST, this “mind-reading” simply makes humans better social manipulators and predators) human behavior our In the ’80s and ’90s, developmental psychologists, including me, showed that preschoolers and even infants understand minds apart from their own. But it was hard to show that other animals did the same. “Theory of mind” became a candidate for the special, uniquely human trick. (A social conceit)

Yet de Waal’s studies show that chimps possess a remarkably developed political intelligence—they are profoundly interested in figuring out social relationships such as status and alliances. (A primatologist friend told me that even before they could stand, the baby chimps he studied would use dominance displays to try to intimidate one another.) It turns out, as de Waal describes, that chimps do infer something about what other chimps see. But experimental studies also suggest that this happens only in a competitive political context. The evolutionary anthropologist Brian Hare and his colleagues gave a subordinate chimp a choice between pieces of food that a dominant chimp had seen hidden and other pieces it had not seen hidden. The subordinate chimp, who watched all the hiding, stayed away from the food the dominant chimp had seen, but took the food it hadn’t seen. (A typical anecdotal factoid that proves nothing)

Anyone who has gone to an academic conference will recognize that we, too, are profoundly political creatures. We may say that we sign up because we’re eager to find out what our fellow Homo sapiens think, but we’re just as interested in who’s on top and where the alliances lie. Many of the political judgments we make there don’t have much to do with our theory of mind. We may defer to a celebrity-academic silverback even if we have no respect for his ideas. In Jane Austen, Elizabeth Bennet cares how people think, while Lady Catherine cares only about how powerful they are, but both characters are equally smart and equally human.

The challenge of studying creatures that are so different from us is to get into their heads.

Of course, we know that humans are political, but we still often assume that our political actions come from thinking about beliefs and desires. Even in election season we assume that voters figure out who will enact the policies they want, and we’re surprised when it turns out that they care more about who belongs to their group or who is the top dog. The chimps may give us an insight into a kind of sophisticated and abstract social cognition that is very different from theory of mind—an intuitive sociology rather than an intuitive psychology.

Until recently, however, there wasn’t much research into how humans develop and deploy this kind of political knowledge—a domain where other animals may be more cognitively attuned than we are. It may be that we understand the social world in terms of dominance and alliance, like chimps, but we’re just not usually as politically motivated as they are. (Obsession with social status is so pervasive, that it DISRUPTS neurotypical ability to function!) Instead of asking whether we have a better everyday theory of mind, we might wonder whether they have a better everyday theory of politics.

Thinking seriously about evolutionary cognition may also help us stop looking for a single magic ingredient that explains how human intelligence emerged. De Waal’s book inevitably raises a puzzling question. After all, I’m a modern adult human being, writing this essay surrounded by furniture, books, computers, art, and music—I really do live in a world that is profoundly different from the world of the most brilliant of bonobos. If primates have the same cognitive capacities we do, where do those differences come from?

The old evolutionary-psychology movement argued that we had very specific “modules,” special mental devices, that other primates didn’t have. But it’s far likelier that humans and other primates started out with relatively minor variations in more-general endowments and that those variations have been amplified over the millennia by feedback processes. For example, small initial differences in what biologists call “life history” can have big cumulative effects. Humans have a much longer childhood than other primates do. Young chimps gather as much food as they consume by the time they’re 5. Even in forager societies, human kids don’t do that until they’re 15. This makes being a human parent especially demanding. But it also gives human children much more time to learn—in particular, to learn from the previous generation. (If that generation is “messed up” to the point of incompetence, the advantage disappears and disaster results – which is what we see in the U.S. today). Other animals can absorb culture from their forebears too, like those macaques with their proto-Pringle salty potatoes. But they may have less opportunity and motivation to exercise these abilities than we do.

Even if the differences between us and our nearest animal relatives are quantitative rather than qualitative—a matter of dialing up some cognitive capacities and downplaying others—they can have a dramatic impact overall. A small variation in how much you rely on theory of mind to understand others as opposed to relying on a theory of status and alliances can exert a large influence in the long run of biological and cultural evolution.

Finally, de Waal’s book prompts some interesting questions about how emotion and reason mix in the scientific enterprise. The quest to understand the minds of animals and children has been a remarkable scientific success story. It inevitably has a moral, and even political, dimension as well. The challenge of studying creatures that are so different from us is to get into their heads, to imagine what it is like to be a bat or a bonobo or a baby. A tremendous amount of sheer scientific ingenuity is required to figure out how to ask animals or children what they think in their language instead of in ours.

At the same time, it also helps to have a sympathy for the creatures you study, a feeling that is not far removed from love. And this sympathy is bound to lead to indignation when those creatures are dismissed or diminished. That response certainly seems justified when you consider the havoc that the ladder-of-nature picture has wrought on the “lower” creatures. (Just ask ASD and Asperger children how devastating this lack of “empathy” on the part of the “helping, caring fixing” industry is.)

But does love lead us to the most-profound insights about another being, or the most-profound illusions? Elizabeth Bennet and Lady Catherine would have differed on that too, and despite all our theory-of-mind brilliance, (sorry – that’s ridiculous optimism) we humans have yet to figure out when love enlightens and when it leads us astray. So we keep these emotions under wraps in our scientific papers, for good reason. Still, popular books are different, and both sympathy and indignation are in abundant supply in de Waal’s.

Perhaps the combination of scientific research and moral sentiment can point us to a different metaphor for our place in nature. Instead of a ladder, we could invoke the 19th-century naturalist Alexander von Humboldt’s web of life. We humans aren’t precariously balanced on the top rung looking down at the rest. (Tell that to all those EuroAmerican males who dictate socio-economic-scientific terms of “humans who count”) It’s more scientifically accurate, and more morally appealing, to say that we are just one strand in an intricate network of living things.

About the Author

Alison Gopnik is a professor of psychology and an affiliate professor of philosophy at UC Berkeley.

Once upon a time, I wrote prose / Walking to Sanity

I congratulate myself on becoming mature and gently old, on surmounting difficulty; understanding my fate, and letting up, letting go, but truth is, I’m a liar who has pushed the past away, across the border of my small world. Protected by miles of badland emptiness, a curtain of silence has dropped around me; the outside world doesn’t exist except at set frequencies along the electromagnetic spectrum; television, the radio, the internet, and down deep, that’s the way I want it. I crawled to this place, breathing, and no more. I walked and walked the hills, each step forcing a breath, like a respirator powered by my feet hitting the ground. If I had quit walking I would have died.

A wildlife rescue takes injured raccoons, snakes, and birds and once fixed or repaired, returns them to the wild, whatever that means. But some birds will not be birds again, living with wings broken, bent to sickening angles, improper geometry, hopping, not flying: broken into submission. Dogs travel to new homes, to live skittish, nerve-wracked, terrified, and distrustful lives; barking, scratching, insane human lives. Some animals go crazy, like a chimpanzee wrecked by cruelty, by its forced employment in labs or zoos or circuses, tortured by people whose job it is to twist and maim other beings without conscience or regret; psychologists, cosmetics-makers. Children are disobedient rats. Women redden their lips with monkey blood.

What suffering creatures know,  when subjected to human perversion, every minute of their existence, is that even if they were to be set free – they will never be free.

An old soul of a chimpanzee discovers grass, a tree, air and sky, for the first time: old, too old – just a breath of what might have been, too late, and we congratulate our compassion.

I have created my own rescue a shelter; it is very pretty, very quiet location somewhere outside of time, outside of America, my house old, pre-me, built long before I was born. Other children played in the dirt, grown by Wyoming, shaped by wind, yellow dust in their lungs, cool air sinking from summer storms, building character. There is a character that I play; the old lady on the block who gardens, tends beauty, at arms reach, under my feet, a profusion of living things tangled, overgrown, so unlike the powdery banded desert. People like my yard and my face, but they don’t know that I’m an injured animal, wings broken and limping toward the wild. Salvation is instinctual, but sanity is earned by walking, walking the world away.

 

 

Using Numbers to Express Emotion / Chat Room Chat

(My comments in olive Green) A common difficulty in communication between Asperger’s and social typicals is that specific words and word concepts (such as emotions, empathy) do not have common or shared “meanings”
This is not superficial. It reflects differences in the act of communication itself; in the (expected) intent, utility, and outcomes of communication. Social use of language often seems “Nebulous” “self referencing” “vague” and “pointless” to a Concrete visual thinker, whose brain is set to problem-solving mode against a background of logical “natural” structures.
Social typical language is About human relationships that define status on a social hierarchy – A system driven by rigid rules and yet perpetually “under negotiation” at personal, group and class  boundaries.

In essence, Asperger types and social types are not talking to each other at all, but about distinct mental “universes” that arise from very different perceptions of the environment – “reality” 

Trying to establish “contact” with Asperger’s individuals by forcing them to “reveal social-emotional states” is counterproductive; in fact it is outside our experience of reality. Trying to establish contact with social typicals by “sharing” the fascinating facts of physics, steam locomotion, forest growth cycles or geologic processes is equally hopeless.

No, I don’t have a solution. (Perhaps a sense of humor of the “absurd” kind aids tolerance, at least LOL)

Numbers / Empathy / Emotion

An Asperger chat line exchange: 

MOMBOY: My mom mentioned I don’t have much empathy. I told her it was a useless emotion. (Empathy: Is it an emotion, a behavior, a brain function, a concept, or science fiction?) I said that I do help people sometimes to make up for it, minus the emotional baggage. I told her that I usually have an empathy of 2/100. I said it peaks at 20/100. Then she laughed that I use numbers to describe it. She implied that I pull these numbers out of thin air. I feel like these numbers are ways to express approximations. Does anyone else use numbers to describe feelings? Is this funny to you?

MR FORMAL: What works for you, works for you, but when communicating with others you should consider trying to stick with known contexts (words). Otherwise you get weird looks. The point is, there is an amount of conformism that should be observed in order to cleanly operate within soceity. Humans need a consensus in many different areas in order to get along with each other. Communication is one of the most basic consensus items we use to attain this. If you are communicating in another language (foreign, numerical, or invented) then you will be misunderstood by the majority around you. Predict your future if you stick with describing yourself in unconventional ways while in the company of others. The case for social conformity, at least in public.

MOMBOY: You make a good point. However, this is my mother so I can get away with it.  And saying “I don’t possess empathy for other humans and you’re not smart enough to understand why” would not score me any more social points than my fractions would. The use of numbers to disguise true feelings and thoughts?

MS LOGIC: It depends on how you think. Being left brained I always use numbers to describe things, but most people don’t. Being a person of math most things are approximations, only theorems are absolute after all. Approximations = shades of emotion.

MOMBOY: When I think of something that has a quantitative value (amount of empathy or something), I see a glass of liquid and I guage how full it is. Numeric is then the most logical way. Visual conversion of quantity.

MS LOGIC: I don’t see a glass of liquid, but I agree that numbers tend to be the most logical representation. That way you can organize things based on situation. Just to clarify, my give-a-shit meter works on a percentage scale. Your fraction based system is very similar though. (Fractions are percentages!) Visualizing quantities – numbers.

MR FORMAL: A perfect example of communication building based on associating language with context. Keep this up and you guys will be speaking half in numbers and half in words.

MOMBOY: I don’t use the numbers in speech unless someone asks me how much empathy I feel for a certain situation. That has yet to happen, so for now my mind just uses the numbers as markers. The numbers gauge things comparatively against what you think about other events, not comparing what you think to what others think.

MR FORMAL: You misunderstand me, however, I was thinking along the lines of you both developing a new way of expressing something through a hybridization of an old language and a new element. And I was running a social simulation in my head where human language used numbers instead of words to describe feelings. Creative compromise.

NEWBOY: The numbers are useless without context. 1/100 could be referencing happiness, grief, or relief. I don’t think a language based entirely off of numbers is practical. Unless of course you change certain numbers to be words. As in when I say “5” it holds the same meaning as the word empathy. That would be rather clumsy though. There are languages based on numbers – they’re called CODES.

MOMBOY: There is little difference between how we use numbers and how other people use terms such as a lot, somewhat, strongly, very etc…

NEWGIRL: We just use the numbers because they are more concise and pleasing to us. In reality there is almost no difference between these two “I strongly empathize with XXX” and “I empathize 90% with XXX” Even though to some it may sound foreign to some people.

MOMBOY: Exactly. Numbers allow thoughts to be clarified in ways that are often notable. By supporting someone’s actions 85 percent (instead of saying support them greatly), the 15 percent left speaks volumes. It leaves room for a lecture, (fudging between saying what you think and limiting the social blowback?) yet still conveys that you are not in terrible opposition to the person’s actions; thus they needn’t be too troubled by your critique. Numbers (in the right context of course) can say a lot to people.

MS. DAISY: Makes perfect sense to me. I usually either feel an emotion or I don’t, so I don’t think that quantifying them would help me very much. With that in mind, though, I think it’s a very useful concept, at least for introspection. I doubt most people would appreciate the numeric representation of emotions, as it probably comes off as being a bit cold.

MOMBOY: The touchy feely types may not appreciate such a mathematical approach to detecting emotion.

JOINER: To me, emotion is like a smoke or a fog that moves a bit like liquid. Emotion is very ethereal. Feeling emotion is like a mystical treasure. But deciding how important it is? That must done with the most precise logic. Very visual experience – of a physical state.

MR MATH: I describe feelings with numbers most of the time as well because it’s easier for me to explain feelings this way. I suppose it just depends on whom you’re talking to whether they’ll appreciate it or not. Most of the people in my life have gotten used to it. I can explain my interest in someone in terms of “Its 10% friendship, 20% …” Having to translate physical feeling (emotion) into numbers in order to describe it.

MR FORMAL: I have a couple of published papers on quantification of soft cognition: beliefs, hunches, biases, assumptions, uncertainty, emotional mood, etc. This is for research related to applications in artificial intelligence. Some of my recent research is based upon a “calculus” I have developed – Bias-Based Reasoning, which mathematizes mental percepts.

MS Daisy: Sometimes I do quantify a feeling in terms of how much i would spend on something. (Me) This I can relate to: if I want to limit calories to 1200 / day, the calories automatically convert to dollars and  I “spend them” on food. It’s very different to think of a 400 calorie chocolate bar costing $4.00 out of a $12.00 budget, which shows that it’s not a “bargain.”

MR MATH: My natural tendency is to use numbers. I think in terms of a horizontal line with 100% at one side and 0% at the other. I have learned to edit out the % with most people as I feel I come across very nerdy and I know most people don’t relate to it. Visual conversion of numbers. This is familiar to me as a visual thinker.

NEW GIRL: I would say my empathy can get to 8/10 at extreme conditions, usually falls around a steady 0.37/10.00 and coasts up to 3/10 sometimes. A bit wacky, but perhaps charming. LOL!

Recent History of Socio-Political Anthropology Battles / Important

From Natural History Magazine:

Remembering Stephen Jay Gould

http://www.naturalhistory.com/perspectives/3024131/remembering-stephen-jay-gould

Human evolution was not a special case of anything.

By Ian Tattersall

For long-time readers of Natural History, Stephen Jay Gould needs no introduction. His column, “This View of Life,” was a mainstay of the magazine, starting in January 1974 with “Size and Shape” and concluding with the 300th installment, “I Have Landed,” in the December 2000/January 2001 issue. What made his columns so popular was not just Gould’s range of chosen topics, but also the way he regularly allowed himself to be carried away on any tangent that he found interesting.

Gould died on May 20, 2002. Last spring, on the tenth anniversary of his death, I was invited to join other scholars at a commemorative meeting in Venice organized by the Istituto Veneto di Scienze, Lettere ed Arti in collaboration with the Università Ca’ Foscari. It fell to me, as an anthropologist, to talk about Gould’s intellectual legacy to anthropology. Gould was, of course, anything but a primate specialist. But as it happens, in 1974, the year Gould started writing “This View of Life,” he and I were both invited to attend a specialized meeting on “Phylogeny of the Primates: An Interdisciplinary Approach.” Even at that early stage in his career, I learned, the reach of his writings had broadened well beyond his realms of invertebrate paleontology (he was a fossil-snail expert) and evolutionary theory. He came to address the roles of ontogeny (development of the individual) and neoteny (the evolutionary retention of juvenile traits in adults) in human evolution. What I personally found most interesting, however, was his preprint for the conference, which contained, among much else, a virtuoso canter through the history of human evolutionary studies. He effortlessly displayed mastery of a huge literature on a scale that many professional paleoanthropologists fail to achieve in entire academic lifetimes.

Despite a paucity of strictly technical contributions, there can be no doubt that Gould’s influence on anthropology, and on paleoanthropology in particular, was truly seminal. Foremost among such influences was his 1972 collaboration with Niles Eldredge in developing and publicizing the notion of “punctuated equilibria,” the view that species typically remain little changed during most of their geological history, except for rapid events when they may split to give rise to new, distinct species. This breakthrough enabled paleoanthropologists, like other paleontologists, to treat the famous “gaps” in the fossil record as information, a reflection of how evolution actually proceeded.

Similarly, it was Gould who, in collaboration with Yale paleontologist Elisabeth S. Vrba (then at the Transvaal Museum in Pretoria, South Africa), emphasized that an anatomical or behavioral trait that evolved to serve one function could prove a handy adaptation for an entirely unanticipated one—and that the term exaptation was a better name for this phenomenon than preadaptation, which implied some kind of inherent tendency for a species to follow a certain evolutionary path. Anthropologists were forced to recognize exaptation as an essential theme in the history of innovation in the human family tree.

Speaking of trees, I am convinced that Gould’s most significant contribution to paleoanthropology was his insistence, from very early on, that the genealogy of human evolution took the form of a bush with many branches, rather than a ladder, or simple sequence of ancestors and descendants. As he wrote in his April 1976 column, “Ladders, Bushes, and Human Evolution”:

“I want to argue that the ‘sudden’ appearance of species in the fossil record and our failure to note subsequent evolutionary change within them is the proper prediction of evolutionary theory as we understand it. Evolution usually proceeds by “speciation”—the splitting of one lineage from a parental stock—not by the slow and steady transformation of these large parental stocks. Repeated episodes of speciation produce a bush.”

Before World War II, paleoanthropologists had overwhelmingly been human anatomists by background, with little interest in patterns of diversity in the wider living world. And having been trained largely in a theoretical vacuum, the postwar generation of paleoanthropologists was already exapted to capitulate when, at exact midcentury, the biologist Ernst Mayr told them to throw away nearly all the many names they had been using for fossil hominids. Mayr replaced this plethora, and the diversity it had suggested, with the idea that all fossil hominids known could be placed in a single sequence, from Homo transvaalensis to Homo erectus and culminating in Homo sapiens.

There was admittedly a certain elegance in this new linear formulation; but the problem was that, even in 1950, it was not actually supported by the material evidence. And new discoveries soon made not only most paleoanthropologists but even Mayr himself—grudgingly, in a footnote—concede that at least one small side branch, the so-called “robust” australopithecines, had indeed existed over the course of human evolution. But right up into the 1970s and beyond, the minimalist mindset lingered. Gould’s was among the first—and certainly the most widely influential —voices raised to make paleoanthropologists aware that there was an alternative.

In his “Ladders, Bushes, and Human Evolution” column, Gould declared that he wanted “to argue that Australopithecus, as we know it, is not the ancestor of Homo; and that, in any case, ladders do not represent the path of evolution.” At the time, both statements flatly contradicted received wisdom in paleoanthropology. And while in making the first of them I suspect that Gould was rejecting Australopithecus as ancestral to Homo as a matter of principle, his immediate rationale was based on the recent discovery, in eastern Africa, of specimens attributed to Homo habilis that were just as old as the South African australopithecines.

Later discoveries showed that Gould had been hugely prescient. To provide some perspective here: In 1950, Mayr had recognized a mere three hominid species. By 1993, I was able to publish a hominid genealogy containing twelve. And the latest iteration of that tree embraces twenty-five species, in numerous coexisting lineages. This was exactly what Gould had predicted. In his 1976 article he had written: “We [now] know about three coexisting branches of the human bush. I will be surprised if twice as many more are not discovered before the end of the century.”

Indeed, his impact on the paleoanthropological mindset went beyond even this, largely via his ceaseless insistence that human beings have not been an exception to general evolutionary rules. Before Gould’s remonstrations began, one frequently heard the term “hominization” bandied about, as if becoming human had involved some kind of special process that was unique to our kind. Gould hammered home the message that human evolutionary history was just like that of other mammals, and that we should not be looking at human evolution as a special case of anything.

Of course, Gould had ideas on particular issues in human paleontology as well, and he never shrank from using his Natural History bully pulpit to voice his opinions. Over the years he issued a succession of shrewd and often influential judgments on subjects as diverse as the importance of bipedality as the founding hominid adaptation; the newly advanced African “mitochondrial Eve”; hominid diversity and the ethical dilemmas that might be posed by discovering an Australopithecus alive today; sociobiology and evolutionary psychology (he didn’t like them); the relations between brain size and intelligence; neoteny and the retention of juvenile growth rates into later development as an explanation of the unusual human cranial form; and why human infants are so unusually helpless.

(Removed here; a narrative about the search for who had perpetrated the Piltdow Man hoax)

Gould’s devotion to the historically odd and curious, as well as his concern with the mainstream development of scientific ideas, is also well illustrated by his detailed account of the bizarre nineteenth-century story of Sarah “Saartjie” Baartman. Dubbed the “Hottentot Venus,” Baartman was a Khoisan woman from South Africa’s Western Cape region who was brought to Europe in 1810 and widely exhibited to the public before her death in 1815. Gould’s publicizing of the extraordinary events surrounding and following Baartman’s exhibition may or may not have contributed to the repatriation in 2002 of her remains from Paris to South Africa, where they now rest on a hilltop overlooking the valley in which she was born. But what is certain is that Gould’s interest in this sad case also reflected another of his long-term concerns, with what he called “scientific racism.”

Principally in the 1970s—when memories of the struggle for civil rights in the United States during the previous decade were still extremely raw—Gould devoted a long series of his columns to the subject of racism, as it presented itself in a whole host of different guises. In his very first year of writing for Natural History, he ruminated on the “race problem” both as a taxonomic issue, and in its more political expression in relation to intelligence. He even made the matter personal, with a lucid and deeply thoughtful demolition in Natural History of the purportedly scientific bases for discrimination against Jewish immigrants to America furnished by such savants as H. H. Goddard and Karl Pearson.

Gould also began his long-lasting and more specific campaign against genetic determinism, via a broadside against the conclusions of Arthur Jensen, the psychologist who had argued that education could not do much to level the allegedly different performances of various ethnic groups on IQ tests. And he began a vigorous and still somewhat controversial exploration of the historical roots of “scientific racism” in the work of nineteenth-century embryologists such as Ernst Haeckel and Louis Bolk.

But Gould’s most widely noticed contribution to the race issue began in 1978, with his attack in Science on the conclusions of the early-nineteenth century physician and craniologist Samuel George Morton, whom he characterized rather snarkily as a “self-styled objective empiricist.” In three voluminous works published in Philadelphia between 1839 and 1849—on Native American and ancient Egyptian skulls, and on his own collection of more than 600 skulls of all races—the widely admired Morton had presented the results of the most extensive study ever undertaken of human skulls. The main thrust of (Morton’s) study had been to investigate the then intensely debated question of whether the various races of humankind had a single origin or had been separately created. Morton opted for polygeny, or multiple origins, a conclusion hardly guaranteed to endear him to Gould. Along the way, Morton presented measurements that showed, in keeping with prevailing European and Euro-American beliefs on racial superiority, that Caucasians had larger brains than American “Indians,” who in turn had bigger brains than “Negroes” did. (Cranial-brain size DOES NOT correlate to intelligence)

After closely examining Morton’s data, Gould characterized the Philadelphia savant’s conclusions as “a patchwork of assumption and finagling, controlled, probably unconsciously, by his conventional a priori ranking (his folks on top, slaves on the bottom).” He excoriated Morton for a catalog of sins that included inconsistencies of criteria, omissions of both procedural and convenient kinds, slips and errors, and miscalculations. And although in the end he found “no indication of fraud or conscious manipulation,” he did see “Morton’s saga” as an “egregious example of a common problem in scientific work.” As scientists we are all, Gould asserted, unconscious victims of our preconceptions, and the “only palliations I know are vigilance and scrutiny.”

That blanket condemnation of past and current scientific practice was a theme Gould shortly returned to, with a vengeance, in his 1981 volume The Mismeasure of Man. Probably no book Gould ever wrote commanded wider attention than did this energetic critique of the statistical methods that had been used to substantiate one of his great bêtes noires, biological determinism. This was (is) the belief, as Gould put it, that “the social and economic differences between human groups—primarily races, classes, and sexes—arise from inherited, inborn distinctions and that society, in this sense, is an accurate reflection of biology.”

We are still plagued by this pseudo-scientific “justification” of poverty and inequality; of misogyny and abuse of “lesser humans” by the Human Behavior Industries. Remember, this is very recent history, and the forces of social “control and abuse” are very much still with us.  

It is alarming that the revolution in DNA / genetic research has shifted the “means” of this abuse of human beings into a radical effort to “prove” that socially-created and defined “human behavior pathologies” are due to genetic determinism. The race is on to “prove” that genetic defects, rather than hidden social engineering goals, underlie “defective behavior and thinking” as dictated by closet eugenicists. Racism and eugenics are being pursued in the guise of “caring, treating and fixing” socially “defective” peoples. Genetic engineering of embryos is already in progress

SEE POST August 11, 2017: First Human Embryos ‘Edited’ in U.S. / 7 billion humans not consulted

In Mismeasure, Gould restated his case against Morton at length, adding to the mix a robust rebuttal of methods of psychological testing that aimed at quantifying “intelligence” as a unitary attribute. One of his prime targets was inevitably Arthur Jensen, the psychologist he had already excoriated in the pages of Natural History for Jensen’s famous conclusion that the Head Start program, designed to improve low-income children’s school performance by providing them with pre-school educational, social, and nutritional enrichment, was doomed to fail because the hereditary component of their performance—notably that of African American children—was hugely dominant over the environmental one. A predictable furor followed the publication of Mismeasure, paving the way for continuing controversy during the 1980s and 1990s on the question of the roles of nature versus nurture in the determination of intelligence.

This issue of nature versus nurture, a choice between polar opposites, was of course designed for polemic, and attempts to find a more nuanced middle ground have usually been drowned out by the extremes. So it was in Gould’s case. An unrepentant political liberal, he was firmly on the side of nurture. As a result of his uncompromising characterizations of his opponents’ viewpoints, Gould found himself frequently accused by Jensen and others of misrepresenting their positions and of erecting straw men to attack.

Yet even after Mismeasure first appeared, the climax of the debate was yet to come. In 1994, Richard Herrnstein and Charles Murray published their notorious volume, The Bell Curve: Intelligence and Class Structure in American Life. At positively Gouldian length, Herrnstein and Murray gave a new boost to the argument that intelligence is largely inherited, proclaiming that innate intelligence was a better predictor of such things as income, job performance, chances of unwanted pregnancy, and involvement in crime than are factors such as education level or parental socioeconomic status. They also asserted that, in America, a highly intelligent, “cognitive elite” was becoming separated from the less intelligent underperforming classes, and in consequence they recommended policies such as the elimination of what they saw as welfare incentives for poor women to have children.

Eugenics has never died in American Science; it remains an underestimated force in the shaping of “what do about unacceptable humans”. It is neither a liberal nor conservative impulse: it is a drive within elites to control human destiny.

To Gould such claims were like the proverbial red rag to a bull. He rapidly published a long review essay in The New Yorker attacking the four assertions on which he claimed Herrnstein and Murray’s argument depended. In order to be true, Gould said, Herrnstein and Murray’s claims required that that what they were measuring as intelligence must be: (1) representable as a single number; (2) must allow linear rank ordering of people; (3) be primarily heritable; and (4) be essentially immutable. None of those assumptions, he declared, was tenable. And soon afterward he returned to the attack with a revised and expanded edition of Mismeasure that took direct aim at Herrnstein and Murray’s long book.

There can be little doubt that, as articulated in both editions of Mismeasure, Gould’s conclusions found wide acceptance not only among anthropologists but in the broader social arena as well. But doubts have lingered about Gould’s broad-brush approach to the issues involved, and particularly about a penchant he had to neglect any nuance there might have been in his opponents’ positions. Indeed, he was capable of committing in his own writings exactly the kinds of error of which he had accused Samuel Morton—ironically, even in the very case of Morton himself.

In June 2011, a group of physical anthropologists led by Jason Lewis published a critical analysis of Gould’s attacks on Morton’s craniology. By remeasuring the cranial capacities of about half of Morton’s extensive sample of human skulls, Lewis and colleagues discovered that the data reported by Morton had on the whole been pretty accurate. They could find no basis in the actual specimens themselves for Gould’s suggestion that Morton had (albeit unconsciously) overmeasured European crania, and under-measured African or Native American ones. What’s more, they could find no evidence that, as alleged by Gould, Morton had selectively skewed the results in various other ways.

The anthropologists did concede that Morton had attributed certain psychological characteristics to particular racial groups. But they pointed out that, while Morton was inevitably a creature of his own times, he (Morton) had done nothing to disguise his racial prejudices or his polygenist sympathies. And they concluded that, certainly by prevailing standards, Morton’s presentation of his basic data had been pretty unbiased. (WOW! What an indictment of current Anthropology) What is more, while they were able to substantiate Gould’s claim that Morton’s final summary table of his results contained a long list of errors, Lewis and colleagues also found that correcting those errors would actually have served to reinforce Morton’s own declared biases. And they even discovered that Gould had reported erroneous figures of his own.

These multiple “errors” DO NOT cancel each other out: this is a favorite social typical strategy and magical belief – Present the contradictions from “each side” and reach a “socially acceptable” deadlock. No discussion is possible past this point. The American intellectual-cultural-political environment is trapped in this devastating “black and white, either or, false concept of “problem-solving”. Nothing can be examined; facts are removed to the “supernatural, word-concept domain” and become “politicized” – weapons of distortion in a socio-cultural landscape of perpetual warfare. In the meantime, the population is pushed to either extreme. This is where we are TODAY and this “warfare” will destroy us from within, because the hard work of running a nation is not being done.

It is hard to refute the authors’ conclusion that Gould’s own unconscious preconceptions colored his judgment. Morton, naturally enough, carried all of the cultural baggage of his time, ethnicity, and class. But so, it seems, did Gould. And in a paradoxical way, Gould had proved his own point. Scientists are human beings, and when analyzing evidence they always have to be on guard against the effects of their own personal predilections.

And of the domination and control of their professions by the “elite and powerful” who promote a racist-eugenic social order and control how their work is “messaged” and used to achieve socioeconomic and biological engineering goals – worldwide.


 

First Human Embryos ‘Edited’ in U.S. / 7 billion humans not consulted

The work, which removed a gene mutation linked to a heart condition, is fueling debate over the controversial tool known as CRISPR. Two days after being injected with a gene-editing enzyme, these developing human embryos were free of a disease-causing mutation.

Two Words: “Unintended Consequences”

By Erin Blakemore

PUBLISHED by National Geographic, August 2, 2017

What if you could remove a potentially fatal gene mutation from your child’s DNA before the baby is even born? In an advance that’s as likely to raise eyebrows as it is to save lives, scientists just took a big step toward making that possible.

For the first time, researchers in the United States have used gene editing in human embryos. As they describe today in the journal Nature, the team used “genetic scissors” called CRISPR-Cas9 to target and remove a mutation associated with hypertrophic cardiomyopathy, a common inherited heart disease, in 42 embryos.

DNA Hacking Tool Enables Shortcut to Evolution

Scientists who want to explore the technique hail it as a biomedical advance that could one day give people the option not to pass down heritable diseases. The tool could also reduce the number of embryos that are discarded during fertility treatments because of worrisome genetic mutations.

“The scientists are out of control,” says George Annas, director of the Center for Health Law, Ethics & Human Rights at the Boston University School of Public Health, who thinks that scientists should not edit the genomes of human embryos for any reason. “They want to control nature, but they can’t control themselves.”

Healing Hearts

According to the Centers for Disease Control and Prevention, hypertrophic cardiomyopathy occurs in about one in 500 people. The condition causes the heart muscle to thicken and can lead to sudden cardiac arrest. It takes only one gene mutation to cause the condition, and you can get the disease even if only one of your parents has the mutated gene. If you inherit it, there’s a 50 percent chance you will pass it on to your children.

For their work, Shoukhrat Mitalipov, principal investigator at the Oregon Health and Science University’s Center for Embryonic Cell and Gene Therapy, and his colleagues targeted the genetic mutations that cause the majority of hypertrophic cardiomyopathy cases.

First, they created 58 human embryos from the sperm of a male donor with the mutation and the egg of a female without the mutation. Then, they used CRISPR to cut the mutation out of the gene. When things go right, the DNA repairs itself and the mutation disappears.

The technique isn’t always successful. In previous studies, some CRISPR-edited embryos developed mosaicism, a condition in which some cells have the unwanted mutations and others don’t. All in all, the team was able to repair the gene mutation in about 70 percent of the embryos, and the study showed no unwanted changes at other sites in the edited DNA.

The team allowed the fertilized cells to develop into blastocysts—the stage at which embryos are usually implanted into the mother during fertility treatments. They showed normal development, the team reports. Then, the embryos were destroyed.

Science in Motion

“Of course further research and ethical discussions are necessary before proceeding to clinical trials,” study coauthor Paula Amato, adjunct associate professor of obstetrics and gynecology at OHSU, said during a press briefing on August 1.

Earlier this year, the National Academy of Sciences and National Academy of Medicine asked an international committee of scientists and ethicists to weigh in on the benefits and risks of genome editing in humans. (Find out why scientists think gene editing is both terrifying and terrific.)

The panel recommended that in the case of the human germline—genes passed down from generation to generation—scientists refrain from editing genes for any purpose other than treating or preventing a disease or disability. (No more neurodiverse people will be allowed to be born? Where does this “elimination” of genetic diversity end?) The report also insisted on a more robust public debate before such experiments begin. (Oh sure, as if anyone in power will listen to the “peasants” at the bottom of the pyramid)

In the United States, there’s currently a ban on using taxpayer funds for any research that destroys human embryos. In this case they used institutional and private funds. If the team can’t move ahead as quickly as desired in the U.S., (blackmail?) they’ll consider pursuing their research in other countries.

Of course, poor and developing countries need the money and will sacrifice their embryos, and the “outcomes” for the population, good or bad, gladly…)

Debating the Future

It’s already possible to screen for genetic defects within embryos during in vitro fertilization using a process called preimplantation genetic diagnosis. The team thinks their CRISPR technique could eventually be applied to gene mutations associated with other diseases, like cystic fibrosis.

In their paper, the team writes that their method may one day “rescue mutant embryos, increase the number of embryos available for transfer and ultimately improve pregnancy rates (for the Elites who can $$$$ pay for it)

“That’s just absurd,” says Annas. “They admit right up front that if you want to avoid having a baby with [the mutation], you can just not implant the embryos that are affected.”

Mitalipov disagrees: “Discarding half the embryos is morally wrong,” he tells National Geographic. “We need to be more proactive.”

The embryos in this experiment were destroyed!

This is unbelievable: taking over the future evolution of our species, is “morally” right? Nature provides genetic variation which is necessary for organisms to adapt to a changing environment – we are not smart enough to interfere with this process.

Has anyone asked 7 billion  humans if they think it’s a slam dunk “moral good” for a handful of scientists, who obviously are not the least concerned about morality or ethics, to just “go ahead” and terminate 3.5 billion years of evolution?

Either way, Annas says, it’s time to revisit the conversation about how to regulate CRISPR in the United States. “My guess is that the regulators will be horrified.” (Until $$$$ decides the issue)

But for Mitalipov, the debate is a chance to inform the world about the technique’s potential. And for a scientist who has also cloned monkey embryos and even cloned human embryos to make stem cells, he knows plenty about how to ignite public debate.

“We’ll push the boundaries,” he says.

Of course, “the rest of” Homo sapiens just don’t count; we have no choice but to submit to a future dictated by individuals who regard themselves as “GODS” The track record of “ethical nightmares” in human recent history (Eugenics, genocide, the Holocaust, chemical and biological warfare, and other mass murder of “defectives” – who are simply people of another race, religion, or ethnicity) has proven over and over that the power-insane “Male Psychopaths” at the top of the social pyramid are destroyers of Nature.

 

 

 

 

 

 

 

Emotional Communication Dog – Human / V. KONOK

imagesCBV5W1IJ

Like most humans, I tried using words to communicate with my dogs. Then I tried the “growls” and  other vocalizations that wolves use to communicate to pups and each other. What a difference!

Emotional communication between dogs and humans

(PDF: Can’t get a direct link to work:  google – emotional communication between dogs and people konok)

Veronika Konok / Department of Ethology, Eötvös Loránd University, Budapest, Hungary 2014

Although emotions are commonly studied in psychology, there is still confusion even in the definition of emotion and there are many competing theories regarding e.g. the components, the function or the emergence of emotion. In 1981 Kleinginna and Kleinginna collected 92 definitions from the literature, and even when they classified them on the basis of the emotional phenomena or the theoretical issue they emphasized, they got 11 different categories. This confusion may partly be attributed to the fact that researchers have focused on different components of the emotional reaction such as expression, behavior or physiology. Perhaps the complexity of the phenomenon which the former facts illustrate makes the definition and the modeling of emotion such difficult.

If we look at the word’s etymology we see that the term “emotion” dates back to 1579, when it was adapted from the French word émouvoir (“to stir up”). However, synonyms of the word likely date back to the very origins of language (Merriam-Webster, 2004).

Emotions were the subject of reflections of not lesser philosophers than Aristotle, Plato, Descartes, Spinoza or Hume. From the perspective of natural science, and ethology, the most important early impact has to be assigned to Darwin. Due to him emotions became no longer seen as dysfunctional, in the sense as something to reject or control (as some philosophers thought, e.g. Plato or Hume), but instead as something being functional and essential for survival (Kappas, 2002). He stated that signals in humans and animals are reflections of their internal state, and he also put an emphasis on the social and communicative function of emotions (Darwin, 1872).

In psychology, after famous debates whether emotion or the physiological arousal is the first (James-Lange vs. Cannon-Bard theories; James, 1890; Lange, 1885; Cannon, 1929, 1931), or whether emotions are results of a cognitive evaluation of general physiological arousal (e.g. Schacter and Singer, 1962) or they have distinctive autonomic patterns (e.g. Alexander, 1950), today a more or less general consensus is formed that emotion is a complex phenomenon (an umbrella concept) consisting of more components: expressive behavior, cognition, autonomic nervous system activity and subjective experience (e.g. Scherer, 1984; Panskepp, 2005; Plutchik, 2001).

1.2 A suitable definition

We chose Izard’s definition of emotion because it emphasizes many aspects of emotions that we will discuss in the following sections. Firstly, it mirrors the functional-evolutionary approach of emotion, secondly, it could be applied to animals and humans as well, and finally, it emphasizes the complexity of the phenomenon (different components): emotions are specific neuropsychological phenomena, shaped by natural selection (and I would add, by socio-cultural selection), that organize and motivate physiological, cognitive, and action patterns that facilitate adaptive responses to the vast array of demands and opportunities in the environment” (Izard, 1992, p561).

___________________________

In a previous post I discussed the Asperger reluctance or inability to VERBALLY describe our reactions to the environment to other people, who essentially demand that we “copy, adopt or conform to a word-based neurotypical social scheme. This “defect” is to many Aspergers, simply a blank phenomenon, or a “none of your business” privacy issue. My opinion is that “emotion words” EXIST as a social tool that teaches children to suppress and transform raw physiological reactions to their environment into “socially identifiable and approved” forms.

The fundamental fight or flight response is defused and redirected by using hundreds of “social emotions words” to transform and control the child’s natural animal behavior. This “social version” of “how humans are supposed to be” carries over into adulthood – people absolutely BELIEVE that words like cheerful, eager, and “self-esteem” are names of discrete physiology; that if the brain is “investigated” there are “clusters of neurons” labeled “happy, sad” etc. for every emotion imaginable: If this grossly inefficient scheme were the case, the brain would not have the energy nor capacity to perform its necessary vital functions. These “emotions” are socially-prescribed “feeling words” that are meant to promote social behavior by controlling instinctual reactions to the environment.

One obvious use of this “word control” is to convert healthy self-preservation (fight or flight) into “emotions” called shame, self-hatred, and self-loathing (a tactic common to many cultures) and by these to threaten the child with rejection and abandonment – which is a terrible thing to do to children. How often do we hear, “Nice little boys don’t fight back; boys don’t express emotion.” “No one wants to be friends with a boy who gets angry and hits other children” and “Nice little girls don’t use ‘bad words’ and never get angry with anyone; they always look cheerful and love everyone and try to please other people.” The translation is, if you don’t control yourself and obey me, I won’t love or care for you (= death.)

The reason that Asperger types have trouble describing our reactions (emotions) using word descriptions, is not that we don’t react to the environment, but we are literally shocked and astounded by “socially-constructed states of belief” about behavior; we are concerned with concrete external reality. The obsession with outlandish emotional displays over trivial “social status” control and competition leaves us feeling BLANK and / or astonished.  (And offended)

Our “emotions” are not tied to the chaotic, unstable and manipulative social arena, but to a set of values and principles that guide human interaction with others and nature. This is fundamental to our way of being in the world. __________________________________________________________________________________________

SEE ALSO:  http://dx.doi.org/10.1016/j.applanim.2014.11.003

Search for articles by this author 

 

 

Debunking Left Brain, Right Brain Myth / Plos Paper – Corbalis

Left Brain, Right Brain: Facts and Fantasies

Michael C. Corballis, Affiliation School of Psychology, University of Auckland, Auckland, New Zealand

Published: January 21, 2014

https://doi.org/10.1371/journal.pbio.1001767 )open access. See original for more.

Summary

Handedness and brain asymmetry are widely regarded as unique to humans, and associated with complementary functions such as a left-brain specialization for language and logic and a right-brain specialization for creativity and intuition. In fact, asymmetries are widespread among animals, and support the gradual evolution of asymmetrical functions such as language and tool use. Handedness and brain asymmetry are inborn and under partial genetic control, although the gene or genes responsible are not well established. Cognitive and emotional difficulties are sometimes associated with departures from the “norm” of right-handedness and left-brain language dominance, more often with the absence of these asymmetries than their reversal.

Evolution of Brain Asymmetries, with Implications for Language

One myth that persists even in some scientific circles is that asymmetry is uniquely human [3]. Left–right asymmetries of brain and behavior are now known to be widespread among both vertebrates and invertebrates [11], and can arise through a number of genetic, epigenetic, or neural mechanisms [12]. Many of these asymmetries parallel those in humans, or can be seen as evolutionary precursors. A strong left-hemispheric bias for action dynamics in marine mammals and in some primates and the left-hemisphere action biases in humans, perhaps including gesture, speech, and tool use, may derive from a common precursor [13]. A right-hemisphere dominance for emotion seems to be present in all primates so far investigated, suggesting an evolutionary continuity going back at least 30 to 40 million years [14]. A left-hemisphere dominance for vocalization has been shown in mice [15] and frogs [16], and may well relate to the leftward dominance for speech—although language itself is unique to humans and is not necessarily vocal, as sign languages remind us. Around two-thirds of chimpanzees are right-handed, especially in gesturing [17] and throwing [18], and also show left-sided enlargement in two cortical areas homologous to the main language areas in humans—namely, Broca’s area [19] and Wernicke’s area [20] (see Figure 1). These observations have been taken as evidence that language did not appear de novo in humans, as argued by Chomsky [21] and others, but evolved gradually through our primate lineage [22]. They have also been interpreted as evidence that language evolved not from primate calls, but from manual gestures [23][25].

Some accounts of language evolution (e.g., [25]) have focused on mirror neurons, first identified in the monkey brain in area F5 [26], a region homologous to Broca’s area in humans, but now considered part of an extensive network more widely homologous to the language network [27]. Mirror neurons are so called because they respond when the monkey performs an action, and also when they see another individual performing the same action. This “mirroring” of what the monkey sees onto what it does seems to provide a natural platform for the evolution of language, which likewise can be seen to involve a mapping of perception onto production. The motor theory of speech perception, for example, holds that we perceive speech sounds according to how we produce them, rather than through acoustic analysis [28]. Mirror neurons in monkeys also respond to the sounds of such physical actions as ripping paper or dropping a stick onto the floor, but they remain silent to animal calls [29]. This suggests an evolutionary trajectory in which mirror neurons emerged as a system for producing and understanding manual actions, but in the course of evolution became increasingly lateralized to the left brain, incorporating vocalization and gaining grammar-like complexity [30]. The left hemisphere is dominant for sign language as for spoken language [31].

Mirror neurons themselves have been victims of hyperbole and myth [32], with the neuroscientist Vilayanur Ramachandran once predicting that “mirror neurons will do for psychology what DNA did for biology” [33]. As the very name suggests, mirror neurons are often taken to be the basis of imitation, yet nonhuman primates are poor imitators. Further, the motor theory of speech perception does not account for the fact that speech can be understood by those deprived of the ability to speak, such as those with damage to Broca’s area. Even chimpanzees [34] and dogs [35] can learn to respond to simple spoken instructions, but cannot produce anything resembling human speech. An alternative is that mirror neurons are part of a system for calibrating movements to conform to perception, as a process of learning rather than direct imitation. A monkey repeatedly observes its hand movements to learn to reach accurately, and the babbling infant calibrates the production of sounds to match what she hears. Babies raised in households where sign language is used “babble” by making repetitive movements of the hands [36]. Moreover, it is this productive aspect of language, rather than the mechanisms of understanding, that shows the more pronounced bias to the left hemisphere [37].

Inborn Asymmetries

Handedness and cerebral asymmetries are detectable in the fetus. Ultrasound recording has shown that by the tenth week of gestation, the majority of fetuses move the right arm more than the left [38], and from the 15th week most suck the right thumb rather than the left [39]—an asymmetry strongly predictive of later handedness [40] (see Figure 2). In the first trimester, a majority of fetuses show a leftward enlargement of the choroid plexus [41], a structure within the ventricles known to synthesize peptides, growth factors, and cytokines that play a role in neurocortical development [42]. This asymmetry may be related to the leftward enlargement of the temporal planum (part of Wernicke’s area), evident at 31 weeks [43].

 In these prenatal brain asymmetries, around two-thirds of cases show the leftward bias. The same ratio applies to the asymmetry of the temporal planum in both infants and adults [44]. The incidence of right-handedness in the chimpanzee is also around 65–70 percent, as is a clockwise torque, in which the right hemisphere protrudes forwards and the left hemisphere rearwards, in both humans and great apes [45]. These and other asymmetries have led to the suggestion that a “default” asymmetry of around 65–70 percent, in great apes as well as humans, is inborn, with the asymmetry of human handedness and cerebral asymmetry for language increased to around 90 percent by “cultural literacy” [46].

Variations in Asymmetry

Whatever their “true” incidence, variations in handedness and cerebral asymmetry raise doubts as to the significance of the “standard” condition of right-handedness and left-cerebral specialization for language, along with other qualities associated with the left and right brains that so often feature in popular discourse. Handedness and cerebral asymmetry are not only variable, they are also imperfectly related. Some 95–99 percent of right-handed individuals are left-brained for language, but so are about 70 percent of left-handed individuals. Brain asymmetry for language may actually correlate more highly with brain asymmetry for skilled manual action, such as using tools [47],[48], which again supports the idea that language itself grew out of manual skill—perhaps initially through pantomime.

Even when the brain is at rest, brain imaging shows that there are asymmetries of activity in a number of regions. A factor analysis of these asymmetries revealed four different dimensions, each mutually uncorrelated. Only one of these dimensions corresponded to the language regions of the brain; the other three had to do with vision, internal thought, and attention [49]—vision and attention were biased toward the right hemisphere, language and internal thought to the left. This multidimensional aspect throws further doubt on the idea that cerebral asymmetry has some unitary and universal import.

Handedness, at least, is partly influenced by parental handedness, suggesting a genetic component [50], but genes can’t tell the whole story. For instance some 23 percent of monozygotic twins, who share the same genes, are of opposite handedness [51]. These so-called “mirror twins” have themselves fallen prey to a Through the Looking Glass myth; according to Martin Gardner [52], Lewis Carroll intended the twins Tweedledum and Tweedledee in that book to be enantiomers, or perfect three-dimensional mirror images in bodily form as well as in hand and brain function. Although some have argued that mirroring arises in the process of twinning itself [53],[54], large-scale studies suggest that handedness [55],[56] and cerebral asymmetry [57] in mirror twins are not subject to special mirroring effects. In the majority of twins of opposite handedness the left hemisphere is dominant for language in both twins, consistent with the finding that the majority of single-born left-handed individuals are also left-hemisphere dominant for language. In twins, as in the singly born, it is estimated that only about a quarter of the variation in handedness is due to genetic influences [56].

The manner in which handedness is inherited has been most successfully modeled by supposing that a gene or genes influence not whether the individual is right- or left-handed, but whether a bias to right-handedness will be expressed or not. In those lacking the “right shift” bias, the direction of handedness is a matter of chance; that is, left-handedness arises from the lack of a bias toward the right hand, and not from a “left-hand gene.” Such models can account reasonably well for the parental influence [58][60], and even for the relation between handedness and cerebral asymmetry if it is supposed that the same gene or genes bias the brain toward a left-sided dominance for speech [60],[61]. It now seems likely that a number of such genes are involved, but the basic insight that genes influence whether or not a given directional bias is expressed, rather than whether or not it can be reversed, remains plausible (see Box 1).

Genetic considerations aside, departures from right-handedness or left-cerebral dominance have sometimes been linked to disabilities. In the 1920s and 1930s, the American physician Samuel Torrey Orton attributed both reading disability and stuttering to a failure to establish cerebral dominance [62]. Orton’s views declined in influence, perhaps in part because he held eccentric ideas about interhemispheric reversals giving rise to left–right confusions [63], and in part because learning-theory explanations came to be preferred to neurological ones. In a recent article, Dorothy Bishop reverses Orton’s argument, suggesting that weak cerebral lateralization may itself result from impaired language learning [64]. Either way, the idea of an association between disability and failure of cerebral dominance may be due for revival, as recent studies have suggested that ambidexterity, or a lack of clear handedness or cerebral asymmetry, is indeed associated with stuttering [65] and deficits in academic skills [66], as well as mental health difficulties [67] and schizophrenia (see Box 1).

Although it may be the absence of asymmetry rather than its reversal that can be linked to problems of social or educational adjustment, left-handed individuals have often been regarded as deficient or contrarian, but this may be based more on prejudice than on the facts. Left-handers have excelled in all walks of life. They include five of the past seven US presidents, sports stars such as Rafael Nadal in tennis and Babe Ruth in baseball, and Renaissance man Leonardo da Vinci, perhaps the greatest genius of all time.