Question / Is Common Sense even better than Empathy?

My posting has slowed to almost nothing since last Saturday:

Summer at last; warm winds, blue skies, puffy clouds. The dog and I are both delirious over the ability to “get out of” quasi imprisonment indoors.

Into the truck; a short drive to the south, up and over the canyon edge into the wide open space of the plateau. Out into “the world again” striding easily along a two-rut track that goes nowhere; the type that is established by the driver of a first vehicle, turning off the road, through the brush, and headed nowhere. Humans cannot resist such a “lure” – Who drove off the road and why? Maybe the track does go somewhere. And so, the tracks grow, simply by repetition of the “nowhere” pattern. Years pass; ruts widen, deepen, grow and are bypassed, smoothed out, and grow again, becoming as permanent and indestructible as the Appian Way.

This particular set of ruts is a habitual dog-walking path for me: the view, the wind, the light, the sky whipped into a frenzy of lovely clouds… and then, agony. Gravel underfoot has turned my foot, twisting my ankle and plunging me into a deep rut and onto the rough ground. Pain; not Whoops, I tripped pain, but OMG! I’m screwed pain. I make a habit of glancing a few feet ahead to check where my feet are going, but my head was in the clouds.

This isn’t the first time in 23 years that I’ve taken a fall out in the boonies: a banged up shin or knee, a quick trip to the gravel; scraped hands, even a bonk on the head, but now… can I walk back to the truck, or even stand up? One, two, three… up.

Wow! Real pain; there’s no choice. Get to the truck, which appears to be very, very far away, at this point. Hobble, hobble, hobble; stop. Don’t stop! Keep going. Glance up at the truck to check periodically to see if it’s “growing bigger” – reachable. I always tell myself the same (true) mantra in circumstances like this: shut out time, let it pass, and suddenly, there you will be, pulling open the truck door and pulling yourself inside.

There is always some dumb luck in these matters: it’s my left ankle. I don’t need my left foot to drive home. Then the impossible journey from the truck to the house, the steps, the keys, wrangling the dog and her leash, trying not to get tangled and fall again – falling through the doorway, grabbing something and landing on the couch. Now what?

That was five days ago. Five days of rolling around with my knee planted in the seat of a wheeled office chair, pushing with the right foot as far as I can go, then hopping like a  one-legged kangaroo the rest of the way. Dwindling food supplies; unable to stand to cook; zapping anything eligible in the microwave. No milk in my coffee. Restless nights. Any bump to my bandaged foot wakes me up. This is ridiculous! My life utterly disrupted by a (badly) sprained ankle. I think I’m descending into depression.

Bipedalism, of course, begins to takeover my thoughts. But first, I try to locate hope on the internet, googling “treatment for sprained ankle.” You’re screwed, the pages of entries say. One begins to doubt “evolution” as the master process that produces elegant and sturdy design. Ankles are a nightmare of tiny bones and connecting ligaments, with little blood supply to heal the damage, and once damaged, a human can expect a long recovery, intermittent swelling and inevitable reinjury, for as long as you live.

It seems that for our “wild ancestors” a simple sprain could trigger the expiration date for any individual unlucky enough to be injured: the hyenas, big cats, bears and other local predators circle in, and then the vultures. Just like any other animal grazing the savannah or born into the forest, vulnerability = death. It’s as true today as it ever was. Unless someone is there with you when you are injured, you can be royally screwed: people die in their own homes due to accidents. People die in solo car wrecks. People go for a day hike in a state park and within an hour or two, require rescue, hospitalization and difficult recovery, from one slip in awareness and focus. And, being in the company of one or more humans, hardly guarantees survival. Success may depend on their common sense.

So: the question arises around this whole business of Homo sapiens, The Social Species. There are many social species, and it is claimed that some “non-human” social species “survive and reproduce successfully” because they “travel together” in the dozens, thousands or millions and “empathize” with others of their kind. Really? How many of these individual organisms even notice that another is in peril, other than to sound the alarm and get the hell out of the danger zone or predator’s path? How one human mind gets from reproduction in massive numbers, that is, playing the “numbers game” (1/ 100, 1/100, 1, 100,000 new creatures survive in a generation), and the congregation of vast numbers in schools, flocks and the odds for “not being one of the few that gets caught and eaten” – how one gets from there to “pan-social wonderfulness” is one of the mysteries of the social human mind.

There are occasions when a herd may challenge a predator, or a predatory group; parents (usually the female), will defend offspring in varying manner and degree, but what one notices in encounters (fortuitously caught on camera, posted on the internet or included in documentaries) that solitary instances are declared to represent “universal behavior” and proof of the existence of (the current fad of) empathy in “lesser animals”. What is ignored (inattentional blindness) and not posted, is the usual behavior; some type of distraction or defensive behavior is invested in, but the attempt is abandoned, at some “common sense point” in the interaction; the parents give up, or the offspring or herd member is abandoned.

What one notices is that the eggs and the young of all species supply an immense amount of food for other species.

Skittles evolved solely as a food source for Homo sapiens children. It has no future as a species. LOL

I’ve been watching a lot of “nature documentaries” to pass the time. This is, in its way, an extraordinary “fact of nature”. Our orientation to extreme Darwinian evolution (reductionist survival of the fittest) is stunningly myopic. We create narratives from “wildlife video clips” edited and narrated to confirm our imaginary interpretation of natural processes; the baby “whatever” – bird, seal, monkey, or cute cub; scrambling, helpless, clueless, “magically” escapes death (dramatic soundtrack, breathless narration) due to Mom’s miraculous, just-in-the-nick-of-time return. The scoundrel predator is foiled once again; little penguin hero “Achilles” (they must have names) has triumphantly upheld our notion that “survival is no accident” – which in great measure is exactly what it is.

One thing about how evolution “works” (at least as presented) has always bothered me no end: that insistence that the individual creatures which survive to reproduce are “the fittest”. How can we know that? What if among the hundreds, thousands, millions of “young” produced, but almost immediately destroyed or consumed by chance, by random events, by the natural changes and disasters that occur again and again, the genetic potential “to be most fit” had been eliminated, depriving the species of potential even “better” adaptations than what those we see? We have to ask, which individuals are “fittest” for UNKNOWN challenges that have not yet occurred? Where is the variation that may be acted upon by the changing environment?

This is a problem of human perception; of anthropomorphic projection, of the unfailing insistence of belief in an intentional universe. Whatever “happens” is the fulfilment of a plan; evolution is distorted to “fit” the human conceit, that by one’s own superior DNA, survival and reproduction necessarily become fact. 

Human ankles (and many other details) of human physiology are not “great feats of evolutionary engineering.”

Like those two-rut roads that are ubiquitous where I live, chance predicts that most of evolution’s organisms “go nowhere” but do constitute quick and easy energy sources for a multitude of other organisms.

 

What is self? / an anthropological concept

A. I. Hallowell on ‘Orientations for the Self’

The following summary of Hallowell’s analysis as set out in his paper The self and its behavioral environment (most easily accessible as Chapter 4 of his book Culture and Experience (1955; 2nd Edition, 1971): University of Pennsylvania Press, has been taken from A. Lock (1981) Universals in human conception, in P.L.F. Heelas and A.J. Lock (eds.) Indigenous Psychologies: The Anthropology of the Self. London: Academic Press, pp19-36, with minor revisions.

__________________________________

Alfred IrvingPeteHallowell (1892–1974) was an American anthropologist, archaeologist and businessman. He was born in Philadelphia, Pennsylvania, and attended the Wharton School of the University of Pennsylvania receiving his B.S. degree in 1914, his A.M. in 1920, and his Ph.D. in anthropology in 1924. He was a student of the anthropologist Frank Speck. From 1927 through 1963 he was a professor of anthropology at the University of Pennsylvania excepting 1944 through 1947 when he taught the subject at Northwestern University. Hallowell’s main field of study was Native Americans.

_________________________________

NOTE: I’m “looking into” concepts of “self” and “self-awareness” after confronting, over and over again, the claim that “some high number” of Asperger types lack “self-esteem” – another of those sweeping generalities that likely is a ‘social judgement’ from the “outsider” – parent, teacher, psychologist, counselor, therapist, hairdresser, clerk at convenience store, neighbor or any bystander caring to comment on child behavior. This “lack of self-esteem” has become a “fad, cliché, causal certainty” for almost any perceived “human behavioral problem” in American psychology, education, child-rearing, pop-science, media and common gossip. 

My observation of this presentation of “self” (in a socio-cultural context) is that it’s BAD NEWS for Asperger types, or any individual whose inclination is to “develop” his or her own particular expression of self. Here is the problem: self, self awareness, self-control, self-determination and the myriad applications of the concept of “self” are commonly held to be “real things”; they are not. As pointed out in the selection below, in “normdom” the self is “fictitious” – a creation of culture; culture is a creation of selves. 

If an individual is for some reason, “out of sync” with the concept of self that is a co-creation of “homogeneous individuals” who subscribe to the same “cultural code” of belief, behavior, and perception of “reality” – well, it’s obvious that one is “in trouble” from the start: How does one “grow, create, construct” a familiar, comfortable, interesting, exploratory concept of self in a hostile socio-cultural environment? Even more difficult is the “biological, evolutionary” possibility, that one’s brain organization, and indeed, one’s experience of the environment, and perceptual fundamentals, are truly “alien” to those of the majority.  

As for “self-esteem” – is this not a concept of social conformity? 

In contemporary culture, the selfie = the self. Posting selfies on social media advertises one’s conformity to a culturally “approved” definition of “self” – which for girls and women, is an “image only” competition for social status. The desperation of “adult” women to conform to “imaginary standards” results in some very regrettable behavior. 

If one’s internalized “picture” of self matches that of what is expected and demanded by the dominant culture, then one is judged to “have self-esteem”. Any person who doesn’t measure up to the cultural “image” (imaginary standard) lacks self-esteem. The most obvious example today, is the crisis of “self-hatred” in young women due to highly distorted “ideals” of body type, promoted by misogynistic American cultural standards. External form is declared to be the self.    

___________________________________________________________________________________________

Excerpt. Full article: http://www.massey.ac.nz/~alock/virtual/hallowel.htm

This info is from the anthropological POV. 

Three things may be said about self-awareness:

(i) Self-awareness is a socio-cultural product. To be self-aware is, by definition, to be able to conceive of one’s individual existence in an objective, as opposed to subjective, manner. In G. H. Mead’s (1934) terms, one must view oneself from ‘the perspective of the other‘. Such a level of psychological functioning is only made possible by the attainment of a symbolic mode of representing the world. Again, this mode of mental life is generally agreed to be dependent upon the existence of a cultural level of social organization. We thus come to a fundamental, though apparently tautologous point: that the existence of culture is predicated upon that of self-awareness; and that the existence of self-awareness is predicated upon that of culture. In the same way as in the course of evolution the structure of the brain is seen as being in a positive-feedback relationship with the nature of the individual’s environment, so it is with culture and self-awareness: the self is constituted by culture which itself constitutes the self.

(ii) Culture defines and constitutes the boundaries of the self: the subjective-objective distinction. It is an evident consequence of being self-aware that if one has some conception of one’s own nature, then one must also have some conception of the nature of things other than oneself, i.e. of the world. Further, this distinction must be encapsulated explicitly in the symbols one uses to mark this polarity. Consequently, a symbolic representation of this divide will have become ‘an intrinsic part of the cultural heritage of all human societies‘ (Hallowell, 1971: 75). Thus, the very existence of a moral order, self-awareness, and therefore human being, depends on the making of some distinction between ‘objective’ (things which are not an intrinsic part of the self) and ‘subjective’ (things which are an intrinsic part of the self).

This categorical distinction, and the polarity it implies, becomes one of the fundamental axes along which the psychological field of the human individual is structured for action in every culture. … Since the self is also partly a cultural product, the field of behaviour that is appropriate for the activities of particular selves in their world of culturally defined objects is not by any means precisely coordinate with any absolute polarity of subjectivity-objectivity that is definable. (Hallowell, 1971: 84)

Similarly, Cassirer (1953: 262) in the context of kinship terminology, writes:

language does not look upon objective reality as a single homogeneous mass, simply juxtaposed to the world of the I, but sees different strata of this reality: the relationship between object and subject is not universal and abstract; on the contrary, we can distinguish different degrees of objectivity, varying according to relative distance from the I.

In other words, there are many facets of reality which are not distinctly classifiable in terms of a polarity between self and non-self, subjective or objective: for example, what exactly is the status of this page – is it an objective entity or part of its author’s selves; an objective entity that would exist as a page, rather than marks on a screen, without a self to read it? Again, am I responsible for all the passions I experience, or am I as much a spectator of some of them as my audience is? While a polarity necessarily exists between the two – subjective and objective/self and non-self – the line between the two is not precise, and may be constituted at different places in different contexts by different cultures. The boundaries of the self and the concomitant boundaries of the world, while drawn of necessity, are both constituted by cultural symbolism, and may be constituted upon differing assumptions.

(iii) The behavioural environment of individual selves is constituted by, and encompasses, different objects. Humans, in contrast to other animals, (that need for human exception again) can be afraid of, for example, the dark because they are able to populate it with symbolically constituted objects: ghosts, bogey men, and various other spiritual beings. (Supernatural, magical entities grew out of “real” danger in the environment: just as did “other” animals, we evolved in natural environments, in which “being afraid of the dark” is a really good reaction to the “the dark” because it’s populated by highly dangerous predators – it’s still a good “attitude” to have when in a human city today.)

As MacLeod (1947) points out,

purely fictitious objects, events and relationships can be just as truly determinants of our behaviour as are those which are anchored in physical reality. Yes, this is a serious problem in humans; the inability to distinguish natural from supernatural cause-explanation relationships leaves us vulnerable to bad decision-making and poor problem-solving.  

In Hallowell’s view (1971: 87):

such objects, (supernatural) in some way experienced, conceptualised and reified, may occupy a high rank in the behavioural environment although from a sophisticated Western point of view they are sharply distinguishable from the natural objects of the physical environment.* However, the nature of such objects is no more fictitious, in a psychological sense, than the concept of the self.

*This sweeping claim to “sophistication” is typical over-generalization and arrogance on the part of Western academics, who mistake their (supposedly) superior beliefs as common to all humans, at least in their cultural “fiefdoms”. The overwhelming American POV is highly religious, superstitious, magical and unsophisticated; the supernatural domain (although imaginary) is considered to be the source of power that creates “everything”. 

This self-deception is common: religion exempts itself from scrutiny as to its claims for “absolute truth” above and beyond any rational or scientific description of reality. It’s a case of, “You don’t question my crazy beliefs, and I won’t question yours.” 

 

The irony of social living / reproductive isolation

Futurists are always talking about the human race voyaging to distant star systems. Really? All 7 billion of us? They are liars: a handful of elites will “escape” human-caused disasters and run away to screw up other planets.

We tend to think of isolation as a geographical phenomenon. The wilds of Alaska unpopulated except by old geezers that have a “problem” functioning in a city or town, or belonging to a family. Or edgy patched-together families with haphazard living arrangements for whom life on-the-fly means chronic failure. People who by actual movement, and society’s encouragement, drift farther and farther away from “golden cities” that are jam-packed with successful, educated, well-off people; official, professional” humans who shop, attend the arts and eat peculiar expensive food, the   cost of which, could support entire families for months. Isolated people who “belong to” certain geographic islands in the sky, which protect them from contamination by  the world’s lower classes.

The “winners” of society are by definition the owners and occupiers of the tiny top of the global pyramid; individuals who circulate the globe like the Albatross, doomed to soar the empty skies between red carpets and charity events. Wealth and power guarantee social isolation for the wealthy and powerful and that’s the way they want it. The reward for “making it” is isolation from those one has left behind.

I’ve written before about “the species definition problem” as it applies to hominids, and specifically Homo sapiens. One of the vehicles toward speciation is reproductive separation and isolation. A species migrates and encounters a geographic barrier and divides. One group seeks a path around the mountain range, body of water, or climate boundary and the other decides to stay put. The separation can result in reproductive isolation, or eventually, speciation, should the two groups remain disconnected from each other for an extended period of time.

Society erects similar barriers for modern humans, but based on wealth and class, not on geography. Picture a slice of New York City: one that includes both the isolated, heavily guarded towers of the rich and famous, and adjacent neighborhoods with streets and buildings straight out of post Apocalyptic novels; a social and cultural divide exists that effectively ensures that the two groups will (hardly) ever interact and therefore reproduce. Is this not reproductive isolation? 

maxresdefault

We have seen again and again in human history that isolation of the “elite” has  terrible consequences; too few options for non-incestuous reproduction exist. If reproductive contribution is not diversified, an inferior, inbred and shrinking supply of “talent” occurs. The standard scenario is that “fresh genetic stock” is supplied by a harem arrangement; by “trading” females between top families; and the occasional adoption of healthy outsiders, both male and female, to fill vacancies in the ruling elite.  This may have serious results: if the dynasty is made up of weak and isolated individuals, new members, chosen for intelligence and aggression, can easily dispose of the ruling family. Once this is done, the peasants may assume that overthrowing the elite class is possible and even easy.

It may seem unlikely that this violent type of change can happen in modern nations, but reproductive speciation is a likely outcome. The rich and powerful won’t need to reproduce: cyber existence, extreme medical intervention, and replacement of inferior body and brain parts by “perfect” long-lasting artificial components, will isolate those at the top of the pyramid from organic humans even further. And geographic isolation will increase due to expansion to new exotic locations: a residence in earth orbit, or on the moon, will simply confirm the incredible social distance between the elite and humans left behind in decaying cities.

234230

Great! Mars will look like suburban Salt Lake City!

human_cloning_test_tube_super_soldier_babies_experiments_cyber_wars

A handy set of clones will allow the rich and powerful to outlive themselves several times over.

 

Phrenology and Brain Scans / Ancient Tools of Psychology

We’re sure lucky that brain scans came along to put an end to this nonsense!

goodhealthv1-paperrelicsbipolar_transverse_4 fru4_ant

From Frontiers of Psychology: Fifty psychological and psychiatric terms to avoid: 

(4) Brain region X lights up. Many authors in the popular and academic literatures use such phrases as “brain area X lit up following manipulation Y” (e.g., Morin, 2011). This phrase is unfortunate for several reasons. First, the bright red and orange colors seen on functional brain imaging scans are superimposed by researchers to reflect regions of higher brain activation. Nevertheless, they may engender a perception of “illumination” in viewers. Second, the activations represented by these colors do not reflect neural activity per se; they reflect oxygen uptake by neurons and are at best indirect proxies of brain activity. Even then, this linkage may sometimes be unclear or perhaps absent (Ekstrom, 2010). Third, in almost all cases, the activations observed on brain scans are the products of subtraction of one experimental condition from another. Hence, they typically do not reflect the raw levels of neural activation in response to an experimental manipulation. For this reason, referring to a brain region that displays little or no activation in response to an experimental manipulation as a “dead zone” (e.g., Lamont, 2008) is similarly misleading. Fourth, depending on the neurotransmitters released and the brain areas in which they are released, the regions that are “activated” in a brain scan may actually be being inhibited rather than excited (Satel and Lilienfeld, 2013). Hence, from a functional perspective, these areas may be being “lit down” rather than “lit up.”

Phrenology: Examining The Bumps of Your Brain

PSYCHCENTRAL Website, By Associate Editor 

The next time you say, “so and so should have her head examined,” remember that this was literally done in the 19th century.

Phrenology, as it became known, is the study of brain function. Specifically, phrenologists believed that different parts of the brain were responsible for different emotional and intellectual functions. Furthermore, they felt that these functions could be ascertained by measuring the bumps and indentations in your skull. That is, the shape of your skull revealed your character and talents.

Viennese doctor and anatomist Franz Josef Gall originated phrenology, though he called it cranioscopy. He was correct in saying that brain function was localized (this was a novel idea at the time), but unfortunately, he got everything else wrong.

When Gall was young, he noticed a relationship between people’s attributes and behaviors and the shape of their heads. For instance, he observed that his classmates who had better memories had protruding eyes.  This inspired him to start forming his theories and collecting anecdotal evidence. It’s this type of evidence that is the foundation of phrenology.

The problem? Phrenologists would simply dismiss cases that didn’t support their principles, or just revise their explanation to fit any example.

It was also thought that criminals could be identified by the shape of their brains.

It was also thought that criminals could be identified by the shape of their brains.

Phrenology’s Principles

Johann Spurzheim collaborated with Gall on his brain research, and he is the one who actually coined the term phrenology. He eventually went out on his own. He believed that there were 21 emotional faculties (the term for abilities or attributes) and 14 intellectual faculties.

Phrenology had five main principles, which Spurzheim laid out in Outlines of Phrenology (Goodwin, 1999):

  1. “The brain is the organ of the mind.”
  2. The mind consists of about three dozen faculties, which are either intellectual or emotional.
  3. Each faculty has its own brain location.
  4. People have different amounts of these faculties. A person that has more of a certain faculty will have more brain tissue at that location.
  5. Because the shape of the skull is similar to the shape of your brain, it’s possible to measure the skull to assess these faculties (known as the “doctrine of the skull”).

In this text, Spurzheim featured highly detailed descriptions of the faculties and their locations. Spurzheim popularized phrenology in the U.S. While he was on a lecture tour in America, he passed away. Former attorney turned phrenologist George Combe took over Spurzheim’s work and kept his categories.

Phrenology’s Popularity

Phrenology was particularly popular in the U.S. because it fit so well with the idea of the American dream–the notion that we can accomplish our goals despite a humble heritage. Spurzheim believed that the brain was like a muscle that could be exercised. Like weights for your biceps, a good education could strengthen your intellectual faculties. Plus, phrenology promised to improve the public’s everyday lives with simple solutions.

Soon, phrenology became big business and spread to various areas of life. Phrenologists would test couples for compatibility, potential suitors for marriage, and job applicants for different positions.

Brothers Lorenzo and Orson Fowler (who, as an Amherst college student, actually charged students two cents a head) became phrenology marketing gurus. They opened up phrenology clinics, sold supplies to other phrenologists and even started the American Phrenological Journal in 1838. (Its last issue was published in 1911.) Sound familiar?

The Fowler brothers sold pamphlets on a variety of subjects. A few of the titles: The Indications of Character, Wedlock and Choice of Pursuits. They also gave lectures and offered classes to phrenologists and the public.

They even created a faculties manual that a person would take home after being examined by a phrenologist. The phrenologist would indicate the strength of a faculty from two to seven and then check either the box that said “cultivate” or “restrain.” Then, the person would refer to the necessary sections of the 175-paged book.

While much of the public was fascinated by phrenology, the scientific community wasn’t impressed. By the 1830s, it was already considered pseudoscience. Pierre Flourens, a French physiologist and surgeon, questioned the movement and discredited it by performing experimental studies. He experimented on a variety of animals by observing what happened when he’d remove specific sections of their brains.

But science didn’t cause phrenology to fall out of favor. Psychology professionals offering new methods did.

Phrenology’s Influence on Psychology

If you’ve ever read an introductory psychology book, you might remember that phrenology was depicted as basically a fraud.  It was viewed “as a bizarre scientific dead end in which charlatans read character by looking at the bumps on someone’s head,” wrote C. James Goodwin in A History of Modern Psychology.

But as Goodwin said in his book, that’s a simplistic explanation. In fact, phrenology helped move American psychology forward in various ways. (And while there were charlatans, there were phrenologists who truly wanted to help.)

For instance, the basis of phrenology was individual faculties, and thereby individual differences. Phrenologists were interested in analyzing and measuring individual differences, like psychologists do today.

As mentioned above, phrenology also proposed that one’s DNA didn’t predetermine their life. The environment, including education, played a big role, too. You could improve upon your skills and talents. You — not your genes — had control over your future, and that was a hopeful and exciting notion. It still is!

 

 

New Topic / Intuitive Empathy

The by-now totally clichéd estimate of Asperger empathy is that we have ZERO ability to “sense, feel, react to the emotional state of other humans” which ranks us below the ability of a GPS system, which “voiced” by a human-sounding digital response has some friendly or concerned tone to its responses.

For (some) Aspergers this is a strange notion about empathy, or the lack of it,  and the idea of empathy is also confusing, thought-provoking and mysterious. I think what we lack are the stock social responses that are required to “prove” that we care about people. The troubling belief on the part of social typicals is that these “canned” commiserations are the sole content of “empathy.” The mistake is in overlooking or ignoring intuition as a means of understanding “the other” being, whether or not it is an animal or “the human animal”. Intuition functions without conscious awareness and lacks words (until the intuitive process is grasped by intellect) and may not be demonstrated outwardly. But, neurotypicals demand an immediate and scripted social response – which is not compatible with letting intuition do its work.

There is much going on inside a person, and what they are doing or saying may not be the “truth.” Whatever situation they describe may not be what they are actually concerned with; their emotions may tell a different story. I know that my reaction to a hearing someone’s hurt or disappointment is to automatically engage my “intuitive” way of learning; can I help the person to see their feelings more clearly or objectively, and therefore alleviate emotional distress? This does not mean that I don’t “feel” for their suffering, I merely distance myself in order to be helpful.

I believe that this is something I learned to do; as a child I was supersensitive to other people’s “emanations” of pain. Looking back, I can recognize that instances of “meltdown” were triggered by seeing people who were injured, diseased, maimed or disabled – and I simply fell apart. I was seriously chastised for being “unfeeling” for having such a severe reaction; one was supposed to “not notice” such things because it upset the poor unfortunate person in a wheel chair or other obvious damage. Is being invisible better? The social etiquette was to ignore suffering. Only outside of an afflicted person’s presence did one gossip about misfortune.

As I grew older these reactions subsided and I believe it was because I learned to dampen emotion and switch to “solution” mode. This seems natural given my personality as a problem solver. It should be obvious that I don’t write a blog about Aspergers, but the intersection between “us” and the neurotypically dominated world of society, for no reason. Yes it’s personal, problem-solving, and satisfying to my curiosity, but I also empathize with the huge number of modern social typicals who suffer from the same ridged, boring, shallow and life-stifling environment of American culture that drives Aspergers to despair.

It hurts me to see people be miserable, confined and bored and not know why – and to blame themselves for not living up to impossible social demands; to be trapped in the lie that moment to moment emotional chaos is LIFE. Social attention is held to be the sole value to living. 

Only contrived gestures count and questions on surveys, tests and magic brain scans.

This is an actual card that one can purchase and send to a woman who just had a miscarriage.

This is an actual “Empathy” brand card that one can purchase and send to a woman who just had a miscarriage.

 

 

 

 

 

 

 

 

 

 

 

 

 

This could be the strangest website I’ve ever encountered

A slideshow of party / event acts posted by Dizanne Productions in Hawaii…. a strange social subculture… the alien world of neurotypicals… kinda scary, yes?

Live baby chicks to “fondle” at your Easter party…

Balloon Hats “Inflation can be fun”

https://www.yelp.com/biz_photos/dizanne-productions-honolulu?select=uqPxUwandSdS-ZOXcaDwTg&utm_campaign=www_photo_share_popup&utm_medium=copy_link&utm_source=(direct)

Bonus: 

 

 

 

Why does GOD let people starve to death? / Insane Neurotypical Christian Response

FROM “Not Ashamed of the Gospel” website. (You ought to be ashamed…)  https://notashamedofthegospel.com/apologetics/why-god-doesnt-feed-all-starving-children/

3 Strange But True Reasons Why God Doesn’t Feed All the Starving Children in The World

Peter Guirguis / Apologetics 240 Comments

OMG! I will never apologize for being Asperger or Atheist. This is how “normal neurotypicals” see the world; the universe is a supernatural monstrosity.

Evil exists, but not in Nature; it is the consequence of the beliefs and behavior of Modern Social Homo sapiens. Why isn’t this dangerous “mental derangement” not featured in the DSM, and yet Autism is?

God, Can You Please Make it Rain Turkey and Gravy?

If God is all-powerful, then can’t He make it rain turkey and gravy from heaven to feed all the starving kids in the world? The answer is that of course God can do that if that’s what He wanted to do. But since God doesn’t make it rain turkey and gravy upon the starving kids around the world, then we have to ask, ”Why doesn’t He?”

If you’re not able to answer this question, then one of two things is going to happen to you. You’re going to struggle with your faith because you’re going to have doubts that God is a good God. Or you’re never going to find out the truth about God, and you’ll make the mistake of thinking that God doesn’t exist.

This article is for you if:

1. You’ve ever wondered why God doesn’t feed starving kids around the world, and you struggle with the answer.2. You’re skeptical of the Christian God or other gods. 3. You want to be able to answer this question when it’s asked of you in an accurate and positive way.

Why The “Strange But True” Title? The reason I call these reasons that I’m about to share with you “strange” is because if I were God, I would do things differently. But thank goodness, I’m not God. (OMG!)

What may be strange to one person may not be considered strange to another. So depending on how familiar you are with this subject, (NT insanity?) you may agree with me that these reasons are “strange but true”, or you may not. Either way, I hope this will spark a good dialog about this topic. (Totalitarian demand for obedience to supernatural hallucinations is a really good jumping off point for “good dialog”!)

I’ve thought of three different reasons why God doesn’t feed the starving children of the world.

Reason #1 – It Isn’t God’s Responsibility to Feed the Starving Children of the World

Every year, I have the privilege of going through the one-year Bible plan. That means that I will read the entire Bible in one year. I don’t share this to impress you. But I do share it to establish that I’m quite familiar with the Bible. Of all the times that I have read the Bible from cover to cover, I can’t think of a single Bible verse in which God makes a promise to feed all the starving children in the world. (But there are threats that “God” will make people eat their own children!) So when somebody accuses God of being unjust because He has the capability to feed starving children, and He doesn’t, then it’s that person that has a misunderstanding of God. (No misunderstanding here: your imaginary master is a true psycho-sociopath)

GOD: “Hey, it’s not MY JOB to control the vicious uncaring assholes I made in my image. LOL!” 

If God Isn’t Responsible For Feeding Starving Children, Then Who Is?

The answer is you and me. I can think of numerous Bible verses in which God instructs His children to feed the poor people of the world.

And Christians are doing such a great job of it! Bomb entire nations into a state that can only be called “Hell on Earth”, and then send “missionaries of democracy” with bags of leftover “dog food”. Take photos: lie, brag about how “empathetic” and compassionate you and your “god” are. And of course, “profit” from the crimes. 

Proverbs 28:27 says, “He who gives to the poor will not lack, But he who hides his eyes will have many curses.” James 2:15-16 says, “If a brother or sister is naked and destitute of daily food, and one of you says to them, ‘Depart in peace, be warmed and filled,’ but you do not give them the things which are needed for the body, what does it profit?” So if you’re one of those people that thinks God should feed the starving kids around the world, then you are shifting the responsibility.

God isn’t responsible for feeding starving children, you and I are. Then why not demonstrate ethical behavior by refraining from creating mass suffering by  committing predatory wars, practicing profitable poverty as “economics” and enforcing starvation? 

Reason #2 – God Isn’t Like Humans

Atheists make a mistake when they say things like, “If I saw a starving child and had the power to feed him and I don’t, then I am evil. (Uh-yeah! That logically is cruel uncaring behavior) That’s the same thing with God, He is evil because He has the power to feed starving children and He doesn’t.” (You said it! Why not believe your own “instincts” about all this Christian “we’re the good guys” social evil?)

The mistake that atheists make here is that they compare themselves to God, or they compare God to themselves. They put themselves in God’s shoes. (This is utterly BONKERS. God does not exist, and he certainly wouldn’t wear shoes if he did)

God’s goals are different than our goals. His purposes are different than our purposes. His way of justice is different than the human way of justice. But here’s the lesson that’s to be learned: any time you blame God for not doing something that you would do, you’are making an idol in your own image. (Christianity IS a religion of “idols”)

What does that mean? It means that you’re making up your own concept of how God is supposed to act, which is something the Bible warns us about. (My, my – mustn’t use what little intelligence humans have to realize that religion is a con game)

Reason #3 – God’s Justice is Coming Soon For All

You and I want to see justice have its way immediately. Think about all the hate crimes in the world, the rapes, and the murders. You and I want to see those people (Christians commit hate crimes, rape, murder and a long list of heinous behaviors, as a matter of religious and political policy) get what they deserve.

But while we judge others for their heinous crimes, we overlook the sins that we commit in God’s eyes. While God does see hate crimes, rapes, and murders as sins, He also sees lying, cheating, and hating people as sins too. (Your god hates human beings and other living things)

So since God is a just God, then He’s going to have to give justice to all if He were to judge the world today. That means that there would be a lot of people who would receive punishment for eternity for breaking God’s standards. (And how LOW these are!) So instead, God is saving His judgment for Judgment Day. That’s when everyone is going to get judged for what they did on earth.

Those who broke God’s standards and did not receive His son Jesus for salvation will end up going to hell.

This is deranged thinking by any standard; it expresses rage and hatred for all human beings; it’s sick, sadistic and “loves” torture. Why is “religious psychopathy” not in the DSM? 

 

But those who do put their faith and trust in Christ will end up going to heaven. So when you don’t see justice taking place immediately, it’s because God is giving everyone a chance to repent, and put their faith in Jesus Christ as Lord and Savior.

How About Other Reasons?

I have to admit, I’m not a know it all. That’s where you come in. Can you think of any other reasons why God doesn’t feed the starving kids around the world? (“He” is a hallucination: “He” doesn’t exist. Thank God!) 

Share them in the comments below.

I leave you to read the comments: I need to spend some time in Nature, where evil does not exist…

But millions of Americans believe it’s true…

 

Body language: The crotch displays of men (primates)

Right: This is one area where I’m relieved that “social conventions” restrict males from walking around naked. If they did, we’d have to put up with this type of behavior. LOL

Bipedalism was not a result of the crotch display, but it gave male bipeds a great opportunity to enhance traditional primate displays, and to threaten and intimidate other males. 

 x

The original “blue balls”

Body language: The crotch displays of men

(nipped for brevity)

https://www.psychmechanics.com/2015/05/body-language-crotch-displays-of-men.html

One way in which males display dominance is by displaying their crotch…this behavior is something that we’ve inherited from our ancestors. The most common way in which men display their crotch is by taking up the thumbs-in-belt gesture.

Thumbs in belt or pockets

This gesture is used by men to display a dominant, sexually aggressive attitude. It’s perhaps the most direct sexual display a man can make towards a woman. (You’ve got to be joking!) Men also use this gesture to stake their territory or to show other men that they’re not afraid. This gesture communicates the non-verbal message, “I am virile, powerful and dominant”. 

The Obama White House criticized Putin’s posture. I guess the seated crotch display, when done properly,  does intimidate the Hell out of other males. LOL

In a seated position, it becomes kind of difficult for men to assume this gesture but they don’t shy away from displaying their crotch if they want to communicate the message of dominance. They’ll spread their legs and lean slightly backward so that their crotch comes forward and in full display.

Watch any group of young men who’re engaged in an activity that requires them to display a macho attitude and you’ll notice that they often stand with their legs apart and their hands somehow highlight their crotch.

For instance, when sports teams are ready for ‘action’ you may notice the players continually adjusting and re-adjusting their crotch as they unconsciously try to assert their masculinity. Interestingly, this crotch display gesture is also seen in apes and some other primates. Even though the apes don’t wear any belt or trousers, still they highlight their crotch with their hands when they have to stake their territory and show other apes that they’re unafraid.

Some primates such as baboons are a bit more direct. They display dominance by spreading their legs and displaying their penis, giving it continual adjustment or even waving it at their enemies.

What’s even more mind-boggling is that the same penis-waving tactic is also employed by some New Guinea tribes even today who are essentially cut off from modern civilization.

This clearly indicates that such a behavior is an evolved tendency in homo sapiens.

Dropping the pants

I must have been around 11 or 12 years old. It was a bright Sunday morning and we had arranged a cricket match with some schoolmates. Everything was normal as the game progressed and as usual, both the teams rejoiced at the high points and wore disappointed expressions at the low points of the game.

A rather strange thing happened when the game was over. It was a narrow contest right to the end but our team lost. Needless to say, the other team was elated. They jumped with joy, yelled and screamed. But one particular boy was over-excited. He felt so powerful and dominant due to the win that he dropped his pants and showed his penis to our team. (Why not to the other team?)

My team-mates laughed it off but I was taken aback.

I never forgot that incident. I wanted to know why he did that. What possible motive or desire could force a person to resort to such an extreme behavior? (Was the writer really so naive?)

It remained an unanswered question, an unresolved problem in my psyche for a long time until years later, when I read about human evolution and body language, the whole picture became clear to me.

Another similar and common incident that men experience at least once in their lives is when they jokingly question the size of their friend’s penis, the latter usually gets defensive and retorts with something like, “If I show it to you guys, you’ll become afraid and run away”. (Really? Guys say this?)

He may not realize it but unconsciously he knows that the penis display is an effective way to display dominance, and so do his friends.

I’m sure you’re intelligent enough to understand, by now, why people display their middle fingers when they want to offend someone and/or to feel dominant.

It’s not an acceptable behavior anymore in a civilized society for adults to drop their pants and show their penises so they use their middle fingers to symbolically convey the same feelings.

Some of you might ask, “Why do women who wear jeans assume the ‘thumbs-in-belt’ gesture?” or “Why do women show their middle fingers, when they have no actual penises to display?”

Well, it’s most probably a behavior that they’ve learned from men. (Ya think?) Penis display, symbolical or not, has come to be strongly associated with offending someone or showing dominance in the human psyche, thanks to its effectiveness.

I’m sure you’re intelligent enough to understand, by now, why people display their middle fingers when they want to offend someone and/or to feel dominant.

It’s not an acceptable behavior anymore in a civilized society for adults to drop their pants and show their penises so they use their middle fingers to symbolically convey the same feelings.

So, women are just using a tool from men’s psychological repertoire because they know how effective it can be.

Subtle forms of crotch display

No, no, no, never…

Yes. 

Belt and crotch grabbing while dancing is a subtle (?) form of crotch display and men across different cultures do it- from Michael Jackson to Salman Khan. Other subtle forms include wearing tight fitting pants, small-size speedo swimming trunks or even dangling a large bunch of keys/chains on the front or side of the crotch.

Baseball players are particularly “crotch grab prone”. The NHL crotch grab: a puck to the nuts. 

Rather ambiguous message, don’t you think?

The wallets that have those chains dangling on the side of the crotch became popular among men because it helped them draw attention to their crotch. 

To conclude consider what George Carlin, the late American comedian, had to say about wars:

“War is nothing but a whole lot of prick-waving. War is just a lot of men standing around in a field waving their pricks at one another. Of course, the bombs, the rockets, and the bullets are all shaped like dicks. It’s a subconscious need to project the penis into other people’s affairs.”
______________________________________________________________

Are Japanese women tired of neotenic males, perhaps? 

Caption: REALISTIC male mannequins. How pitiful…