Emotions: What a Mess! / Physiology, Supernatural Mental State, Words

This drives me “nuts” – emotions ARE physiological responses to the environment; and yet, psychologists (and other sinners) continue to conceive of emotions as “mental or psychological states” and “word objects” that exist somewhere “inside” humans, like colored jelly beans in jar, waiting to be “called on” by their “names”. Worse, other “scientists hah-hah” also continue to confuse “physiology” as arising from some abstract construct or supernatural domain (NT thingie) called emotion.

Physiological Changes Associated with Emotion

https://www.ncbi.nlm.nih.gov/books/NBK10829/

The most obvious signs of emotional arousal involve changes in the activity of the visceral motor (autonomic) system (see Chapter 21). Thus, increases or decreases in heart rate, cutaneous blood flow (blushing or turning pale), piloerection, sweating, and gastrointestinal motility can all accompany various emotions. These responses are brought about by changes in activity in the sympathetic, parasympathetic, and enteric components of the visceral motor system, which govern smooth muscle, cardiac muscle, and glands throughout the body. (This is obviously real physical activity of the body, and not a magical, psychological or mental “state”) As discussed in Chapter 21, Walter Cannon argued that intense activity of the sympathetic division of the visceral motor system prepares the animal to fully utilize metabolic and other resources in challenging or threatening situations.

Honestly? I think in the above we have a working description of the ASD / Asperger “emotional” system: NO WORDS. So-called “emotions” are a SOCIALLY GENERATED SYSTEM that utilizes language to EXTERNALLY REGULATE human “reactivity” – that is, the child learns to IDENTIFY it’s physiological response with the vocabulary supplied to it by parents, teachers, other adults and by overhearing human conversation, in which it is immersed from birth.

Conversely, activity of the parasympathetic division (and the enteric division) promotes a building up of metabolic reserves. Cannon further suggested that the natural opposition of the expenditure and storage of resources is reflected in a parallel opposition of the emotions associated with these different physiological states. As Cannon pointed out, “The desire for food and drink, the relish of taking them, all the pleasures of the table are naught in the presence of anger or great anxiety.” (This is the physiological state that ASD / Asperger children “exist in” when having to negotiate the “world of social typicals” The social environment is confusing, frustrating, and alien. Asking us “how we feel” in such a circumstance will produce a “pure” physiological response: anxiety, fear, and the overwhelming desire to escape.)

Activation of the visceral motor system, particularly the sympathetic division, was long considered an all-or-nothing process. Once effective stimuli engaged the system, it was argued, a widespread discharge of all of its components ensued. More recent studies have shown that the responses of the autonomic nervous system are actually quite specific, with different patterns of activation characterizing different situations and their associated emotional states. (What is an emotional state? Emotion words are not emotions: they are language used to parse, identify and “name” the physiologic arousal AS SOCIETY  DICTATES TO BE ACCEPTABLE) Indeed, emotion-specific expressions produced voluntarily can elicit distinct patterns of autonomic activity. For example, if subjects are given muscle-by-muscle instructions that result in facial expressions recognizable as anger, disgust, fear, happiness, sadness, or surprise without being told which emotion they are simulating, each pattern of facial muscle activity is accompanied by specific and reproducible differences in visceral motor activity (as measured by indices such as heart rate, skin conductance, and skin temperature). Moreover, autonomic responses are strongest when the facial expressions are judged to most closely resemble actual emotional expression and are often accompanied by the subjective experience of that emotion! One interpretation of these findings is that when voluntary facial expressions are produced, signals in the brain engage not only the motor cortex but also some of the circuits that produce emotional states. Perhaps this relationship helps explain how good actors can be so convincing. Nevertheless, we are quite adept at recognizing the difference between a contrived facial expression and the spontaneous smile that accompanies a pleasant emotional state. (Since modern humans are notoriously “gullible” to the false words, body language and manipulations of “con men” of all types, how can this claim be extended outside a controlled “experiment” in THE LAB? Having worked in advertising for 15 years, I can assure the reader that finding models and actors who could act, speak and use body language that was “fake but natural” was a constant challenge. In other words, what was needed was a person who could “fake” natural behavior. Fooling the consumer was the GOAL!)

This evidence, along with many other observations, indicates that one source of emotion is sensory drive from muscles and internal organs. This input forms the sensory limb of reflex circuitry that allows rapid physiological changes in response to altered conditions. However, physiological responses can also be elicited by complex and idiosyncratic stimuli mediated by the forebrain. For example, an anticipated tryst with a lover, a suspenseful episode in a novel or film, stirring patriotic or religious music, or dishonest accusations can all lead to autonomicactivation and strongly felt emotions. (Are these “events, anticipated or actualized”, not social constructs that are learned? Would any child grow up to “behave patriotically” if he or she had not been taught do this by immersion in the total social environment, which “indoctrinates” children in the “proper emotions” of the culture?) The neural activity evoked by such complex stimuli is relayed from the forebrain to autonomic and somatic motor nuclei via the hypothalamus and brainstem reticular formation, the major structures that coordinate the expression of emotional behavior (see next section). (Is exploitation of this “neural activity” not the “pathway” to training social humans to “obey” the social rules?) 

In summary, emotion and motor behavior are inextricably linked. (Why would any one think that they are not? Emotion is merely the language used to manipulate, interpret and communicate the physiology) As William James put it more than a century ago:

What kind of an emotion of fear would be left if the feeling neither of quickened heart-beats nor of shallow breathing, neither of trembling lips nor of weakened limbs, neither of goose-flesh nor of visceral stirrings, were present, it is quite impossible for me to think … I say that for us emotion dissociated from all bodily feeling is inconceivable.

William James, 1893 (Psychology: p. 379.)

NEXT: The representation of “emotions” as “thingies” that can be experienced and eaten! Are we to believe that 34,000 distinct “emotion objects” exist “in nature / in humans” or are these “inventions” of social language? 

Plutchik’s Wheel of Emotions: What is it and How to Use it in Counseling?

Can you guess how many emotions a human can experience?

The answer might shock you – it’s around 34,000.

With so many, how can one navigate the turbulent waters of emotions, its different intensities, and compositions, without getting lost?

The answer – an emotion wheel.

Through years of studying emotions, Dr. Robert Plutchik, an American psychologist, proposed that there are eight primary emotions that serve as the foundation for all others: joy, sadness, acceptance, disgust, fear, anger, surprise and anticipation. (Pollack, 2016)

This means that, while it’s impossible to fully understand all 34,000 distinguishable emotions, (what is referred to is merely “vocabulary” that humans have come up with, and not emotion thingies that exist “somewhere” -) learning how to accurately identify how each of the primary emotions is expressed within you can be empowering. It’s especially useful for moments of intense feelings when the mind is unable to remain objective as it operates from its older compartments that deal with the fight or flight response. (Watkins, 2014) (This refers to the “pop-science” theory of the additive brain, (lizard, etc) which is utter fantasy) 

This article contains:

NEXT: Some Definitions of Emotions / Rather confusing, conflicting, unsatisfying, nonspecific descriptions: – indication that we’ve entered the supernatural realm of word concepts. Aye, yai, yai!

From introductory psychology texts

Sternberg, R. In Search of the Human Mind, 2nd Ed.Harcourt, Brace, 1998 p 542 “An emotion is a feeling comprising physiological and behavioral (and possibly cognitive) reactions to internal and external events.”

Nairne, J. S. Psychology: The Adaptive Mind. 2nd Ed. Wadsworth, 2000. p. 444 ” . . . an emotion is a complex psychological event that involves a mixture of reactions: (1) a physiological response (usually arousal), (2) an expressive reaction (distinctive facial expression, body posture, or vocalization), and (3) some kind of subjective experience (internal thoughts and feelings).”

From a book in which many researchers in the field of emotion discuss their views of some basic issues in the study of emotion. (Ekman, P., & Davidson, R. J. The Nature of Emotions: Fundamental Questions. Oxford, 1994)

Panksepp, Jaak p 86. .Compared to moods, “emotions reflect the intense arousal of brain systems that strongly encourage the organism to act impulsively.”

Clore, Jerald L p 184. “. . . emotion tems refer to internal mental states that are primarily focused on affect (where “affect” simply refers to the perceived goodness or badness of something). [see Clore & Ortony (1988) in V. Hamilton et al. Cognitive Science Perspectives on Emotion and Motivation. 367-398]

Clore, Jerald L p 285-6. “If there are necessary features of emotions, feeling is a good candidate. Of all the features that emotions have in common, feeling seems the least dispensable. It is perfectly reasonable to say about ones anger, for example,’I was angry, but I didn’t do anything,’ but it would be odd to say ‘I was angry, but I didn’t feel anything.’ ”

Ellsworth, Phoebe p 192. “. . . the process of emotion . . . is initiated when one’s attention is captured by some discrepancy or change. When this happens , one’s state is different, physiologically and psychologically, from what it was before. This might be called a “state of preparedness” for an emotion . . . The process almost always begins before the name [of the emotion is known] and almost always continues after it.

Averill, James R. p 265-6. “The concept of emotion . . . refer[s] to (1) emotional syndromes, (2) emotional states, and (3) emotional reactions. An emotional syndrome is what we mean when we speak of anger, grief, fear, love and so on in the abstract. . . . For example, the syndrome of anger both describes and prescribes what a person may (or should) do when angry. An emotional state is a relatively short term, reversible (episodic) disposition to respond in a manner representative of the corresponding emotional syndrome. . . . Finally, and emotional reaction is the actual (and highly variable) set of responses manifested by an individual when in an emotional state: . . . facial expressions, physiological changes, overt behavior and subjective experience.”

LeDoux, Joseph E. p 291. “In my view, “emotions” are affectively charged, sujectively experienced states of awareness.”

 

Question / Is Common Sense even better than Empathy?

My posting has slowed to almost nothing since last Saturday:

Summer at last; warm winds, blue skies, puffy clouds. The dog and I are both delirious over the ability to “get out of” quasi imprisonment indoors.

Into the truck; a short drive to the south, up and over the canyon edge into the wide open space of the plateau. Out into “the world again” striding easily along a two-rut track that goes nowhere; the type that is established by the driver of a first vehicle, turning off the road, through the brush, and headed nowhere. Humans cannot resist such a “lure” – Who drove off the road and why? Maybe the track does go somewhere. And so, the tracks grow, simply by repetition of the “nowhere” pattern. Years pass; ruts widen, deepen, grow and are bypassed, smoothed out, and grow again, becoming as permanent and indestructible as the Appian Way.

This particular set of ruts is a habitual dog-walking path for me: the view, the wind, the light, the sky whipped into a frenzy of lovely clouds… and then, agony. Gravel underfoot has turned my foot, twisting my ankle and plunging me into a deep rut and onto the rough ground. Pain; not Whoops, I tripped pain, but OMG! I’m screwed pain. I make a habit of glancing a few feet ahead to check where my feet are going, but my head was in the clouds.

This isn’t the first time in 23 years that I’ve taken a fall out in the boonies: a banged up shin or knee, a quick trip to the gravel; scraped hands, even a bonk on the head, but now… can I walk back to the truck, or even stand up? One, two, three… up.

Wow! Real pain; there’s no choice. Get to the truck, which appears to be very, very far away, at this point. Hobble, hobble, hobble; stop. Don’t stop! Keep going. Glance up at the truck to check periodically to see if it’s “growing bigger” – reachable. I always tell myself the same (true) mantra in circumstances like this: shut out time, let it pass, and suddenly, there you will be, pulling open the truck door and pulling yourself inside.

There is always some dumb luck in these matters: it’s my left ankle. I don’t need my left foot to drive home. Then the impossible journey from the truck to the house, the steps, the keys, wrangling the dog and her leash, trying not to get tangled and fall again – falling through the doorway, grabbing something and landing on the couch. Now what?

That was five days ago. Five days of rolling around with my knee planted in the seat of a wheeled office chair, pushing with the right foot as far as I can go, then hopping like a  one-legged kangaroo the rest of the way. Dwindling food supplies; unable to stand to cook; zapping anything eligible in the microwave. No milk in my coffee. Restless nights. Any bump to my bandaged foot wakes me up. This is ridiculous! My life utterly disrupted by a (badly) sprained ankle. I think I’m descending into depression.

Bipedalism, of course, begins to takeover my thoughts. But first, I try to locate hope on the internet, googling “treatment for sprained ankle.” You’re screwed, the pages of entries say. One begins to doubt “evolution” as the master process that produces elegant and sturdy design. Ankles are a nightmare of tiny bones and connecting ligaments, with little blood supply to heal the damage, and once damaged, a human can expect a long recovery, intermittent swelling and inevitable reinjury, for as long as you live.

It seems that for our “wild ancestors” a simple sprain could trigger the expiration date for any individual unlucky enough to be injured: the hyenas, big cats, bears and other local predators circle in, and then the vultures. Just like any other animal grazing the savannah or born into the forest, vulnerability = death. It’s as true today as it ever was. Unless someone is there with you when you are injured, you can be royally screwed: people die in their own homes due to accidents. People die in solo car wrecks. People go for a day hike in a state park and within an hour or two, require rescue, hospitalization and difficult recovery, from one slip in awareness and focus. And, being in the company of one or more humans, hardly guarantees survival. Success may depend on their common sense.

So: the question arises around this whole business of Homo sapiens, The Social Species. There are many social species, and it is claimed that some “non-human” social species “survive and reproduce successfully” because they “travel together” in the dozens, thousands or millions and “empathize” with others of their kind. Really? How many of these individual organisms even notice that another is in peril, other than to sound the alarm and get the hell out of the danger zone or predator’s path? How one human mind gets from reproduction in massive numbers, that is, playing the “numbers game” (1/ 100, 1/100, 1, 100,000 new creatures survive in a generation), and the congregation of vast numbers in schools, flocks and the odds for “not being one of the few that gets caught and eaten” – how one gets from there to “pan-social wonderfulness” is one of the mysteries of the social human mind.

There are occasions when a herd may challenge a predator, or a predatory group; parents (usually the female), will defend offspring in varying manner and degree, but what one notices in encounters (fortuitously caught on camera, posted on the internet or included in documentaries) that solitary instances are declared to represent “universal behavior” and proof of the existence of (the current fad of) empathy in “lesser animals”. What is ignored (inattentional blindness) and not posted, is the usual behavior; some type of distraction or defensive behavior is invested in, but the attempt is abandoned, at some “common sense point” in the interaction; the parents give up, or the offspring or herd member is abandoned.

What one notices is that the eggs and the young of all species supply an immense amount of food for other species.

Skittles evolved solely as a food source for Homo sapiens children. It has no future as a species. LOL

I’ve been watching a lot of “nature documentaries” to pass the time. This is, in its way, an extraordinary “fact of nature”. Our orientation to extreme Darwinian evolution (reductionist survival of the fittest) is stunningly myopic. We create narratives from “wildlife video clips” edited and narrated to confirm our imaginary interpretation of natural processes; the baby “whatever” – bird, seal, monkey, or cute cub; scrambling, helpless, clueless, “magically” escapes death (dramatic soundtrack, breathless narration) due to Mom’s miraculous, just-in-the-nick-of-time return. The scoundrel predator is foiled once again; little penguin hero “Achilles” (they must have names) has triumphantly upheld our notion that “survival is no accident” – which in great measure is exactly what it is.

One thing about how evolution “works” (at least as presented) has always bothered me no end: that insistence that the individual creatures which survive to reproduce are “the fittest”. How can we know that? What if among the hundreds, thousands, millions of “young” produced, but almost immediately destroyed or consumed by chance, by random events, by the natural changes and disasters that occur again and again, the genetic potential “to be most fit” had been eliminated, depriving the species of potential even “better” adaptations than what those we see? We have to ask, which individuals are “fittest” for UNKNOWN challenges that have not yet occurred? Where is the variation that may be acted upon by the changing environment?

This is a problem of human perception; of anthropomorphic projection, of the unfailing insistence of belief in an intentional universe. Whatever “happens” is the fulfilment of a plan; evolution is distorted to “fit” the human conceit, that by one’s own superior DNA, survival and reproduction necessarily become fact. 

Human ankles (and many other details) of human physiology are not “great feats of evolutionary engineering.”

Like those two-rut roads that are ubiquitous where I live, chance predicts that most of evolution’s organisms “go nowhere” but do constitute quick and easy energy sources for a multitude of other organisms.

 

Overview of personality “theories” + History of “personality concept” / Yikes!

https://www.simplypsychology.org/personality-theories.html

Without any attempt at addressing this enormously complex problem as a whole, it may be worthwhile to recall one recent example of interdisciplinary discussion occurring at the intersection of empirical psychology and normative ethics: a discussion of virtuous character. The latter being a paradigmatic subject matter of virtue ethics, at least from Socrates, has recently been reconsidered in the light of experimental results obtained by academic psychology. More specifically, it was related to the criticism of the concept of personality voiced mostly by social psychologists.

The conceptual and theoretical core of personality psychology, both in its scientific and folk versions (Gilbert & Malone, 1995; Ross, 1977), has usually been constructed around the notion of temporally stable and cross-situationally consistent features: so-called global or robust traits. A recent empirical tradition of situationism, however, seems to provide ample evidence not only for the fact that we are all indeed “prone to dispositionism” of this kind, but also that such “dispositionism is false” (Goldie, 2004, p. 63). The researchers from this tradition deny that there are stable and consistent traits or, alternatively, insist that most actual people don’t exhibit traits of this kind. Rather, the large body of empirical evidence (among the research most commonly discussed by virtue ethicists is that by Darley & Batson, 1973; Isen & Levin, 1972; Milgram, 1963; for a more complete review see Doris, 2002) provided shows that it is a situation in which an agent finds him/herself acting, rather than an allegedly context-independent and stable personality, that accounts for the large amount of human behavior.

The experiments conducted by social psychologists were soon generalized into doubts concerning the usefulness of trait concepts for the purposes of scientific explanation and prediction. Understood in such a context, in turn, they attracted the attention of many philosophers. The empirical results mentioned above could, indeed, have been disquieting, especially if one realized that the very center of traditional philosophical moral psychology, especially within so-called virtue ethics, had been founded on the notion of moral character with virtues and vices aspiring to exactly the same stability and cross-situational consistency that was undermined in the case of personality. Among the philosophers it was especially Gilbert Harman (1999, 2000) and John Doris (1998, 2002) who stimulated a fierce debate by stating that situationist literature posed a grave threat against “globalist moral psychologies” (Doris & Stich, 2014), and as undermining the very basis of both ancient and contemporary virtue ethics.

Such a far-reaching claim, obviously, provoked a strong response (for useful reviews see Alfano, 2013; Appiah, 2008; Goldie, 2004; Miller, 2013a). What seems to have been assumed by at least many disputants from both sides of the controversy, however, was a relatively direct applicability of psychological theses concerning personality to philosophical issues referring to character. In brief, it was the interchangeability of the notions of personality and character that had been presumed. Despite the fact that such an implicit assumption has been often made, these two notions are not identical. True, they are often used interchangeably and the difference is vague, if not obscure. Still, however, the notions in question can be distinguished from each other and the effort to draw the distinction is arguably worthwhile because of the latter’s bearing on many particular issues, including the above discussion of situationism.

One possible way of exploring the difference between these two concepts is to compare the typical, or paradigmatic, ways of their application as revealed in their respective original domains. Common language is obviously not very helpful here, as it exhibits the very same confusion that is intended to be clarified. Rather, the context of classical virtue ethics (for character) as well as that of academic personality psychology (for personality), are promising. Such a general clue will be used in the following sections. At first, the concepts of character and personality will be investigated both historically and systematically. Then, in turn, a parallel will be drawn between the pair in question and so-called fact–value distinction and an analysis of the functions played by both concepts conducted. Finally, the outcomes achieved will be placed in the context of some differences between the fact–value distinction and the Humean is–ought dichotomy.

Historical vicissitudes of the notions

In antiquity the notion of character was inseparably connected with the normative aspect of human conduct and in most contexts amounted to moral qualities of a respective person: to virtues and vices. Such a connection was emphasized in a term alternative to “character”: the Greek word “êthos” (cf. Gill, 1983, p. 472). An evaluative discourse of character can be found in common language and folk psychology (cf. Irwin, 1996), but it is its professional version proper to virtue ethics that is crucial in the present context. The latter philosophical tradition took on its classical form in Socrates and culminated with Aristotle’s (trans. 2000) Nicomachean Ethics, which to this day is a paradigmatic example of a virtue ethical account.

Ancient conceptions of character were descriptive and normative with both these features closely intertwined. They involved theories of moral and developmental psychology and, at the same time, a prescription and a detailed instruction of character education and character self-cultivation. And it was, importantly, a ‘life-long learning’ account that was provided: it was a rational adult, rather than a child deprived of genuine rationality, who was regarded by Cicero, Seneca, or Plutarch as able to accomplish “character formation through reasoned reflection and decision” (Gill, 1983, p. 470). The standards for the success of such a process were usually considered objective. In the Aristotelian context, for instance, it was the ability to properly perform human natural functions that provided the ultimate criterion.

The ancient Greek and Roman concept of character turned out to be profoundly influential in the following ages at least, as has been mentioned, until the beginnings of the previous century (for part of the story, see MacIntyre, 2013). Some of the variations on this ancient notion can be found in the Kantian ideal of the ethical personality, the German tradition of Bildung, the 19th-century American model of the balanced character and, last but not least, the Victorian vision of the virtuous character very vivid in the novels from this cultural milieu (Woolfolk, 2002). What is remarkable is that the notion of character, as influential as it used to be, is considerably much less important today. Nowadays, in fact, it seems to be mostly substituted by the concept of personality. And it is the history of the process that led to this state of affairs, of the shift “from a language of ‘character’ to a language of ‘personality’” (Nicholson, 1998, p. 52) that can be very revealing in the present context. Two particularly helpful accounts have been provided by Danziger (1990, 1997) and Brinkmann (2010).3

Danziger begins his account with an important remark that initially the notion of personality carried the meanings which were not psychological, but were theological, legal, or ethical ones. It was only as a result of considerable evolution that it “ended up as a psychological category.” The first important dimension of the process of its coming “down to earth” (1997, p. 124) was the medicalization. Danziger places the latter in 19th-century France, where medical professionals were as skeptical about the earlier theologically or philosophically laden versions of the notion as they were enthusiastic about the promises of its naturalization. It was as a result of their reconceptualization that “personality” began to refer to “a quasi-medical entity subject to disease, disorder and symptomatology” (1997, p. 131). The term understood as such won its place within medical discourse and soon, in 1885, it became possible for Théodule Ribot to publish The Diseases of the Personality without a risk of conceptual confusion. An evolution began which would later lead to the inclusion of the personality disorders into the DSM (cf. Brinkmann, 2010, p. 73).

Among the descendants of the medicalization it is arguably the mental hygiene movement, “an ideological component” (Danziger, 1990, p. 163) of the rise of contemporary research on personality, that was most important at that time. On the basis of the belief that it is an individual maladjustment rooted in early development that is responsible for all kinds of social and interpersonal problems, “a powerful and well-founded social movement” (p. 164) directed at the therapy of the maladjusted as well as at the preventive efforts addressed to the potentially maladjusted (which could include everybody), was initiated. The notion of personality, as noted by Danziger, “played a central role in the ideology” (p. 164) of this movement. More particularly, it was the “personality” of individuals addressed by the latter which was recognized as “the site where the seeds of future individual and social problems were sown and germinated” (Danziger, 1997, p. 127) and, accordingly, established as an object of intervention.

Personality understood as such needed to be scientifically measured on the dimension of its adaptation/maladaptation and it was at this place that the psychologists from the Galtonian tradition of individual differences and mental testing arrived on the scene. In fact, it could easily seem that no one was better equipped than those researchers to perform the task set by the mental hygiene movement and to provide the latter’s ideology with a technical background. Mental testing confined to cognitive abilities or intelligence at roughly the same time, i.e., after World War I, turned out to be insufficient not only as a means of occupational selection but also for its originally intended application, i.e. as a predictor of school success. In effect, there was an increasing recognition of the need for measurement techniques for non-intellectual mental qualities.

And such techniques were indeed soon developed using the very same methodological assumptions that had been previously applied to cognitive abilities. Paper-and-pencil questionnaires measuring non-cognitive individual differences “began to proliferate” (Danziger, 1990, p. 159). Simultaneously, a new field of psychological investigation, “something that could sail under the flag of science” (p. 163), began to emerge. Only one more thing was lacking and it was a label, a name for the new sub-discipline and its subject matter.

The “shortlisted” candidates included the notions of temperament, character, and personality. The first one was rejected due to its then associations with physiological reductionism. Why not “character,” then? Well, that notion in turn was considered inappropriate due to its association with the concept of will being an “anathema to scientifically minded American psychologists” (Danziger, 1997, p. 126) and generally normative connotations. The third candidate, “personality,” as a result, came to the fore.

Not only was it devoid of an unwelcome moralistic background and already popularized by the mental hygiene movement, it also offered a realistic prospect of quantitative empirical research. Already adopted by scientific medicine and understood along the lines of Ribot as an “associated whole” (un tout de coalition) of the variety of forces, personality, rather than holistic character, was a much more promising object for the post-Galtonian methodology (Danziger, 1997, p. 127; cf. Brinkmann, 2010, p. 74). Soon, the newly emerging field “developed loftier ambitions” (Danziger, 1997, p. 128) and became a well-established part of academic psychology4 with its flagship project of discovering basic, independent, and universal personality-related qualities: the traits. And it is actually this tradition that is more or less continued today, with the Big Five model being a default perspective.

Note: I would add, that moralistic social “tradition” did not disappear from “personality theory” – psychology remains a socio-religious “prescriptive and rigid” conception of human behavior, despite the effort to construct “something that could sail under the flag of science”

For the establishment of personality rather than character as a subject matter of the new psychological science, Gordon W. Allport’s importance can hardly be overestimated (Allport, 1921, 1927; cf. Nicholson, 1998). Following an earlier proposal by John B. Watson Allport drew an explicit distinction between normatively neutral personality, “personality devaluated,” and character as “personality evaluated” (Allport, 1927, p. 285). Personality and character, crucially, were regarded by him as conceptually independent. The former, in particular, could be intelligibly grasped without the reference to the latter: “There are no ‘moral traits’ until trends in personality are evaluated” (p. 285). Accordingly, an evaluation was considered as additional and only accidental. As such it was regarded as both relative and connected with inevitable uncertainty (for the cultural background and metaethical position of emotivism lying behind such an approach see MacIntyre, 2013).5

The point which is crucial here is that the recognition of the normative element of the character concept led to its virtual banishment. While listing “basic requirements in procedures for investigating personality,” Allport (1927, p. 292) was quite explicit to enumerate “the exclusion of objective evaluation (character judgments) from purely psychological method” (p. 292). Those psychologists who accept his perspective “have no right, strictly speaking, to include character study in the province of psychology” (Allport, 1921, p. 443).6

The transition from the notion of character to that of personality was a very complex process which reflected some substantial changes in cultural and social milieu. Some insightful remarks about the nature of the latter have been provided by Brinkmann’s (2010) account of the shift between the premodern “culture of character” and essentially modern “culture of personality.”This shift, importantly, was not only a “linguistic trifle.” Rather, it was strictly associated with “the development of a new kind of American self” (Nicholson, 1998, p. 52).

A culture of character, to begin with, was essentially connected with moral and religious perspectives, which provided the specification of human télos. And it was in relation to the latter that the pursuit of moral character was placed. In the paradigmatic Aristotelian account, for instance, the notion of the virtuous character was essentially functional in the same way in which the concept of a good watch is (MacIntyre, 2013). The criteria of success and failure, accordingly, were defined in terms of one’s ability to perform the natural functions of the human creature. And the latter were not “something for individuals to subjectively decide” (Brinkmann, 2010, p. 70). Rather, they were predetermined by a broader cosmic order of naturalistic or theological bent.

The goal of adjusting one’s character to suit the requirements of human nature was institutionalized in social practices of moral education and character formation. According to Brinkmann, it was especially moral treatment or moral therapy that embodied the default approach “to the formation and correction of human subjects” (2010, p. 71). This endeavor was subsequently carried on in the very same spirit, though in an essentially different cultural milieu, by William Tuke and Phillipe Pinel and it was no earlier than with Sigmund Freud that a new form of therapy, properly modern and deprived of an explicit normative background, emerged.

Note: And yet, in American psychology, it is precisely this “imaginary normal” that continues to be the default assumption against which pathology and defect are  assigned.

The ancient virtue ethical approach embodied in a culture of character was taken over by the Middle Ages, with an emphasis shifted considerably towards theological accounts of human goals. A thoroughly new perspective proper to a culture of personality appeared much later with the emergence of the scientific revolution, which seriously undermined the belief in objective normative order. The earlier cosmic frameworks started to be supplanted by psychological perspectives with romanticism and modernism being, according to Brinkmann (2010, p. 72), two forces behind them.

One of the main running threads of romanticism is the idea that “each human being has a unique personality that must be expressed as fully as possible” (Brinkmann, 2010, p. 73). Before romanticism, the final purpose had been specified in terms external to a particular individual. It was related to generic norms of humanity as such or to those determined by God. (Today, “generic norms” are determined by a “new” God: the psych industry) Now the goal to be pursued started to be understood as properly individual and unique

Note: I don’t think that Americans understand how pervasive “the shift” away from the individual being “a unique personality that must be expressed as fully as possible” to a totalitarian demand for conformity as dictated by a “new religious” tide of psycho-social tyranny was accomplished in a few decades. It is not surprising that Liberalism is every bit as religious as the Christian Right in its goal to “restore” the extreme religious aims (and hatred of humanity) of Colonial America; a continuation of the religious wars that raged in Europe for centuries.  

This difference is evident when one compares Augustine’s and Rousseau’s confessional writings. The former “tells the story of a man’s journey towards God,” whereas the latter “is about a man’s journey towards himself, towards an expression of his own personality” (Brinkmann, 2010, p. 73). (Not allowed anymore!)

The demand for the “journey towards himself” can be connected with a disenchantment of the world, which had left an individual in a universe devoid of meaning and value. If not discovered in the world, the latter needed to be invented by humans. One had to “turn inwards” in order to find the purpose of life and this entailed, significantly, the rejection of external and social forces as potentially corrupting the genuine inborn self. The idea of “an individual in relative isolation from larger social and cosmological contexts” began to prosper and it “paved the way for the modern preoccupation with personality” (Brinkmann, 2010, pp. 67, 73) defined in fully atomistic or non-relational terms.

The second major force behind a culture of personality was modernism, which, in alliance with the modern idea of science, entailed an “ambition of knowing, measuring [emphasis added], and possibly improving [emphasis added] the properties of individuals” (Brinkmann, 2010, p. 73), which proved to have a considerable bearing on the newly emerging notion of personality. The latter concept had been deeply influenced by the logic of standardization and quantification characteristic of the whole of modernity; not only of its industry, but also of education, bureaucracy, and the prevailing ways of thinking. This logic found its direct counterpart in trait-based thinking about personality with the idea that the latter can “be measured with reference to fixed parameters” and that individuals “vary within the parameters, but the units of measurement are universal” (Brinkmann, 2010, p. 75). (This assumption that “opinions that arise from a social agenda” can be quantified is disastrous.)

The romantic and modernist branches of a culture of personality, with all the differences between what they laid emphasis on, were connected by the common atomistic account of the self and a plea for the development of unique qualities of the individual. And it is this “core element” of their influence which is still in place today,8 even though some authors, Brinkmann included, have announced the appearance of a new cultural formation, a culture of identity.

The character–personality distinction

The relationship between two notions in question can be elucidated by, first, indicating their common features (genus proximum) and, then, by specifying the ways in which they differ from each other (differentia specifica). As far as the former is concerned, both “character” and “personality” can be regarded as constructs belonging to the discourse of individual differences.9 Both notions are analyzable, even if not reductively analyzable, in terms of some lower-level terms such as virtues and vices or, respectively, traits. These lower-lever concepts are usually understood as dispositional. A personality trait, for instance, can be defined as a “disposition to form beliefs and/or desires of a certain sort and (in many cases) to act in a certain way, when in conditions relevant to that disposition” (Miller, 2013a, p. 6). The higher level notions of character and personality, accordingly, are also dispositional.

The formal features indicated above are common to the notions of character and personality.10 And it is on the basis of this “common denominator” that one can attempt to clarify the difference between them. A good place to begin with is a brief remark made by Goldie (2004) who claimed that “character traits are, in some sense, deeper than personality traits, and … are concerned with a person’s moral worth” (p. 27). It is a dimension of depth and morality, then, which can provide one with a useful clue. (Note that both “traits” and moral rules are subjective, culturally defined and NOT quantifiable objects: that is, this remains a religious discussion.) 

As far as the depth of the notion of character is concerned, the concept of personality is often associated with a considerable superficiality and the shallowness of mere appearances (Goldie, 2004, pp. 4–5; Kristjansson, 2010, p. 27). The fact that people care about character, accordingly, is often connected with their attempt to go beyond the “surface,” beyond “the mask or veneer of mere personality” (Goldie, 2004, p. 50; cf. Gaita, 1998, pp. 101–102).11 Even the very etymology of the term “personality” suggests superficiality by its relation to the Latin concept of persona: “a mask of the kind that used to be worn by actors.” Character as deeper “emerges when the mask is removed” (Goldie, 2004, p. 13; cf. the Jungian meaning of persona).

The reference to the depth of character, as helpful as it may be, is certainly insufficient due to its purely formal nature. What still remains to be determined, is a substantive issue of the dimension on which character is deeper than personality. As far as Goldie’s distinction is concerned such a specification is provided in what follows: “someone’s personality traits are only good [emphasis added] conditionally upon that person also having good character traits … On the other hand, the converse isn’t true: the goodness [emphasis added] of someone’s character trait is not good [emphasis added] conditionally on his having good personality traits” (2004, p. 32). It is depth referring to ethical dimension, then, which distinguishes between character and personality.12 One’s virtue of honesty, for instance, can still be valued even if the person in question is extremely shy (introvert, as the psychologist would say). (Both introversion and “honesty” are labeled symptoms of “developmental disorder” in the ASD / Asperger diagnosis)

It does not work the other way around, though. An outgoing and charming personality, when connected with considerably bad character, is in a sense polluted. A criminal who is charming can be even more dangerous, because he/she can use the charm for wicked purposes.13 Such a difference, importantly, should not be taken as implying that personality cannot be evaluated at all. It can, with a reservation that such an evaluation will be made in terms of non-moral criteria or preferences. An extraverted person, for instance, can still be considered as a “better” or more preferable candidate for the position of talk show host (cf. Goldie, 2004, p. 47; McKinnon, 1999, pp. 61–62).

The above-given specification of the distinction can be enriched by some remarks by Gill (1983, p. 470), who notices that “character” and “personality” are not only distinguishable as two concepts but also as “two perspectives on human psychology” for which they are, respectively, central. The character-viewpoint, to begin with, “presents the world as one of … performers of deliberate actions” (Gill, 1986, p. 271). Human individuals, in particular, are considered as more or less rational and internally consistent moral agents possessing stable dispositions (virtues and vices) and performing actions which are susceptible to moral evaluation and responsibility ascription. The evaluation of their acts, importantly, is believed to be objective: to be made along the lines of some definite “human or divine standards” (p. 271). No “special account,” accordingly, is taken “of the particular point of view or perspective of the individuals concerned” (Gill, 1990, p. 4).

The personality-viewpoint, on the other hand, is not associated with any explicitly normative framework. Rather, it is colored by “the sense that we see things ‘as they really are’ … and people, as they really are” (Gill, 1986, p. 271). The purposes are psychological, rather than evaluative: to understand, empathize with, or to explain. Also the default view of the individuals in question is considerably shifted. Their personality is recognized as being “of interest in its own right” (Gill, 1983, p. 472) and their agency as considerably weakened: “The person is not typically regarded as a self-determining agent,” but rather as a “relatively passive” (p. 471) individual often at the mercy of the forces acting beyond conscious choice and intention. The unpredictability and irrationality entailed by such a view is substantial.

To sum up the points made above, it may be said that while both “character” and “personality” belong to the discourse of individual differences, only the former is involved in the normative discourse of person’s moral worth and responsibility. The thesis that the notion of character, but not that of personality, belongs to the discourse of responsibility should be taken here as conceptual. What is claimed, in particular, is that linguistic schemes involving the former notion usually involve the notion of responsibility as well and allow us to meaningfully hold somebody responsible for his/her character. Language games involving both concepts, in other words, make it a permissible, and actually quite a common, “move” to be made. Whether and, if yes, under what circumstances such a “move” is metaphysically and ethically justified is a logically separate issue, which won’t be addressed here.

In those accounts in which the connection between character and responsibility is considered stronger, i.e., as making responsibility claims not only conceptually possible but also justified, a separate account of responsibility is needed (e.g., Miller, 2013a, p. 13). One possible ground on which such an account can be developed is the relationship between character and reasons (as opposed to mere causes). Goldie (2004), for instance, emphasizes the reason-responsiveness of character traits: the fact that they are dispositions “to respond to certain kind of reasons” (p. 43). Actually, he even defines a virtue as “a trait that is reliably responsive to good reasons, to reasons that reveal values” (p. 43, emphasis removed; cf. the definition by Miller, 2013b, p. 24). A vice, accordingly, would be a disposition responsive to bad reasons.

Whether all personality traits are devoid of reason-responsiveness is not altogether clear (cf. Goldie, 2004, p. 13). For the notion of personality proper to academic psychology the answer would probably depend on a particular theoretical model employed. There would be a substantial difference, for instance, between personality understood, along the behavioristic lines, as a disposition to behavior and more full-fledged accounts allowing emotional and, especially, cognitive dispositions. What seems to be clear is the importance of reason-responsiveness for character traits.

The fact–value distinction is usually derived from some remarks in David Hume’s (1738/2014, p. 302) Treatise of Human Nature, in which the idea of the logical distinctiveness of the language of description (is) and the one of evaluation (ought) was expressed. A relatively concise passage by Hume soon became very influential and gave birth not only to a distinction, but actually to a strict dichotomy between facts and values (cf. Putnam, 2002). A methodological prescription “that no valid argument can move from entirely factual premises to any moral or evaluative conclusion” (MacIntyre, 2013, p. 67) was its direct consequence.

In order to refer the above dichotomy to the notions of character and personality, it may be helpful to remember Allport’s (1921) idea of character being “the personality evaluated according to prevailing standards of conduct” (p. 443). A crucial point to be made here is that the act of evaluation is considered as an addition of a new element to an earlier phenomenon of personality, which can be comprehended without any reference to normativeness. The latter notion, in other words, is itself morally neutral: “There are no ‘moral traits’ until trends in personality are evaluated” (Allport, 1927, p. 285).

The thesis that personality can be specified independently of character or more generally, without any application of normative terms, is of considerable importance because it illustrates the fact that the character–personality distinction logically implies the fact–value one. The validity and the strictness of the former, in consequence, rely on the same features of the latter. Character and personality, in brief, can be separated only as long as it is possible to isolate personality-related facts from character-related values.

Such dependence should necessarily be referred to contemporary criticism of the fact–value distinction (e.g., MacIntyre, 2013; Putnam, 2002; cf. Brinkmann, 2005, 2009; Davydova & Sharrock, 2003). This criticism has been voiced from different perspectives and involves at least several logically distinct claims. For the present purposes, however, it is an argument appealing to so-called thick ethical concepts14 and the fact–value entanglement that is of most direct significance.

The distinction between thick and thin ethical concepts was first introduced (in writing) by Bernard Williams (1985/2006)15 and subsequently subjected to intense discussion (for useful introductions see Kirchin, 2013; Roberts, 2013; applications for moral psychology can be found in Fitzgerald & Goldie, 2012). What is common to both kinds of concepts is that they are evaluative: they “indicate some pro or con evaluation” (Kirchin, 2013, p. 5). Thick concepts, furthermore, are supposed to provide some information about the object to which they refer (information, which thin concepts do not provide). They have, in other words, “both evaluative conceptual content … and descriptive conceptual content … are both evaluative and descriptive” (Kirchin, 2013, pp. 1–2). If I inform somebody, for instance, that person A is good and person B is courageous, it is obvious that my evaluation of both A and B is positive. At the same time, however, the person informed doesn’t seem to know much about a good (thin concept) person A, whereas he/she knows quite a bit about a courageous (thick concept) person B.

The significance of thick concepts for philosophical discussion is usually connected with some “various distinctive powers” they supposedly possess. More specifically, when they are interpreted along the lines of a so-called non-reductive view they seem to have “the power to undermine the distinction between fact and value” (Roberts, 2013, p. 677).16 The non-reductive position is usually introduced as a criticism of the reductive idea that thick concepts “can be split into separable and independently intelligible elements” (Kirchin, 2013, p. 8; cf. the idea of dividing character into two parts mentioned above) or, more specifically, explained away as a combination of (supposedly pure) description and thin evaluation. If such a reduction was successful thick concepts would turn out to be derivative and lacking philosophical importance.

Many authors, however, including notably Williams (1985/2006), McDowell (1981), and Putnam (2002), claim that no such reductive analysis can be conducted due to the fact–value entanglement characteristic of thick concepts. The latter, as is argued, are not only simultaneously descriptive and evaluative, but also “seem to express a union of fact and value” (Williams, 1985/2006, p. 129). The fact–value entanglement proper to thick concepts becomes apparent if one realizes that any attempt to provide a set of purely descriptive rules governing their application seems to be a hopeless endeavor. One cannot, for instance, develop a list of necessary and jointly sufficient factual criteria of cruelty.17 It is obviously possible “to describe the pure physical movements of a torturer without including the moral qualities” (Brinkmann, 2005, p. 759), but it would yield a specification which comes dangerously close to the description of some, especially unsuccessful, surgical operations. In order to convey the meaning of the word “cruelty” (and to differentiate it from the phrase “pain-inflicting”) one needs to refer to values and reasons (rather than facts and causes only). An evaluative perspective from which particular actions are recognized as cruel, accordingly, must be at least imaginatively taken in order to grasp the rationale for applying the term in some cases, but not in others. Communication using thick concepts, as a result, turns out to be value-laden through and through.

The above-given features assigned to thick concepts by the non-reductionists are crucial due to the fact that they cannot be accounted for within the framework of the fact–value distinction. As such they are often believed to “wreak havoc” (Roberts, 2013, p. 678) with the latter or, more precisely, to undermine “the whole idea of an omnipresent and all-important gulf between value judgments and so-called statements of fact” (Putnam, 2002, p. 8).

The undermining of the sharp and universal dichotomy between facts and values has a very direct bearing on the character–personality distinction being, as emphasized above, dependent on the former. A crucial point has been made by Brinkmann who noticed that almost “all our words used to describe human action are thick ethical concepts” (2005, p. 759; cf. Fitzgerald & Goldie, 2012, p. 220). And the same applies to the language of character which, contrary to Allport’s expectations, cannot be neatly separated into the factual core of personality and the normative addition. The distinction between the notions of character and personality, in consequence, even though often applicable and helpful, cannot be inflated into a sharp dichotomy.

Having analyzed the reliance of the character–personality distinction on the dichotomy between, respectively, value and fact, it becomes possible to carry out the second detailed investigation devoted to the functions played by the two concepts scrutinized. A good starting point for this exploration may be a remark made by Goldie (2004) who, while discussing the omnipresence of the discourse of personality and character, noticed that it is “everywhere largely because it serves a purpose: or rather, because it serves several purposes [emphasis added]” (p. 3). These functions merit some closer attention because they can help to further specify the difference between the concepts investigated.

The purposes served by the discourse of individual differences have been briefly summarized by the abovementioned author when he said that we use it “to describe people, to judge them, to enable us to predict what they will think, feel and do, and to enable us to explain their thoughts, feelings and actions” (and to control, manipulate and abuse them) (Goldie, 2004, pp. 3–4; cf. an analogous list provided by Miller, 2013b, pp. 12–13). Some of these functions are common to the notions of character and personality. Some others, however, are proper to the concept of character only.

The first of the common functions is description. The language of character and personality can serve as a kind of shorthand for the longer accounts of the actions taken. When asked about the performance of a new employee, for instance, a shift manager can say that he/she is certainly efficient and hard-working (rather than mention all particular tasks that have been handled). Similarly, if we say that A is neurotic, B is extraverted, C is just, and D is cruel, we do convey some non-trivial information about A, B, C, and D, respectively (even though our utterances may include something more than a mere description).

The second of the purposes that can be served by both concepts is prediction. We may anticipate, for example, that neurotic A will experience anxiety in new social situations. Despite the fact that such a prediction will be inevitably imprecise and fallible, it does enable us to narrow down “the range of possible choices and actions” (Goldie, 2004, p. 67) we can expect from a particular agent.

In fact, predictions regarding human behavior are notoriously inaccurate “guesses” – note the inability of the Psych Industry to identify mass shooters before they act.) 

The notions of character and personality, furthermore, can be employed as a means of judgment. At this point, however, an important qualification needs to be made. If this function is to be assigned to both concepts it can be understood only in a weak sense of judging as providing an instrumental assessment. The ascription of personality traits of neuroticism and extraversion to A and B, respectively, can be used to say that A would not make a good candidate for an assertiveness coach, whereas B may deserve a try in team work. It falls short, however, of any moral judgment, which can be made only by means of character-related notions.

The concepts of personality and character, finally, can both be used to provide explanation. We can, for instance, say that C was chosen as a team leader because he/she is just and expected to deal fairly with potential conflicts. Having assigned an explanatory role to “character” and “personality,” however, one should necessarily remain fully aware of the experimental results reported in the first section. An appeal to “character” and “personality” as explanatory constructs does not have to mean that they provide the whole explanation. Situational factors still count and, as a matter of fact, one may need to acknowledge that in a “great many cases … [they] will be nearly the entire explanation” (Kupperman, 1991, p. 59).

One other reservation concerns the kind of explanation conveyed by the personality- or character-trait ascription. Human behavior, in particular, can be explained in at least two distinct ways (e.g., Gill, 1990, p. 4). Explanation, to begin with, can be made along the lines of the deductive-nomological model and refer to causes and natural laws. In such cases it is not substantially different from explanations of natural facts (like an earthquake) offered by the sciences. And it is this kind of explanation that is provided when non-reason-responsive features of personality are appealed to (cf. Goldie, 2004, p. 66).

Human action, however, can be also made comprehensible by the reference to reasons behind it. If we know what a person in question “values or cares for,” in particular, we can “make sense [emphasis added] of the action, or make the action intelligible, understandable or rational [emphasis added]” (Goldie, 2004, p. 65). Such an “explanation” can be given by the indication of only those traits, which are reason-responsive and, strictly speaking, is much closer to Dilthey’s (1894/2010) understanding (Verstehen) than to naturalistically understood explanation (Erklären).

The functions of description, prediction, instrumental assessment, and explanation (at least as far as the latter is understood in terms of causes) are common to both concepts of “personality” and “character.” The latter notion, however, can serve some additional purposes, which give it a kind of functional autonomy. Among the character-specific functions, to begin with, there is moral judgment. When we say that C is just and D is cruel we don’t make an instrumental and task-relative assessment. Rather, we simply evaluate that C is a morally better person than D (other things being equal). With this, the function of imposing moral responsibility is often connected. The issue of the validity of such an imposition is very complex and controversial. Still, it does remain a discursive fact that the claim that D is cruel is usually associated with holding D, at least to some extent, responsible for his/her cruelty.

Note that “pathological, disordered, mentally ill, socially defective” are labels every bit as “moral / judgmental” in the “real social environment” as  “sinful, perverted, possessed by demons, Godless atheist, or an “agent of Satan” 

The functions of moral judgment and moral responsibility ascription are not typically served by the scientific notion of personality. They may, however, become formally similar to description, explanation, and prediction if they are, as is often the case, applied within mostly third-personal language (as judging others and imposing responsibility on others). Apart from these functions, however, the notion of character can fulfill some essentially first-personal kind of purposes. And it is the latter that seems to be its most specific feature.

Among the first-personal functions ofcharacter,” identification is fundamental, both psychologically and conceptually. When a person identifies with a character trait or, more holistically, with a complete character ideal, she begins to consider such a trait or character as a part of her identity (cf. Goldie, 2004, pp. 69–70): as something she “decides to be or, at least, to see herself as being” (Kupperman, 1991, p. 50). Such an identification, if serious, is very rich in consequences: it establishes “the experienced structure of the world of direct experience as a field of reasons, demands, invitations, threats, promises, opportunities, and so on” (Webber, 2013, p. 240) and helps one to achieve a narrative unity of one’s life (cf. Goldie, 2004; Kuppermann, 1991; McKinnon, 1999).

First-personal functions of the character notion, additionally, enable the agent to undertake more specific self-formative acts such as evaluating oneself against the idealized self, structuring moral progress, or providing motivation needed to cope with the difficulties of moral development. The notion of character employed in such a way becomes a kind of an internalized regulative ideal with a considerable emotional, imaginative, and narrative dimension. Its specific purposes are self-evaluative, self-prescriptive, and self-creative (rather than descriptive, predictive, and explanatory). The criteria of its assessment, accordingly, should be at least partially independent from those proper to strictly scientific constructs.

The latter fact, as may be worthwhile to mention, has a direct bearing on the challenge of situationism mentioned at the beginning of these analyses. The arguments in favor of this disquieting position have typically referred to experiments indicating that situational variables possess much bigger explanatory and predictive value than those related to personality and concluded that the usefulness of the personality concept needs to be seriously questioned. The doubts concerning the notion of character usually followed without further ado. No special attention, in particular, was paid to the assumption that the concepts of character and personality fulfill the same functions of description, explanation, and prediction. Accordingly, it was usually taken for granted that the failure of the latter concept automatically entails the uselessness of the former.18 As far as it is admitted that such an approach is at least partially erroneous, it may be worthwhile to refocus the debate towards the specific, first-personal, and normative functions of the notion of character. Do we need the latter to perform them and, if so, does this notion really serve us well, even though it is scientifically weak?

Some final remarks

An important clarification, however, that needs to be made here is that any skepticism concerning the fact–value dichotomy suggested by some features of thick concepts should not be conceived by the psychologists as a call to develop a prescriptive and moralistic science of character and, thus, to become “like priests” (too late: this is where American Psychology stands today)  (Charland, 2008, p. 16). A false impression that it is the case might result from the conflation between the full-fledged version of the fact–value distinction and the original, and relatively modest, Humean dictum that “no valid argument can move from entirely factual premises to any moral or evaluative conclusion” (MacIntyre, 2013, p. 67).19

That it is the latter that most psychologists care about can be clearly seen in two recent papers by Kendler (1999, 2002), who issues a stern warning that any psychological project developed along the lines of what he calls “the enchanted science”20 and motivated by the belief that psychology itself can discover moral truths can lead not only to Gestalt psychology’s holism or humanistic psychology, but also to the quasi-scientific justification of “Nazi and Communist ideology” (1999, p. 828). And it is in order to prevent these kinds of abuses that Kendler (1999) refers to what he calls “the fact/value dichotomy” or “an unbridgeable chasm between fact and values” (p. 829). By this, however, he does not seem to mean anything more than that “empirical evidence can validate factual truth but not moral truth” (p. 829). An example he provides considers the possibility of obtaining reliable empirical data supporting the thesis that bilingual education is advantageous for ethnic identification, but disadvantageous for academic development. Such data, as he rightly insists, would still leave it to the society to decide which value, ethnic identification or academic progress, should be given priority.

All of this, however, does not need to lead one to the acceptance of the fact–value dichotomy in the strong version that has been criticized by Putnam, McDowell, and others. Rather, it is the is–ought dichotomy which seems to be sufficient. The subtle differences between these two distinctions have been clarified by Dodd and Stern-Gillet (1995) who argue that the Humean dictum can be best understood as a general logical principle without any substantive metaphysical dimension of the kind usually connected with the fact–value dichotomy. That the is–ought gap is narrower and weaker is also illustrated by the fact that it is confined to “ought” statements with a considerable amount of other evaluative statements left aside. The examples provided by the authors are the aesthetic language of art and, importantly, the virtue ethical discourse of character. And as the ascription of beauty to a painting does not automatically entail any particular prescription,21 so does the assignment of courage or foolishness to a person. Even though such a feature of the characterological language has often happened to be conceived as a weakness within metaethical contexts, it can be arguably beneficial to all those psychologists who want to study the complexities of character without making an impression that any particular normative position can be derived from purely scientific study. A substantial amount of normativity, as shown by the example of thick concepts, will obviously remain inevitable, but it is certainly worthwhile to emphasize that it is mostly placed before empirical research as an evaluative framework taken from elsewhere and, thus, subjected to criteria and authorities of a non-empirical nature.

This paper has been written during a visit to Oxford University’s Faculty of Philosophy. I am greatly indebted to Edward Harcourt for all his help and support.

Evolution of Personality Variation in Humans / A challenge to status quo

I’m including more of the paper in the next post: a section on the BIG 5 Personality Traits model. 

The prejudice against there being an Asperger type “personality” within the variation of “acceptable human” variation, is incredibly strong – and socially enforced – what I have encountered among “psych-psych” professionals is outright dismissal, and even “scorn” at the suggestion that an Asperger individual might not be “developmentally defective” but simply a “version” of Homo sapiens with different sensory and perceptual systems well-adapted to specific natural environments. 

Evolutionary Psychology is the branch of psychology that studies the mental adaptations of humans to a changing environment, especially differences in behavior, cognition, and brain structure.

_____________________________________

https://www.researchgate.net (Note; I have a direct link to the pdf, but the url of the pdf does not link to the pdf!)

by Daniel Nettle, Newcastle University in American Psychologist, 2006  Evolution and Behaviour Research Group, Division of Psychology, Henry Wellcome Building, University of Newcastle, Newcastle NE2 4HH, United Kingdom.

In recent years, there has been an extraordinary growth of interest in giving ultimate, evolutionary explanations for psychological phenomena alongside the proximate, mechanistic explanations that are psychology’s traditional fare (Barkow, Cosmides, & Tooby, 1992; Buss, 1995). The logic of ultimate explanations is that for psychological mechanisms and behavioral tendencies to have become and remain prevalent, they must serve or have served some fitness-enhancing function. (We AS types are still here, despite being a 1% minority!)

The explanatory program of evolutionary psychology has concentrated strongly on human universals, such as jealousy, sexual attraction, and reasoning about social exchange (Buss, 1989; Buss, Larsen, Westen, & Semmelroth, 1992; Cosmides, 1989). The focus has been on the central tendency of the psychology of these domains, rather than the observed variation, and explanation has been in terms of adaptations shared by all individuals. Indeed, some evolutionary psychologists have implied that one should not expect there to be any important variation in traits that have a history of selection. For example, Tooby and Cosmides (1992) argued that “human genetic variation . . . is overwhelmingly sequestered into functionally superficial biochemical differences, leaving our complex functional design universal and species typical” (p. 25). The reason invoked for this assertion is that natural selection, which is a winnowing procedure, should, if there are no counteracting forces, eventually remove all but the highest-fitness variant at a particular locus (Fisher, 1930; Tooby & Cosmides, 1990), especially because complex adaptations are built by suites of genes whose overall functioning tends to be disrupted by variation.

Because of the winnowing nature of selection, the existence of heritable variation in a trait is argued to be evidence for a trait’s not having been under natural selection: “Heritable variation in a trait generally signals a lack of adaptive significance” (Tooby & Cosmides, 1990, p. 38, italics in original). Tooby and Cosmides (1992) thus suggested that most of the genetic variation between human individuals is neutral or functionally superficial. They did, however, concede a possible role for “some thin films” (?) of functionally relevant heritable interindividual differences (Tooby & Cosmides, 1992, p. 80). The possible sources of these thin films—frequency-dependent selection and selective responses to local ecological conditions—are discussed in greater detail below. What is relevant for the present purposes is the low priority given to understanding these thin films relative to the task of describing and explaining the universal psychological mechanisms that humans undoubtedly all share.

Personality Variation and Evolutionary Psychology 

There has however, been a response from researchers seeking to marry differential and evolutionary psychology in a way that gives greater weight to the study of individual differences (see, e.g., Buss, 1991; Buss & Greiling, 1999; Figueredo et al., 2005; Gangestad & Simpson, 1990; MacDonald, 1995; Nettle, 2005). David Buss made an early contribution to this literature by enumerating possible sources of functionally important interindividual variation (Buss, 1991; see also Buss & Greiling, 1999). Most of these are mechanisms that do not rely on heritable variation in psychological mechanisms, for example, enduring situational evocation, or calibration by early life events, or calibration of behavior by the size or state of the individual. However, Buss also discussed the possibility that there are equally adaptive alternative behavioral strategies underlain by genetic polymorphisms, or continua of reactivity of psychological mechanisms, in which there is no universal optimum, and so genetic variation is maintained. The idea of continua of reactivity was taken up by Kevin MacDonald (1995). MacDonald proposed that the normal range of observed variation on personality dimensions represents a continuum of viable alternative strategies for maximizing fitness.

In this view, average fitness would be about equal across the normal range of any given personality dimension, but individuals of different personality levels might differ in the way that they achieved their fitness—for example, by investing in reproductive rather than parental effort. Implicit in MacDonald’s formulation, but perhaps not examined in enough detail, is the concept of trade-offs. The idea of trade-offs is reviewed in detail below, but the key point is that if two levels of a trait have roughly equal fitness overall and if increasing the trait increases some component of fitness, then it must also decrease other components. Every benefit produced by increasing a trait must also produce a cost. If this is not the case, there is no trade-off, and natural selection is directional toward the higher value of the trait.

The purposes of the present article are several. First, no reasonable biologist or psychologist should disagree with Tooby and Cosmides (1990, 1992) that humans’ psychological mechanisms show evidence of complex design and are largely species-specific. Nor need differential psychologists deny the importance of the branches of psychology that are devoted to the study of species-typical mechanisms. However, I argue that a more up-to-date reading of the very biology from which Tooby and Cosmides draw their inspiration leads to a rather different view of the extent and significance of variation. The films of functionally significant interindividual variation need not be particularly thin. The first purpose of this article, then, is to review interindividual variation in nonhuman species, with particular attention to the way that selection can allow variation to persist even when it is relevant to fitness. Second, although Buss’s (1991) and MacDonald’s (1995) reviews have been influential in enumerating possible evolutionary mechanisms that lie behind the persistence of personality differences, there has as yet been relatively little work in evolutionary personality psychology that actually tests the predictions of these models empirically (for some exceptions, see Figueredo et al., 2005; Gangestad &Simpson, 1990; Nettle, 2005).

The bulk of the work in personality psychology goes on uninspired by considerations of ultimate evolutionary origins. The second purpose of this article, therefore, is to build from MacDonald’s ideas of personality dimensions as alternative viable strategies, outlining a more explicit framework of costs and benefits, and to apply this framework to each of the dimensions of the five-factor model of personality. This approach allows existing studies that were done from a largely inductive, a theoretical perspective to be interpreted more coherently through the long lens of adaptive costs and benefits. In addition, the approach allows the generation of novel predictions and ideas for future research.

much, much more… 11 pages total 

Death by Medical Error / Not Reported or Tracked

This is a story without end. Dozens of articles and studies argue over the number of deaths, which are “guessed at” “arrived at statistically” “reworked from archival material” “fudged” “denied” – in other words, the numbers have no reality – Why? Because data on medical deaths is not required on Death Certificates.  There is no tracking of such deaths because they are not reported.

I wonder why? Could the Medical Industry be protecting itself by “not coming clean”?

Here’s where the public gets “shafted” Who would have guessed that the insurance industry is now dictating the content of Death Certificates, which are legal  documents that affect each and every one of us, and which have widespread consequences for families tasked with the complex mysteries of navigating the post mortem experience, including cheap shots from insurance providers who refuse to live up to promised coverage. 

Families have the right to know how their loved one died. 

______________________________________________________________________________________________

1462316610726_cached

Medical errors may be third leading cause of death in the U.S.

By Jen Christensen and Elizabeth Cohen, CNN, Tue May 3, 2016

You’ve heard those hospital horror stories where the surgeon removes the wrong body part or operates on the wrong patient or accidentally leaves medical equipment in the person they were operating on. Even scarier, perhaps, is a new study in the latest edition of BMJ suggesting most medical errors go unobserved, at least in the official record.

In fact, the study, from doctors at Johns Hopkins, suggests medical errors may kill more people than lower respiratory diseases like emphysema and bronchitis do. That would make these medical mistakes the third leading cause of death in the United States. That would place medical errors right behind heart disease and cancer.

Through their analysis of four other studies examining death rate information, the doctors estimate there are at least 251,454 deaths due to medical errors annually in the United States. The authors believe the number is actually much higher, as home and nursing home deaths are not counted in that total.

When a surgeon should just say ‘I’m sorry’

This is a much greater number than a highly cited 1999 study from the Institute of Medicine that put the number in the 44,000 to 98,000 range. Other studies have put estimates closer to 195,000 deaths a year. The U.S. Department of Health and Human Services Office of the inspector general in 2008 reported 180,000 deaths by medical error among Medicare patients alone.

Dr. Martin Makary and Dr. Michael Daniel, who did the study, hope their analysis will lead to real reform in a health care system they argue is letting patients down. 

“We have to make an improvement in patient safety a real priority,” said Makary, a professor of surgery and health policy and management at Johns Hopkins.

Bit by a squirrel? There’s now a code for that

One reason there’s such a wide range of numbers is because accurate data on these kinds of deaths is surprisingly sparse. That’s in part because death certificates don’t ask for enough data, Makary said.

Currently the cause of death listed on the certificate has to line up with an insurance billing code. Those codes do not adequately capture human error or system factors.

“Billing codes are designed to maximize billing rather than capture medical errors,” Makary said.

_________________________________________________________________________________________

 

 

 

PhD Dissertation / Asperger Syndrome Social Narratives

Dissertation for Dr. of Philosophy, Bowling Green State University, 2010 Neil Shepard

FULL TEXT: http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1276714818

imagesQTYDAM51 images94L4E8HC

From Introduction: This dissertation explores representations of Asperger’s syndrome, an autism spectrum disorder. Specifically, it textually analyzes cultural representations with the goal of identifying specific narratives that have become dominant in the public sphere. Beginning in 2001, with Wired magazine’s article by Steve Silberman entitled “The Geek Syndrome” as the starting point, this dissertation demonstrates how certain values have been linked to Asperger’s syndrome: namely the association between this disorder and hyper-intelligent, socially awkward personas.

Narratives about Asperger’s have taken to medicalizing not only genius (as figures such as Newton and Einstein receive speculative posthumous diagnoses) but also to medicalizing a particular brand of new economy, information-age genius. The types of individuals often suggested as representative Asperger’s subjects can be stereotyped as the casual term “geek syndrome” suggests: technologically savvy, successful “nerds.” On the surface, increased public awareness of Asperger’s syndrome combined with the representation has created positive momentum for acceptance of high functioning autism. In a cultural moment that suggests “geek chic,” Asperger’s syndrome has undergone a critical shift in value that seems unimaginable even 10 years ago.

This shift has worked to undo some of the stigma attached to this specific form of autism. The proto-typical Aspergian persona represented dominantly in the media is often both intelligent and successful. At the same time, these personas are also so often masculine, middle/upper class and white. These representations are problematic in the way that they uphold traditional normativity in terms of gender, race and class, as well as reifying stigma toward other points on the autistic spectrum.

Having grown up with a family connection to Asperger’s syndrome, I can say that from my experience the truly challenging difficulties that emerge do so from encounters with the social world. I have never met a person with autism who is, in and of themselves, a “problem.” Problems come in the form of ignorance; the forms of this ignorance vary in range from inadequate educational resources to bullies. The sentiment that the problem is social rather than individual is something that I have seen echoed repeatedly throughout my research, whenever I have read of or spoken with people with autism, their parents, guardians, children, siblings and friends. Whatever Asperger’s or autism may be has, in my experience, been less important thanthe beliefs and practices that comprise it. The work of cultural studies, as I see it, is to interrogate those beliefs and practices.

To talk about a condition such as autism as being socially constructed isn’t to deny the reality of the condition, but rather to call attention to those beliefs and practices that shape the consequences of that reality. Understanding Asperger’s syndrome as a social construction is not to deny the clear realities of a condition that is manifested in the body, but to recognize the accountability of culture’s role in that reality. A social model approach to autism means an acute awareness of those impairments and those disabling features that are a result of the surrounding culture.

Citation: Shepard, Neil, “Rewiring Difference and Disability: Narratives of Asperger’s Syndrome in the Twenty-First Century” (2010). American Culture Studies Ph.D. Dissertations. Paper 40.

Individualism is an atheist lie / from a “Progressive Christian”

http://www.patheos.com/blogs/mercynotsacrifice/2011/10/19/individualism-is-an-atheist-lie/

October 19, 2011 by Morgan Guyton

We meditated on this quotation from Jesus yesterday at our Virginia Methodist provisional clergy mentor covenant group retreat. On the side, I have been reading Eastern Orthodox theologian John Zizioulas’ Being and Communion, which has caused me to see the implications of Jesus’ statement in a completely new light. Zizioulas writes that God is the only authentic person in the universe because God is the source of His own being. As creatures, we are completely contingent upon God for our being.

If we really believe that God is the source of every instant of our consciousness, then Jesus’ statement is a lot more all-encompassing than we might have previously thought. He is not simply talking about the relationship that followers have to their leader or students have to their teacher. He is not just talking about any kind of lifestyle or community we choose to enter into. He is talking about the relationship He has as Creator to all of His creatures who are branches on His vine whether we accept this reality or not. Nothing in the universe exists independent from Christ, who is not solely the man Jesus who walked the Earth 2000 years ago but also the very Word of God, the creative agency which articulates and implements the Father’s will as John 1:3 describes: “Through him all things were made; without him nothing was made that has been made.”

On the vine of our creator Christ, those whose hearts are opened to communion and intimacy with their Creator “bear fruit.” Those who pretend to “be like gods” themselves (Gen 3:5) and cling to the delusion of their own self-sufficiency are “like a branch that is thrown away and withers… [before it is] picked up, thrown into the fire and burned” (John 15:6). Individualism describes the atheist delusion that we are the source of our own being, which is having the naivete of a branch that thinks it does not need God’s vine to be fed and survive. You can be an individualist and talk about God all day, but God is not truly God to you if you think you’re a self-made person. Unfortunately, individualism is the default perspective with which people in our age view life, including many who never stop blabbering about Jesus.

Cogito ergo sum. I think therefore I am. Written by Rene Descartes in 1637, this is perhaps the most definitive declaration of independence from God in the course of Western history. (How about Nietzsche / “God is Dead” ?) It is the origin of secular thinking, because it sets as a foundational premise that our minds in effect “create” our existence, i.e. we are the source of our own identity (rather than God). Descartes’ premise is a choice to view the world with the assumption that the boundaries of reality are determined by our perception of it. I think; therefore I am” applied to the world outside my brain becomes “I see it; therefore it is,” which is the foundational premise of modern science.

Truth becomes that which has been observed and measured by multiple persons coming to the same conclusions instead of what our ancestors tell us that God told their ancestors to pass down to us. Rather than being a tribe in which our identity is given to us by our family, humanity is redefined by the Western secular tradition of Descartes and Enlightenment thinkers as a race of individuals who are the source of their own identity and subsequently form families and societies through social contracts with other individuals.

To view the world in this “I-centered” way which is ubiquitous to Western culture means living as if God doesn’t exist, at least not the God who Christians for centuries considered to be the One in whom “we live and move and have our being” (Acts 17:28). Rather than being understood as the source of our being, God becomes just another infinitely bigger and more powerful being who’s a constant threat to our freedom. God is the one who started the world, who intervenes occasionally in certain spectacular supernatural moments, and who will ultimately end the world, instead of being the One from whom creation is constantly emanating. God is seen as Someone outside of everything to whom we call to intervene rather than Someone inside of everything to whom we seek a purer connection. (That persistent NT insistence of inside / outside human isolation from Nature!) Paul’s declaration that “in him all things hold together” (Col 1:17) sounds like pious poetry to us, but we don’t take this at all seriously as an ontological claim, because what we really believe in modernity is that “in science nature holds together” and, most problematically, “in our theological system God holds together.”

I understand that there are many positives to the legacy of Descartes and the Enlightenment. I just think it’s completely wrong to say Cogito ergo sum when we should be saying Cogitat Deus ergo sum (God thinks; therefore I am). Cogito ergo sum isn’t just Descartes’ delusion; it’s the delusion of all in our society who are taught to see themselves as self-made individuals. People don’t make themselves. Individualism is an atheist lie. Christ is our Creator. In Him all things hold together. All things are created through Him and for Him. He is the vine and we are the branches.

____________________________________________________________________________________________

Okay, this may seem an odd piece to post, but it does contribute to the topic of recent posts on the concept of SELF. It demonstrates the ongoing conflict between so-called ‘secular thinking’ and ‘religious thinking’ and also the failure to recognize that philosophical points of view, and definitions of specific terms, pass into popular cultural as  strange and distorted “thingies”. We can also detect the influence of psychology and the social sciences, which, with traditional Biblical sources, create a fine mish-mash of assertions. Science, the method, is completely misunderstood.

The “point” of the piece seems to be the instructive metaphor, “He is the vine, and we are the branches”. This seems a sufficient illustration of belief. Why all the  unnecessary flailing around over misrepresentations of historical contributions to “Western Thought”? This, to me, weakens the “message.” “Stand by your man…”

_________________________________________________________________

INDIVIDUALISM / 1.The habit or principle of being independent and self-reliant. ‘a culture that celebrates individualism and wealth’ 1.1 Self-centered feeling or conduct; egoism. 2. A social theory favouring freedom of action for individuals over collective or state control. ‘encouragement has been given to individualism, free enterprise, and the pursuit of profit’

Hmmmmm  …. If Individualism is an atheist lie, then The United States was founded by atheists, and no “true” Christian can participate in the U.S. Capitalist economy, and in fact, a “true” Christian believes in Communism / Socialism  and not in Democracy, as a form of governance.    

The fundamental “bottom line” of science. 

No “true” Christian should purchase or use any product of “computer science” (including the Internet) unless Jesus Christ can be proven to have invented it.  

 

 

Musings on THE SELF /

 

Excerpts various posts: 

https://aspergerhuman.wordpress.com/2017/07/25/co-consciousness-social-typical-dependence-on-word-thinking/

A child is told who it is, where it belongs, and how to behave, day in and day out, from birth throughout childhood (and indeed, throughout life.) In this way culturally-approved patterns of thought and behavior are imparted, organized and strengthened in the child’s brain. Setting tasks that require following directions (obedience) and asking children to ‘correctly’ answer questions along the way, helps parents and society to discover if the preferred responses are in place.

I don’t remember blurting out “Cogito ergo sum!” in school one day. Achieving awareness of my existence was a misty process, a journey taken before I “knew” of an existence of a “self”. Identity (which is not the same as personality) does not pre-exist; it is constructed. Long before a baby is conceived and born, family and society have composed an identity and a comprehensive world picture for it. The majority of those who belong to a religion or a social class are members by accident of birth, not by choice. We are born into cultures and belief systems; into versions of reality invented by humans long departed.

https://aspergerhuman.wordpress.com/2018/05/20/self-awareness-omg-what-a-hornets-nest/

Self awareness comes as we live our lives: true self-esteem is connected to that process, not as a “before” thing, but an “after” thing: a result of meeting life as it really is, not as a social fantasy. Self awareness is built from the expression of talents and strengths that we didn’t know we possessed. It also arises as we see the “world” as its pretentions crumble before us. Being able to see one’s existence cast against the immensity of reality, and yet to feel secure, is the measure of finally giving birth to a “self”. 

https://aspergerhuman.wordpress.com/2016/10/30/express-yourself-or-express-oneself-social-vs-hyposocial/

As a “hyposocial” individual, tattooing is somewhat of a mystery: tattoos are a social “sign of commitment” to a group or belief system, whether or not that group is large or consists of one other person. My reaction is: But what if you change your mind? What if your “self” changes? The notion of a “static” self is difficult to grasp.

Me, me, me, me, me! The social typical orientation. This is how NTs “look” to me. 

https://aspergerhuman.wordpress.com/2018/05/07/what-is-the-asperger-blank-stare-all-about/

One of the big mistakes that social typicals make is to attribute intent to Asperger behavior. This is because social typicals are “self-oriented” – everything is about THEM; any behavior on the part of a human, dog, cat, plant or lifeform in a distant galaxy, must be directed at THEM. Example: God, or Jesus, or whomever, is believed to be paying attention 24/7 to the most excruciatingly trivial moments in the lives of social typicals. We’re not as patient as God or Jesus.

The Asperger default mental state is a type of reverie, day-dreaming, trance or other “reflective” brain process; that is, we do “intuitive” thinking. The “blank face” is because we do not use our faces to do this type of thinking. 

Sorry – we’re just busy elsewhere! When you ask a question, it can take a few moments to “come out of” our “reverie” and reorient our attention. If you are asking a “general question” that is meant to elicit a “feeling” (social) response, it will land like a dead fish in front of us. Hence the continued “blankness”. 

https://aspergerhuman.wordpress.com/2017/04/11/to-see-with-the-minds-eye-what-does-it-mean/

The self is “imported” from a socio-cultural menu.

It is a very common assumption that all people “think and act” exactly alike. (Thus the insistence that “underneath it all, everyone is the same” – often said by white people to end discussions of racism) When I was a child I also thought that everyone had “the same brain” as if they roll off an assembly line into our skulls, and it created no end of problems! How could people “come up with” bizarre conclusions and irrational explanations for perfectly logical occurrences? And then one day, I realized that my brain “worked” differently than just about everyone I had ever met. This was a giant leap toward self awareness of the good news / bad news type.   

It is exactly this human self-centeredness that makes the “Theory of Mind” and “mind-reading” so laughable.

Neurotypicals assume that the other person thinks and feels as they do: this is a good “guess” when social people account for 99% of the population and the self is “imported” from an extremely limited socio-cultural menu. And, social people are taught to automatically agree with what others say, in order to be considered a “nice person”. 

Who am I?

The answer for me turned out to be simple: I am everything I have ever seen. Meep! Meep!

especially when young, asks,

 

 

 

 

 

What is self? / an anthropological concept

A. I. Hallowell on ‘Orientations for the Self’

The following summary of Hallowell’s analysis as set out in his paper The self and its behavioral environment (most easily accessible as Chapter 4 of his book Culture and Experience (1955; 2nd Edition, 1971): University of Pennsylvania Press, has been taken from A. Lock (1981) Universals in human conception, in P.L.F. Heelas and A.J. Lock (eds.) Indigenous Psychologies: The Anthropology of the Self. London: Academic Press, pp19-36, with minor revisions.

__________________________________

Alfred IrvingPeteHallowell (1892–1974) was an American anthropologist, archaeologist and businessman. He was born in Philadelphia, Pennsylvania, and attended the Wharton School of the University of Pennsylvania receiving his B.S. degree in 1914, his A.M. in 1920, and his Ph.D. in anthropology in 1924. He was a student of the anthropologist Frank Speck. From 1927 through 1963 he was a professor of anthropology at the University of Pennsylvania excepting 1944 through 1947 when he taught the subject at Northwestern University. Hallowell’s main field of study was Native Americans.

_________________________________

NOTE: I’m “looking into” concepts of “self” and “self-awareness” after confronting, over and over again, the claim that “some high number” of Asperger types lack “self-esteem” – another of those sweeping generalities that likely is a ‘social judgement’ from the “outsider” – parent, teacher, psychologist, counselor, therapist, hairdresser, clerk at convenience store, neighbor or any bystander caring to comment on child behavior. This “lack of self-esteem” has become a “fad, cliché, causal certainty” for almost any perceived “human behavioral problem” in American psychology, education, child-rearing, pop-science, media and common gossip. 

My observation of this presentation of “self” (in a socio-cultural context) is that it’s BAD NEWS for Asperger types, or any individual whose inclination is to “develop” his or her own particular expression of self. Here is the problem: self, self awareness, self-control, self-determination and the myriad applications of the concept of “self” are commonly held to be “real things”; they are not. As pointed out in the selection below, in “normdom” the self is “fictitious” – a creation of culture; culture is a creation of selves. 

If an individual is for some reason, “out of sync” with the concept of self that is a co-creation of “homogeneous individuals” who subscribe to the same “cultural code” of belief, behavior, and perception of “reality” – well, it’s obvious that one is “in trouble” from the start: How does one “grow, create, construct” a familiar, comfortable, interesting, exploratory concept of self in a hostile socio-cultural environment? Even more difficult is the “biological, evolutionary” possibility, that one’s brain organization, and indeed, one’s experience of the environment, and perceptual fundamentals, are truly “alien” to those of the majority.  

As for “self-esteem” – is this not a concept of social conformity? 

In contemporary culture, the selfie = the self. Posting selfies on social media advertises one’s conformity to a culturally “approved” definition of “self” – which for girls and women, is an “image only” competition for social status. The desperation of “adult” women to conform to “imaginary standards” results in some very regrettable behavior. 

If one’s internalized “picture” of self matches that of what is expected and demanded by the dominant culture, then one is judged to “have self-esteem”. Any person who doesn’t measure up to the cultural “image” (imaginary standard) lacks self-esteem. The most obvious example today, is the crisis of “self-hatred” in young women due to highly distorted “ideals” of body type, promoted by misogynistic American cultural standards. External form is declared to be the self.    

___________________________________________________________________________________________

Excerpt. Full article: http://www.massey.ac.nz/~alock/virtual/hallowel.htm

This info is from the anthropological POV. 

Three things may be said about self-awareness:

(i) Self-awareness is a socio-cultural product. To be self-aware is, by definition, to be able to conceive of one’s individual existence in an objective, as opposed to subjective, manner. In G. H. Mead’s (1934) terms, one must view oneself from ‘the perspective of the other‘. Such a level of psychological functioning is only made possible by the attainment of a symbolic mode of representing the world. Again, this mode of mental life is generally agreed to be dependent upon the existence of a cultural level of social organization. We thus come to a fundamental, though apparently tautologous point: that the existence of culture is predicated upon that of self-awareness; and that the existence of self-awareness is predicated upon that of culture. In the same way as in the course of evolution the structure of the brain is seen as being in a positive-feedback relationship with the nature of the individual’s environment, so it is with culture and self-awareness: the self is constituted by culture which itself constitutes the self.

(ii) Culture defines and constitutes the boundaries of the self: the subjective-objective distinction. It is an evident consequence of being self-aware that if one has some conception of one’s own nature, then one must also have some conception of the nature of things other than oneself, i.e. of the world. Further, this distinction must be encapsulated explicitly in the symbols one uses to mark this polarity. Consequently, a symbolic representation of this divide will have become ‘an intrinsic part of the cultural heritage of all human societies‘ (Hallowell, 1971: 75). Thus, the very existence of a moral order, self-awareness, and therefore human being, depends on the making of some distinction between ‘objective’ (things which are not an intrinsic part of the self) and ‘subjective’ (things which are an intrinsic part of the self).

This categorical distinction, and the polarity it implies, becomes one of the fundamental axes along which the psychological field of the human individual is structured for action in every culture. … Since the self is also partly a cultural product, the field of behaviour that is appropriate for the activities of particular selves in their world of culturally defined objects is not by any means precisely coordinate with any absolute polarity of subjectivity-objectivity that is definable. (Hallowell, 1971: 84)

Similarly, Cassirer (1953: 262) in the context of kinship terminology, writes:

language does not look upon objective reality as a single homogeneous mass, simply juxtaposed to the world of the I, but sees different strata of this reality: the relationship between object and subject is not universal and abstract; on the contrary, we can distinguish different degrees of objectivity, varying according to relative distance from the I.

In other words, there are many facets of reality which are not distinctly classifiable in terms of a polarity between self and non-self, subjective or objective: for example, what exactly is the status of this page – is it an objective entity or part of its author’s selves; an objective entity that would exist as a page, rather than marks on a screen, without a self to read it? Again, am I responsible for all the passions I experience, or am I as much a spectator of some of them as my audience is? While a polarity necessarily exists between the two – subjective and objective/self and non-self – the line between the two is not precise, and may be constituted at different places in different contexts by different cultures. The boundaries of the self and the concomitant boundaries of the world, while drawn of necessity, are both constituted by cultural symbolism, and may be constituted upon differing assumptions.

(iii) The behavioural environment of individual selves is constituted by, and encompasses, different objects. Humans, in contrast to other animals, (that need for human exception again) can be afraid of, for example, the dark because they are able to populate it with symbolically constituted objects: ghosts, bogey men, and various other spiritual beings. (Supernatural, magical entities grew out of “real” danger in the environment: just as did “other” animals, we evolved in natural environments, in which “being afraid of the dark” is a really good reaction to the “the dark” because it’s populated by highly dangerous predators – it’s still a good “attitude” to have when in a human city today.)

As MacLeod (1947) points out,

purely fictitious objects, events and relationships can be just as truly determinants of our behaviour as are those which are anchored in physical reality. Yes, this is a serious problem in humans; the inability to distinguish natural from supernatural cause-explanation relationships leaves us vulnerable to bad decision-making and poor problem-solving.  

In Hallowell’s view (1971: 87):

such objects, (supernatural) in some way experienced, conceptualised and reified, may occupy a high rank in the behavioural environment although from a sophisticated Western point of view they are sharply distinguishable from the natural objects of the physical environment.* However, the nature of such objects is no more fictitious, in a psychological sense, than the concept of the self.

*This sweeping claim to “sophistication” is typical over-generalization and arrogance on the part of Western academics, who mistake their (supposedly) superior beliefs as common to all humans, at least in their cultural “fiefdoms”. The overwhelming American POV is highly religious, superstitious, magical and unsophisticated; the supernatural domain (although imaginary) is considered to be the source of power that creates “everything”. 

This self-deception is common: religion exempts itself from scrutiny as to its claims for “absolute truth” above and beyond any rational or scientific description of reality. It’s a case of, “You don’t question my crazy beliefs, and I won’t question yours.” 

 

Psychology of Pro Hockey Players / Useful Tips for Asperger Types

From Sports Psychology Today 

It’s that time of year: I’m a huge hockey fan, having grown up with the Chicago Blackhawks on black and white TV. No helmets. Scarred faces, missing teeth. Much less padding. But were there more fights than today? I honestly can’t say.

Today it’s (my friend’s) vast screen TV and obnoxious pregame “shows” (what else can be expected from Las Vegas, that great faux-gold turd in the desert). But I wouldn’t want to deprive anyone from experiencing the “joy of” the Stanley Cup Final. My friend has never been a fan of hockey (or any sport), but enforced viewing through some of the playoff games, and she’s “hooked” already – unfortunately, by the Las Vegas Knights. My guess concerning my friend’s rapid “seduction” by hockey? Well it is a fantastic sport, after all! And a chance for female Homo sapiens to observe and enjoy “men being men” without any collateral damage, like in war. They can punch each other all they want; no one else gets hurt. 

I have also observed that the “mental qualities” manifested by players may be a guide to help nervous Asperger types approach our confrontations with “hostile” neurotypicals. No kidding!  

Performance Anxiety and Pregame Jitters

http://www.sportpsychologytoday.com/youth-sports-psychology/performance-anxiety-and-pregame-jitters/

Written by

Many athletes feel performance anxiety in the opening minutes of the game. You may feel butterflies in your stomach or your heart pounding. Some athletes like to feel pregame jitters before competition. These athletes think of pregame jitters as a sign of readiness and energy. Other athletes think of pregame jitters as a sign of nervousness.

I would say that “pregame jitters” are a fact of life for Asperger types: every social interaction, everyday – along with a strong tendency to “rehearse” upcoming events – but aren’t daily practice and visualization vital to athletic excellence? Can we change our “attitude” toward this anticipatory physical phenomenon, and perhaps take a “neutral” view? I know, it seems a difficult task! LOL

Pre-game jitters are a natural part of competing and a sign you are ready to embrace competition. Even the best athletes in the world get the jitters.   Michael Leighton, goaltender for the Philadelphia Flyers, admitted to feeling nervous before his first NHL playoff game.”My legs were shaking a little bit, I was nervous,” Leighton said. “Once I made a few saves, you kind of forget about that and just get focused. It kind of goes away.”

This seems applicable to Asperger types; often, once I get to the “performance” part of social interaction, something automatic takes over and I jabber away – 

The mistake many athletes make is interpreting pre-game jitters as there is something wrong or a problem.  Pregame jitters can be harmful when they don’t go away in the opening minutes of the game. They can cause you to lose confidence and focus. When you’re focused on how nervous you feel, you lose focus on the present task.

Athletes need to embrace the pregame jitters as a sign they are ready to play.  Your mental game tip is to stay calm when you experience pregame jitters in the opening minutes. (Yes, but how???) Stay focused on your strategy and what’s important to execute.  Pregame jitters are important to help you prepare for the game and they will help you focus your best if you embrace them! Think of it this way: the best athletes get worried if they don’t experience pregame jitters!

Maybe our tendency to rehearse is an asset, if we use that energy to devise a strategy! Is rehearsal another “asset” that NT psychologists misinterpret as a defect?

Listed below are some mental game tips to help you perform your best under pressure and in the big game.

What seem like minor everyday social interactions for NTs can be extremely “big game” status for us! 

1. Develop a consistent pregame routine. (Yes, psychologists judge preference for establishing routines in ASD / Asperger individuals a “developmental defect”. Screw them! We need to use our traits as assets…)  A pregame routine can help transition you into the right mindset before competition. While you’re warming up your body, you also want to mentally prepare for the upcoming game. A pregame routine will help you focus your mind, prepare to feel confident, and to trust in your practice. During your pregame routine, remind yourself to trust in the practice you have done leading up to the game.

2. Focus on your game not your competitors. Many athletes tend to make comparisons to their competitors and thus psych themselves out. When you do this, you typically make negative comparisons, which can cause you to lose confidence in your game. Instead of gawking at your competitors, you want to focus on your pregame preparation. You should focus on your strengths and abilities, for example.

3. Focus on the process, not results. Focusing on results causes you to think too far ahead and sets too many expectations for competition. When you focus on the results, you lose focus on the current play, point or shot and you can’t perform your best. Remind yourself that focusing on results doesn’t help you execute. Then, refocus quickly on what’s important, such as the target or your strategy for the current play.

4. Have trust in yourself. Some athletes lose trust and tighten up in the big game. This can cause you to over control your performance and not play freely. You want your performance to just happen, without thinking too much about “how to” execute your skills. For example, a batter needs to react to the ball instead of think about how to make a good swing. Simplify your thoughts such as thinking about one thought or image to help you execute (feeling balanced). Avoid thinking too much about how to or technique.

Overall, you want to treat the big game as any other game. You don’t want to place too much importance on one game, which can lead to added pressure, a lack of focus, and trust in your game. Focus on what you do best. Follow your usual pregame routine and mentally prepare for the big game just like you would any other game.

What does all this point to for Asperger types? We must free ourselves from the poisonous messages we have received all our lives from neurotypicals who “judge” our traits and behaviors as “defective and subhuman” And USE our cognitive skills and superior sensory abilities to our advantage! This is very different than submitting to being “trained monkeys” as social humans demand of us.