Helen Mirren on Vasily Kandinsky / MOMA Video

 

 

 

There is much written – available online, about Kandinsky the “petrified icon”. I will only comment that to me, these two works are the same painting.

Composition VIII, 1923

Composition IV, 1911

 

Aspie Motor Skills / Do you do this?

I have a problem with clockwise, counter-clockwise motions. This shows up most often in screw caps or lids, from jelly jars to toothpaste tubes to juice bottles. I get so frustrated that I either just screw the lid on crooked, or jam it on, if possible. This leads to much “spilled milk” if I happen to drop or knock over the container later. (Which I do frequently) Case in point: Recently my dog knocked over a water bottle into my waterproof camera bag, turning it into an aquarium. The cap popped off because it was only partly screwed on – and crooked. End of camera…

I have a solution, but it’s a bit odd. Instead of rotating the cap or lid, I hold the cap stationary in my left hand – and rotate the jar or tube with my right hand, “screwing” the container into the lid. This is fine except for large or heavy containers, like milk jugs.

I’m still using my right hand to do the rotating, but to turn the container, not the lid.

I have no trouble “getting” a VISUAL DIAGRAM of the principle / action of torque, so why does my brain not “send” the correct signals to my body?

Balanced Torques. do the math! The block of unknown weight tends to rotate the system of blocks and stick counterclockwise, and the 20-N block tends to rotate the system clockwise. The system is in balance when the two torques are equal: counterclockwise torque = clockwise torque.

Note: If I were a “hunter-gatherer” I would NOT be the person doing manual detail work; weaving baskets, fish nets, making tools or jewelry or constructing clothing.  I’d be out “purposefully wandering” looking for resources – “treasure hunting” game locations, useful materials, water sources, making maps and sketches and returning with samples. A “human camera” of sorts.

Debunking Left Brain, Right Brain Myth / Paper – U. Utah Neuroscience

An Evaluation of the Left-Brain vs. Right-Brain Hypothesis with Resting State Functional Connectivity Magnetic Resonance Imaging

Jared A. Nielsen , et al, Affiliation Interdepartmental Program in Neuroscience, University of Utah, Salt Lake City, Utah, United States of America (See original for authors and affiliations)

Published: August 14, 2013

https://doi.org/10.1371/journal.pone.0071275 (Extensive paper with loads of supporting graphics, etc.) (Heavy going technical paper)

Abstract

Lateralized brain regions subserve functions such as language and visuospatial processing. It has been conjectured that individuals may be left-brain dominant or right-brain dominant based on personality and cognitive style, but neuroimaging data has not provided clear evidence whether such phenotypic differences in the strength of left-dominant or right-dominant networks exist. We evaluated whether strongly lateralized connections covaried within the same individuals. Data were analyzed from publicly available resting state scans for 1011 individuals between the ages of 7 and 29. For each subject, functional lateralization was measured for each pair of 7266 regions covering the gray matter at 5-mm resolution as a difference in correlation before and after inverting images across the midsagittal plane. The difference in gray matter density between homotopic coordinates was used as a regressor to reduce the effect of structural asymmetries on functional lateralization. Nine left- and 11 right-lateralized hubs were identified as peaks in the degree map from the graph of significantly lateralized connections. The left-lateralized hubs included regions from the default mode network (medial prefrontal cortex, posterior cingulate cortex, and temporoparietal junction) and language regions (e.g., Broca Area and Wernicke Area), whereas the right-lateralized hubs included regions from the attention control network (e.g., lateral intraparietal sulcus, anterior insula, area MT, and frontal eye fields). Left- and right-lateralized hubs formed two separable networks of mutually lateralized regions. Connections involving only left- or only right-lateralized hubs showed positive correlation across subjects, but only for connections sharing a node. Lateralization of brain connections appears to be a local rather than global property of brain networks, and our data are not consistent with a whole-brain phenotype of greater “left-brained” or greater “right-brained” network strength across individuals. Small increases in lateralization with age were seen, but no differences in gender were observed.

From Discussion

In popular reports, “left-brained” and “right-brained” have become terms associated with both personality traits and cognitive strategies, with a “left-brained” individual or cognitive style typically associated with a logical, methodical approach and “right-brained” with a more creative, fluid, and intuitive approach. Based on the brain regions we identified as hubs in the broader left-dominant and right-dominant connectivity networks, a more consistent schema might include left-dominant connections associated with language and perception of internal stimuli, and right-dominant connections associated with attention to external stimuli.

Yet our analyses suggest that an individual brain is not “left-brained” or “right-brained” as a global property, but that asymmetric lateralization is a property of individual nodes or local subnetworks, and that different aspects of the left-dominant network and right-dominant network may show relatively greater or lesser lateralization within an individual. If a connection involving one of the left hubs is strongly left-lateralized in an individual, then other connections in the left-dominant network also involving this hub may also be more strongly left lateralized, but this did not translate to a significantly generalized lateralization of the left-dominant network or right-dominant network. Similarly, if a left-dominant network connection was strongly left lateralized, this had no significant effect on the degree of lateralization within connections in the right-dominant network, except for those connections where a left-lateralized connection included a hub that was overlapping or close to a homotopic right-lateralized hub.

It is also possible that the relationship between structural lateralization and functional lateralization is more than an artifact. Brain regions with more gray matter in one hemisphere may develop lateralization of brain functions ascribed to those regions. Alternately, if a functional asymmetry develops in a brain region, it is possible that there may be hypertrophy of gray matter in that region. The extent to which structural and functional asymmetries co-evolve in development will require further study, including imaging at earlier points in development and with longitudinal imaging metrics, and whether asymmetric white matter projections [52], [53] contribute to lateralization of functional connectivity.

We observed a weak generalized trend toward greater lateralization of connectivity with age between the 20 hubs included in the analysis, but most individual connections did not show significant age-related changes in lateralization. The weak changes in lateralization with age should be interpreted with caution because the correlations included >1000 data points, so very subtle differences may be observed that are not associated with behavioral or cognitive differences. Prior reports with smaller sample sizes have reported differences in lateralization during adolescence in prefrontal cortex [54] as well as decreased structural asymmetry with age over a similar age range [55].

Similarly, we saw no differences in functional lateralization with gender. These results differ from prior studies in which significant gender differences in functional connectivity lateralization were reported [16], [17]. This may be due to differing methods between the two studies, including the use of short-range connectivity in one of the former reports and correction for structural asymmetries in this report. A prior study performing graph-theoretical analysis of resting state functional connectivity data using a predefined parcellation of the brain also found no significant effects of hemispheric asymmetry with gender, but reported that males tended to be more locally efficient in their right hemispheres and females tended to be more locally efficient in their left hemispheres [56].

It is intriguing that two hubs of both the left-lateralized and right-lateralized network are nearly homotopic. Maximal left-lateralization in Broca Area corresponds to a similar right-lateralized homotopic cluster extending to include the anterior insula in the salience network. Although both networks have bilateral homologues in the inferior frontal gyrus/anterior insular region, it is possible that the relative boundaries of Broca Homologue on the right and the frontoinsular salience region may “compete” for adjacent brain cortical function. Future studies in populations characterized for personality traits [57] or language function may be informative as to whether local connectivity differences in these regions are reflected in behavioral traits or abilities. The study is limited by the lack of behavioral data and subject ascertainment available in the subject sample. In particular, source data regarding handedness is lacking. However, none of the hubs in our left- and right- lateralized networks involve primary motor or sensory cortices and none of the lateralized connections showed significant correlation with metrics of handedness in subjects for whom data was available.

Despite the need for further study of the relationship between behavior and lateralized connectivity, we demonstrate that left- and right-lateralized networks are homogeneously stronger among a constellation of hubs in the left and right hemispheres, but that such connections do not result in a subject-specific global brain lateralization difference that favors one network over the other (i.e. left-brained or right-brained). Rather, lateralized brain networks appear to show local correlation across subjects with only weak changes from childhood into early adulthood and very small if any differences with gender.

 

 

Debunking Left Brain, Right Brain Myth / Plos Paper – Corbalis

Left Brain, Right Brain: Facts and Fantasies

Michael C. Corballis, Affiliation School of Psychology, University of Auckland, Auckland, New Zealand

Published: January 21, 2014

https://doi.org/10.1371/journal.pbio.1001767 )open access. See original for more.

Summary

Handedness and brain asymmetry are widely regarded as unique to humans, and associated with complementary functions such as a left-brain specialization for language and logic and a right-brain specialization for creativity and intuition. In fact, asymmetries are widespread among animals, and support the gradual evolution of asymmetrical functions such as language and tool use. Handedness and brain asymmetry are inborn and under partial genetic control, although the gene or genes responsible are not well established. Cognitive and emotional difficulties are sometimes associated with departures from the “norm” of right-handedness and left-brain language dominance, more often with the absence of these asymmetries than their reversal.

Evolution of Brain Asymmetries, with Implications for Language

One myth that persists even in some scientific circles is that asymmetry is uniquely human [3]. Left–right asymmetries of brain and behavior are now known to be widespread among both vertebrates and invertebrates [11], and can arise through a number of genetic, epigenetic, or neural mechanisms [12]. Many of these asymmetries parallel those in humans, or can be seen as evolutionary precursors. A strong left-hemispheric bias for action dynamics in marine mammals and in some primates and the left-hemisphere action biases in humans, perhaps including gesture, speech, and tool use, may derive from a common precursor [13]. A right-hemisphere dominance for emotion seems to be present in all primates so far investigated, suggesting an evolutionary continuity going back at least 30 to 40 million years [14]. A left-hemisphere dominance for vocalization has been shown in mice [15] and frogs [16], and may well relate to the leftward dominance for speech—although language itself is unique to humans and is not necessarily vocal, as sign languages remind us. Around two-thirds of chimpanzees are right-handed, especially in gesturing [17] and throwing [18], and also show left-sided enlargement in two cortical areas homologous to the main language areas in humans—namely, Broca’s area [19] and Wernicke’s area [20] (see Figure 1). These observations have been taken as evidence that language did not appear de novo in humans, as argued by Chomsky [21] and others, but evolved gradually through our primate lineage [22]. They have also been interpreted as evidence that language evolved not from primate calls, but from manual gestures [23][25].

Some accounts of language evolution (e.g., [25]) have focused on mirror neurons, first identified in the monkey brain in area F5 [26], a region homologous to Broca’s area in humans, but now considered part of an extensive network more widely homologous to the language network [27]. Mirror neurons are so called because they respond when the monkey performs an action, and also when they see another individual performing the same action. This “mirroring” of what the monkey sees onto what it does seems to provide a natural platform for the evolution of language, which likewise can be seen to involve a mapping of perception onto production. The motor theory of speech perception, for example, holds that we perceive speech sounds according to how we produce them, rather than through acoustic analysis [28]. Mirror neurons in monkeys also respond to the sounds of such physical actions as ripping paper or dropping a stick onto the floor, but they remain silent to animal calls [29]. This suggests an evolutionary trajectory in which mirror neurons emerged as a system for producing and understanding manual actions, but in the course of evolution became increasingly lateralized to the left brain, incorporating vocalization and gaining grammar-like complexity [30]. The left hemisphere is dominant for sign language as for spoken language [31].

Mirror neurons themselves have been victims of hyperbole and myth [32], with the neuroscientist Vilayanur Ramachandran once predicting that “mirror neurons will do for psychology what DNA did for biology” [33]. As the very name suggests, mirror neurons are often taken to be the basis of imitation, yet nonhuman primates are poor imitators. Further, the motor theory of speech perception does not account for the fact that speech can be understood by those deprived of the ability to speak, such as those with damage to Broca’s area. Even chimpanzees [34] and dogs [35] can learn to respond to simple spoken instructions, but cannot produce anything resembling human speech. An alternative is that mirror neurons are part of a system for calibrating movements to conform to perception, as a process of learning rather than direct imitation. A monkey repeatedly observes its hand movements to learn to reach accurately, and the babbling infant calibrates the production of sounds to match what she hears. Babies raised in households where sign language is used “babble” by making repetitive movements of the hands [36]. Moreover, it is this productive aspect of language, rather than the mechanisms of understanding, that shows the more pronounced bias to the left hemisphere [37].

Inborn Asymmetries

Handedness and cerebral asymmetries are detectable in the fetus. Ultrasound recording has shown that by the tenth week of gestation, the majority of fetuses move the right arm more than the left [38], and from the 15th week most suck the right thumb rather than the left [39]—an asymmetry strongly predictive of later handedness [40] (see Figure 2). In the first trimester, a majority of fetuses show a leftward enlargement of the choroid plexus [41], a structure within the ventricles known to synthesize peptides, growth factors, and cytokines that play a role in neurocortical development [42]. This asymmetry may be related to the leftward enlargement of the temporal planum (part of Wernicke’s area), evident at 31 weeks [43].

 In these prenatal brain asymmetries, around two-thirds of cases show the leftward bias. The same ratio applies to the asymmetry of the temporal planum in both infants and adults [44]. The incidence of right-handedness in the chimpanzee is also around 65–70 percent, as is a clockwise torque, in which the right hemisphere protrudes forwards and the left hemisphere rearwards, in both humans and great apes [45]. These and other asymmetries have led to the suggestion that a “default” asymmetry of around 65–70 percent, in great apes as well as humans, is inborn, with the asymmetry of human handedness and cerebral asymmetry for language increased to around 90 percent by “cultural literacy” [46].

Variations in Asymmetry

Whatever their “true” incidence, variations in handedness and cerebral asymmetry raise doubts as to the significance of the “standard” condition of right-handedness and left-cerebral specialization for language, along with other qualities associated with the left and right brains that so often feature in popular discourse. Handedness and cerebral asymmetry are not only variable, they are also imperfectly related. Some 95–99 percent of right-handed individuals are left-brained for language, but so are about 70 percent of left-handed individuals. Brain asymmetry for language may actually correlate more highly with brain asymmetry for skilled manual action, such as using tools [47],[48], which again supports the idea that language itself grew out of manual skill—perhaps initially through pantomime.

Even when the brain is at rest, brain imaging shows that there are asymmetries of activity in a number of regions. A factor analysis of these asymmetries revealed four different dimensions, each mutually uncorrelated. Only one of these dimensions corresponded to the language regions of the brain; the other three had to do with vision, internal thought, and attention [49]—vision and attention were biased toward the right hemisphere, language and internal thought to the left. This multidimensional aspect throws further doubt on the idea that cerebral asymmetry has some unitary and universal import.

Handedness, at least, is partly influenced by parental handedness, suggesting a genetic component [50], but genes can’t tell the whole story. For instance some 23 percent of monozygotic twins, who share the same genes, are of opposite handedness [51]. These so-called “mirror twins” have themselves fallen prey to a Through the Looking Glass myth; according to Martin Gardner [52], Lewis Carroll intended the twins Tweedledum and Tweedledee in that book to be enantiomers, or perfect three-dimensional mirror images in bodily form as well as in hand and brain function. Although some have argued that mirroring arises in the process of twinning itself [53],[54], large-scale studies suggest that handedness [55],[56] and cerebral asymmetry [57] in mirror twins are not subject to special mirroring effects. In the majority of twins of opposite handedness the left hemisphere is dominant for language in both twins, consistent with the finding that the majority of single-born left-handed individuals are also left-hemisphere dominant for language. In twins, as in the singly born, it is estimated that only about a quarter of the variation in handedness is due to genetic influences [56].

The manner in which handedness is inherited has been most successfully modeled by supposing that a gene or genes influence not whether the individual is right- or left-handed, but whether a bias to right-handedness will be expressed or not. In those lacking the “right shift” bias, the direction of handedness is a matter of chance; that is, left-handedness arises from the lack of a bias toward the right hand, and not from a “left-hand gene.” Such models can account reasonably well for the parental influence [58][60], and even for the relation between handedness and cerebral asymmetry if it is supposed that the same gene or genes bias the brain toward a left-sided dominance for speech [60],[61]. It now seems likely that a number of such genes are involved, but the basic insight that genes influence whether or not a given directional bias is expressed, rather than whether or not it can be reversed, remains plausible (see Box 1).

Genetic considerations aside, departures from right-handedness or left-cerebral dominance have sometimes been linked to disabilities. In the 1920s and 1930s, the American physician Samuel Torrey Orton attributed both reading disability and stuttering to a failure to establish cerebral dominance [62]. Orton’s views declined in influence, perhaps in part because he held eccentric ideas about interhemispheric reversals giving rise to left–right confusions [63], and in part because learning-theory explanations came to be preferred to neurological ones. In a recent article, Dorothy Bishop reverses Orton’s argument, suggesting that weak cerebral lateralization may itself result from impaired language learning [64]. Either way, the idea of an association between disability and failure of cerebral dominance may be due for revival, as recent studies have suggested that ambidexterity, or a lack of clear handedness or cerebral asymmetry, is indeed associated with stuttering [65] and deficits in academic skills [66], as well as mental health difficulties [67] and schizophrenia (see Box 1).

Although it may be the absence of asymmetry rather than its reversal that can be linked to problems of social or educational adjustment, left-handed individuals have often been regarded as deficient or contrarian, but this may be based more on prejudice than on the facts. Left-handers have excelled in all walks of life. They include five of the past seven US presidents, sports stars such as Rafael Nadal in tennis and Babe Ruth in baseball, and Renaissance man Leonardo da Vinci, perhaps the greatest genius of all time.

 

 

 

 


Psychological Nuttiness Strikes Again / Theories of Emotion

from verywell.com

What Are the 6 Major Theories of Emotion?

Some of the Major Theories to Explain Human Emotions

By Kendra Cherry, Updated May 10, 2017

What Is Emotion?

In psychology, emotion is often defined as a complex state of feeling that results in physical and psychological changes that influence thought and behavior. (We’re knee deep in magical thinking already – inverted and circular “reasoning” at the same time!)

Emotionality is associated with a range of psychological phenomena, including temperament, personality, mood, and motivation. According to author David G. Meyers, human emotion involves “…physiological arousal, expressive behaviors, and conscious experience.” (Just what do “psychological” and “conscious” mean here? Psychology is rife with “opportunities” for misinformation and crazy interpretation because it lacks self-regulation for standards of “scientific behavior” on the part of its researchers and practitioners. It is a “secular religion”)

Theories of Emotion

The major theories of motivation (?) can be grouped into three main categories: physiological, neurological, and cognitive. (This implies that neurological activity and cognitive activity are not physical phenomenon) Physiological theories suggest that responses within the body are responsible for emotions.

Neurological theories propose that activity within the brain leads to emotional responses. Finally, cognitive theories argue that thoughts and other mental activity play an essential role in forming emotions. (that chopping up into categorical objects again – thoughts and whatever other “mental activity” refers to – are held to be objects that act on other objects. Psychology is hopelessly stuck in a pre-20th C. conception of “physics” –

Where have psychologists been for the past 100+ years of scientific revolution?

Evolutionary Theory of Emotion

It was naturalist Charles Darwin (also a geologist) who proposed that emotions evolved because they were adaptive and allowed humans and animals to survive and reproduce. Feelings of love and affection lead people to seek mates and reproduce. Feelings of fear compel people to either fight or flee the source of danger. (Oh dear, the social narrative intrudes, as usual)

According to the evolutionary theory of emotion, our emotions exist because they serve an adaptive role. Emotions motivate people to respond quickly to stimuli in the environment, which helps improve the chances of success and survival. (Standard social blah, blah, blah)

Understanding the emotions of other people and animals also plays a crucial role in safety and survival. If you encounter a hissing, spitting, and clawing animal, chances are you will quickly realize that the animal is frightened or defensive and leave it alone. By being able to interpret correctly the emotional displays of other people and animals, you can respond correctly and avoid danger. (That’s it? That’s not a theory. That’s a script for a PBS kid’s show.)

The James-Lange Theory of Emotion

The James-Lange theory is one of the best-known examples of a physiological theory of emotion. Independently proposed by psychologist William James and physiologist Carl Lange, the James-Lange theory of emotion suggests that emotions occur as a result of physiological reactions to events. (A scientific theory does not “suggest” – it produces one or more testable hypotheses; generates valid experiments and must be independently confirmed or disproven. Neurotypicals reject this method, because they only believe in “social” authority. Independent “reality” does not exist for them.)

This theory suggests that when you see (or sense – we have multiple senses) an external stimulus that leads to a physiological reaction. (This is so.) Your emotional reaction is dependent upon how you interpret those physical reactions.

For example, suppose you are walking in the woods and you see a grizzly bear. You begin to tremble, and your heart begins to race. The James-Lange theory proposes that you will interpret your physical reactions and conclude that you are frightened (“I am trembling. Therefore, I am afraid”). According to this theory of emotion, you are not trembling because you are frightened. Instead, you feel frightened because you are trembling.

(Amazing how the standard “fear response” – common to primates, mammals and other animals, can be “negated” by “pausing” to think about what’s going on – and coming up with a “cognitive interpretation” of one’s physiologic response to an ACTUAL threat – the presence of a grizzly bear: fear is an instinctual response – WHATEVER WORD(S) YOU CHOOSE TO DESCRIBE IT. This scenario is plausible and applicable only if there is no danger present. If you are sitting quietly in your living room, and experience the rush of adenaline, etc, that is the FFF response, you might stop to think “Gee, there’s no danger present, but I feel afraid – this must be a “false alarm” – and this realization may result in a cessation of the physiological response. But – anyone who makes this “interpretation” when confronted by actual threat will be in serious trouble.

The Cannon-Bard Theory of Emotion

Another well-known physiological theory is the Cannon-Bard theory of emotion. Walter Cannon disagreed with the James-Lange theory of emotion on several different grounds. First, he suggested, people can experience physiological reactions linked to emotions (?) without actually feeling those emotions. For example, your heart might race because you have been exercising and not because you are afraid. (Mind-boggling)

Cannon also suggested that emotional responses occur much too quickly for them to be simply products of physical states. (Beyond mind-boggling)

When you encounter a danger in the environment, you will often feel afraid before you start to experience the physical symptoms associated with fear such as shaking hands, rapid breathing, and a racing heart. (Okay, this is simply stupid! We are confronted again by “supernatural” fear that precedes the actual physical response that IS FEAR. And this “supernatural” power travels faster than the speed of light. LOL!)

Cannon first proposed his theory in the 1920s and his work was later expanded on by physiologist Philip Bard during the 1930s. According to the Cannon-Bard theory of emotion, we feel emotions and experience physiological reactions such as sweating, trembling, and muscle tension simultaneously.

(Gee, could it be that these two “categorical objects” are one and the same phenomenon – that “emotions ARE physiological responses? This is an example of the archaic conception of “mind and body” as separate “things” – and the attribution to a supernatural dimension the “magical patterns and templates” that  are believed to “create” reality.)

More specifically, it is suggested that emotions result when the thalamus sends a message to the brain in response to a stimulus, resulting in a physiological reaction. At the same time, the brain also receives signals (via amorphous goo from the supernatural dimension?) triggering the emotional experience. Cannon and Bard’s theory suggests that the physical and psychological experience of emotion happen at the same time and that one does not cause the other. (Separate but equal? That’s justice!)

(The neurotypical brain simply cannot let go of the “magical thinking” stage common in childhood, which attributes all phenomena to MAGICAL POWERS that defy physical reality. ‘Psychological’ refers to the imaginary explanations and narratives that are necessary to the neotenic brain, which is frozen in infantile conceptions. These narratives are created by social indoctrination into a subjective and isolated cultural context)

Schachter-Singer Theory

Also known as the two-factor theory of emotion, the Schachter-Singer Theory is an example of a cognitive theory of emotion. This theory suggests that the physiological arousal occurs first, and then the individual must identify the reason for this arousal to experience and label it as an emotion. (At last – someone recognizes “emotion words” as LABELS) A stimulus leads to a physiological response that is then cognitively interpreted and labeled which results in an emotion. (AYE, yai, yai! The “emotion” IS the physiological response. The “labels” are the myriad words that children are taught to use to “parse” the physical experience into socially-approved verbal expressions. Only social humans could invent this awkward imposition of “cognition as verbal manipulation” as existing prior to instinct in evolution.)

Schachter and Singer’s theory draws on both the James-Lange theory and the Cannon-Bard theory of emotion. Like the James-Lange theory, the Schachter-Singer theory proposes that people do infer emotions based on physiological responses. The critical factor is the situation and the cognitive interpretation that people use to label that emotion. (My head hurts, my stomach hurts, I’m out of exclamations of shock and disbelief. Children “learn” to label physiological response as “verbal” expressions, which are specific to their particular social and cultural context. Many societies also demand that “physical emotion responses” be quashed, hidden or forbidden expression.)

Like the Cannon-Bard theory, the Schachter-Singer theory also suggests that similar physiological responses can produce varying emotions. For example, if you experience a racing heart and sweating palms during an important math exam, you will probably identify the emotion as anxiety. If you experience the same physical responses on a date with your significant other, you might interpret those responses as love, affection, or arousal.

(This demolishes the idea that “emotions” are distinct categories of experience or “objects” in the brain, body or supernatural dimension. The ever-expanding array of “parts” that constitute brain and body in Western culture is astounding – and imaginary. The incredible number of “emotion words” in languages, do not each correspond to “an emotion”. They are invented labels.)

Cognitive Appraisal Theory

According to appraisal theories of emotion, thinking must occur first before experiencing emotion. Richard Lazarus was a pioneer in this area of emotion, and this theory is often referred to as the Lazarus theory of emotion.

According to this theory, the sequence of events first involves a stimulus, followed by thought which then leads to the simultaneous experience of a physiological response and the emotion. For example, if you encounter a bear in the woods, you might immediately begin to think that you are in great danger. This then leads to the emotional experience of fear and the physical reactions associated with the fight-or-flight response. (Nonsense again – this conceit that “conscious thinking via verbal language” is SUPERIOR to instinct screws up analysis of “how things work” The effectiveness of instinct is that you don’t have to THINK ABOUT IT! Instinctual behavior is automatic and has been aiding survival of myriad species for hundreds of millions of years!)

Facial-Feedback Theory of Emotion

The facial-feedback theory of emotions suggests that facial expressions are connected to experiencing emotions. (That does not a theory make) Charles Darwin and William James both noted early on that sometimes physiological responses often had a direct impact on emotion (for the love of sanity: the physiological response IS EMOTION), rather than simply being a consequence of the emotion. Supporters of this theory suggest that emotions are directly tied to changes in facial muscles. For example, people who are forced to smile pleasantly at a social function will have a better time at the event than they would if they had frowned or carried a more neutral facial expression.

(The “jump” from “reverse smiling” – mimicry – which may stimulate a pleasant “feeling” to the socially-mandated “having a better time at an event” demonstrates belief in contagious magic.)

What is the experiential phenomenon that is called EMOTION?

Emotion in animals is pretty simple: a subjective physiological reaction to “something” in the environment. What we call “emotion” is activation of the familiar “fight, flight or freeze response” that results from sensory stimulation, and is usually attuned to “danger”.

Emotion is a word: a noun, which designates an object that can be “named” – but the physical phenomenon is not an object: the naming of “emotions” is a socio-cultural activity. Nature never created an “emotion thing” that resides somewhere inside a human or animal; like other animals, we have a brain and nervous system which interacts with the environment, ostensibly for our benefit – to promote survival. Humans created the social “idea monstrosity” that claims to be “the truth” about how Homo sapiens works. Emotions are presented as parts “inside of you” – their location has been argued over forever! The heart, brain, gut, mysterious fluids, etc. have been given the attribution as the “seat” of emotion. Most “social” views of emotion are negative: weird and destructive animal inheritances that must be controlled, not surprisingly, by society!

Peculiar dogma plagues our concepts and application of “emotion rules” –  notions which are purely cultural and do not “transfer” from Western psychology to “all humans”. Psychology demands the conceit that ALL HUMANS are mere replicas of “normal humans” who happen to be white males; underneath all the obvious  “human diversity” of size, form, skin color, hair types, skull dimensions, manners, behaviors and individual preferences is a “white male” prototype. “Evolution” is deemed to be a “mistake” – all humans were meant to be white males in thought, behavior and belief; inferior mistakes ought to at least “mimic” their superiors.

This promotion of a bizarre “evolutionary” fantasy sounds ridiculous when plainly stated; a farce, a narrative born of childish arrogance, a sociopathic “plan” for world domination, and yet this Western psychological addiction to imaginary superiority is supported, promoted and fed by American Psychology – in theory, policy and practice.

As usual, we must go back to basics to untangle the mess surrounding “emotions” and the “off-topic” arguments over good and evil, positive and negative, male and female, race and class, biology and religion, authority and expertise and supernatural origins, which are indulged as serious consequences of human beliefs (not facts) of what we call “emotions” – fact, myth and propaganda.

Example 1.

From Gerrig, Richard J. & Philip G. Zimbardo (a self-diagnosed psychopath, BTW) . Psychology And Life, 16th ed.  Published by Allyn and Bacon, Boston, MA. Copyright (c) 2002 by Pearson Education.

Emotion:  A complex pattern of changes, including physiological arousal, feelings, cognitive processes, and behavioral reactions, made in response to a situation perceived to be personally significant. (Wow! Considerable “mumbo-jumbo” ahead)

Emotional intelligence: Type of intelligence defined as the abilities to perceive, appraise, and express emotions accurately and appropriately, to use emotions to facilitate thinking, to understand and analyze emotions, to use emotional knowledge effectively, and to regulate one’s emotions to promote both emotional and intellectual growth. (See? Mumbo-jumbo of the ‘throw in every Psych-concept cliché you can think of’ type)

Example 2.

Paul Thagard Ph.D./ What Are Emotions? / April 15, 2010

Happiness is a brain process 

Philosophers and psychologists have long debated the nature of emotions such as happiness. Are they states of supernatural souls, cognitive judgments about goal satisfaction, or perceptions of physiological changes? Advances in neuroscience suggest how brains generate emotions through a combination of cognitive appraisal and bodily perception.

Suppose that something really good happens to you today: you win the lottery, your child gets admitted to Harvard, or someone you’ve been interested in asks you out. Naturally, you feel happy, but what does this happiness amount to? On the traditional dualist view of a person, you consist of both a body and a soul, and it is the soul that experiences mental states such as happiness. This view has the appealing implication that you can even feel happiness after your body is gone, if your soul continues to exist in a pleasant location such as heaven. Unfortunately, there is no good evidence for the existence of the soul and immortality, so the dualist view of emotions and the mind in general can be dismissed as wishful thinking or motivated inference. (Not so fast: this “duality” remains the hard-core belief of the “majority” of people in the U.S. And, as we shall see, in American Psychology.)

There are currently two main scientific ways of explaining the nature of emotions. According to the cognitive appraisal theory emotions are judgments about the extent that the current situation meets your goals. Happiness is the evaluation that your goals are being satisfied, as when winning the lottery solves your financial problems and being asked out holds the promise of satisfying your romantic needs. Similarly, sadness is the evaluation that your goals are not being satisfied, and anger is the judgment aimed at whatever is blocking the accomplishment of your goals. (BTW, this is not a scientific theory – it is a social narrative)

Alternatively, William James and others have argued that emotions are perceptions of changes in your body such as heart rate, breathing rate, perspiration, and hormone levels. (A reasonable proposition based in physiology) On this view, happiness is a kind of physiological perception, not a judgment, and other emotions such as sadness and anger are mental reactions (why is “mental” used here? That “ghostly” duality again!) to different kinds of physiological stages. The problem with this account is that bodily states do not seem to be nearly as finely tuned as the many different kinds of emotional states.Yet there is undoubtedly some connection between emotions and physiological changes. (OMG! This is a rambling misconception of a “supernatural origin of emotions” and refutation of physical reality as the foundation for valid hypotheses about thought and behavior in humans. This brilliantly demonstrates the serious mistake of believing that words are “actual objects” that precede and supersede physical reality. This is word magic – the belief that words have the power to create reality – Abracadabra!)

Understanding how the brain works shows that these theories of emotion – cognitive appraisal and physiological perception – can be combined into a unified account of emotions. (are you ready for some fabulous psych nonsense?) The brain is a parallel processor, doing many things at once. Visual and other kinds of perception are the result of both inputs from the senses and top-down interpretations based on past knowledge. Similarly, the brain can perform emotions by interactively combining both high-level judgments about goal satisfactions and low-level perceptions of bodily changes. The judgments are performed by the prefrontal cortex which interacts with the amygdala and insula that process information about physiological states. Hence happiness can be a brain process that simultaneously makes appraisals and perceives the body. For details about how this might work, see the EMOCON model of emotional consciousness (link is external).

Before we proceed to, Major Theories of Emotion,

(I desperately need a break)

let’s peruse a few “general” definitions of emotion.

Word origin of ’emotion’: from old French esmovoir to excite, from Latin ēmovēre to disturb, from movēre to move (this is the same, regardless of the specific definition)

Note how many “non-physical” reference words are included

Thanks to FARLEX ONLINE, which collects stuff for you, in one place.

emotion

a state of arousal characterized by alteration of feeling tone and by physiologic behavioral changes. The external manifestation of emotion is called affect; a pervasive and sustained emotional state, mood. adj., adj emo´tional. The physical form of emotion may be outward and evident to others, as in crying, laughing, blushing, or a variety of facial expressions. However, emotion is not always reflected in one’s appearance and actions even though psychic changes (duality again) are taking place. Joy, grief, fear, and anger are examples of emotions.

Miller-Keane Encyclopedia and Dictionary of Medicine, Nursing, and Allied Health, Seventh Edition. © 2003 by Saunders, an imprint of Elsevier, Inc. All rights reserved.

emotion

A strong feeling, aroused mental state, or intense state of drive or unrest, which may be directed toward a definite object and is evidenced in both behavior and in psychological changes, with accompanying autonomic nervous system manifestations.

Farlex Partner Medical Dictionary © Farlex 2012

emotion

a strong feeling state, arising subjectively and directed toward a specific object, with physiological, somatic, and behavioral components.

Dorland’s Medical Dictionary for Health Consumers. © 2007 by Saunders, an imprint of Elsevier, Inc. All rights reserved.

emotion

1. A mental state that arises spontaneously rather than through conscious effort and is often accompanied by physiological changes; a feeling: the emotions of joy, sorrow, and anger.

2. Such mental states or the qualities that are associated with them, especially in contrast to reason: a decision based on emotion rather than logic. (That duality again, when “reason” and emotion are not opposed in human behavior, but work together)!

The American Heritage® Medical Dictionary Copyright © 2007, 2004 by Houghton Mifflin Company. Published by Houghton Mifflin Company. All rights reserved.

emotion

1 the outward expression or display of mood or feeling states.

2 the affective aspect of consciousness as compared with volition and cognition. Physiological alterations often occur with a marked change of emotion regardless of whether the feelings are conscious or unconscious, expressed or unexpressed. See also emotional need, emotional response. (“Conceptual clichés” again)

Mosby’s Medical Dictionary, 9th edition. © 2009, Elsevier.

emotion

Psychology A mood, affect or feeling of any kind–eg, anger, excitement, fear, grief, joy, hatred, love. See Negative emotion, Positive emotion, Toxic emotion. (Yeah, a list of emotion words is not a definition; neither is a social “judgement” about “good and evil”) 

Concise Dictionary of Modern Medicine. © 2002 by The McGraw-Hill Companies, Inc.

emotion

Any state of arousal in response to external events or memories of such events that affect, or threaten to affect, personal advantage. Emotion is never purely mental (emotion is physical, actually) but is always associated with bodily changes such as the secretion of ADRENALINE and cortisol and their effects. The limbic system and the hypothalamus of the brain are the mediators of emotional expression and feeling. The external expression of emotional content is known as ‘affect’. Repressed emotions are associated with psychosomatic disease. The most important, in this context, are anger, a sense of dependency, and fear. (Oh dear, the unscientific social narratives never end – emotions  are the “bringers” of pestilence and punishment.)

Collins Dictionary of Medicine © Robert M. Youngson 2004, 2005

emotion

a short-term positive or negative affective state. Typically differentiated from mood in that an emotion is of shorter duration and evoked in response to a specific event, such as anger. (So odd! Anger is  the ’emotion’ – reaction; there seems to be a universal neurotypical inability to discern cause and effect!)

Dictionary of Sport and Exercise Science and Medicine by Churchill Livingstone © 2008 Elsevier Limited. All rights reserved.

emotion

a complex feeling or state (affect) accompanied by characteristic motor and glandular activities; feelings; mood.

Mosby’s Dental Dictionary, 2nd edition. © 2008 Elsevier, Inc. All rights reserved.

emotion

aroused state involving intense feeling, autonomic activation and related behavior. Animals have emotions insofar as they are motivated to behave by what they perceive and much of the reaction is learned rather than intuitive (instinctive) (Hmm … the categorical division  animal  / human is maintained, but with animal emotion being “lower in status” – a mere reaction – which is true in humans also. The reactions are based on rewarding and adversive properties of stimuli from the external environment. The center for the control of emotional behavior is the limbic system of the brain.

Saunders Comprehensive Veterinary Dictionary, 3 ed. © 2007 Elsevier, Inc. All rights reserved.

Is there any question as to why social  humans cannot communicate with each other?  Without a foundation in physical fact and common meaning, language is gibberish – an extension of confused personal opinion, narcissism and nonsense.; a toy, a sham, a hindrance to understanding.

Critique of DSM 5 / No Medical Basis for Diagnoses

1_HgubUPoLvaikrySJGsQ40APacific Standard Magazine  

The Problem With Psychiatry, the ‘DSM,’ and the Way We Study Mental Illness

by Ethan Watters

Imagine for a moment that the American Psychiatric Association was about to compile a new edition of its Diagnostic and Statistical Manual of Mental Disorders. But instead of 2013, imagine, just for fun, that the year is 1880.

Transported to the world of the late 19th century, the psychiatric body would have virtually no choice but to include hysteria in the pages of its new volume. Women by the tens of thousands, after all, displayed the distinctive signs: convulsive fits, facial tics, spinal irritation, sensitivity to touch, and leg paralysis. Not a doctor in the Western world at the time would have failed to recognize the presentation. “The illness of our age is hysteria,” a French journalist wrote. “Everywhere one rubs elbows with it.”

Hysteria would have had to be included in our hypothetical 1880 DSM for the exact same reasons that attention deficit hyperactivity disorder is included in the just-released DSM-5. The disorder clearly existed in a population and could be reliably distinguished, by experts and clinicians, from other constellations of symptoms.

There were no reliable medical tests to distinguish hysteria from other illnesses then; the same is true of the disorders listed in the DSM-5 today.

“Practically speaking, the criteria by which something is declared a mental illness are virtually the same now as they were over a hundred years ago.”

The DSM determines which mental disorders are worthy of insurance reimbursement, legal standing, and serious discussion in American life.

That its diagnoses are not more scientific is, according to several prominent critics, a scandal.

In a major blow to the APA’s dominance over mental-health diagnoses, Thomas R. Insel, director of the National Institute of Mental Health, recently declared that his organization would no longer rely on the DSM as a guide to funding research. “The weakness is its lack of validity,” he wrote. “Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever.” As an alternative, Insel called for the creation of a new, rival classification system based on genetics, brain imaging, and cognitive science.

This idea — that we might be able to strip away all subjectivity from the diagnosis of mental illness and render psychiatry truly scientific — is intuitively appealing. But there are a couple of problems with it. The first is that the science simply isn’t there yet. A functional neuroscientific understanding of mental suffering is years, perhaps generations, away from our grasp. What are clinicians and patients to do until then? But the second, more telling problem with Insel’s approach lies in its assumption that it is even possible to strip culture from the study of mental illness. Indeed, from where I sit, the trouble with the DSM — both this one and previous editions — is not so much that it is insufficiently grounded in biology, but that it ignores the inescapable relationship between social cues and the shifting manifestations of mental illness.

PSYCHIATRY tends not to learn from its past. With each new generation, psychiatric healers dismiss the enthusiasms of their predecessors by pointing out the unscientific biases and cultural trends on which their theories were based. Looking back at hysteria, we can see now that 19th-century doctors were operating amidst fanciful beliefs about female anatomy, an assumption of feminine weakness, and the Victorian-era weirdness surrounding female sexuality. And good riddance to bad old ideas. But the more important point to take away is this: There is little doubt that the symptoms expressed by those thousands of women were real.

The resounding lesson of the history of mental illness is that psychiatric theories and diagnostic categories shape the symptoms of patients. “As doctors’ own ideas about what constitutes ‘real’ dis-ease change from time to time,” writes the medical historian Edward Shorter, “the symptoms that patients present will change as well.”

This is not to say that psychiatry wantonly creates sick people where there are none, as many critics fear the new DSM-5 will do. Allen Frances — a psychiatrist who, as it happens, was in charge of compiling the previous DSM, the DSM-IV — predicts in his new book, Saving Normal, that the DSM-5 will “mislabel normal people, promote diagnostic inflation, and encourage inappropriate medication use.” Big Pharma, he says, is intent on ironing out all psychological diversity to create a “human monoculture,” and the DSM-5 will facilitate that mission. In Frances’ dystopian post-DSM-5 future, there will be a psychoactive pill for every occasion, a diagnosis for every inconvenient feeling: “Disruptive mood dysregulation disorder” will turn temper tantrums into a mental illness and encourage a broadened use of antipsychotic drugs; new language describing attention deficit disorder that expands the diagnostic focus to adults will prompt a dramatic rise in the prescription of stimulants like Adderall and Ritalin; the removal of the bereavement exclusion from the diagnosis of major depressive disorder will stigmatize the human process of grieving. The list goes on.

In 2005, a large study suggested that 46 percent of Americans will receive a mental-health diagnosis at some point in their lifetimes. Critics like Frances suggest that, with the new categories and loosened criteria in the DSM-5, the percentage of Americans thinking of themselves as mentally ill will rise far above that mark.

But recent history doesn’t support these fears. In 1994 the DSM-IV — the edition Frances oversaw — launched several new diagnostic categories that became hugely popular among clinicians and the public (bipolar II, attention deficit hyperactivity disorder, and social phobia, to name a few), but the number of people receiving a mental-health diagnosis did not go up between 1994 and 2005. In fact, as psychologist Gary Greenberg, author of The Book of Woe, recently pointed out to me, the prevalence of mental health diagnoses actually went down slightly. This suggests that the declarations of the APA don’t have the power to create legions of mentally ill people by fiat, but rather that the number of people who struggle with their own minds stays somewhat constant.

What changes, it seems, is that they get categorized differently depending on the cultural landscape of the moment. Those walking worried who would have accepted the ubiquitous label of “anxiety” in the 1970s would accept the label of depression that rose to prominence in the late 1980s and the 1990s, and many in the same group might today think of themselves as having social anxiety disorder or ADHD.

Viewed over history, mental health symptoms begin to look less like immutable biological facts and more like a kind of language. Someone in need of communicating his or her inchoate psychological pain has a limited vocabulary of symptoms to choose from. From a distance, we can see how the flawed certainties of Victorian-era healers created a sense of inevitability around the symptoms of hysteria. There is no reason to believe that the same isn’t happening today. Healers have theories about how the mind functions and then discover the symptoms that conform to those theories. Because patients usually seek help when they are in need of guidance about the workings of their minds, they are uniquely susceptible to being influenced by the psychiatric certainties of the moment. There is really no getting around this dynamic. Even Insel’s supposedly objective laboratory scientists would, no doubt, inadvertently define which symptoms our troubled minds gravitate toward. The human unconscious is adept at speaking the language of distress that will be understood.

WHY DO PSYCHIATRIC DIAGNOSES fade away only to be replaced by something new? The demise of hysteria may hold a clue. In the early part of the 20th century, the distinctive presentation of the disorder began to blur and then disappear. The symptoms began to lose their punch. In France this was called la petite hysterie. One doctor described patients who would “content themselves with a few gesticulatory movements, with a few spasms.” Hysteria had begun to suffer from a kind of diagnostic overload. By 1930s or so, the dramatic and unmistakable symptoms of hysteria were vanishing from the cultural landscape because they were no longer recognized as a clear communication of psychological suffering by a new generation of women and their healers.

It is true that the DSM has a great deal of influence in modern America, but it may be more of a scapegoat than a villain. It is certainly not the only force at play in determining which symptoms become culturally salient. As Frances suggests, the marketing efforts of Big Pharma on TV and elsewhere have a huge influence over which diagnoses become fashionable. Some commentators have noted that shifts in diagnostic trends seem uncannily timed to coincide with the term lengths of the patents that pharmaceutical companies hold on drugs. Is it a coincidence that the diagnosis of anxiety diminished as the patents on tranquilizers ran out? Or that the diagnosis of depression rose as drug companies landed new exclusive rights to sell various antidepressants? Consider for a moment that the diagnosis of depression didn’t become popular in Japan until Glaxo-SmithKlein got approval to market Paxil in the country.

Journalists play a role as well: We love to broadcast new mental-health epidemics. The dramatic rise of bulimia in the United Kingdom neatly coincided with the media frenzy surrounding the rumors and subsequent revelation that Princess Di suffered from the condition. Similarly, an American form of anorexia hit Hong Kong in the mid-1990s just after a wave of local media coverage brought attention to the disorder.

The trick is not to scrub culture from the study of mental illness but to understand how the unconscious takes cues from its social settings. This knowledge won’t make mental illnesses vanish (Americans, for some reason, find it particularly difficult to grasp that mental illnesses are absolutely real and culturally shaped at the same time). But it might discourage healers from leaping from one trendy diagnosis to the next. As things stand, we have little defense against such enthusiasms. “We are always just one blockbuster movie and some weekend therapist’s workshops away from a new fad,” Frances writes. “Look for another epidemic beginning in a decade or two as a new generation of therapists forgets the lessons of the past.” Given all the players stirring these cultural currents, I’d make a sizable bet that we won’t have to wait nearly that long.

Visual Thinking / Speyer Cathedral – Space Shuttle

A visual thinker files away information in the form of images that may be “triggered” by encounters, many years later, that recall a stored image. Often, these mean nothing – are simple coincidence; mere curiosities – and will be returned to visual memory, but “updated” by the comparison.

In this case, a chance “appearance” of a photo of Speyer Cathedral, found while searching for something else on the Internet, immediately produced in my mind, an image of the Space Shuttle. The striking similarity of forms passed from a coincidence to a curiosity – and then to an idea expressed by Oswald Spengler in Decline of the West:  – that Western Culture is driven by the desire to overcome the visible; to expand into time and space; to replace organic nature with machines.

A thousand years in time separate these two iconic products of Western Civilization: Is the space shuttle not the fulfillment of the cathedral? Note that the (abstract) concept of Western desire for domination and “spatial conquest” is represented in my visual brain by SPECIFIC concrete objects, which only then, can be “connected” to word concepts.

untitled-speyer

Speyer is dominated by its Romanesque cathedral (dedicated 1061). Speyer is one of Germany’s oldest cities and the resting place of eight medieval emperors and kings of the Salian, Staufer and Habsburg dynasties. History: Speyer was the seat of the Imperial Chamber Court between 1527 and 1689, and also held 50 sessions of the Imperial Diet. The First Diet of Speyer (1526) decreed toleration of Lutheran teaching, soon revoked by the Second Diet of Speyer (1529). The latter diet led to the Protestation at Speyer the same year, during which 6 princes and 14 Imperial Free Cities protested against the anti-Reformation resolutions. It is from this event that the term ‘Protestantism’ was coined.

The History of the Space Shuttle, by Alan Taylor, Jul 1, 2011 (Fabulous photos): From its first launch 30 years ago (1981) to its final mission scheduled for next Friday, NASA’s Space Shuttle program has seen moments of dizzying inspiration and of crushing disappointment. When next week’s launch is complete, the program will have sent up 135 missions, ferrying more than 350 humans and thousands of tons of material and equipment into low Earth orbit. The missions have been risky, the engineering complex, the hazards extreme. Indeed, over the years 14 shuttle astronauts lost their lives.

 

Visual-Spatial Thinking / Watch Crows “Do it”

The crows do not think verbally. They are “visual thinkers” as are Asperger types.  Humans once primarily “thought like crows” but in modern social humans, word-language dominates communication, and visual thinking ability has atrophied. In fact, the power of visual-spatial processing has been forgotten.