Individualism is an atheist lie / from a “Progressive Christian”

http://www.patheos.com/blogs/mercynotsacrifice/2011/10/19/individualism-is-an-atheist-lie/

October 19, 2011 by Morgan Guyton

We meditated on this quotation from Jesus yesterday at our Virginia Methodist provisional clergy mentor covenant group retreat. On the side, I have been reading Eastern Orthodox theologian John Zizioulas’ Being and Communion, which has caused me to see the implications of Jesus’ statement in a completely new light. Zizioulas writes that God is the only authentic person in the universe because God is the source of His own being. As creatures, we are completely contingent upon God for our being.

If we really believe that God is the source of every instant of our consciousness, then Jesus’ statement is a lot more all-encompassing than we might have previously thought. He is not simply talking about the relationship that followers have to their leader or students have to their teacher. He is not just talking about any kind of lifestyle or community we choose to enter into. He is talking about the relationship He has as Creator to all of His creatures who are branches on His vine whether we accept this reality or not. Nothing in the universe exists independent from Christ, who is not solely the man Jesus who walked the Earth 2000 years ago but also the very Word of God, the creative agency which articulates and implements the Father’s will as John 1:3 describes: “Through him all things were made; without him nothing was made that has been made.”

On the vine of our creator Christ, those whose hearts are opened to communion and intimacy with their Creator “bear fruit.” Those who pretend to “be like gods” themselves (Gen 3:5) and cling to the delusion of their own self-sufficiency are “like a branch that is thrown away and withers… [before it is] picked up, thrown into the fire and burned” (John 15:6). Individualism describes the atheist delusion that we are the source of our own being, which is having the naivete of a branch that thinks it does not need God’s vine to be fed and survive. You can be an individualist and talk about God all day, but God is not truly God to you if you think you’re a self-made person. Unfortunately, individualism is the default perspective with which people in our age view life, including many who never stop blabbering about Jesus.

Cogito ergo sum. I think therefore I am. Written by Rene Descartes in 1637, this is perhaps the most definitive declaration of independence from God in the course of Western history. (How about Nietzsche / “God is Dead” ?) It is the origin of secular thinking, because it sets as a foundational premise that our minds in effect “create” our existence, i.e. we are the source of our own identity (rather than God). Descartes’ premise is a choice to view the world with the assumption that the boundaries of reality are determined by our perception of it. I think; therefore I am” applied to the world outside my brain becomes “I see it; therefore it is,” which is the foundational premise of modern science.

Truth becomes that which has been observed and measured by multiple persons coming to the same conclusions instead of what our ancestors tell us that God told their ancestors to pass down to us. Rather than being a tribe in which our identity is given to us by our family, humanity is redefined by the Western secular tradition of Descartes and Enlightenment thinkers as a race of individuals who are the source of their own identity and subsequently form families and societies through social contracts with other individuals.

To view the world in this “I-centered” way which is ubiquitous to Western culture means living as if God doesn’t exist, at least not the God who Christians for centuries considered to be the One in whom “we live and move and have our being” (Acts 17:28). Rather than being understood as the source of our being, God becomes just another infinitely bigger and more powerful being who’s a constant threat to our freedom. God is the one who started the world, who intervenes occasionally in certain spectacular supernatural moments, and who will ultimately end the world, instead of being the One from whom creation is constantly emanating. God is seen as Someone outside of everything to whom we call to intervene rather than Someone inside of everything to whom we seek a purer connection. (That persistent NT insistence of inside / outside human isolation from Nature!) Paul’s declaration that “in him all things hold together” (Col 1:17) sounds like pious poetry to us, but we don’t take this at all seriously as an ontological claim, because what we really believe in modernity is that “in science nature holds together” and, most problematically, “in our theological system God holds together.”

I understand that there are many positives to the legacy of Descartes and the Enlightenment. I just think it’s completely wrong to say Cogito ergo sum when we should be saying Cogitat Deus ergo sum (God thinks; therefore I am). Cogito ergo sum isn’t just Descartes’ delusion; it’s the delusion of all in our society who are taught to see themselves as self-made individuals. People don’t make themselves. Individualism is an atheist lie. Christ is our Creator. In Him all things hold together. All things are created through Him and for Him. He is the vine and we are the branches.

____________________________________________________________________________________________

Okay, this may seem an odd piece to post, but it does contribute to the topic of recent posts on the concept of SELF. It demonstrates the ongoing conflict between so-called ‘secular thinking’ and ‘religious thinking’ and also the failure to recognize that philosophical points of view, and definitions of specific terms, pass into popular cultural as  strange and distorted “thingies”. We can also detect the influence of psychology and the social sciences, which, with traditional Biblical sources, create a fine mish-mash of assertions. Science, the method, is completely misunderstood.

The “point” of the piece seems to be the instructive metaphor, “He is the vine, and we are the branches”. This seems a sufficient illustration of belief. Why all the  unnecessary flailing around over misrepresentations of historical contributions to “Western Thought”? This, to me, weakens the “message.” “Stand by your man…”

_________________________________________________________________

INDIVIDUALISM / 1.The habit or principle of being independent and self-reliant. ‘a culture that celebrates individualism and wealth’ 1.1 Self-centered feeling or conduct; egoism. 2. A social theory favouring freedom of action for individuals over collective or state control. ‘encouragement has been given to individualism, free enterprise, and the pursuit of profit’

Hmmmmm  …. If Individualism is an atheist lie, then The United States was founded by atheists, and no “true” Christian can participate in the U.S. Capitalist economy, and in fact, a “true” Christian believes in Communism / Socialism  and not in Democracy, as a form of governance.    

The fundamental “bottom line” of science. 

No “true” Christian should purchase or use any product of “computer science” (including the Internet) unless Jesus Christ can be proven to have invented it.  

 

 

Advertisements

What is self? / an anthropological concept

A. I. Hallowell on ‘Orientations for the Self’

The following summary of Hallowell’s analysis as set out in his paper The self and its behavioral environment (most easily accessible as Chapter 4 of his book Culture and Experience (1955; 2nd Edition, 1971): University of Pennsylvania Press, has been taken from A. Lock (1981) Universals in human conception, in P.L.F. Heelas and A.J. Lock (eds.) Indigenous Psychologies: The Anthropology of the Self. London: Academic Press, pp19-36, with minor revisions.

__________________________________

Alfred IrvingPeteHallowell (1892–1974) was an American anthropologist, archaeologist and businessman. He was born in Philadelphia, Pennsylvania, and attended the Wharton School of the University of Pennsylvania receiving his B.S. degree in 1914, his A.M. in 1920, and his Ph.D. in anthropology in 1924. He was a student of the anthropologist Frank Speck. From 1927 through 1963 he was a professor of anthropology at the University of Pennsylvania excepting 1944 through 1947 when he taught the subject at Northwestern University. Hallowell’s main field of study was Native Americans.

_________________________________

NOTE: I’m “looking into” concepts of “self” and “self-awareness” after confronting, over and over again, the claim that “some high number” of Asperger types lack “self-esteem” – another of those sweeping generalities that likely is a ‘social judgement’ from the “outsider” – parent, teacher, psychologist, counselor, therapist, hairdresser, clerk at convenience store, neighbor or any bystander caring to comment on child behavior. This “lack of self-esteem” has become a “fad, cliché, causal certainty” for almost any perceived “human behavioral problem” in American psychology, education, child-rearing, pop-science, media and common gossip. 

My observation of this presentation of “self” (in a socio-cultural context) is that it’s BAD NEWS for Asperger types, or any individual whose inclination is to “develop” his or her own particular expression of self. Here is the problem: self, self awareness, self-control, self-determination and the myriad applications of the concept of “self” are commonly held to be “real things”; they are not. As pointed out in the selection below, in “normdom” the self is “fictitious” – a creation of culture; culture is a creation of selves. 

If an individual is for some reason, “out of sync” with the concept of self that is a co-creation of “homogeneous individuals” who subscribe to the same “cultural code” of belief, behavior, and perception of “reality” – well, it’s obvious that one is “in trouble” from the start: How does one “grow, create, construct” a familiar, comfortable, interesting, exploratory concept of self in a hostile socio-cultural environment? Even more difficult is the “biological, evolutionary” possibility, that one’s brain organization, and indeed, one’s experience of the environment, and perceptual fundamentals, are truly “alien” to those of the majority.  

As for “self-esteem” – is this not a concept of social conformity? 

In contemporary culture, the selfie = the self. Posting selfies on social media advertises one’s conformity to a culturally “approved” definition of “self” – which for girls and women, is an “image only” competition for social status. The desperation of “adult” women to conform to “imaginary standards” results in some very regrettable behavior. 

If one’s internalized “picture” of self matches that of what is expected and demanded by the dominant culture, then one is judged to “have self-esteem”. Any person who doesn’t measure up to the cultural “image” (imaginary standard) lacks self-esteem. The most obvious example today, is the crisis of “self-hatred” in young women due to highly distorted “ideals” of body type, promoted by misogynistic American cultural standards. External form is declared to be the self.    

___________________________________________________________________________________________

Excerpt. Full article: http://www.massey.ac.nz/~alock/virtual/hallowel.htm

This info is from the anthropological POV. 

Three things may be said about self-awareness:

(i) Self-awareness is a socio-cultural product. To be self-aware is, by definition, to be able to conceive of one’s individual existence in an objective, as opposed to subjective, manner. In G. H. Mead’s (1934) terms, one must view oneself from ‘the perspective of the other‘. Such a level of psychological functioning is only made possible by the attainment of a symbolic mode of representing the world. Again, this mode of mental life is generally agreed to be dependent upon the existence of a cultural level of social organization. We thus come to a fundamental, though apparently tautologous point: that the existence of culture is predicated upon that of self-awareness; and that the existence of self-awareness is predicated upon that of culture. In the same way as in the course of evolution the structure of the brain is seen as being in a positive-feedback relationship with the nature of the individual’s environment, so it is with culture and self-awareness: the self is constituted by culture which itself constitutes the self.

(ii) Culture defines and constitutes the boundaries of the self: the subjective-objective distinction. It is an evident consequence of being self-aware that if one has some conception of one’s own nature, then one must also have some conception of the nature of things other than oneself, i.e. of the world. Further, this distinction must be encapsulated explicitly in the symbols one uses to mark this polarity. Consequently, a symbolic representation of this divide will have become ‘an intrinsic part of the cultural heritage of all human societies‘ (Hallowell, 1971: 75). Thus, the very existence of a moral order, self-awareness, and therefore human being, depends on the making of some distinction between ‘objective’ (things which are not an intrinsic part of the self) and ‘subjective’ (things which are an intrinsic part of the self).

This categorical distinction, and the polarity it implies, becomes one of the fundamental axes along which the psychological field of the human individual is structured for action in every culture. … Since the self is also partly a cultural product, the field of behaviour that is appropriate for the activities of particular selves in their world of culturally defined objects is not by any means precisely coordinate with any absolute polarity of subjectivity-objectivity that is definable. (Hallowell, 1971: 84)

Similarly, Cassirer (1953: 262) in the context of kinship terminology, writes:

language does not look upon objective reality as a single homogeneous mass, simply juxtaposed to the world of the I, but sees different strata of this reality: the relationship between object and subject is not universal and abstract; on the contrary, we can distinguish different degrees of objectivity, varying according to relative distance from the I.

In other words, there are many facets of reality which are not distinctly classifiable in terms of a polarity between self and non-self, subjective or objective: for example, what exactly is the status of this page – is it an objective entity or part of its author’s selves; an objective entity that would exist as a page, rather than marks on a screen, without a self to read it? Again, am I responsible for all the passions I experience, or am I as much a spectator of some of them as my audience is? While a polarity necessarily exists between the two – subjective and objective/self and non-self – the line between the two is not precise, and may be constituted at different places in different contexts by different cultures. The boundaries of the self and the concomitant boundaries of the world, while drawn of necessity, are both constituted by cultural symbolism, and may be constituted upon differing assumptions.

(iii) The behavioural environment of individual selves is constituted by, and encompasses, different objects. Humans, in contrast to other animals, (that need for human exception again) can be afraid of, for example, the dark because they are able to populate it with symbolically constituted objects: ghosts, bogey men, and various other spiritual beings. (Supernatural, magical entities grew out of “real” danger in the environment: just as did “other” animals, we evolved in natural environments, in which “being afraid of the dark” is a really good reaction to the “the dark” because it’s populated by highly dangerous predators – it’s still a good “attitude” to have when in a human city today.)

As MacLeod (1947) points out,

purely fictitious objects, events and relationships can be just as truly determinants of our behaviour as are those which are anchored in physical reality. Yes, this is a serious problem in humans; the inability to distinguish natural from supernatural cause-explanation relationships leaves us vulnerable to bad decision-making and poor problem-solving.  

In Hallowell’s view (1971: 87):

such objects, (supernatural) in some way experienced, conceptualised and reified, may occupy a high rank in the behavioural environment although from a sophisticated Western point of view they are sharply distinguishable from the natural objects of the physical environment.* However, the nature of such objects is no more fictitious, in a psychological sense, than the concept of the self.

*This sweeping claim to “sophistication” is typical over-generalization and arrogance on the part of Western academics, who mistake their (supposedly) superior beliefs as common to all humans, at least in their cultural “fiefdoms”. The overwhelming American POV is highly religious, superstitious, magical and unsophisticated; the supernatural domain (although imaginary) is considered to be the source of power that creates “everything”. 

This self-deception is common: religion exempts itself from scrutiny as to its claims for “absolute truth” above and beyond any rational or scientific description of reality. It’s a case of, “You don’t question my crazy beliefs, and I won’t question yours.” 

 

Every Asperger Needs to Read this Paper! / Symptoms of entrapment and captivity

Research that supports my challenge to contemporary (American) psychology that Asperger symptoms are the result of “captivity” and not “defective brains” 

From: Depression Research and Treatment

Depress Res Treat. 2010; 2010: 501782. Published online 2010 Nov 4. doi:  10.1155/2010/501782 PMCID: PMC2989705

Full Article: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2989705/

Testing a German Adaption of the Entrapment Scale and Assessing the Relation to Depression

Manuel Trachsel, 1 ,* Tobias Krieger, 2 Paul Gilbert, 3 and Martin Grosse Holtforth 2 :

Abstract

The construct of entrapment is used in evolutionary theory to explain the etiology of depression. The perception of entrapment can emerge when defeated individuals want to escape but are incapable. Studies have shown relationships of entrapment to depression, and suicidal tendencies. The aim of this study was a psychometric evaluation and validation of the Entrapment Scale in German (ES-D). 540 normal subjects completed the ES-D along with other measures of depressive symptoms, hopelessness, and distress. Good reliability and validity of the ES-D was demonstrated. Further, whereas entrapment originally has been regarded as a two-dimensional construct, our analyses supported a single-factor model. Entrapment explained variance in depressive symptoms beyond that explained by stress and hopelessness supporting the relevance of the construct for depression research. These findings are discussed with regard to their theoretical implications as well as to the future use of the entrapment scale in clinical research and practice.

Being outnumbered by social humans, 99% to 1%, is de facto defeat and captivity

1. Introduction

Assuming a certain degree of adaptivity of behavior and emotion, evolutionary theorists have suggested various functions of moodiness and depression. Whereas adaptive mechanisms may become functionally maladaptive [1, 2], there have been many attempts to explain potentially adaptive functions of depression. For example, Price [3] suggested that depression evolved from the strategic importance of having a de-escalating or losing strategy. Social rank theory [4, 5] built on this and suggests that some aspects of depression, such as mood and drive variations, may have evolved as mechanisms for regulating behavior in contexts of conflicts and competition for resources and mates. Hence, subordinates are sensitive to down rank threats and are less confident than dominants, while those who are defeated will seek to avoid those who defeated them. Depression may also serve the function to help individuals disengage from unattainable goals and deal with losses [6]. 

Social rank theory (e.g., [4]) links defeat states to depression. Drawing on Dixon’s arrested defences model of mood variation [7, 8], this theory suggests that especially when stresses associated with social defeats and social threats arise, individuals are automatically orientated to fight, flight or both. Usually, either of those defensive behaviors will work. So, flight and escape remove the individual from the conditions in which stress is arising (e.g., threats from a dominant), or anger/aggression curtails the threat. These defensive behaviors typically work for nonhuman animals. However, for humans, such basic fight and flight strategies may be less effective facing the relatively novel problems of living in modern societies, perhaps explaining the prevalence of disorders such as depression [8]. Dixon suggested that in depression, defensive behaviors can be highly aroused but also blocked and arrested and in this situation depression ensues. Dixon et al. [8] called this arrested flight. For example, in lizards, being defeated but able to escape has proven to be less problematic than being defeated and being trapped. Those who are in caged conditions, where escape is impossible, are at risk of depression and even death [9]. Gilbert [4, 10] and Gilbert and Allan [5] noted that depressed individuals commonly verbalize strong escape wishes and that feelings of entrapment and desires to escape have also been strongly linked to suicide, according to O’Connor [11]. In addition they may also have strong feelings of anger or resentment that they find difficult to express or become frightening to them. (Or are NOT ALLOWED to express, without being punished) 

Gilbert [4] and Gilbert and Allan [5] proposed that a variety of situations (not just interpersonal conflicts) that produce feeling of defeat, or uncontrollable stress, which stimulate strong escape desires but also makes it impossible for an individual to escape, lead the individual to a perception of entrapment. They defined entrapment as a desire to escape from the current situation in combination with the perception that all possibilities to overcome a given situation are blocked. Thus, theoretically entrapment follows defeat if the individual is not able to escape. This inability may be due to a dominant subject who does not offer propitiatory gestures following antagonistic competition, or if the individual keeps being attacked. (Relentless social bullying) 

In contrast to individuals who feel helpless (cf. the concept of learned helplessness [12]), which focus on perceptions of control, the entrapped model focuses on the outputs of the threat system emanating from areas such as the amygdala [13]. In addition, depressed people are still highly motivated and would like to change their situation or mood state. It was also argued that, unlike helplessness, entrapment takes into account the social forces that lead to depressive symptoms, which is important for group-living species with dominance hierarchies such as human beings [14]. Empirical findings by Holden and Fekken [15] support this assumption. Gilbert [4] argued that the construct of entrapment may explain the etiology of depression better than learned helplessness, because according to the theory of learned helplessness, helpless individuals have already lost their flight motivation whereas entrapped individuals have not.

According to Gilbert [4], the perception of entrapment can be triggered, increased, and maintained by external factors but also internal processes such as intrusive, unwanted thoughts and ruminations can play an important role (e.g., [16, 17]). For example, ruminating on the sense of defeat or inferiority may act as an internal signal of down-rank attack that makes an individual feel increasingly inferior and defeated. Such rumination may occur despite the fact that an individual successfully escaped from an entrapping external situation because of feelings of failure, which may cause a feeling of internal entrapment. For example, Sturman and Mongrain [18] found that internal entrapment increased following an athletic defeat. Moreover, thoughts and feelings like “internal dominants” in self-critics may exist that can also activate defensive behaviors.

For the empirical assessment of entrapment, Gilbert and Allan [5] developed the self-report Entrapment Scale (ES) and demonstrated its reliability. Using the ES, several studies have shown that the perception of entrapment is strongly related to low mood, anhedonia, and depression [5, 1921]. Sturman and Mongrain [22] found that entrapment was a significant predictor of recurrence of major depression. Further, Allan and Gilbert [23] found that entrapment relates to increased feelings of anger and to a lower expression of these feelings. In a study by Martin et al. [24], the perception of entrapment was associated with feelings of shame, but not with feelings of guilt. Investigating the temporal connection between depression and entrapment, Goldstein and Willner [25, 26] concluded that the relation between depression and entrapment is equivocal and might be bilateral; that is, entrapment may lead to depression and vice versa.

Entrapment was further used as a construct explaining suicidal tendency. In their cry-of pain-model, Williams and Pollock [27, 28] argued that suicidal behavior should be seen as a cry of pain rather than as a cry for help. Consistent with the concept of arrested flight, they proposed that suicidal behavior is reactive. In their model, the response (the cry) to a situation is supposed to have the following three components: defeat, no escape potential, and no rescue. O’Connor [11] provided empirical support in a case control study by comparing suicidal patients and matched hospital controls on measures of affect, stress, and posttraumatic stress. The authors hypothesized that the copresence of all three cry-of-pain variables primes an individual for suicidal behavior. The suicidal patients, with respect to a recent stressful event, reported significantly higher levels of defeat, lower levels of escape potential, and lower levels of rescue than the controls. Furthermore, Rasmussen et al. [21] showed that entrapment strongly mediated the relationship between defeat and suicidal ideation in a sample of first-time and repeated self-harming patients. Nevertheless, there has also been some criticism of the concept of entrapment as it is derived from animal literature [29].

To our knowledge so far, there is no data on the retest reliability or the temporal stability of the Entrapment Scale. Because entrapment is seen as a state-like rather than a trait-like construct, its stability is likely dependent on the stability of its causes. (Remove the social terrorism, or remove yourself) Therefore, if the causes of entrapment are stable (e.g., a long-lasting abusive relationship), then also entrapment will remain stable over time. In contrast, for the Beck Hopelessness Scale (BHS), there are studies assessing temporal stability that have yielded stable trait-like components of hopelessness [30]. Young and coworkers [30] stated that the high stability of hopelessness is a crucial predictor of depressive relapses and suicidal attempts. For the Perceived Stress Questionnaire (PSQ), there are studies examining retest reliability. The PSQ has shown high retest reliability over 13 days (r = .80) in a Spanish sample [31]. It is to be expected that with longer retest intervals as in the present study (3 months), the stability of perceived stress will be substantially lower. We, therefore, expect the stability of entrapment to be higher than that of perceived stress as a state-like construct, but lower than that of hopelessness, which has been shown to be more trait-like [32].

Previous research is equivocal regarding the dimensionality of the entrapment construct. Internal and external entrapment were originally conceived as two separate constructs (cf. [5]) and were widely assessed using two subscales measuring entrapment caused by situations and other people (e.g., “I feel trapped by other people”) or by one’s own limitations (e.g., “I want to get away from myself”). The scores of the two subscales were averaged to result in a total entrapment score in many studies. However as Taylor et al. [33] have shown, entrapment may be best conceptualized as a unidimensional construct. This reasoning is supported by the observation that some of the items of the ES cannot easily be classified either as internal or external entrapment and because the corresponding subscales lack face validity (e.g., “I am in a situation I feel trapped in” or “I can see no way out of my current situation”).

5. Discussion

The entrapment construct embeds depressiveness theoretically into an evolutionary context. The situation of arrested flight or blocked escape, in which a defeated individual is incapable of escaping despite a maintained motivation to escape, may lead to the perception of entrapment in affected individuals [8]. In this study, the Entrapment Scale (ES) was translated to German (ES-D), tested psychometrically, and validated by associations with other measures. This study provides evidence that the ES-D is a reliable self-report measure of entrapment demonstrating high internal consistency. The study also shows that the ES-D is a valid measure that relates to other similar constructs like hopelessness, depressive symptoms or perceived stress. Levels of entrapment as measured with the ES-D were associated with depressiveness, perceived stress, and hopelessness, showing moderate to high correlations. Results were consistent with those obtained by Gilbert and Allan [5]. Entrapment explained additional variance in depressiveness beyond that explained by stress and hopelessness. Taken together, the present data support the conception of entrapment as a relevant and distinct construct in the explanation of depression. (And much of Asperger behavior)

The results of our study confirm the findings of Taylor et al. [33], thereby showing that entrapment is only theoretically, but not empirically, separable into internal and external sources of entrapment. The authors even went further by showing that entrapment and defeat could represent a single construct. Although in this study the defeat scale [5] was not included, the results are in line with the assumption of Taylor et al. [33] and support other studies using entrapment a priori as a single construct. However, although this study supports the general idea that escape motivation affects both internal and external events and depression, clinically it can be very important to distinguish between them. For example, in studies of psychosis entrapment can be very focused on internal stimuli, particularly voices [47].

The state conceptualization of entrapment implies that the perception of entrapment may change over time. Therefore, we did not expect retest correlations as high as retest correlations for more trait-like constructs like hopelessness [32]. Since the correlation over time is generally a function of both the reliability of the measure and the stability of the construct, high reliability is a necessary condition for high stability [48]. In this study, we showed that the ES-D is a reliable scale, and we considered retest correlations as an indicator for stability. The intraclass correlation of .67 suggests that entrapment is more sensitive to change than hopelessness (r = .82). Furthermore, the state of entrapment seems to be more stable than perceived stress, which may be influenced to a greater extent by external factors. Given the confirmed reliability and validity of the ES-D in this study, we therefore cautiously conclude that entrapment lies between hopelessness and perceived stress regarding stability.

Whereas the high correlation between entrapment and depressive symptoms in this study may be interpreted as evidence of conceptual equivalence, an examination of the item wordings of two scales clearly suggest that these questionnaires assess distinct constructs. However, the causal direction of this bivariate relation is not clear. Theoretically, both directions are plausible. Entrapment may be a cause or a consequence of depressive symptoms, or even both. Unfortunately, studies examining the temporal precedence so far have yielded equivocal results and have methodological shortcomings (e.g., no clinical samples, only mild and transitory depression and entrapment scores with musical mood induction) in order to answer this question conclusively [25, 26]. It remains unclear whether entrapment only is depression specific. Entrapment might not only be associated with depression, but also with other psychological symptoms, or even psychopathology in general. This interpretation is supported by research showing a relation between distress arising from voices and entrapment in psychotic patients [49, 50]. Furthermore, other studies show the relation between entrapment and depressive symptoms [5153] and social anxiety and shame [54] in psychosis. The usefulness of entrapment as a construct for explaining psychopathologies in humans has been questioned [29]. Due to the present study, it is now possible to investigate entrapment in psychopathology in the German speaking area.

Modern social humans and the social hierarchy: Driving Asperger types crazy for thousands of years!

 

Lies about Brain Scans / Dead Salmon Re-Post

I have posted often about the false claims of scientific reliability and experimental rigor on the part of the Big Pharma and it’s co-conspirators, the Psychology and Psychiatry Industries. I’m not alone. 

Right: The primary textbook for brain scan tech and interpretation. 

“The low statistical power and the imperative to publish, incentivizes researchers to mine their data to try to find something meaningful,” says Chris Chambers, a professor of cognitive neuroscience at the University of Cardiff. “That’s a huge problem for the credibility and integrity of the field.”

What credibility?

untitledsc untitledscan

BOLD Assumptions: Why Brain Scans Are Not Always What They Seem

Moheb Costandi, on DECODER

In 2009, researchers at the University of California, Santa Barbara performed a curious experiment. In many ways, it was routine — they placed a subject in the brain scanner, displayed some images, and monitored how the subject’s brain responded. The measured brain activity showed up on the scans as red hot spots, like many other neuroimaging studies.

Except that this time, the subject was an Atlantic salmon, and it was dead.

Dead fish do not normally exhibit any kind of brain activity, of course. The study was a tongue-in-cheek reminder of the problems with brain scanning studies. Those colorful images of the human brain found in virtually all news media may have captivated the imagination of the public, but they have also been subject of controversy among scientists over the past decade or so. In fact, neuro-imagers are now debating how reliable brain scanning studies actually are, and are still mostly in the dark about exactly what it means when they see some part of the brain “light up.”

Glitches in reasoning

Functional magnetic resonance imaging (fMRI) measures brain activity indirectly by detecting changes in the flow of oxygen-rich blood, or the blood oxygen-level dependent (BOLD) signal, with its powerful magnets. The assumption is that areas receiving an extra supply of blood during a task have become more active. Typically, researchers would home in on one or a few “regions of interest,” using ‘voxels,’ tiny cube-shaped chunks of brain tissue containing several million neurons, as their units of measurement.

Early fMRI studies involved scanning participants’ brains while they performed some mental task, in order to identify the brain regions activated during the task. Hundreds of such studies were published in the first half of the last decade, many of them garnering attention from the mass media.

Eventually, critics pointed out a logical fallacy in how some of these studies were interpreted. For example, researchers may find that an area of the brain is activated when people perform a certain task. To explain this, they may look up previous studies on that brain area, and conclude that whatever function it is reported to have also underlies the current task.

Among many examples of such studies were those that concluded people get satisfaction from punishing rule-breaking individuals, and that for mice, pup suckling is more rewarding than cocaine. In perhaps one of the most famous examples, a researcher diagnosed himself as a psychopath by looking at his own brain scan.

These conclusions could well be true, but they could also be completely wrong, because the area observed to be active most likely has other functions, and could serve a different role than that observed in previous studies.

The brain is not composed of discrete specialized regions. Rather, it’s a complex network of interconnected nodes, which cooperate to generate behavior. Thus, critics dismissed fMRI as “neo-phrenology” – after the discredited nineteenth century pseudoscience that purported to determine a person’s character and mental abilities from the shape of their skull – and disparagingly referred to it as ‘blobology.’

When results magically appear out of thin air

In 2009, a damning critique of fMRI appeared in the journal Perspectives on Psychological Science. Initially titled “Voodoo Correlations in Social Neuroscience” and later retitled to “Puzzlingly high correlations in fMRI studies of emotion, personality, and social cognition,” the article questioned the statistical methods used by neuro-imagers. The authors, Ed Vul of University of California in San Diego and his colleagues, examined a handful of social cognitive neuroscience studies, and pointed out that their statistical analyses gave impossibly high correlations between brain activity and behavior.

“It certainly created controversy,” says Tal Yarkoni, an assistant professor in the Department of Psychology at the University of Texas, Austin. “The people who felt themselves to be the target ignored the criticism and focused on the tone, but I think a large subset of the neuroimaging community paid it some lip service.”

Russ Poldrack of the Department of Psychology at Stanford University says that although the problem was more widespread than the paper suggested, many neuro-imagers were already aware of it. They happened to pick on one part of the literature, but almost everybody was doing it,” he says.

The problem arises from the “circular” nature of the data analysis, Poldrack says. “We usually analyze a couple of hundred thousand voxels in a study,” he says. “When you do that many statistical tests, you look for the ones that are significant, and then choose those to analyze further, but

they’ll have high correlations by virtue of the fact that you selected them in the first place.” We see this again and again in crappy psych “research”

Not long after Vul’s paper was published, Craig Bennett and his colleagues published their dead salmon study to demonstrate how robust statistical analyses are key to interpreting fMRI data. When stats are not done well enough, researchers can easily get false positive results – or see an effect that isn’t actually there, such as activity in the brain of a dead fish.

The rise of virtual superlabs

The criticisms drove researchers to do better work— to think more deeply about their data, avoid logical fallacies in interpreting their results, and develop new analytical methods.

At the heart of the matter is the concept of statistical power, which reflects how likely the results are to be meaningful instead of being obtained by pure chance. Smaller studies typically have lower power. An analysis published in 2013 showed that underpowered studies are common in almost every area of brain research. This is specially the case in neuroimaging studies, because most of them involve small numbers of participants.

“Ten years ago I was willing to publish papers showing correlations between brain activity and behavior in just 20 people,” says Poldrack. “Now I wouldn’t publish a study that doesn’t involve at least 50 subjects, or maybe 100, depending on the effect. A lot of other labs have come around to this idea.”

Cost is one of the big barriers preventing researchers from increasing the size of their studies. “Neuroimaging is very expensive. Every lab has a budget and a researcher isn’t going to throw away his entire year’s budget on a single study. Most of the time, there’s no real incentive to do the right thing,” Yarkoni says.

Replication – or repeating experiments to see if the same results are obtained – also gives researchers more confidence in their results. But most journals are unwilling to publish replication experiments, preferring novel findings instead, and the act of repeating someone else’s experiments is seen as aggressive, as if implying they were not done properly in the first place. Confirmation by repeat experiments is vital to the scientific method!

This “unwillingness” is a SOCIAL IMPOSITION on the validity of scientific inquiry. We wouldn’t want to hurt the feelings of the researchers, would we? But no one cares about the consequences to the public!

One way around these problems is for research teams to collaborate with each other and pool their results to create larger data sets. One such initiative is the IMAGEN Consortium, which brings together neuro-imaging experts from 18 European research centers, to share their results, integrate them with genetic and behavioral data, and create a publicly available database. (Assuming all this “data” is not junk data…)

Five years ago, Poldrack started the OpenfMRI project, which has similar aims. “The goal was to bring together data to answer questions that couldn’t be answered with individual data sets,” he says. “We’re interested in studying the psychological functions underlying multiple cognitive tasks, and the only way of doing that is to amass lots of data from lots of different tasks. It’s way too much for just one lab.”

An innovative way of publishing scientific studies, called pre-registration, could also increase the statistical power of fMRI studies. Traditionally, studies are published in scientific journals after they have been completed and peer-reviewed. Pre-registration requires that researchers submit their proposed experimental methods and analyses early on. If these meet the reviewers’ satisfaction, they are published; the researchers can then conduct the experiment and submit the results, which are eventually published alongside the methods.

“The low statistical power and the imperative to publish incentivizes researchers to mine their data to try to find something meaningful,” says Chris Chambers, a professor of cognitive neuroscience at the University of Cardiff. “That’s a huge problem for the credibility and integrity of the field.”

Chambers is an associate editor at Cortex, one of the first scientific journals to offer pre-registration. As well as demanding larger sample sizes, the format also encourages researchers to be more transparent about their methods.

Many fMRI studies would, however, not be accepted for pre-registration – their design would not stand up to the scrutiny of the first-stage reviewers.

“Neuro-imagers say pre-registration consigns their field to a ghetto,” says Chambers. “I tell them they can collaborate with others to share data and get bigger samples.”

Pushing the field forward

Even robust and apparently straight-forward fMRI findings can still be difficult to interpret, because there are still unanswered questions about the nature of the BOLD signal. How exactly does the blood rush to a brain region? What factors affect it? What if greater activation in a brain area actually means the region is working less efficiently?

“What does it mean to say neurons are firing more in one condition than in another? We don’t really have a good handle on what to make of that,” says Yarkoni.

“You end up in this uncomfortable situation where you can tell a plausible story no matter what you see.”

To some extent, the problems neuro-imagers face are part of the scientific process, which involves continuously improving one’s methods and refining ideas in light of new evidence. (Unless you block the possibility of there being any new evidence.)When done properly, the method can be extremely powerful, as the ever-growing number of so-called “mind-reading” and “decoding” studies clearly show. (Obligatory last sentence contradiction to the theme of the article. Can’t hurt the feelings of the incompetent researchers!)

_____________________________my comment:

That’s just great! In the meantime, hundreds of thousands of children and adults have been “diagnosed” as having abnormal brains and developmental disorders, as well as numerous “mental illnesses” by charlatans, in the “caring, helping, fixing” industry – people who continue to acquire obscene profits at the expense of parents and children who are the targets of borderline “eugenic” activity.

_______________________________________________________________________________

It’s likely that with incremental improvements in the technology, fMRI results will become more accurate and reliable. In addition, there are a number of newer projects that aim to find other ways to capture brain activity. For example, one group at Massachusetts General Hospital is working on using paramagnetic nanoparticles to detect changes in blood volume in the brain’s capillaries.

Such a method would radically enhance the quality of signals and make it possible to detect brain activity in one individual, as opposed to fMRI that requires pooling data from a number of people, according to the researchers.

Betcha didn’t know that it’s not YOUR BRAIN lighting up those colorful composite brain images! 

Other scientists are diving even deeper, using paramagnetic chemicals to reveal brain activity at the cell level. If such methods come to fruition, we could find the subtlest activities in the brain, maybe just not in a dead fish.

 

We have to start somewhere / What is cognition?

I’m working up to the problem of visual and sensory thinking being all but ignored (or even dismissed) by the “cognition and behavior sciences” as a primary mode of perception and cognition in evolutionary history. This ignorance or arrogance on the part of “researchers” is especially negligent on the part of those whose declared interest is ASD / Asperger’s and other non-typical diagnosis. The irony is that these diagnosis of “abnormality” may simply demonstrate the bias or outright prejudice that only the “social” language of scripted word concepts / formal academic constructs  is “important” to human thought and behavior. That is, rigid restrictions have been placed on human thought, behavior and personal expression that may reflect the inability of the “social engineering class” to think in any other mode. Can this group have become so isolated from “natural” human behavior, that only individuals who are similarly limited to social constructs and rigid narratives are “accepted, selected for” inclusion in the class of those who dictate social behavior, thus increasingly diminishing the diversity of ideas about “what it is to be human” to their own impoverished experiences? The peasant classes are urged to function only on emotional reactivity and scripted social behavior, thus remaining powerless.

WIKI on Cognition: 

“Cognition is “the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses”.[1] It encompasses processes such as attention, the formation of knowledge, memory, and working memory, judgement and evaluation, reasoning and “computation,” problem-solving and decision making, comprehension and production of language. Cognitive processes use existing knowledge and generate new knowledge.” 

Note that “producing language” is only one of many thinking processes; the “expressive – action based” fields of art and music, dance and kinesthetic “thinking” must be assumed to be included under experience and the senses; otherwise these thought processes are missing from the list. Why? The stress is on “conscious” cognition; “unconscious” cognition is considered to be “low-level” cognition and has been segregated from “high-level cognition” – an error that has had severe consequences to the understanding of “how the brain works” in relation to the “whole” human organism and how it interacts with the environment. This “social conception” of human biology, physiology and behavior serves the western socio-religious narcissism of “man” as a special creation isolated from the reality of evolution.  

“The processes are analyzed from different perspectives within different contexts, notably in the fields of linguistics, anesthesia, neuroscience, psychiatry, psychology, education, philosophy, anthropology, biology, systemics, logic, and computer science. These and other different approaches to the analysis of cognition are synthesized in the developing field of cognitive science, a progressively autonomous academic discipline.”  

Again, we must assume that “the arts” are included somewhere in this disconnected “chopped salad” of academic reserves, which often are “at war” with each other over “domains of expertise” (territories) without much flow of information or “honest” discussion between academics. Genuine scientific competition and progress requires constant questioning of assumptions (hypothesis, theories); this necessity is hampered by most of these disciplines being based on theories, rather than truly investigative “reality-based” research that is open to challenges by other researchers.

A severe problem with current concepts of cognition and intelligence: The 300,000 y.o. Jebel Irhoud Homo sapiens, considered to be the “earliest so far” true Homo sapiens. If judged on the decision / conceit that only “conscious social cognition and behavior” count toward being classified as Homo sapiens, how do we explain the survival of any hominid? The current explanation is that these early Homo sapiens were “cognitively and socially identical to modern social humans.” A reality based conclusion would be, that given the variety and range of difficult environments and conditions in which they survived and successfully reproduced, these humans would have had to be more intelligent than modern domesticated humans, who have the advantage of 300,000 years of collective human experience and culture HANDED TO THEM by default. 

The “human brain and behavior” community would have us believe that this fellow survived by relying on modern social word-concepts and social theories of behavior.

Au contraire! Survival would have demanded the “action” intelligences of sensory processing: art and technology production, acute and immediate visual-sensory analysis of threats and opportunities presented by a wild ‘natural’ environment, memorization / mapping of geographical, geological and faunal-flora details of food availability; cooperation, sharing and mutual respect for individual skills and talents, and a precise (not vague or generalized) use of verbal language, gestures, imitative animal communication and graphic symbols.

 

 

 

 

What is the Asperger “Blank Stare” all about? / Re-Post

What is the Aspie blank stare and why is it a disturbing facet of Aspie behavior?

Complaint from an Aspie ‘Mum’ about her son, decoded:

MUM: In my experience, I would get a blank stare when I asked (my Asperger son) a question.  It could be, for example, what he would like for dinner? What happened at school? You know – normal sorts of ‘Mum’ questions!

Answer: Social typical questions tend to be vague and non-specific. A specific question would be: “Would you like pizza or hot dogs for dinner?” Or try, “We’re having hamburgers for dinner. I bought the kind of buns you like and you can add tomatoes or pickles or cheese, or whatever else you like.”  “What stories did you read in reading class today?”

MUM: How did I interpret the blank stare that I got?

At the time, I believed that ‘the blank stare’ was used by (SON) to avoid answering the questions I asked questions I thought were easy to answer! I realize now, that in my frustration over not getting an answer, I would pile on the questions one after another, and (SON) didn’t have time to process even the first one! I would get cross with him, frustrated that he seemed to refuse to respond to my requests for information, and I would give up.

Answer: One of the big mistakes that social typicals make is to attribute INTENT to Asperger behavior. This is because social typicals are “self-oriented” – everything is about THEM; any behavior on the part of a human, dog, cat, plant or lifeform in a distant galaxy, must be directed at THEM. Example: God, or Jesus, or whomever, is paying attention 24/7 to the most excruciatingly trivial moments in the lives of social typicals. We’re not as patient as God or Jesus.

The Asperger default mental state is a type of reverie, day-dreaming, trance or other “reflective” brain process; that is, we do “intuitive” thinking. The “blank face” is because we do not use our faces to do this type of thinking. 

Sorry – we’re just busy elsewhere! When you ask a question, it can take a few moments to “come out of” our “reverie” and reorient our attention. If you are asking a “general question” that is meant to elicit a “feeling” (social) response, it will land like a dead fish in front of us. Hence the continued “blankness”.  

MUM: What is the real cause of the blank stare?

I believe that SON uses the blank stare while he is processing a question. If give him enough time, he will think deeply, and consider his response, which is often unexpected.

Answer: The “blank stare” is due to our type of brain activity. We process questions; processing questions adds to response time. Some questions are so vague that we simply cannot answer them. Some questions aren’t questions at all, but are an attempt to get our attention and to get a “social” something from us. This is truly confusing. 

MUM: (I’m told that) at any given moment an Aspie is taking in lots of information from the world around them. They notice details that normal people ignore. These details can easily result in sensory overload. The blank stare is used by Aspies as a way to ‘zone out’, or ‘go into themselves’ as a coping mechanism for when their senses are overloaded.

Answer: Not correct (in my experience). Sensory overload is another matter entirely; sensory overload results in the desire to flee, and if we can’t “get away” we experience meltdown. Other Aspies may have a different take on this.

Aspie chat concerning “The Stare”

“I watched “Rain Man” again recently. There was a scene where Dusty was sitting on a park bench and just looking at the ground, and Tom Cruise started YELLING at him. I felt like, “Hey ! sometimes I just sit and think about things, and maybe I’m staring at the ground, so cool it Tom.” We tend to look off into the horizon while we’re talking, and really, it’s not a big deal …”

“At work I’ll be at my desk just working away and people will tell me to cheer up when I don’t feel at all down. Also, if I’m standing around somewhere, and not focusing on anything in particular – and feeling fine, someone will ask me if I’m OK or if I’m pissed off about something. Something about my neutral (not happy or sad, just contented) expression makes people think I’m depressed or angry.”

“People are always doing one of the following: Ask me if I’m okay because I’m staring off into the distance; look behind their back to see what I’m staring at; or tell me to “SMILE!” because I don’t have any facial expression.”

Yes, social typicals are self-centered and demanding. They don’t want to “put up with” a blank face; it damages their perfect narcissistic universe, in which it is everyone’s job to make them feel important.

And then, there is the other “eye” problem:

“I dont get it…..my teacher tells me to look at her when she talks and when I look at other people they tell me to stop staring at them. What the…?”

“Apparently staring and looking are two different things, not that I know how to tell the difference.”

The teacher demands eye-contact because it indicates OBEDIENCE – SUBMISSION. Authoritarian adults demand instant obedience from children. But if you stare at a  “regular” person, that causes another problem. You are claiming higher status; predators stare down prey; you, dear Aspie, are unwittingly behaving like a predator.

“I stare because I get easily distracted by details and I want to see more; it’s just attention to detail. I’m doing better at straight eye contact, but open my eyes too wide because I’m trying hard to focus and pay attention.”

“If I am interested in what a person is saying – it’s new to me or important information, I will stare like a laser. Also if I am trying to recognize someone that looks vaguely familiar, or there is something interesting about how they look and I want to examine it. If I’m not interested, I won’t look at them. However, that does not mean I am not listening just because I am not looking at them.”

It seems to me, that Aspies use our senses as nature intended: We use our eyes to see and we use our ears to listen. 

 

 

 

Chauvet Cave Paintings / Thoughts on Deep History

Yesterday I watched a Werner Herzog short film on the Chauvet Cave and its 32,000 y.o. paintings, mainly drawings or sketches of rhinoceros, lions and horses. The sequences included poetic and reflective notions and feelings of the European scientists and film crew; if a person is to have deep and moving experiences about our ancestors, wouldn’t it be in a place such as Chauvet? If the drawings don’t affect one as a significant point of contact with the history of “being human” what would or could ever touch that person’s awareness? For people whose experience of human history is no longer than the disappearance of yesterday’s fake news, 32,000 years is at least an imaginable period of time. It’s not an incomprehensible billion or even million years: it’s 3 x 10,000 years.

_____________________

Reproductions of photos of the art are quite inferior to the filmed version in which the limestone – calcite covered walls are pale and glistening and dimensional. We will never see the paintings as they were while fresh due to changes within the cave itself; post-art stalactites and flow stone cover the walls and floors. The original entrance, into which sunlight would have penetrated, is lost due to rock falls. Nor can we leave behind 10,000 years of human agriculture, technology, mass religion, and the “hoard” of crazy ideas about ourselves, nature and the universe that resulted from recent human “mental” activity. Wiping clean our cluttered perceptions of the nature of reality for these ancestral people is impossible; we will inevitably impose our personal, social and cultural hysteria onto their lives.

But, we can and do possess the effect that the cave and drawings have on us as individuals, if and when and in what form we see them. As for myself, the bulk of my reaction is unconscious – visual; no explanation is needed. Art is what humans do; these people were human. As to their appearance, cave-cleaning habits, disposal of trash, logistics for acquiring material objects, language / no language, social skills, love lives, supposed religion, or beliefs, I’ll leave that bundle of speculation to those that fixate on “creating and controlling narratives” that try very hard to bring remote ancestors into the socio-conceptual fold of contemporary narcissism, and which ultimately fail. If we want to get picky about the competence and creativity of these people, how many present day humans could manage a fair copy of these originals?

Both “primitive” artists, and artists through the centuries, often say that while working, it is as if their hands are guided by a “spirit” of creativity; their own identity and awareness all but disappear. A trancelike state, if you will. “The image or figure came through me” and onto the paper, canvas, wood or stone. Visual perception: it’s the oldest form of human thought and communication. No verbal description could ever replace the drawings at Chauvet (although people try to do just that.)

There is much I could say about my reactions as an artist: the absolute sense of aliveness and activity of the animals, not easy to do; the lack of “sacred ritual reverence” so depended on by anthropologists and archaeologists as an “explanation” for the existence of every human artefact. Drawings deface other drawings – overlap them, cut them off. Animals superimpose animals – an illusion of a mural or gallery is created by drawings being added at intervals, some thousands of years apart. Overall, the impression is of a sketchbook; an individual is practicing, improving, trying to “capture” the essence of a particular animal as he or she “sees it”. Others “copy” – not quite as elegantly. The attention of the “drawer” to his or her own abilities is inseparable from the drawings. These are dangerous wild animals, and yet fear seems absent; admiration, excitement, curiosity and familiarity are conveyed by the unhesitating swiftness of lines and careful shading. The “sense” conveyed to me, is an expression of self-confidence – a timeless attribute of “natural man” to this day.

 

Psychotropic drug “poisoning” of the U.S. population / Criminal Psychiatry

__________________________________________

def. psychotropic medication

Psychotropic medication: Any medication capable of affecting the mind, emotions, and behavior. (This “generic” description covers any and all “brain functions” that control, in essence, “who we are” as individual organisms; how we think, behave and feel, as well as the actions these states produce. 

___________________________________________

Types of Psychotropic Medications

From: GoodTherapy.orgRHelping people find therapists. Advocating for ethical therapy. Click here for extended text.

Several different types of medications are used to treat mental health conditions. The following is a list of the major categories of psychotropic medications: LIVE LINKS BELOW – read about each drug…

Most Frequently Prescribed Psychotropic Drugs

Based on 2013 data, here is a list of the 10 most prescribed psychotropic drugs in the United States (with the number of prescriptions written during one year):

  1. Xanax (alprazolam), 48.5 million
  2. Zoloft (sertraline), 41.4 million
  3. Celexa (citalopram), 39.4 million
  4. Prozac (fluoxetine), 28.3 million
  5. Ativan (lorazepam), 27.9 million
  6. Desyrel (trazodone HCL), 26.2 million
  7. Lexapro (escitalopram), 24.9 million
  8. Cymbalta (duloxetine), 18.6 million
  9. Wellbutrin XL (bupropion HCL XL), 16.1 million
  10. Effexor XR (venlafaxine HCL ER), 15.8 million

Medication that works well for one person may not work well for another. It is important to have an in-depth conversation about your medical history, symptoms, diagnosis, and goals with your medical provider before beginning a psychotropic medication. You cannot legally purchase psychotropic medication without a prescription. (Which drives individuals who have become “hooked on” psychotropics, but can no longer get prescriptions – or afford the fees imposed by psychiatrists for appointments – to move on to illegal black market drugs to “treat” their psychotropic addiction.) 

___________________________________________________

And then, there are the psychotropic drugs used for “other” conditions.

Pregabalin, marketed under the brand name Lyrica among others, is a medication used to treat epilepsy, neuropathic pain, fibromyalgia, and generalized anxiety disorder.[9][10][11] Its use for epilepsy is as an add-on therapy for partial seizures with or without secondary generalization in adults.[12] Some off-label uses of pregabalin include restless leg syndrome,[13] prevention of migraines,[14] social anxiety disorder,[14] and alcohol withdrawal.[15] When used before surgery it does not appear to affect pain after surgery but may decrease the use of opioids.[16]

Common side effects include: sleepiness, confusion, trouble with memory, poor motor coordination, dry mouth, problem with vision, and weight gain.[10] Potentially serious side effects include angioedema, drug misuse, and an increased suicide risk.[10] When pregabalin is taken at high doses over a long period of time, addiction may occur, but if taken at usual doses the risk of addiction is low.[1] Pregabalin is a gabapentinoid and acts by inhibiting certain calcium channels.[17][18]

Parke-Davis developed pregabalin as a successor to gabapentin and was brought to market by Pfizer after the company acquired Warner-Lambert.[19][20] There is to be no generic version available in the United States until 2018.[21] A generic version is available in Canada, the United Kingdom, and Australia.[22][23][24] In the US it costs about 300-400 USD per month.[10] Pregabalin is a Schedule V controlled substance under the Controlled Substances Act of 1970 (CSA).

REVENUE Lyrica

In 2016, Lyrica generated a revenue of some 4.4 billion U.S. dollars. Lyrica is an anticonvulsant drug marketed by Pfizer. In the United States, it is most commonly used for neuropathic pain.  

Note the use of a this “psych drug” for nerve pain and many, many “off-label” conditions. No psychiatric diagnosis needed. 

Side effects: This is just ONE LIST from pages and pages of side effects for this drug!

Psychiatric side effects of Lyrica: for complete info go to: https://www.drugs.com/sfx/lyrica-side-effects.html

Common (1% to 10%): Confusion, euphoria, amnesia, nervousness, irritability, disorientation, insomnia, libido decreased, disturbance in attention, anxiety, depersonalization, stupor, abnormal thinking

Uncommon (0.1% to 1%): Cognitive disorder, mental impairment, abnormal dreams, agitation, apathy, aphasia, hallucinations, hostility

Rare (less than 0.1%): Delirium, delusions, manic reaction, paranoid reaction, personality disorder, psychotic depression, schizophrenic reaction, sleep disorder, disinhibition[Ref]