Eurocentric Anthropology / Will the real H. Sapiens please stand up?

Right: A composite computer reconstruction of fossils from Jebel Irhoud shows a modern, flattened face paired with an archaic, elongated braincase (Meaning this is an “idealized model” that may or may not accurately represent actual persons who lived in this location at the time referenced.)

Homo sapiens as “designed by” Anthropologists

Article Comment: Very problematic (as usual) with the “definition” of Homo sapiens “changing” with each fossil discovery or re-analysis. This current “archaic skull shape” and “modern flat-face” discovery seems a  flimsy basis for vital species description and is indeed Eurocentric! No surprise there. But where does that leave contemporary humans who do not have flat faces? Non-flat (under or over prognathic) faces are generally labeled as “deformities” … which leaves a considerable percentage of modern humans as either “archaic proto-Homo sapiens” or as “deformed” modern Homo sapiens. And what about “dolichocephalic” (long, narrow head) modern humans? What are they? Are aboriginal Australian people “Homo sapiens” or not? African Americans?

From: SCIENCE / AAAS

World’s oldest Homo sapiens fossils found in Morocco

(edited for length)

By Ann Gibbons Jun. 7, 2017

For decades, researchers seeking the origin of our species have scoured the Great Rift Valley of East Africa. Now, their quest has taken an unexpected detour west to Morocco: Researchers have redated a long-overlooked skull from a cave called Jebel Irhoud to a startling 300,000 years ago, and unearthed new fossils and stone tools. The result is the oldest well-dated evidence of Homo sapiens, pushing back the appearance of our kind by 100,000 years.

The discoveries, reported in Nature, suggest that our species came into the world face-first, evolving modern facial traits while the back of the skull remained elongated like those of archaic humans. (Neanderthals,  H. erectus, Denisovans)

Back in 1961, miners searching for the mineral barite stumbled on a stunningly complete fossil skull at Jebel Irhoud, 75 kilometers from Morocco’s west coast. With its big brain but primitive skull shape, the skull was initially assumed to be an African Neandertal. In 2007, researchers published a date of 160,000 years based on radiometric dating of a human tooth.

At Herto, in Ethiopia’s Great Rift Valley, researchers dated H. sapiens skulls to about 160,000 years ago; farther south at Omo Kibish, two skullcaps are dated to about 195,000 years ago, making them the oldest widely accepted members of our species, until now.

Some researchers thought the trail of our species might have begun earlier. After all, geneticists date the split of humans and our closest cousins, the Neandertals, to at least 500,000 years ago, notes paleoanthropologist John Hawks of the University of Wisconsin in Madison. So you might expect to find hints of our species somewhere in Africa well before 200,000 years ago, he says.

One of the few people who continued to ponder the Jebel Irhoud skull was French paleoanthropologist Jean-Jacques Hublin…

The team now has new partial skulls, jaws, teeth, and leg and arm bones from at least five individuals, including a child and an adolescent, mostly from a single layer that also contained stone tools. In their detailed statistical analysis of the fossils, Hublin and paleoanthropologist Philipp Gunz, also of the Max Planck in Leipzig, find that a new partial skull has thin brow ridges. (Is this an actual skull or a reconstruction from statistical data?) And its face tucks under the skull rather than projecting forward, similar to the complete Irhoud skull as well as to people today. But the Jebel Irhoud fossils also had an elongated brain case and “very large” teeth, (as do many H. sapiens today) like more archaic species of Homo, the authors write.

New dates and fossils from Jebel Irhoud in Morocco suggest that our species emerged across Africa. The new findings may help researchers sort out how these selected fossils from the past 600,000 years are related to modern humans and to one another. (Or will result in more of the same “socio-academic” arguments over which “fossil humans” are WORTHY of being the ancestors of modern European H. sapiens)

The fossils suggest that faces evolved modern features before the skull and brain took on the globular shape (how magical!) seen in the Herto fossils and in living people. “It’s a long story—it wasn’t that one day, suddenly these people were modern,” Hublin says.

Neandertals show the same pattern: Putative Neandertal ancestors such as 400,000-year-old fossils in Spain have elongated, archaic skulls with specialized Neandertal traits in their faces. “It’s a plausible argument that the face evolves first,” (Really?) says paleoanthropologist Richard Klein of Stanford University in Palo Alto, California, although researchers don’t know what selection pressures might drive this. (Actually, many scenarios have been put forth as to changes in face shape – from  changes in diet to a trend toward “gracile neoteny”)

The traditional anthropological arguments follow:

This scenario hinges on the revised date for the skull, which was obtained from burnt flint tools. (The tools also confirm that the Jebel Irhoud people controlled fire.) Archaeologist Daniel Richter of the Max Planck in Leipzig used a thermoluminescence technique to measure how much time had elapsed since crystalline minerals in the flint were heated by fire. He got 14 dates that yielded an average age of 314,000 years, with a margin of error from 280,000 to 350,000 years. This fits with another new date of 286,000 years (with a range of 254,000 to 318,000 years), from improved radiometric dating of a tooth. These findings suggest that the previous date was wrong, and fit with the known age of certain species of zebra, leopard, and antelope in the same layer of sediment. “From a dating standpoint, I think they’ve done a really good job,” says geochronologist Bert Roberts of the University of Wollongong in Australia.

Once Hublin saw the date, “we realized we had grabbed the very root of the whole species lineage,” he says. (!!!) The skulls are so transitional that naming them becomes a problem: The team calls them early H. sapiens rather than the “early anatomically modern humans” described at Omo and Herto. 

(Words again – word labels and categories – but where is the functional “reality”?)

Some people might still consider these robust humans “highly evolved H. heidelbergensis,” (arguments continue as to whether or not this is a made up or actual species!) says paleoanthropologist Alison Brooks of The George Washington University in Washington, D.C. She and others, though, think they do look like our kind. “The main skull looks like something that could be near the root (now there’s a fine example of specific concrete scientific language LOL Note that “root” refers to the popular “tree diagrams” of evolutionary relationships which are “archaic” at this point, and not up to date with current understanding of actual evolution…)  of the H. sapiens lineage,” says Klein, who says he would call them “protomodern, not modern.” (Aye, yai, yai! Habitual “word construct academic thought structures” are simply inadequate to understanding evolution)

The team doesn’t propose that the Jebel Irhoud people were directly ancestral to all the rest of us. (LOL Will any fossil humans ever be “good enough” to be the direct ancestors of European anthropologists?) Rather, they suggest that these ancient humans were part of a large, interbreeding population that spread across Africa when the Sahara was green about 300,000 to 330,000 years ago; they later evolved as a group toward modern humans. (Misty magic is invoked again) H. sapiens evolution happened on a continental scale,” Gunz says.

Support for that picture comes from the tools that Hublin’s team discovered. They include hundreds of stone flakes that had been hammered repeatedly to sharpen them and two cores—the lumps of stone from which the blades were flaked off —characteristic of the Middle Stone Age (MSA). Some researchers thought that archaic humans such as H. heidelbergensis invented these tools. But the new dates suggest that this kind of toolkit, found at sites across Africa, may be a hallmark of H. sapiens. (Murky mental picture)

The finds will help scientists make sense of a handful of tantalizing and poorly dated skulls from across Africa, each with its own combination of modern and primitive traits. For example, the new date may strengthen a claim that a somewhat archaic partial skull at Florisbad in South Africa, roughly dated to 260,000 years ago, may be early H. sapiens. But the date may also widen the distance between H. sapiens and another species, H. naledi, that lived at this time in South Africa.

The connections among these skulls and the appearance of MSA tools across Africa at this time and possibly earlier shows “a lot of communication across the continent,” Brooks says. (Communication is a popular and contagious theme in “communication-crazy” smart phone- era culture) This shows a pan-African phenomenon, with people expanding and contracting across the continent for a long time.” (Novel writing is the true calling of archaeologists and anthropologists and so-called science writers) 

_______________________________________________

The original paper: Nature

New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens

Jean-Jacques Hublin, (see original for all authors)

Abstract

Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure because the fossil record is scarce and the chronological age of many key specimens remains uncertain. In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established. Furthermore, it shows that the evolutionary processes behind the emergence of H. sapiens involved the whole African continent.

Comment: Whatever happened to reproductive success as the distinctive basis for a species division? 

Homo sapiens did interbreed with “archaic humans” after all; is homo sapiens a “real” species, or a “breed” of an earlier foundational Homo species? It seems that in Eurocentric anthropology, at least, Homo sapiens is a morphologic type and not a species at all.

Where is reproduction of viable offspring in this “tangled obsession” with “body parts” as signs and portents in Eurocentric narcissism? Homo sapiens simply “looks like us”! How does “flat face” versus “prognathic face” impact “having offspring” – unless, of course, you are racist?

Face “shape” may be determined by factors having nothing at all to do with what is vital to being a modern Homo sapiens!

 

 

Genetic Literacy Project / No increase in “Autism” – Mystery Solved

See post: “The $9,000,000,000 U.S. Autism Industry” That’s $9 BILLION

No, it’s not vaccines, GMOs glyphosate–or organic foods

Clip from Genetic Literacy Project:

So what does the latest evidence show? There now is intriguing evidence that there in fact has been no dramatic rise in autism after all. According to a just-released study, scientists at the Aarhus University, in Aarhus, Denmark assessed more than 670,000 children born between 1980 and 1991 in Denmark, following them from their birth until they were diagnosed with Autism Spectrum Disorder, died, emigrated or reached the end of the study period which was December 2011. Among other things, Denmark is renowned for its excellent national medical records system, which allowed them to conduct a study of this magnitude and over the extended time span. Among the population studied, 4,000 children were diagnosed as being along the autism spectrum and many of these diagnoses were made after 1995. Look at what happened just before that detected increase. Tara Haelle reports in Forbes

In Denmark in particular, the diagnostic criteria for autism expanded in 1994 to include a spectrum of disorders with a broader list of symptoms, thereby widening the definition of autism. Then in 1995, national data tracking began to include diagnoses made from outpatient patient visits rather than just diagnoses of those admitted to a healthcare facility.

The exact same thing has happened in every country that has seen soaring autism rates–the definition of what constitutes autism was dramatically expanded in the early 1990s to embrace the catch-all term Autism Spectrum Disorder–correlating with when GMO usage, chemtrail rates, pesticide exposure and organic food sales began a sharp increase.

The researchers discovered that the change in diagnostic criteria taken together along with the diagnoses made outside of a healthcare facility accounted for as much as 60 percent of the increase in prevalence of autism spectrum disorders. The authors of the study conclude thus

Changes in reporting practices can account for most (60 percent) of the increase in the observed prevalence of ASDs in children born from 1980 through 1991 in Denmark. Hence, the study supports the argument that the apparent increase in ASDs in recent years is in large part attributable to changes in reporting practices.

Though this in itself doesn’t mean evidence of a lack of increase in the prevalence of autism, it does say very emphatically that the huge uptick in numbers of autistic children diagnosed have more to do with how we diagnose the condition than an actual increase.

The idea that increased diagnosis contributes to higher prevalence of a disease is not new at all. In fact it is quite common especially as new diagnostic techniques come into play and early screening programs are put in place by governments. This often leads to debates in the medical literature about whether increases in prevalence of disease are real or due to an increase in diagnosis. Prostate cancer is a common example –incidence for prostate cancer jumped over 100 percent from 1986 to 1992 which coincided with an aggressive expansion of the prostate cancer screening program based on the Prostate Specific Antigen (PSA) test which was approved by the FDA in 1986.

The result of the autism study, even if somewhat expected is still very important. The quality of Denmark’s health records and the size of the study make it unique –it makes the data extremely robust and reliable. So how do these results translate to the United States? We do have similarities in how the diagnosis has changed writes Tara Haelle in Forbes

The way autism is defined in the U.S. has changed dramatically since 1980, when it first appeared in the DSM-III as “Infantile Autism” and could only be diagnosed in children whose symptoms began before they were three years old. Autism spectrum disorders have expanded to include diagnosis without a specific age requirement beyond the “early developmental period” and without requiring significant language impairment in the recently revised DSM-5.
The vast majority of people diagnosed with autism spectrum disorders today would never have qualified under the 1980 classification, and no formal classification separate from schizophrenia existed before then. So it’s not surprising that numbers have increased in the U.S.
Note how the deletion of “language impairment” whimsically redefined Asperger’s as “Autism” – a socially motivated  decision which is not based in scientific evidence.

There are many possible causes why there is an increase in the prevalence of a disease. Apart from increased screening and changes in diagnostic criteria, factors like increased awareness will also come into play. As credible scientific efforts around the world continue to identify genetic and/or environmental causes behind autism, it is prudent to not be taken in by wild claims and give in to the fears spread by those who accept and promote pseudoscience. (Such as psychologists and psychiatrists and the Autism Industry) And no, the Wi-Fi in your house or the genetically modified foods you eat will not lead to your child becoming autistic.

Arvind Suresh is a science communicator and a former laboratory biologist. Follow him @suresh_arvind

New Research / Female Hips and Running, Birth

 

john hawks weblog

If the basic assumptions of the obstetric dilemma are right, says Lewton, participants with wider hips should run and walk less efficiently than those with narrow ones. But that wasn’t what Lewton and her team found. Instead, they found no connection at all between hip width and efficiency: wide-hipped runners moved just as well as their narrow-hipped peers. Lewton and her colleagues published their results in March 2015 in the online journal PLOS ONE. The work was supported by grants from the National Science Foundation and The Leakey Foundation.

“This ‘trade-off’ between hips wide enough for a big baby and small enough for efficient locomotion does not seem to occur,” says Lewton. “That means that we have to rewrite all of the anthropology textbooks! Even outside of textbooks, the general public thinks that if your hips are wide, you’re a bad biped, and that does not seem to be the case.”

The research paper discussed in the article is the recent one by Anna Warrener and colleagues, “A Wider Pelvis Does Not Increase Locomotor Cost in Humans, with Implications for the Evolution of Childbirth”.

Reference

Warrener AG, Lewton KL, Pontzer H, Lieberman DE (2015) A Wider Pelvis Does Not Increase Locomotor Cost in Humans, with Implications for the Evolution of Childbirth. PLoS ONE 10(3): e0118903. doi:10.1371/journal.pone.0118903

Left: "Lucy" Australopithecus afarensis 3.2 mya; H. erectus female "Gona" 1.2 mya; modern H. sapins female.

Left: “Lucy” Australopithecus afarensis 3.2 mya Center: H. erectus female 1.2 mya Right: modern H. sapiens female.

Bipolar Disorder / Medication Warnings Paper

Bipolar Disord. 2013 Aug;15(5):594-621. doi: 10.1111/bdi.12098. Epub 2013 Jul 19.

Impact of psychotropic drugs on suicide and suicidal behaviors.

Yerevanian BI1, Choi YM.     Department of Psychiatry, Greater Los Angeles VA Healthcare System, Sepulveda Ambulatory Care Center, North Hills, CA 91343, USA.  Author Information

OBJECTIVE: To examine the impact of psychotropic drugs on suicide and suicidal behaviors in bipolar disorders.

METHODS: A Medline search of articles published from January 1960 to January 2013 was performed using relevant keywords to identify studies examining the relationship of psychotropic drugs to suicidal behaviors. The publications were further reviewed for relevant references and information. Additionally, the US Food and Drug Administration Center for Drug Evaluation Research website was searched.

RESULTS: The available studies used differing methodologies, making interpretation of the findings difficult. Studies suggest that antidepressants may increase suicidal risk in bipolar disorder, this possibly being related to the induction of broadly defined mixed states. There is no evidence that antiepileptic drugs as a class increase suicidal risk in patients with bipolar disorder. Only lithium provides convincing data that it reduces the risk of suicide over the long term. There is little known regarding the effects of antipsychotics, as well as anti-anxiety and hypnotic drugs, on suicidal behavior.

CONCLUSIONS: The available evidence for the impact of psychotropics on suicidal risk in patients with bipolar disorder is largely methodologically flawed and, except for a few instances, clinically not useful at this point. Adequately powered, prospective randomized controlled studies are needed to assess the impact of each class of psychotropic and each psychotropic as well as common combination therapies. Until such studies have been carried out, clinicians are urged to exercise caution in using these drugs and rely on the traditional means of carefully assessing and monitoring patients with bipolar disorder who are at high risk for suicide.

Kind of amazing that so little is actually known about drugs that are routinely prescribed for bipolar disorder.

I personally had traumatic experiences with Wellbutrin. Doctor ignored warnings –

Personal comments on being Asperger and female

images6A5K77T9

Originally posted Dec. 2015. Didn’t need to do much updating…

What do we really understand about being human?

Officially the diagnosis of Asperger has been removed from the DSM, but the “state” of being Asperger persists: Asperger-type people continue to see themselves as a group who are different to other people. Not just a little different, but having a specific set of dramatic differences. I don’t like to call these differences symptoms, nor do I think Asperger’s can be placed on an “invented” autism spectrum, a construct convenient for those who diagnose, treat, and get paid for being “autism experts.”

In fact, the more I read about autism, the less I can logically apply autism characteristics to the Asperger Way of Being: our brain is focused on the literal physical universe and strives to understand “how things work”. Our perception / processing is especially attuned to patterns and connections “invisible” to the social brain. Modern social typicals or “neurotypicals” are confined to socially constrained “value pyramids” that actively suppress factual information and require conformity to manmade “word-concept reality.”

Female Aspergers are a recently recognized phenomena; we do manifest our Aspergerness differently to males. We tend to learn and use language more easily and effectively than males, but females are judged by a very narrow and strict definition of femininity, and at the first sign of noncompliance, girls are harshly criticized and relentlessly pressured to become “normal” socially-obsessed girls, often with deeply damaging consequences for developing our true identity: scars made early in childhood impair self-fulfillment as females over a lifetime.

The Asperger brain is radically different: asking a child to change their innate personality is cruel. Imagine telling a female child that she is not a “real girl” merely due to her interest in logic, science, mathematics – or visual arts, and preference for “real knowledge”. The Asperger girl is punished for choices as mundane as comfortable clothing and “active” toys. “Act like a girl” is a mystifying and nonsensical directive pounded into us from birth. We are girls – no one has a right to deny that fact.

How many times have I had to insist, “I am a girl and this is how I behave; therefore this is how a girl behaves.”

Logic has no effect on irrational people, who merely repeat whatever they have been told to believe and say by those higher up on the “human status” pyramid. The very act of questioning supernatural (word-concept) assumptions, (which Asperger children do frequently) brings forth angry rejection and social exile. Anyone who cannot see the trauma inflicted on a child by the outright rejection of a child as “not really human – subhuman in fact, needs to examine their personal beliefs about empathy, compassion and what human beings mean to each other. Even though I was a strong-willed and confident child, I grew up with a nagging anxiety that pursuing my intelligence and curiosity – my identity – put me on the wrong side of a fence that divided unacceptable female humans from everyone else.

The question I repeatedly asked: How could anyone believe that sex and gender have anything to do with natural curiosity, learning, thinking and self-expression?

Why is it okay for boys to pursue a career of their choosing, but girls are exceptional and abnormal for wanting the same fulfillment? Why are boys say allowed to say things that girls are forbidden to say? Why are boys encouraged to test their limits, but girls must settle for a highly scripted (and unhealthy) inferior female life? Why is it okay to be smart in school, but once a girl walks out the door she must hide her intelligence, and indeed pretend (physically and mentally) to be merely a servile sexual being, because that’s what the social order demands? In my case, being pretty only compounded the insults : pretty girls were meant to be trophy wives and I was “defacing” that role by insisting on being someone in my own right.

less contrastIMG_0225 FB

In many ways, the “disorder” of being a female Asperger comes down to the sin of being intelligent, logical, curious, ambitious, and not amenable to being social when social means abandoning the best of yourself to a hierarchy of unequal opportunity and a straightjacket of prescribed behavior. Beauty isn’t skin deep, but society wants women to be surfaces, without any inconvenient depth.

I was given my physical and mental attributes at birth (that’s opportunity) and in my value system, it is my obligation to develop those attributes thoughtfully, carefully and sanely because that’s what life offers every living creature. To be human is to unfold our destiny, personally and collectively, just as every component of the universe is in a state of “becoming what it becomes”.  A wolf becomes a wolf, a sunflower a sunflower, a star a star, a woman a woman. That is equality; that is fulfillment.

imagesdress

 

What is a proper human being?

There is a proper way to be a rattlesnake, an armadillo, or a Polar bear, but is there a proper way to be human? A child must learn to be a member of a culture within the confines set by family and society, even if those terms are unhealthy or grotesque.

It takes one or more adults to introduce social requirements. When people live in small groups, it is likely that each child will receive clear instruction on becoming an adult, even if his or her parents don’t survive. We know this to be true: anthropology books are stuffed with myths, rituals, laws and practices that describe, for a particular group, exactly how to be a human being. Rituals throughout the year, and those that mark life’s transitions, are considered to be sacred and unchanging.

No single way of being human has emerged as the winner, but the battle for supremacy rages on. After thousands of years of existence, one perfect Homo sapiens culture has yet to be defined. Thousands of human groups have been absorbed, reduced, or annihilated by other groups, a process that is ongoing as “globalization” obliterates cultural boundaries.

It is obvious that diversity is not a modern human value.

As Homo sapiens spread across the planet, grew in number, and “ran into” other branches of our species, challenges occurred over what it means to be human. The condition of incessant bickering and fighting throughout known history, between political, religious, economic an cultural groups, indicates that disharmony and aggression are here to stay. The original “human” template that directs our behavior, would seem to have been lost.

The religious notion that instinct is a dangerous threat to moral behavior is a tragic flaw in human thinking.

Our remote ancestors operated on instinct. If we aren’t instinctual beings today, where did instinct go, and why did we loose such a successful system? Not sort of successful, but overwhelmingly successful throughout the animal kingdom.  

The expense of cutting off mankind from nature forces us to examine ourselves from the strange perspective of a creature utterly alone in the universe. Contemporary investigations into when we “became human” often begin with this isolated, top-down view, which is rooted in the invalid assumption that our species is the goal of evolution or God’s design: those two concepts are the same.

Scientists look to isolated tribal peoples for clues as to what our hunter-gatherer ancestors may have been like, but today’s Homo sapiens is the descendant of the humans who eliminated by murder or reproductive annihilation anyone who was “in the way.” Ruthless Homo sapiens not only survived, but went on to conquer the planet: brutal, self-centered and driven to remove all obstacles, including other groups of humans, especially hunter-gathers. Contemporary social humans were not people, who, for whatever reason, stagnated in marginal environments. Our ancestors murdered their way to power.

Within recorded history we see that modern humans view most of the people they encounter as subhuman or as direct relatives of monkeys, gorillas or other wild animals. There has been great profit and power generated by the simple propaganda that individuals who are the same species, but who show variation in physical appearance or religious practice, are fair game for predation.

The ancestors were actual people not magical beings.

The Ancestors were actual people, not magical beings.

At the same time, “ownership” of dead or conquered cultures, people who can’t protest or refute the myths written by archaeologists and anthropologists, are exterminated twice, using the magic of words of pseudoscientific interpretation. What we know about non-modern people is tainted by the peculiar belief that archaeologists and anthropologists have magical insight into the brains that filled the fragments of broken skulls.

If ancient people had been as confused about their world as their myths (and ours) suggest, they would not have survived a single generation. Although initiated by actual people and events, to be useful, myth must be stripped  down to talking points that remain, no matter how irrational the story is after generations of repetition.

Does a base map or template exist within the human brain, about which we are unaware, but which directs our behavior just the same? What we experience as intuition may occur when pictures, symbols, or uncanny feelings arise unbidden from our instinctual (unconscious) memories and reactions. The human ‘sixth sense’ is  instinct speaking from the deep wisdom of nature that is shared by all animals and is not the product of magic words that modern social people believe describe reality.

Words, however useful, are a barrier between our brain and our understanding of the brain.

Overdependence on word concepts has left us bereft of our animal thought processes.

The modern human animal demonstrates special attributes, and among these is an incessant obsessive focus on ourselves. We possess an incredible will to force our beliefs not only on each other, but onto the universe. An infantile need for attention activates our big brains. It is a need that cannot be calmed for long.

 

Psychotropic Drug Prescriptions / Link to Suicide, Violence in Military

Let’s face it: The “helping, caring, fixing” industry has a policy of “carpet-bombing” American children and adults with dangerous and lethal drugs and with absolutely no regard for human life – WHY?

PDF : https://http://www.veterans.senate.gov

A REVIEW OF HOW PRESCRIBED PSYCHIATRIC MEDICATIONS COULD BE DRIVING MEMBERS OF THE ARMED FORCES AND VETS TO ACTS OF VIOLENCE & SUICIDE

A Report by Citizens Commission on Human Rights International, April 2014

INTRODUCTION

The recent tragedies at Fort Hood and the Washington, D.C. Navy Yard are deeply concerning because of the increasing reports of military and veteran violence and suicide in our Armed Forces. Though there can be many reasons for killing oneself or others, the possible role of psychiatric drugs in these tragedies has not been effectively explored. It would be a serious mistake to ignore this factor.

  • Researchers have identified 25 psychiatric medications disproportionately associated with violence, including physical assault and homicide.
  • There are 22 international drug-regulatory agency warnings about these medications causing violent behavior, mania, psychosis and homicidal ideation.
  • There are almost 50 international drug-regulatory agency warnings about psychiatric drugs causing suicidal ideation.
  • One in six American service members were taking at least one psychiatric medication in 2010. More than 110,000 Army personnel were given antidepressants, narcotics, sedatives, antipsychotics and anti-anxiety drugs while on duty in 2011.3

2008-2010

  • Between 2005 and 2011 the military increased its prescriptions of psychoactive drugs (antipsychotics, sedatives, stimulants and mood stabilizers) by almost 700 percent, according to The New York Times.
  • Prescriptions written for antipsychotic drugs for active-duty troops increased 1,083 percent from 2005 to 2011, while the number of antipsychotic drug prescriptions in the civilian population increased just 22 percent.5
  • The Department of Defense Suicide Event Reports (DoDSERs) for 2012 reported that the Armed Forces Medical Examiner System (AFMES) found that as of 31 March 2013, there were 319 suicides among Active component Service members and 203 among Reserve component Services members. 92.8 percent of the Service Members were male, with 39.6 percent aged between 17 and 24.
  • DoDSERs were only included in this report if they were submitted by April 1, 2013 and thus there are discrepancies between the fi gures reported by the AFMES and the number of DoDSERs included in the DoDSER 2012 report. In addition, there were some DoDSERs that were submitted for events that were still pending a final determination as a suicide.
  • A total of 841 Service members had one or more attempted suicides reported in the DoDSER program for CY 2012.
  • Some 134 suicide DoDSERs (42.1 percent) and 452 suicide attempt DoDSERs (52 percent) indicated a history of a behavioral disorder.
  • The reports also indicated that “93 decedents (29.2 percent) were reported to have ever taken psychotropic1 medications. A total of 63 decedents (19.8 percent) were known to have used psychotropic medications within 90 days prior to suicide.” However, this is likely to be much higher as almost 21 percent of both the “Ever Taken Psychotropic Medication” and the “Use of Psychotropic Medication last 90 days” questions were answered with “Data Unavailable.” Potentially up to 50 percent of those committing suicide had at some point taken psychiatric drugs and up to nearly 46 percent had taken them within 90 days.6

Psychotropic: A term coined in the late 1940s by Ralph Waldo Gerard, an American behavioral scientist and physiologist to medically describe medication capable of affecting the mind, emotions, and behavior—from the Greek, “mind-turning.”

  • The majority (55 percent) of service members who died by suicide during 2008-2010 had never deployed and 84 percent had no documented combat experiences. In the 2012 DoD Suicide Event report on suicide, 52.2 percent of completed suicides had not been deployed in the recent wars and 56.5 percent of suicide attempts had no reported history of deployment.
  • The suicide rate increased by more than 150 percent in the Army and more than 50 percent in the Marine Corps between 2001 to 2009. From 2008 to 2010, military suicides were nearly double the number of suicides for the general U.S. population, with the military averaging 20.49 suicides per 100,000 people, compared to a general rate of 12.07 suicides per 100,000 people.10
  • There are hundreds of “sudden deaths” among veterans that have been prescribed massive cocktails of psychotropic1 drugs, which a leading neurologist says are “probable sudden cardiac deaths.” Yet the practice of prescribing seven or more drugs documented to cause cardiac problems, stroke, violent behavior and suicide (to name but a few of the adverse effects) is still prevalent.

PSYCHOTROPIC MEDICATIONS: ACTS OF VIOLENCE

  • FORT HOOD GUNMAN IVAN LOPEZ, 34, was taking Ambien, a sleep agent, and other psychiatric drugs for depression and anxiety when he shot dead three colleagues and injured 16 others before killing himself on April 2, 2014.11
  • WASHINGTON NAVY YARD SHOOTER AARON ALEXIS, 34, had been prescribed Trazodone killed 12 people and wounded 8, before being killed by police on Sept. 16, 2013.12
  • SOLDIER PFC. DAVID LAWRENCE, 20, and MARINE LANCE CPL. DELANO HOLMES were both taking Trazodone and other psychiatric medications when they killed a Taliban commander in his prison cell and an Iraqi soldier respectively.

PSYCHOTROPIC MEDICATIONS: VIOLENCE RISKS

  • It is important to understand that the mental health system for our Armed Forces and veterans often involves the use of psychotropic and neuroleptic2 drugs. Between 2001 and 2009, orders for psychiatric drugs for the military increased seven-fold.14 In 2010, the Army Times reported that one in six service members were taking some form of psychiatric drug.15
  • A National Institutes of Health website warns consumers to report if while taking Trazodone—one of the drugs prescribed the Navy Yard shooter—they are “thinking about harming or killing yourself,” experience “extreme worry; agitation; panic attacks…aggressive behavior; irritability; acting without thinking; severe restlessness; and frenzied abnormal excitement….”
  • Psychologists have blamed the surge in random acts of violence among U.S. military on the heavy use of prescribed drugs. “We have never medicated our troops to the extent we are doing now …And I don’t believe the current increase in suicides and homicides in the military is a coincidence,” states Bart Billings, a former military psychologist and combat stress expert.
  • The Food and Drug Administration (FDA) MedWatch system that collects adverse drug reports revealed that between 2004 and 2012, there were 14,773 reports of psychiatric drugs causing violent side effects including: 1,531 (10.4 percent) reports of homicidal ideation/homicide, 3,287 (22.3 percent) reports of mania and 8,219 (55.6 percent) reports of aggression.
  • Dr. David Healy, a psychiatrist and a former secretary of the British Association for Psychopharmacology estimates that 90 percent of school shooters were users of antidepressants. These same medications are prescribed to at least 6 percent of our servicemen and women.

Supporting Information

“We have never medicated our troops to the extent we are doing now… The current increase in suicides and homicides is no coincidence.”

-Dr. Bart Billings, Fmr. Col. & Army Psychologist

This PDF has 34 pages of horrifying information, charts and statistics KNOWN to the VA, Congress and the “empathy experts” who are drugging our soldiers and destroying families.

 

 

Drug Expiration Dates / What They Mean $$$


From Harvard Health Newsletter

FDA study gets to the heart of medicine expiration and safety

Updated: September 2, 2015

This is a dilemma many people face in some way or another. A column published in Psychopharmacology Today offers some advice.

It turns out that the expiration date on a drug does stand for something, but probably not what you think it does. Since a law was passed in 1979, drug manufacturers are required to stamp an expiration date on their products. This is the date at which the manufacturer can still guarantee the full potency and safety of the drug.

Most of what is known about drug expiration dates comes from a study conducted by the Food and Drug Administration at the request of the military. With a large and expensive stockpile of drugs, the military faced tossing out and replacing its drugs every few years. What they found from the study is 90% of more than 100 drugs, both prescription and over-the-counter, were perfectly good to use even 15 years after the expiration date.

So the expiration date doesn’t really indicate a point at which the medication is no longer effective or has become unsafe to use. Medical authorities state if expired medicine is safe to take, even those that expired years ago. A rare exception to this may be tetracycline, but the report on this is controversial among researchers. It’s true the effectiveness of a drug may decrease over time, but much of the original potency still remains even a decade after the expiration date. Excluding nitroglycerin, insulin, and liquid antibiotics, most medications are as long-lasting as the ones tested by the military. Placing a medication in a cool place, such as a refrigerator, will help a drug remain potent for many years.

Is the expiration date a marketing ploy by drug manufacturers, to keep you restocking your medicine cabinet and their pockets regularly? You can look at it that way. Or you can also look at it this way: The expiration dates are very conservative to ensure you get everything you paid for. And, really, if a drug manufacturer had to do expiration-date testing for longer periods it would slow their ability to bring you new and improved formulations. (“New and improved” often means same drug with a slight alteration, but a new name, new advertising campaign and much higher price!)

The next time you face the drug expiration date dilemma, consider what you’ve learned here. If the expiration date passed a few years ago and it’s important that your drug is absolutely 100% effective, you might want to consider buying a new bottle. And if you have any questions about the safety or effectiveness of any drug, ask your pharmacist. He or she is a great resource when it comes to getting more information about your medications.

From CNN MONEY / A special drug

Globally, more than 130 million people are estimated to be living with Hepatitis C. Left untreated, the disease can be deadly. But sofosbuvir, released in late 2013 by U.S. biopharmaceutical firm Gilead (GILD), is effectively a cure. (Along with a barrage of nonstop ads on TV for “Harvoni” – hyped as a cure now available to everyone)

It’s also expensive, costing $84,000 for a 12-week course in the U.S. Doctors often prescribe the drug — sold under the brand names Sovaldi and Harvoni — in combination with others, further raising the overall cost of treatment.

As a result, insurers and government healthcare providers often pay for its use in only their sickest patients.

But in India, a 12-week course of the drug’s generic version can be purchased for just $500.

Foster Care / legal abuse of children / prescription meds

This is how the United States of America “loves” its children:

“California Moves To Stop Misuse Of Psychiatric Meds In Foster Care”

Efforts to protect children in foster care from being inappropriately medicated with powerful antipsychotic drugs got a big boost forward on Tuesday, when California Gov. Jerry Brown signed three bills into law designed to reform prescribing. Overprescribing of psychiatric meds for foster youth is a persistent problem nationwide, with children given the drugs at double or triple the rate of those not in foster care.

In 2011, the federal Government Accounting Office found nearly 1 in 4 children in foster care was taking psychotropic medications, which include antipsychotics, antidepressants, mood stabilizers and stimulants. Hundreds of children were found to be taking five or more psychotropic medications at a time, and thousands were prescribed doses that exceeded FDA-approved guidelines. According to the report, monitoring programs fell short of guidelines established by the American Academy of Child and Adolescent Psychiatry. Many of the medications have side effects that include lethargy, weight gain, diabetes and tremors – (which can be permanent) (And far worse – psychotropic drugs that are forced on children are CAUSING brain damage and “psychotic” behavior.)

The California legislation, which covers 63,000 children and teens in foster care, will allow public health nurses access to medical records to monitor the foster children who are prescribed psychotropic drugs; identify the group homes that rely most on these medications and potentially require them to take corrective action (No consequences? Why am I not surprised?) and provide child welfare workers with better training and oversight tools to spot dangerous prescribing practices. 

“I hope the approval of this legislation tells our foster care youth that we love them, that their lives matter to all of us and that we care deeply about their future,” said state Sen. Jim Beall, a San Jose Democrat, author of two of the bills. (Really?)

The Oakland-based National Center for Youth Law, which was among the legislation’s sponsors, called it the most comprehensive effort in the U.S. to date to curb the misuse of psychotropics in foster care.

However, a bill that would have required a prior medical examination and ongoing monitoring before a juvenile court could authorize psychotropic drugs was pulled from the legislative package following intensive lobbying by associations representing physicians and group homes.

The bill’s author said he would reintroduce the measure in January.


Elaine Korry writes about health care and social policy from the San Francisco Bay Area.

“By the time DeAngelo Cortijo was 14, he had been in more than a dozen foster homes. He had run away and lived on the streets for months, and he had been diagnosed with bipolar and anxiety disorders, attachment disorder, intermittent explosive disorder or posttraumatic stress disorder. He had been in and out of mental hospitals and heavily medicated.” To read more: antipsychotic medications and foster care